why 16mp sensor is enough..for everyday used

Sure, no-one denies that, but generally we don't call designing the software...
I didn't realize you think the sensor is the only hardware that matters in compiling the image.
In the case of a sensor with on-chip ADCs and a digital output, it is. So now you have learned something.
That explains a lot. Maybe you should ask someone about application-specific integrated circuits (ASIC) , A/D converters and CPUs.
Who do you think I should ask? CPUs, by the way are things that run software. If you're using a CPU to 'compile the image', you're using software.
Then come back and we can talk.
Well, maybe if you're one of the people I could be asking, perhaps you might explain, if you have a sensor in which the ADCs are on chip, the output is digital, and the internal operating parameters (such as clock frequencies, capture windows, capture timing, read chain gain) are controlled by programmable registers, just what are these 'ASICs and A/D converters' you speak of are doing? Given, that is, that the CPU you mentioned will be running software which will cause those registers to be programmed with the values chosen by the software developers.

BTW, I have actually designed an ASIC for interfacing a Sony sensor to an application processor. It doesn't change the numbers coming out of the sensor, it's only task is to translate Sony's proprietary data format (8 lane sub-LVDS) to a format that the application processor can accept (in this case MIPI CSI-2). So when you come up with your answer, you need to concentrate on ASICs which actually alter the performance of the sensor.

--

263, look deader.
 
Last edited:
Nikon made a statement to the effect that the sensor is their design
Cool.

But again you are very confused.
The one confused here is you.

Suggesting Bob "should should ask someone about application-specific integrated circuits (ASIC) , A/D converters and CPU" when you yourself need all the help on those matters you can get is beyond amusing. Your ADC reference is quite hilarious.

The matter Bob pointed out was that quoting as a source something that starts with a monstrous marketing hyperbole is appalling.

It'd be better to spend your time learning; arguing with people who know stuff is a waste of your time, unless you are here to entertain them. If that was your intention, you have succeeded.
I never said Nikon manufactured it or developed it.
I said Nikon often gets more out of the sensors than Sony if you go by DxO. I provided proof (links to DxO comparisons) too.

I also said there is to creating the image file than just the sensor which is obviously correct.

I hope that helps your confusion.
..
getting better results than Sony according to DxO
Bob answered you, including the DxO part, here https://www.dpreview.com/forums/post/63012234

Ciao.
 
Sure, no-one denies that, but generally we don't call designing the software...
I didn't realize you think the sensor is the only hardware that matters in compiling the image.
In the case of a sensor with on-chip ADCs and a digital output, it is. So now you have learned something.
Most already knew that. But the whole chain is not, LOL!!!!! :D
That explains a lot. Maybe you should ask someone about application-specific integrated circuits (ASIC) , A/D converters and CPUs.
Who do you think I should ask? CPUs, by the way are things that run software.
That is not 100% true. You need more to run software, just like you need more than jsut a sensor.

And your implication that CPUs do not vary and make no difference, that is laughable.
Then come back and we can talk.
Well, maybe if you're one of the people I could be asking, perhaps you might explain, if you have a sensor in which the ADCs are on chip, the output is digital, and the internal operating parameters (such as clock frequencies, capture windows, capture timing, read chain gain) are controlled by programmable registers, just what are these 'ASICs and A/D converters' you speak of are doing? Given, that is, that the CPU you mentioned will be running software
Again you imply that all CPUs are the same and have no effect on the resulting file.

Ouch.

Tell me again how the CPU (hardware) and firmware (software) have no effect on the resulting image! LOL!!!

According to you only the sensor matters.

But then are only trying to cover your FAILURE when you said Nikon doesn't score higher on DxO than Sony despite using the same sensors. Get back to that topic. :D
 
Last edited:
Nikon made a statement to the effect that the sensor is their design
Cool.

But again you are very confused.
The one confused here is you.

Suggesting Bob "should should ask someone about application-specific integrated circuits (ASIC) , A/D converters and CPU" when you yourself need all the help on those matters you can get is beyond amusing. Your ADC reference is quite hilarious.

The matter Bob pointed out was that quoting as a source something that starts with a monstrous marketing hyperbole is appalling.

It'd be better to spend your time learning; arguing with people who know stuff is a waste of your time, unless you are here to entertain them. If that was your intention, you have succeeded.
It's quite easy to think you know more than you do. Years ago I had a huge argument with Iliah about the workings of the Sony IMX028 sensor found in the D3X and Sony A900 and how and why Nikon was getting better results than Sony from it. I had some experience using Sony sensors (but not that one) and had developed a theory about how Nikon was getting more from it by reading the sensor multiple times and averaging the results (hence the slow 14-bit read rate). I was so taken by the theory, I wasn't letting go of it for anything. What I didn't know was that Iliah was actually using it. Anyway, it turned out that the sensor had a 14-bit mode, that operated at 1/4 the speed of the 12-bit mode (as single-slope converters do). The reason that Nikon appeared to get more out of the sensor was just that they were using it in 14-bit mode and Sony in 12-bit mode. The D3X was only ever tested in 14-bit mode. Ingenious as my theory was, it was nonsense.
 
Nikon made a statement to the effect that the sensor is their design
Cool.

But again you are very confused.
The one confused here is you.

Suggesting Bob "should should ask someone about application-specific integrated circuits (ASIC) , A/D converters and CPU" when you yourself need all the help on those matters you can get is beyond amusing. Your ADC reference is quite hilarious.
Actually I was correct.

And Bob was 100% wrong when he said Nikon never gets more out of Sony sensors (going by DxO scores).

That was proven false, and Bob is only trying to change the subject as a distraction.

https://www.dpreview.com/forums/post/63013547
 
Last edited:
Sorry Bob.

Lets recap. According to DxO Nikon often gets better ratings with Sony sensors than Sony does. This despite you falsely saying otherwise. (bob said of results and DxO scores " Essentially, they are the same")
Here is the D800 he said gets the same results as the A7R

https://www.dxomark.com/Cameras/Compare/Side-by-side/Nikon-D800-versus-Sony-A7R___792_917

Another classic example is D600 vs. A7 where there is even more of a difference.

https://www.dxomark.com/Cameras/Compare/Side-by-side/Nikon-D600-versus-Sony-A7___834_916

There is a bigger difference between D7200 and A6000 (or even the A6300/A6500 which camera a few years later).

https://www.dxomark.com/Cameras/Compare/Side-by-side/Sony-A6000-versus-Nikon-D7200___942_1020

Sorry bob, despite the same sensor, the results are NOT "Essentially,.. the same"
 
Last edited:
Sure, no-one denies that, but generally we don't call designing the software...
I didn't realize you think the sensor is the only hardware that matters in compiling the image.
In the case of a sensor with on-chip ADCs and a digital output, it is. So now you have learned something.
Most already knew that. But the whole chain is not, LOL!!!!! :D
So, please do explain. Digital data is read from the sensor into the image processor. What, apart from software running on the image processor is going to change the characteristics of that data, with respect to properties like dynamic range, signal to noise and so on. Obviously I've been lucky enough to stumble across a real expert who can teach me something here, so now I'm eager to find out what is this hardware that changes these properties that have been encoded into digital values without performing digital processing on them.
That explains a lot. Maybe you should ask someone about application-specific integrated circuits (ASIC) , A/D converters and CPUs.
Who do you think I should ask? CPUs, by the way are things that run software.
That is not 100% true. You need more to run software, just like you need more than jsut a sensor.

And your implication that CPUs do not vary and make no difference, that is laughable.
I'd also appreciate a little education about the notion of Turing completeness and computability, since at seems that my new-found expert has discovered that the idea no longer apples.
Then come back and we can talk.
Well, maybe if you're one of the people I could be asking, perhaps you might explain, if you have a sensor in which the ADCs are on chip, the output is digital, and the internal operating parameters (such as clock frequencies, capture windows, capture timing, read chain gain) are controlled by programmable registers, just what are these 'ASICs and A/D converters' you speak of are doing? Given, that is, that the CPU you mentioned will be running software
Again you imply that all CPUs are the same and have no effect on the resulting file.

Ouch.
As I said, I'd like a little education about how and when the concept of Turing completeness was overturned. I seem to have missed the paper. Strange, I'd have expected a little more furore in academic circles when it happened. I must have been asleep.
Tell me again how the CPU (hardware) and firmware (software) have no effect on the resulting image! LOL!!!
Well, I'm one of those old-fashioned people who believe that if a problem is computable, then the answer will be the same, whichever Turing complete machine you use to compute it. You seem to know differently, so as I've said, I'd very much appreciate a link to the paper in which all Turing's (and not to mention Church's) work was overturned. It would be hugely interesting.
According to you only the sensor matters.
Which isn't what I said, but never mind. What I said is if the sensor gives a digital output, and is controlled by a digital input, there isn't very much opportunity to change what it does except by programming. That's my experience using this kind of sensor. You clearly know better, in which case I'd like you to be a lot more specific, because it appears that I've been doing things wrongly all this time.
But then are only trying to cover your FAILURE when you said Nikon doesn't score higher on DxO than Sony despite using the same sensors. Get back to that topic. :D
I didn't say that either. What I did say was that the DxOmark examples you posted were the same to within the margin of error of DxO's measurements.

--
263, look deader.
 
Last edited:
Sorry Bob.

Lets recap. According to DxO Nikon often gets better ratings with Sony sensors than Sony does. This despite you falsely saying otherwise. (bob said of results and DxO scores " Essentially, they are the same")
Here is the D800 he said gets the same results as the A7R

https://www.dxomark.com/Cameras/Compare/Side-by-side/Nikon-D800-versus-Sony-A7R___792_917

Another classic example is D600 vs. A7 where there is even more of a difference.

https://www.dxomark.com/Cameras/Compare/Side-by-side/Nikon-D600-versus-Sony-A7___834_916

There is a bigger difference between D7200 and A6000 (or even the A6300/A6500 which camera a few years later).

https://www.dxomark.com/Cameras/Compare/Side-by-side/Sony-A6000-versus-Nikon-D7200___942_1020

Sorry bob, despite the same sensor, the results are NOT "Essentially,.. the same"
And you should be sorry. You just said a6000 has a Toshiba DKAO HEZ1 TOS-5105 inside. and not an imx210.

How much further your ignorance goes?
 
Sorry Bob.

Lets recap. According to DxO Nikon often gets better ratings with Sony sensors than Sony does. This despite you falsely saying otherwise. (bob said of results and DxO scores " Essentially, they are the same")
Please don't accuse me of putting around falsehoods whilst making false claims about what I actually said. I have never said anywhere that Nikon doesn't get better DxOmark ratings with Sony sensors than Sony does, and if you're going to claim otherwise, please link and quote to where I am supposed to have said that.
What I said was that those two are the same to within DxO's margin of error, and indeed the same, very likely to within individual sample variance. If you look at the differences, there is 0.3 bit difference in what they call 'colour depth', there is a 0.3 EV difference in DR and a 4% difference in 'low light ISO', all of those are within DxOmark's likely margin of error, given the methods they use.
Another classic example is D600 vs. A7 where there is even more of a difference.

https://www.dxomark.com/Cameras/Compare/Side-by-side/Nikon-D600-versus-Sony-A7___834_916
Maybe you missed the whole controversy about the lack of an uncompressed raw on the A7? DxOmark does its analysis on raw files. If the raw files have compression artefacts, it's going to affect things.
There is a bigger difference between D7200 and A6000 (or even the A6300/A6500 which camera a few years later).

https://www.dxomark.com/Cameras/Compare/Side-by-side/Sony-A6000-versus-Nikon-D7200___942_1020
D7200 and A6000 have completely different sensors. D7200 is Toshiba HEZ1 TOS-5105, A6000 is a Sony IMX-193
Sorry bob, despite the same sensor, the results are NOT "Essentially,.. the same"
Erm, not actually 'the same sensor' in the last case. But I only said 'essentially the same' in the case of the D800/A7R, which as I said is within the margin of error. I also explained about the D7200, which you seem to have ignored, because it doesn't fit your thesis. And then the other is comparing badly lossy compressed raw with uncompressed raw. Hmm.

BTW, the only point I've been making is that the Nikon did not 'develop' the D800 sensor as they claimed, and that any differences in the case fo these shared sensors are not down to differences in sensor design or development, but down to software, which includes the way that the sensor is configured in software, such things as readout cycle, gain choices and so on. Or, as in the one case where you have shown a performance difference between tow cameras using the same sensor, software matters such as how the raw is compressed (software matters such as choice of gain, readout cycle also makes a difference to DxO metrics) You seem to be suggesting that Nikon has some magic extra hardware. It isn't true. And even if it were, it wouldn't justify a claim that they 'developed' the sensor. That claim is just false.

--
263, look deader.
 
Last edited:
That explains a lot. Maybe you should ask someone about application-specific integrated circuits (ASIC) , A/D converters and CPUs.
Who do you think I should ask? CPUs, by the way are things that run software.
That is not 100% true. You need more to run software, just like you need more than jsut a sensor.

And your implication that CPUs do not vary and make no difference, that is laughable.
I'd also appreciate a little education about the notion of Turing completeness and computability, since at seems that my new-found expert has discovered that the idea no longer apples.
Joke's on you and Turing too, Bob. You've been arguing with a computer this whole time (the "Brainiac") and didn't realize it!
 
That explains a lot. Maybe you should ask someone about application-specific integrated circuits (ASIC) , A/D converters and CPUs.
Who do you think I should ask? CPUs, by the way are things that run software.
That is not 100% true. You need more to run software, just like you need more than jsut a sensor.

And your implication that CPUs do not vary and make no difference, that is laughable.
I'd also appreciate a little education about the notion of Turing completeness and computability, since at seems that my new-found expert has discovered that the idea no longer apples.
Joke's on you and Turing too, Bob. You've been arguing with a computer this whole time (the "Brainiac") and didn't realize it!
:-D
 
That explains a lot. Maybe you should ask someone about application-specific integrated circuits (ASIC) , A/D converters and CPUs.
Who do you think I should ask? CPUs, by the way are things that run software.
That is not 100% true. You need more to run software, just like you need more than jsut a sensor.

And your implication that CPUs do not vary and make no difference, that is laughable.
I'd also appreciate a little education about the notion of Turing completeness and computability, since at seems that my new-found expert has discovered that the idea no longer apples.
Joke's on you and Turing too, Bob. You've been arguing with a computer this whole time (the "Brainiac") and didn't realize it!
 
If the “best” sensor is essential then we must all delete our previous images the moment we “upgrade” as anything previous isn’t worthy. In fact every photo ever taken should be obliterated from existence the minute a new wonder sensor is released. Who wants to make their eyes bleed looking at Ansel Adams or Cartier-Bresson..... they used film. How old school and totally unacceptable ..........
 
According to DxO Nikon often gets better ratings with Sony sensors than Sony does.
I wholeheartedly agree that even those they were said to have the same sensor, the pictures from the D600 were cleaner with a bit more DR.

My thought was that Sony used less voltage or something to keep the energy use down to prevent overheating. And I read the layer with OSPDAF pixels possibly adversely affected IQ even though just slightly . Whatever the reason, my family and I saw a difference even under the same conditions.
 
According to DxO Nikon often gets better ratings with Sony sensors than Sony does.
I wholeheartedly agree that even those they were said to have the same sensor, the pictures from the D600 were cleaner with a bit more DR.

My thought was that Sony used less voltage or something to keep the energy use down to prevent overheating.
Sony drives these sensors with 5V (for the sensor array), 3.3V (for the mixed signal circuitry) and 1.8V (for the logic and data signalling). There is also a dedicated reset supply, for which Sony uses 4.5V. All those seem to be fairly standard default values (and are the values specified for the part). I have no information on what Nikon uses, but I would be very surprised indeed if they deviated from the specified voltages. It's very hard to see haw changing any of those voltages would increase IQ. Increasing the digital supply voltages would allow faster clocking, but that wouldn't affect IQ. Increasing the sensor supply voltage won't increase the charge collection. Increasing the reset voltage would, but then it would drive the pixel output out of range of the read circuitry, the voltage swing of which is dependent on internally generated biasses, which aren't going to be changed by a change of supply voltage. Using higher than specified voltage would result in no greater IQ and would significantly negatively impact reliability.
And I read the layer with OSPDAF pixels possibly adversely affected IQ even though just slightly . Whatever the reason, my family and I saw a difference even under the same conditions.
The main thing, as I said, being that at the time in that example is the very poor raw format, with a non-optional and artefact ridden compression scheme. They put it right later. But that's that specific example. There are many opportunities to tune the usage of these sensors to get very different results from the same sensor, and performance differences between cameras using the same sensor are not uncommon. Possibly the most extreme example is the difference between the D7100 and D7200, which have been confirmed to use the same sensor by Chipworks and also by a Nikon statement to Imaging resource. Nonetheless the D7200 shows significantly lower read noise and less pattern noise. I suspect that the D7100 was somewhat underperforming due to Nikon engineers' lack of familiarity with a new sensor source, that is they were programming it as though it was a Sony. Sony sensors have pretty low pattern noise due to the hybrid analog/digital CDS scheme - which isn't available to any other manufacturer (except Aptina, which has never used it, so far as I know) being well wrapped up in patent protection. The usual way of cancelling fixed patten noise is to characterise it using the 'optical black' pixels round the border of the array and then digitally subtract it, and it's quite possible that the D7200 software paid more attention to this. There is also evidence that Nikon is manipulating the gain differently between the two cameras. I don't have specific information on this Toshiba sensor, but many sensors have two stages of variable gain, and the characteristics change depending on how the selected gain is shared between the two stages. All these changes are made by programming registers within the sensor, not by external hardware choices, which is why I said to brainiac that this is about programming. It's about programming to use the hardware you have most effectively.
 
According to DxO Nikon often gets better ratings with Sony sensors than Sony does.
I wholeheartedly agree that even those they were said to have the same sensor, the pictures from the D600 were cleaner with a bit more DR.

My thought was that Sony used less voltage or something to keep the energy use down to prevent overheating. And I read the layer with OSPDAF pixels possibly adversely affected IQ even though just slightly . Whatever the reason, my family and I saw a difference even under the same conditions.
To add to Bob's excellent account on hardware and firmware that programs the sensor itself, there is another thing you guys are missing.

Out of camera JPEGs are what a company considers to be good enough and cost-effective. Comparing out of camera JPEGs one can't be certain what causes the differences between the cameras. It can easily be the conversion itself.

If you are comparing raw conversions using the manufacturer's raw converter, the above equally applies.

But if you are comparing using a third-party converter, you are introducing yet another very important variable: how good that third party understands the raw format for particular cameras.

Same applies to DxO results. Results depend on the knowledge of the raw format. If the test misses using important metadata instructions (and I can easily bring in some examples where raw data itself is not sufficient for a test, it needs to be correctly interpreted through additional metadata instructions) test results are skewed.

So when you say you see the difference it may be just because of conversion and insufficient understanding of raw format.
 
We can speculate all we want, but Nikon does something to score higher over at DxO with the same sensor. Maybe it is the rest of the HW, or what they do with the data. Who cares.

The fact remains, Nikon gets different results.
 
We can speculate all we want, but Nikon does something to score higher over at DxO with the same sensor. Maybe it is the rest of the HW,
No, it isn't 'the rest of the HW, for the reasons discussed earlier. Most of what I said wasn't 'speculation' by the way. It's how it is.
or what they do with the data. Who cares.
Well clearly someone does, because the idea that the difference is in the software that controls the sensor seems to be unacceptable to some.
The fact remains, Nikon gets different results.
But that was never a question anyone was posing. The question was did Nikon 'develop' the D800 sensor and the answer is no. The rest has just been a huge detour on the part of some Nikon fans who can't live with Nikon buying commodity sensors from Sony just like everyone else. Nikon also does design its own sensors, but not the one in the D800. or for that mater the D850.

Given that we're on the mFT forum, and Nikon is something of an irrelevance here, it's worth noting that the same thing happens in mFT. Olympus and Panasonic get different results using the same sensor. And like for the Nikon sensor, it isn't in the hardware (unless it's greater processing capacity to do some more real-time image processing), because the sensor has a digital interface.

--
263, look deader.
 
Last edited:
I agree. Bob is floundering because his initial point was a complete fail,

Its obvious the old D7200 with a Sony sensor performs better than any Sony ASPC camera with the same sensor (one of several examples). That's been proven over and over. Bobn tried to tell us they were the same, and looked pretty silly. Now he is floundering and trying desperately to change the subject. This is comom in forums but certain types.

When he finally admits he was wrong we can move on. But we all know he'll stick his head in the sand.
 
We can speculate all we want, but Nikon does something to score higher over at DxO with the same sensor. Maybe it is the rest of the HW,
No, it isn't 'the rest of the HW, for the reasons discussed earlier. Most of what I said wasn't 'speculation' by the way. It's how it is.
or what they do with the data. Who cares.
Well clearly someone does, because the idea that the difference is in the software that controls the sensor seems to be unacceptable to some.
The fact remains, Nikon gets different results.
But that was never a question anyone was posing. The question was did Nikon 'develop' the D800 sensor and the answer is no. The rest has just been a huge detour on the part of some Nikon fans who can't live with Nikon buying commodity sensors from Sony just like everyone else. Nikon also does design its own sensors, but not the one in the D800. or for that mater the D850.

Given that we're on the mFT forum, and Nikon is something of an irrelevance here, it's worth noting that the same thing happens in mFT. Olympus and Panasonic get different results using the same sensor. And like for the Nikon sensor, it isn't in the hardware (unless it's greater processing capacity to do some more real-time image processing), because the sensor has a digital interface.
Those who went for a detour are after a "bigger point". They refuse to acknowledge that not only their "bigger point" is not where this discussion started and there was no reason to bring it up at all, but also that said "bigger point" largely failed.

There are few cameras for astronomers that use the same sensor as D800, and the only improvement that happened there is due to sensor cooling. Even attempts to switch the sensor to 16-bit mode failed.

But imagine "bigger point" supporters' disappointment when they will discover that a lot of other hardware, as well as a a huge chunk of firmware and software in Nikon cameras was also outsourced, camera body design is often outsourced, or that Nikon are not making their own lithium batteries, magnesium, or (fill the blank).
 

Keyboard shortcuts

Back
Top