Hey fellow photogs!
I've been diving deep into the technical aspects of dynamic range lately, and I've come across some conflicting information regarding the impact of sensor size. Some argue that larger sensors inherently have better dynamic range, while others believe that advancements in technology are closing the gap. The recent BSI / partial sensor designs also play a role here.
But the latest sensor designs like BSI or that partial design on the recent mid-range Nikon deliver less dynamic range, in favor of more speed, so I wouldn’t say the last years gave us as many benefits, as the first two decades of digital photography.
And, if you're a landscape photographer who shoots a lot of scenes with high dynamic range, then a large sensor DOES make a fairly big difference, especially if you try and recover a little shadow detail
Yes, that’s true, especially if you print big.
That being said, photographers who actually take advantage of this are not plentiful. Personally, I take landscapes with both my full frame gear and micro four thirds gear, and like both.
Exactly, what kind of difference does the viewer see in the end result on a phone or tablet between MFT and MF?
What is at issue is the number of photons captured in a pixel (or photodiode) well.
Really? Why is that the only thing at issue? Do you imagine that the DR of a whole image is the same as the DR of one of its constituent pixels?
This -more pixel well depth -will reflect In highest dynamic range meaning less blown highlights and details In darkest areas.
If this theory of yours was correct, we'd expect the DR of cameras with different sensor sizes but similar pixel sizes to be similar, and we'd expect the DR of cameras with same sensor sizes but different pixel sizes to be different by the difference in pixel size.
Let's look at the PDR performance of some actual cameras with similar pixel technology to test your theory.
Consider the Z6, Z7 and Z50. All have similar sensor technology.
The Z50 and Z7 have very similar pixel sizes but the Z7's sensor size is more than twice as big. The Z6 has pixels nearly twice as large as the Z7, but a sensor of the same size.
It seems that the data do not support your theory. The Z6's pixels are nearly a whole stop larger than those of the Z7, yet the difference in PDR performance is practically non-existant at ISO 100, and typically about 1/3 of a stop on most of the graph. The Z7 and Z50 have pixels of similar size, yet the difference in PDR between them is twice as much as the difference between the Z6 and Z7.
BSI sensors close to doubles that amount of photon capture available per area unit of a sensor.
This may be true of tiny pixels on phone-camera sensors, but it is not true of pixels on MFT, APS-C, FF, or MF sensors. On these larger pixels, the switch to BSI makes only a small fraction of a stop of difference, because the non-sensing components on the front side make up a much smaller portion of the pixel surface area. These components do not scale in size with pixel size.
The increasing MP for a given unit area of a sensor will however, while having more pixels lessen the possible dynamic range.
The increasing number of pixels does reduce DR, but not in proportion to the increase in number of pixels. The decrease in PDR is due to increased read noise associated with reading out a larger number of pixels in the same time,
Now, this is my understanding and having seen examples of this through the years of sensor development, I think is true.
Yet I show you data that indicates it is not actually true.
Let me introduce you to a different theory.
DR of an image depends mostly on the SNR of an image, The SNR of an image depends primarily on the amount of light captured in the image. For a given technology, sensors of the same size capture about the same amount of light at the same exposure, regardless of pixel size/count. Sensors of different sizes capture light in proportion to the differences in surface area of the whole sensor.
SNR is secondarily affected by read noise, with the effect of read noise becoming more significant at lower exposures. Major factors affecting read noise are the number and speed of read operations.
A factor of two difference in sensor area has a larger effect on SNR than a factor of 2 difference in read operation counts and/or speed.
Applying this to the Z6, Z7 and Z50 data, we start with the Z6 and Z7 capturing about the same amount of light at the same exposure. This means they will have similar DR. The Z50, in contrast, captures about half as much light at a given exposure, because its sensor has about half the surface area. This gives it about one stop lower DR. Then we have to account for the fact that the Z7 has about twice as many pixels as the Z6 and Z50. This results in a reduction of its DR by about a third of a stop. This explanation corresponds to the actual data.