Dynamic Range: Does Sensor Size Really Matter?

That said, the device with a lower pixel count (but bigger pixels) has a better chance of taking better photos, depending on how large the picture is. This is one reason some pros say that a 6-megapixel camera takes just as good photos, am I right?
Some pros know nothing about the technology, so they might say anything,

There is a widespread myth that says that, since larger pixels capture more light than smaller pixels, a sensor made up of fewer large pixels will make less noisy images than a sensor of the same size made up of more smaller pixels. If this myth was correct, the noise performance of sensors of a given size would vary in proportion to the pixel size, and sensors of different sizes that had the same size pixels would have the same noise performance. Yet neither is the case in reality. That's because the myth fails to account for the fact that sensors of a given size but different pixel size all capture about the same amount of light for a given exposure, and that sensors of different sizes but same pixel size capture significantly different amount of light.

There usually is a difference in noise performance between sensors of a given size but different pixel size and counts, but that difference is much smaller than the difference in pixel size. The reason for the different performance is that camera makers want to achieve certain frame rates, and to do so with a larger number of pixels generates more read noise.
 
I would assume there are boundaries between pixels (whether large or small) where one pixel ends and another starts, sort of like eggs in an egg carton, or the honeycomb of a beehive. An example is the pixel size of a Nikon D90 compared to a D3200. The D90 pixel is 106 percent larger than the D3200's. Slightly over two D3200 pixels could fit inside that of the D90's. In the middle of the D90 pixel is a boundary of the two D3200 pixels. There would approximately be twice as many boundaries between the D3200 pixels than between those of the D90's. Whereas taking the D90 pixel, there is no boundary in the middle and the boundary(ies) are probably what fosters or creates loss of data manifesting as noise in the photographs.
 
Last edited:
I hope this topic has not been answered already: Does sensor size and/or sensor resolution matter in terms of picture quality?
Sure. they both matter. If they didn't, there'd be very little reason for people to use cameras with anything but the smallest, lowest resolution sensors.
I looked over the pictures these two cameras took and with the same lighting conditions both were identical photos. To the unaided eye, it is impossible to spot any difference, the pictures they take under the same conditions are the spitting image of the other.
I sometimes think we are putting a lot of resolution in our cameras to justify the cost we want to see, and as a marketing strategy.
You can think that, but the answer to your questions is still yes, both matter.
I am not going to be splitting hairs with you, ishwanu. Regardless of what you show here, my mind is made up. Sorry, I cannot agree with you.
Interesting. Then why are you not using the smallest sensor cameras with the lowest resolution instead of your D90 and D3200? Isn't it because you think they give you an image quality advantage? If so, it would be illogical to suggest that others who use cameras with even larger sensors and more pixels don't enjoy an image quality advantage that your cameras can't provide. How can you disagree with that? If it's just that your personal threshold is lower than the thresholds of those others, that's fine. We all have different needs and expectations.
 
Last edited:
I would assume there are boundaries between pixels (whether large or small) where one pixel ends and another starts, sort of like eggs in an egg carton, or the honeycomb of a beehive. An example is the pixel size of a Nikon D90 compared to a D3200. The D90 pixel is 106 percent larger than the D3200's. Slightly over two D3200 pixels could fit inside that of the D90's. In the middle of the D90 pixel is a boundary of the two D3200 pixels. There would approximately be twice as many boundaries between the D3200 pixels than between those of the D90's. Whereas taking the D90 pixel, there is no boundary in the middle and the boundary(ies) are probably what fosters or creates loss of data manifesting as noise in the photographs.
You assume things without researching their truth.

Camera sensors have a microlens over every pixel, and whatever 'boundaries' might exist between them make little difference for anything. In fact, test measurements of your two cameras show that the higher resolution sensor in the D3200 outperforms the one in the D90 in every metric including noise.

https://www.dxomark.com/Cameras/Compare/Side-by-side/Nikon-D90-versus-Nikon-D3200___439_801

d3814d0a1fd349e38a67ac9306b38134.jpg
 
Last edited:
I would assume there are boundaries between pixels (whether large or small) where one pixel ends and another starts,
Yes, your assumption is correct. Your implied assumption that the boundary is large enough to have a significant effect on noise and DR performance is incorrect.
sort of like eggs in an egg carton, or the honeycomb of a beehive. An example is the pixel size of a Nikon D90 compared to a D3200. The D90 pixel is 106 percent larger than the D3200's. Slightly over two D3200 pixels could fit inside that of the D90's. In the middle of the D90 pixel is a boundary of the two D3200 pixels. There would approximately be twice as many boundaries between the D3200 pixels than between those of the D90's. Whereas taking the D90 pixel, there is no boundary in the middle and the boundary(ies) are probably what fosters or creates loss of data manifesting as noise in the photographs.
You seem unaware that camera-makers place microlenses over each pixel on the sensor in order to redirect light that would otherwise fall on non-light-sensitive areas onto the light sensitive areas of pixels. There are boundaries between these lenses too, but their size is so small that noise due to light loss due to boundaries is much less significant than read noise, which is much less significant than shot noise differences from sensor size (not-pixel size) differences.

You made the same error with regard to the difference in sensor sizes between the D90 and the D3200. You identified the existence of a difference but you failed to appreciate the relatively small size of the difference and thus failed to realize that the effect of the difference on performance would be negligible compared to the effects of other differences such as exposure, sensor size class (MF/FF/APS-C/MFT/1"/...) and even read noise.
 
I would assume there are boundaries between pixels (whether large or small) where one pixel ends and another starts,
Yes, your assumption is correct. Your implied assumption that the boundary is large enough to have a significant effect on noise and DR performance is incorrect.
sort of like eggs in an egg carton, or the honeycomb of a beehive. An example is the pixel size of a Nikon D90 compared to a D3200. The D90 pixel is 106 percent larger than the D3200's. Slightly over two D3200 pixels could fit inside that of the D90's. In the middle of the D90 pixel is a boundary of the two D3200 pixels. There would approximately be twice as many boundaries between the D3200 pixels than between those of the D90's. Whereas taking the D90 pixel, there is no boundary in the middle and the boundary(ies) are probably what fosters or creates loss of data manifesting as noise in the photographs.
You seem unaware that camera-makers place microlenses over each pixel on the sensor in order to redirect light that would otherwise fall on non-light-sensitive areas onto the light sensitive areas of pixels. There are boundaries between these lenses too, but their size is so small that noise due to light loss due to boundaries is much less significant than read noise, which is much less significant than shot noise differences from sensor size (not-pixel size) differences.
Once the intensity of light falls below (even slightly) the requirement(s) even these microlenses will fail at their (the lenses') boundaries and noise will start to appear where these boundaries are. This is one reason that lighting conditions should always be ideal in photography.
You made the same error with regard to the difference in sensor sizes between the D90 and the D3200. You identified the existence of a difference but you failed to appreciate the relatively small size of the difference and thus failed to realize that the effect of the difference on performance would be negligible compared to the effects of other differences such as exposure, sensor size class (MF/FF/APS-C/MFT/1"/...) and even read noise.
I just glanced at a comparison made in one online article showing the image sensors side-by-side, giving me the impression the D90 sensor was much bigger. I did not consider their dimensions.
 
You assume things without researching their truth.
Yes, assuming is never a good idea. I had a boss once who would chew you out if you said you assumed something.
 
...

You seem unaware that camera-makers place microlenses over each pixel on the sensor in order to redirect light that would otherwise fall on non-light-sensitive areas onto the light sensitive areas of pixels. There are boundaries between these lenses too, but their size is so small that noise due to light loss due to boundaries is much less significant than read noise, which is much less significant than shot noise differences from sensor size (not-pixel size) differences.
Once the intensity of light falls below (even slightly) the requirement(s)
What requirements? The limit to transmission is between 1 and 0 photons.
even these microlenses will fail at their (the lenses') boundaries and noise will start to appear where these boundaries are.
Nope. Noise cannot appear at the boundaries because boundaries collect no light. Light is collected by discrete photo-diodes, 1 per pixel.
This is one reason that lighting conditions should always be ideal in photography.
The likely hood that any given photon will or will not be refracted by a lens does not depend on the number of photons arriving coincidentally on the lens.
 
That said, the device with a lower pixel count (but bigger pixels) has a better chance
Why chance? Pros don't leave anything to chance.
of taking better photos,
Better how?
depending on how large the picture is.

This is one reason some pros say that a 6-megapixel camera takes just as good photos, am I right?
Definitely. If you print postcards, even 6 Mpx is a bit of an overkill. 3 is plenty.
 
I would assume there are boundaries between pixels (whether large or small) where one pixel ends and another starts, sort of like eggs in an egg carton, or the honeycomb of a beehive. An example is the pixel size of a Nikon D90 compared to a D3200. The D90 pixel is 106 percent larger than the D3200's. Slightly over two D3200 pixels could fit inside that of the D90's. In the middle of the D90 pixel is a boundary of the two D3200 pixels. There would approximately be twice as many boundaries between the D3200 pixels than between those of the D90's. Whereas taking the D90 pixel, there is no boundary in the middle and the boundary(ies) are probably what fosters or creates loss of data manifesting as noise in the photographs.
I haven't seen any evidence of this.

Quantum efficiency does not actually vary much over a wide range of pixel densities.
 
That said, the device with a lower pixel count (but bigger pixels) has a better chance of taking better photos, depending on how large the picture is. This is one reason some pros say that a 6-megapixel camera takes just as good photos, am I right?
IMO, it is generally true of most cameras,
I just noticed this, but I'm not sure what you're agreeing with. Does the device (camera) with a lower pixel count also have a larger sensor area, or the same sensor area?

If the sensor in the camera with a lower pixel count has the same area as the one with a higher pixel count, we would have to establish what 'better' means. Does it mean improved dynamic range based on lower read noise - but maybe at the expense of lower detail recorded in the result?
BUT, very high-end and expensive MF cameras have much improved read noise because of techniques of sensor/pixel manufacturing. It shows up in a cleaner image regarding noise in shadow areas. This of course gives greater dynamic range overall. To see if my opinion is correct, look at images of same pixel density and exposre time of highend and less expensive cameras. Point is there is a way to address read noise but it is expensive.
So, was your agreement based mostly on the differences in read noise?
 
Last edited:
That said, the device with a lower pixel count (but bigger pixels) has a better chance of taking better photos, depending on how large the picture is. This is one reason some pros say that a 6-megapixel camera takes just as good photos, am I right?
IMO, it is generally true of most cameras,
I just noticed this, but I'm not sure what you're agreeing with. Does the device (camera) with a lower pixel count also have a larger sensor area, or the same sensor area?
Same area
If the sensor in the camera with a lower pixel count has the same area as the one with a higher pixel count, we would have to establish what 'better' means. Does it mean improved dynamic range based on lower read noise - but maybe at the expense of lower detail recorded in the result?
Yes, less read noise at the expense of lower resolution.
BUT, very high-end and expensive MF cameras have much improved read noise because of techniques of sensor/pixel manufacturing. It shows up in a cleaner image regarding noise in shadow areas. This of course gives greater dynamic range overall. To see if my opinion is correct, look at images of same pixel density and exposre time of highend and less expensive cameras. Point is there is a way to address read noise but it is expensive.
So, was your agreement based mostly on the differences in read noise?
Yes. Read noise is a main contributive to noise in shadow or dark areas. S/N ratio. Each pixel will add a bit read noise and therefore a sensor of the same size but more pixels will be adding to the read noise as the pixel number goes up compared to less pixels on the same size sensor. But, If a sensor has manufacturing techniques that lessen read noise through better electronic circuitry, a cleaner image will result (because of less noise per pixel) and more detail in shadow area can be obtained and so greater detail in both ends of the light range are possible. Some of the higher end cameras can be seen with this advantage.
 
That said, the device with a lower pixel count (but bigger pixels) has a better chance of taking better photos, depending on how large the picture is. This is one reason some pros say that a 6-megapixel camera takes just as good photos, am I right?
IMO, it is generally true of most cameras,
I just noticed this, but I'm not sure what you're agreeing with. Does the device (camera) with a lower pixel count also have a larger sensor area, or the same sensor area?
Same area
If the sensor in the camera with a lower pixel count has the same area as the one with a higher pixel count, we would have to establish what 'better' means. Does it mean improved dynamic range based on lower read noise - but maybe at the expense of lower detail recorded in the result?
Yes, less read noise at the expense of lower resolution.
BUT, very high-end and expensive MF cameras have much improved read noise because of techniques of sensor/pixel manufacturing. It shows up in a cleaner image regarding noise in shadow areas. This of course gives greater dynamic range overall. To see if my opinion is correct, look at images of same pixel density and exposre time of highend and less expensive cameras. Point is there is a way to address read noise but it is expensive.
So, was your agreement based mostly on the differences in read noise?
Yes. Read noise is a main contributive to noise in shadow or dark areas. S/N ratio. Each pixel will add a bit read noise and therefore a sensor of the same size but more pixels will be adding to the read noise as the pixel number goes up compared to less pixels on the same size sensor. But, If a sensor has manufacturing techniques that lessen read noise through better electronic circuitry, a cleaner image will result (because of less noise per pixel) and more detail in shadow area can be obtained and so greater detail in both ends of the light range are possible. Some of the higher end cameras can be seen with this advantage.
Nine years ago, DPReview said the read noise difference from smaller pixels is only a minor consideration. If by 'manufacturing techniques that lessen read noise through better electronic circuitry' you mean pretty much every sensor in cameras made over the last several years, then your point is valid. I think it's pretty rare now to find sensors with otherwise similar technology where the pixel pitch makes any important overall noise difference.

https://www.dpreview.com/articles/5365920428/the-effect-of-pixel-and-sensor-sizes-on-noise/2

'The key thing to note is that the shot noise difference (from sensor size) plays a role at all ISOs, and usually has more impact than is made by difference in pixel size or technology enhancements.'
 
Last edited:
BUT, very high-end and expensive MF cameras have much improved read noise because of techniques of sensor/pixel manufacturing. It shows up in a cleaner image regarding noise in shadow areas. This of course gives greater dynamic range overall. To see if my opinion is correct, look at images of same pixel density and exposre time of highend and less expensive cameras. Point is there is a way to address read noise but it is expensive.
So, was your agreement based mostly on the differences in read noise?
Yes. Read noise is a main contributive to noise in shadow or dark areas. S/N ratio. Each pixel will add a bit read noise and therefore a sensor of the same size but more pixels will be adding to the read noise as the pixel number goes up compared to less pixels on the same size sensor. But, If a sensor has manufacturing techniques that lessen read noise through better electronic circuitry, a cleaner image will result (because of less noise per pixel) and more detail in shadow area can be obtained and so greater detail in both ends of the light range are possible. Some of the higher end cameras can be seen with this advantage.
Nine years ago, DPReview said the read noise difference from smaller pixels is only a minor consideration. If by 'manufacturing techniques that lessen read noise through better electronic circuitry' you mean pretty much every sensor in cameras made over the last several years, then your point is valid. I think it's pretty rare now to find sensors with otherwise similar technology where the pixel pitch makes any important overall noise difference.

https://www.dpreview.com/articles/5365920428/the-effect-of-pixel-and-sensor-sizes-on-noise/2

'The key thing to note is that the shot noise difference (from sensor size) plays a role at all ISOs, and usually has more impact than is made by difference in pixel size or technology enhancements.'
Thanks for article. I had not read that before. It is a lot of detail showing the contribution of 'shot noise.' By 'manufacturing techniques', I was referring to something beyond BSI which has become standard. It is hard to find info on the exact differences in very high end and expensive MF sensors but I recall having seen samples showing noticiably cleaner images at high ISOs for 100MPs when compared to FF sensors.
 
Last edited:
This is the real question.
If you’re a studio photographer, you set up lighting and control the lighting and DR wont matter because you won’t need to lift shadows in post.

To cut through the chaff here — yes, a larger sensor can give you wider DR, but smaller sensor cameras have improved and noise reduction software like DxO has also improved. Full frame cameras cost more, weigh more, and the lenses are heavier. I love my 70-200 f/2.8, but carrying that with two other f/2.8 lenses is a workout! Two strobes weighs far less, that’s for sure.

You can absolutely deliver professional results with a crop sensor camera. Which system you choose depends a lot on the genres you’re chasing, your budget and even the weight requirements.



For me, I like to photograph musicians. The light is horrible, and we have extreme brightness next to extreme darkened in the images. I shoot RAW and most often cut the highlights way down and push the shadows way up, at ISO’s which can be very high. So FF is best for this, along with great editing software.
 
Last edited:
For me, I like to photograph musicians. The light is horrible, and we have extreme brightness next to extreme darkened in the images. I shoot RAW and most often cut the highlights way down and push the shadows way up, at ISO’s which can be very high. So FF is best for this, along with great editing software.
High-DR FF, of course. I don't think the FF cameras from 18 to 20 years ago can hold a candle to modern APS-C sensors in the shadows.

Your type of photography also begs for lenses with minimal contrast loss and imaged flare, with no filters if not absolutely necessary, and a very clean lens. A lens hood is useful, too, when there are bright direct light sources outside the FOV, but which still hit the lens directly.

It may seem on the surface that it is logical to embrace low optical contrast with high-contrast scenes so as to fit the capture within the camera's DR, but these two DRs are apples and oranges; the optical/scene DR is a range of light levels, and the sensor DR is the range between a clipping point, and a certain amount of maximum acceptable noise. When you have milky, clouded shadows because of optics, you have extra light there, and extra light causes MORE photon noise, which has nothing to do with the actual subject matter, so it is really is just extra noise. Once you remove that milky haze from the shadows, all of the extra photon noise remains, so they are noisier than if you had just used a lens that had very high contrast, to begin with. If you say, "I wanted that blanket of light's effect in the shadows, anyway, then you could just use a high-contrast lens and add a blanket of light in software, with NO added noise.
 
Yes. Read noise is a main contributive to noise in shadow or dark areas. S/N ratio. Each pixel will add a bit read noise and therefore a sensor of the same size but more pixels will be adding to the read noise as the pixel number goes up compared to less pixels on the same size sensor.
The read noise of individual pixels does not "add" into more noise. When additive Gaussian noise at the pixel level is proportional to pixel spacing, it is equalized in aggregate, at the image level. So, if you quadruple the total pixel count of the same size sensor, the same image-level read noise would be had if the pixel-level read noise doubled, which is actually quite typical.

Your observation is confounded, perhaps, by a very real technological challenge in reading lots of pixels on a large sensor at a high speed, in order to minimize rolling shutter artifacts and allow useful flash sync in e-shutter modes. Large sensors don't seem to handle speed demands gracefully. Take any current best-of-class m43 or 1" sensor, and compare against the Sony 61MP FF sensor cropped to the same size, and to the most recent 24MP FFs, and they will be in the ballpark of similar or slightly cleaner noise with the 24MP FFs cropped, and significantly cleaner than the 61MP FF, cropped.
 
For me, I like to photograph musicians. The light is horrible, and we have extreme brightness next to extreme darkened in the images. I shoot RAW and most often cut the highlights way down and push the shadows way up, at ISO’s which can be very high. So FF is best for this, along with great editing software.
High-DR FF, of course. I don't think the FF cameras from 18 to 20 years ago can hold a candle to modern APS-C sensors in the shadows.

Your type of photography also begs for lenses with minimal contrast loss and imaged flare, with no filters if not absolutely necessary, and a very clean lens. A lens hood is useful, too, when there are bright direct light sources outside the FOV, but which still hit the lens directly.

It may seem on the surface that it is logical to embrace low optical contrast with high-contrast scenes so as to fit the capture within the camera's DR, but these two DRs are apples and oranges; the optical/scene DR is a range of light levels, and the sensor DR is the range between a clipping point, and a certain amount of maximum acceptable noise. When you have milky, clouded shadows because of optics, you have extra light there, and extra light causes MORE photon noise, which has nothing to do with the actual subject matter, so it is really is just extra noise. Once you remove that milky haze from the shadows, all of the extra photon noise remains, so they are noisier than if you had just used a lens that had very high contrast, to begin with. If you say, "I wanted that blanket of light's effect in the shadows, anyway, then you could just use a high-contrast lens and add a blanket of light in software, with NO added noise.
Thank you! You’re a great asset to the community

No matter how much I learn and how good I get, I always learn more.

How do you assess whether a lens is high or low contrast? I have all of my lenses listed below, but the one I use the most is the cheapest — the Canon EF 85mm f/1.8. I just like the look I get with that lens. I hated the 50mm f/1.4 because it had poor focus accuracy, but that seems to be resolved with the R series cameras. I have the 24-70 f/2.8L and the 70-200 f/2.8L, but I mostly use them only if it’s daytime or outdoors. For the indoor shows, I usually like the 85 the most.





 
Last edited:

Keyboard shortcuts

Back
Top