More pixels are better for noise reduction. Really?

Started May 8, 2015 | Discussions thread
Great Bustard Forum Pro • Posts: 45,431
I can answer that.

AceP wrote:

I've read several posts recently that claim more pixels actually have no effect or even a beneficial effect on reducing noise. Whilst in the past, there was a chorus from many that DEMANDED the megapixel race please stop as we don't want the increase in noise attributed to smaller, more dense pixels.

So please explain (or provide link to explain) this situation where the 36MP (left) samples have more noise in the shadows than the 16MP(right) samples (after they have been normalized to the same print size. :

Same with 36MP compared to 24MP

Noticed the same with the new 50MP Canon samples have more noise.

First of all, with regards to the thread title vs examples posted, "More pixels are better for noise reduction. Really?", downsampling is not noise filtering. Extra resolution doesn't help much, if at all, with regards to noise for downsampling, but it does help a lot when applying noise filtering, which is a far better way to reduce noise than downsampling. How much noise filtering helps, however, depends heavily upon the photo having the additional detail to work with. So, if the conditions of the photo were such that more pixels did not resolve much more detail (e.g., a photo suffering from missed focus, motion blur, etc.), noise filtering won't help nearly as much as it could.

That said, there are two primary sources of noise in a photo:

  • Photon noise (noise from the light itself -- more light, less noisy photo)
  • Electronic noise (noise from the sensor and supporting hardware)

The only aspect of the sensor that affects photon noise is the QE (Quantum Efficiency -- the proportion of light falling on the sensor that is recorded). For sensors of the same generation, the QE is remarkably consistent regardless of sensor size, pixel size, or brand. There are exceptions, of course, such as sensors using BSI tech, but, for the most part, it is remarkably consistent.

The electronic noise, however, can vary a lot based on a great number of variables, to include the ISO setting (and, unlike what most believe, it is *less* at higher ISO settings, not more) and pixel count.

The electronic noise is small compared to the photon noise except for the portions of the photo made with very little light, such as deep shadows at base ISO and progressively more of the photo as the ISO rises.

So, let's consider the D810, D750, and D4s in your comparisons. The QEs for the sensors are 47%, 51%, and 52%, respectively -- essentially identical. Thus, no difference in photon noise.

In terms of the electronic noise, we need to normalize it to the same proportion of the photo. I like to use the µphoto (millionth of a photo) as it is convenient computationally (it also represents one pixel of a photo displayed at 1200x800 pixels on a monitor, which is a common display size for a photo). Thus, one µphoto represents 36 pixels for the D810, 24 pixels for the D750, and 16 pixels for the D4s.

Noise is the standard deviation of the recorded signal from the mean signal, and standard deviations add in a peculiar manner called a quadrature sum. For example, if we have four pixels with an electronic noise of 2 electrons/pixel, the combined noise isn't 2+2+2+2=8 electrons, but rather sqrt(2²+2²+2²+2²)=4 electrons. So, the electronic noise / µphoto for the D810, D750, and D4s, respectively, at ISO 6400 is 15.6, 11.8, and 7.6 electrons.

Let's discuss the significance of this in terms of the noise we see in the photo. The saturation/µphoto for the D810, D750, and D4s at ISO 6400 is 28548, 29040, and 29472 electrons/µphoto, respectively. These are almost identical because they should be almost identical, as the sensors have the same area. The photon noise is the square root of the signal, so let's use 29000 electrons as the max signal and tabulate the photon noise (in electrons) starting at max saturation and at one interval stops below full saturation:

  • Max saturation: 170
  • 1 stop down: 120
  • 2 stops down: 85
  • 3 stops down: 60
  • 4 stops down: 43
  • 5 stops down: 30
  • 6 stops down: 21
  • 7 stops down: 15
  • 8 stops down: 11
  • 9 stops down: 8

What do we see here? The electronic noise matters as much as the photon noise 7 stops down from full saturation with the D810, 8 stops down from full saturation with the D750, and 9 stops down with the D4s. Of course, the electronic noise is a factor long before it is on parity with the photon noise, but we can clearly see that the electronic noise is worse for sensors with more pixels, although we are talking about ISO 6400 here. At lower ISO settings, we have to go progressively further down the DR before the electronic noise is a factor (and, conversely, it becomes a factor earlier as we go to higher ISO settings still).

Let's be more specific still and compute the NSR (Noise-to-Signal Ratio) by adding the photon noise and electronic noise (keeping in mind that the photon noise and electronic noise will add in quadrature, as described above). The table below ls in the following format: NSRs for the D810 / D750 / D4s:

  • Max saturation: 0.6% / 0.6% / 1.7%
  • 1 stop down: 0.8% / 0.8% / 1.7%
  • 2 stops down: 1.2% / 1.2% / 1.7%
  • 3 stops down: 1.7% / 1.7% / 1.7%
  • 4 stops down: 2.5% / 2.4% / 2.4%
  • 5 stops down: 3.7% / 3.6% / 3.4%
  • 6 stops down: 5.8% / 5,4% / 5.0%
  • 7 stops down: 9.6% / 8.4% / 7.4%
  • 8 stops down: 16.7% / 14.0 % / 11.5%
  • 9 stops down: 30.6% / 24.7% / 18.9%

As we can see, the electronic noise begins to matter only for the portions of the photo made with lower and lower light. You can clearly see that for the very dark portions of the photo, which is what your linked examples showed (deep shadows at ISO 6400), there is a clear disadvantage to more pixels with regards to noise.  Further compounding the problem is the lack of detail in those shadows meaning that even if noise filtering were applied, it would be of limited utility.

So, are more pixels more noisy than fewer pixels? For sensors of the same generation, yes they are. Is it significant? Not until higher ISO settings. Can noise filtering (as opposed to downsampling) tip the balance in favor of the sensor with more pixels? Yes, it can, if the scene is such that the sensor with more pixels is able to record more detail and the light is not so low that the electronic noise dominates the photon noise.

In fact, you can do a simple experiment to see for yourself. Take two photos of a detailed static scene from the same position, one using twice the focal length as the other (e.g. 50mm and 100mm). Use 4x the exposure time and 1/4 the ISO setting for the 50mm photo. For example, if the 100mm photo is at f/5.6 1/800 ISO 6400 then shoot the 50mm photo at f/5.6 1/200 ISO 1600. The reason for f/5.6 is so the photos are sharp, the reason for 1/800 and 1/200 are so camera shake does not adversely affect the sharpness, and the reason for ISO 6400 and ISO 1600 so that the photo is noisy.

Now crop the 50mm photo to the same framing as the 100mm photo. The different exposure times on the two photos ensure the same amount of light was used to create the cropped 50mm photo as the whole 100mm photo, so this simulates two sensors of the same size with the same tech where one sensor has 4x the pixel count as the other. Apply noise filtering to the 100mm photo until it matches the detail in the 50mm photo. Display both photos at the same size. What do you notice? How might that change based on the conditions of the photo?

I hope that explains the situation.

Post (hide subjects) Posted by
(unknown member)
(unknown member)
(unknown member)
Keyboard shortcuts:
FForum PPrevious NNext WNext unread UUpvote SSubscribe RReply QQuote BBookmark MMy threads
Color scheme? Blue / Yellow