f-number equivalent between m43 and FF

Started Mar 25, 2014 | Discussions thread
HumanTarget Senior Member • Posts: 1,369
Re: f-number equivalent between m43 and FF

D Cox wrote:

HumanTarget wrote:

D Cox wrote:

The final SNR depends (assuming technology of the same generation) on the amount of light falling on each pixel. This determines the uncertainty of the measurement of light by that individual photodetector. The more photons, the more certain the measurement.

Actually, shot noise increases with the number of photons, so the more photons the more noise, even though the SNR is increased. And a lot of cameras have reduced read noise at higher ISO settings, in which case they get better measurements with lower light levels (though at a reduced SNR ratio).

The uncertainty is what you call the SNR. I find it better to think of a single measurement (one exposure, one pixel) as having uncertainty rather than noise. If you call it noise, you get confused with the image noise across an array of pixels, which results from the random uncertainty of each pixel measurement.

The uncertainty is the noise, which is why it's called noise.  A better SNR does not give you better certainty; it just makes the uncertainty less important.  Noise is noise, whether from one pixel sampled multiple times, or from an array of pixels.

The signal in a photographic image is the differences between pixels. The uncertainty (error bars) for each pixel gives a random, or noise, component when comparing the pixels.

Your definition of signal would suggest that a noisy image has a stronger signal (more difference between pixels).

A noisy image might have a higher amplitude, but the whole problem of noise is that it is inextricably combined with the image signal. There is no a priori way to know that the sky doesn't really have a lot of random variations in it. We call this "noise" because we know from other experience that the sky doesn't "really" look dotty.

We know that the sky has a lot of random variations in it, because that's the way light works.  Light is noisy.

The total area of the sensor is irrelevant. If you put the same lens on various sizes of sensor, the noise level in the part of the image that they all record will be identical. (Assuming they are the same generation of technology.)

But why would you crop the image of the larger sensor? I know of nobody who always crops their images to a lowest common denominator. That totally defeats the purpose of a larger sensor. More information gives you a better image.

You would crop it only for the purpose of comparing noise levels between sensors. It is essential to keep the variables to a minimum, and adding varying amounts of enlargement to the mix confuses the comparison.

But by comparing different sized pixels, you're doing the same thing.  You're comparing one surface that can hold more light to one that holds less.  If you had two ideal/perfect sensors of the same size, but one had twice the pixels, it would appear more noisy per pixel because the sensor with fewer pixels would not have the resolution to show the noise inherent to the light.  By your logic, though, it would be a worse performing sensor, even though it'd have no noise of its own.

The question of why you might see more noise in a bigger print from the same area of a sensor is interesting, but I don't think it is a property of the sensor. Pixel size is.

But the sensor is made up of a number of pixels, isn't it?  If you compare a V6 engine to a V8 engine, you wouldn't compare the cylinder of one to another; you'd compare the overall effect of all of them.

Post (hide subjects) Posted by
(unknown member)
(unknown member)
Keyboard shortcuts:
FForum PPrevious NNext WNext unread UUpvote SSubscribe RReply QQuote BBookmark MMy threads
Color scheme? Blue / Yellow