Getting down to the nitty-gritty about noise and it's effect on IQ Locked

Started Apr 15, 2010 | Discussions thread
ForumParentFirstPrevious
Flat view
This thread is locked.
Crocodile Gena Senior Member • Posts: 1,017
Getting down to the nitty-gritty about noise and it's effect on IQ

It's not a secret that if we compare 100% crops from two equally efficient sensors with the same format (sensor size) but different pixel counts (pixel sizes), that the crop from the sensor with more pixels will appear more noisy.

However, we are comparing different levels of magnification when we make such a comparison. Much more useful, in terms of delivered IQ, is if we compare at the same scene as opposed to the same number of pixels, as this will relate to the appearance of the final image.

To that end, let's consider the following samples:

Does it really make sense to call the 50D pics "more noisy"? Or, instead, does it make more sense to say the noise is basically the same, but the 40D pics are simply "more blurry"?

In my estimation, the latter description seems to more accurately represent the images.

Noise is a vector. That is, it has two components: amplitude and frequency. The greater the standard deviation of the recorded signal from the true (mean) signal, the greater the amplitude. The greater the number of samples (pixels), the greater the frequency.

I hope we can all agree that a greater amplitude of noise is undesireable, whereas a greater frequency of noise is desireable. The problem comes from comparing the amplitudes of noise at different frequencies. If one image has a greater amplitude of noise at a higher frequency than another image which has a lower amplitude of noise at a lower frequency (the case for the images above), does that make it "more noisy"?

In my opinion, the image that most accurately represents the scene is the less noisy of the two, and is consistent with one defintion of noise that means "unwanted signal".

Along these lines, I would argue that if the image with the higher pixel count can be downsampled (and/or NR can be applied) so that maximum frequency of noise matches that of the lower pixel count image, and the amplitude of the downsampled/NR image matches, or is less than, the amplitude of noise of the image with the lower pixel count, then the image with the greater pixel count is not "more noisy".

In other words, using scalar descriptors ("more" / "less") to describe noise in scalar (single-valued) terms makes sense only if the maximum frequencies are the same. But if we are comparing vector quantities (amplitude / frequency), such a simplistic comparison does not accurately relate to the IQ of the final image.

I'd be pleased to hear the opinions of those that have an interest in this topic.

ForumParentFirstPrevious
Flat view
Post (hide subjects) Posted by
tko
ForumParentFirstPrevious
Keyboard shortcuts:
FForum PPrevious NNext WNext unread UUpvote SSubscribe RReply QQuote BBookmark MMy threads
Color scheme? Blue / Yellow