tony field
Forum Pro
I just took a photograph in the sunshine on a lovely Chinook warm day of a half-naked ladyInteresting. How does that happen?I would think so...Does the resolution change?Some folks believe SNR is baked in when the image is made and does not change with display/print size. I happened to think that it does change.Thanks for your replies. I believe that we see the key point that when we talk about observing with our eyes, we have to compare the SNR on a display or print out.
Assume that only "shot noise" is the dominant noise.
SNR of output dispaly or image from Sensor A / SNR of output display or image from Sensor B
= sqrt(Ra / Rb) * (SNRa / SNRb) where Ra is Pixel count of Sensor A and Rb ...
= ..... (because SNR = sqrt(Signal) ∝ sqrt(Photosite Area)
= sqrt(Area of whole sensor A / Area of whole Sensor B)
= 1
pixel dimension, linear dimension reduction, image Std Deviation
3840x5760, 100%, 64.47
1920x2880, 50%, 64.54
960x1440, 25%, 64.77
Resizing the 25% image using bicubic automatic back to full resolution 100% I get:
3840x5760, 100% 64.69
From this I infer that, since the total image standard deviation is virtually unchanged in any of the up or down resolution operations (i.e. in MBP's terms, "change the display/print size"), the signal to noise ratio is invariant with up/down scale. Of course, I assume that any mathematical operation on an image must introduce at least a slight bit of noise, but it is inconsequential, an integer mathematical oddity and does exist.
After each resize, with an obvious change of image resolution detail, you have a unique new image with it's own SNR and standard deviation, but they are so close in terms of noise as to be "the same".
Is this a valid interpretation ??
--
Charles Darwin: "ignorance more frequently begets confidence than does knowledge."
tony
http://www.tphoto.ca
Last edited: