The magnitude of smearing due to diffraction
will be an absolute length beneath which two
points cannot be distinguished. This length is
defined solely by the wavelength of light and
the f number of the optical system. Sensor
size and pixel resoution do not define this dif-
fraction length.
Agreed.
.
So we have an absolute length, say for example
0.01 mm, beneath which nothing can be resolved
in the focal plane. I do not see how sensor size
can therefore determine the effective blurring due
to diffraction.
It's because when one resolution-limited system (such as a stopped-down lens) passes on a signal to another resolution-limited system (such as a sensor with a finite number of discrete pixels) then the resolution of the signal dropping off the end of the chain is not simply equal to the minimum of the resolution limits of the chain's indiviual elements. Instead, the resolution limits will interact in a fairly complex way. This
always gets ignored in those foolish articles, essays, and webpages addressing pixel pitch and Airy disk diameters.
.
The only thing which matters is the distance
between one pixel and another.
No. The only thing that matters (for the degree of blur introduced by diffraction) is the distance from the left border of your frame to the right.
.
As a matter of fact, sensor size is the only factor determining
the effective degree of blur from diffraction. And contrary to
common belief, pixel density has nothing to do with it.
OK. But in that case an APS-C sensor with 4 pixels and an APS-C
sensor with 100 MP will have the same 'effective degree of blur
from diffraction' according to you ...
Exactly.
.
... which is difficult to believe, to say the least.
What is so difficult about it? Obviously you are confusing pixel resolution (or lack thereof, see your four-pixel sensor) with the degree of blur from diffraction. The pixel count of the sensor will provide a certain resolution; it will be higher with more pixels, or lower with less pixels. The effect of diffraction subtracts from that. The amount that gets subtracted depends on aperture and sensor size but
not on pixel density---actually you
can subtract significantly less than the equivalent of one pixel-to-pixel distance, and it will still make a visible difference.
In order
not to make a visible difference, the amount subtracted had to be
much less than a single pixel ... so with your hypothetical four-pixel sensor the loss wouldn't matter indeed. But note the emphasis on the word 'visible'! In mathmatical terms, even your four-pixel sensor would still suffer from diffraction, to the same (absolute) degree as a four-million-pixel sensor of the same size. Only in relative terms the loss from diffraction would the virtually zero because the original resolution is so low to begin with. However in real life there are no four-pixel sensors ...
So when two sensors have different sizes but the same (or almost the same) pixel pitch then the single pixel's contribution to the final image from the larger sensor will be less, hence less blur from diffraction. Basically the same is true when the larger sensor has the same pixel count as the smaller one. Then the individual pixel's contribution to the final image will be equal for both sensors but the (relative) loss from diffraction on the single pixel will be less for the larger sensor (remember: the
absolute loss is always the same, no matter how big or small the individual pixel, as long as the aperture remains the same). Either way, the larger sensor wins, no matter what the pixel densitiy is.
Having said that, don't feel bad everybody who doesn't own a 35-mm-format DSLR camera! The margin by which it wins over a APS-C-format camera in terms of diffraction blur is minuscule, and in 95 % of the cases it gets completely obscured by other real-world effects affecting (read: reducing) image quality which usually are one or two orders of magnitude greater than the difference of diffraction blur between APS-C and 35-mm formats.
Regards,
Olaf
--
Olaf Ulrich, Germany