Re: How to see diffraction in photos
cba_melbourne wrote:
finnan haddie wrote:
cba_melbourne wrote:
finnan haddie wrote:
cba_melbourne wrote:
finnan haddie wrote:
cba_melbourne wrote:
...
But the effects of diffraction only become apparent in our picture, when the size of the airy disc exceeds the pixel size of our sensor. And that happens the sooner, the smaller the pixel size is.
On a 20 MPix MFT sensor that would mean around f/2.8.
Interesting, is that with the Bayer mask, or without as in a black and white sensor?
For both types of sensors. A Bayer mask just adds funny false colors at the edges.
No, it increases the effective pixel size. Two green, one blue and one red pixel are demosaiced, and result in a larger pixel size for the purpose of diffraction visibility.
Diffraction doesn't care about Bayer masks, as you can easily see in RAW files. Also diffraction doesn't care if you develop your picture in black and white or color of choice.
If you say so...
but I disagree
Maybe this helps: https://www.scantips.com/lights/diffraction.html
Not really. It's a rambling discussion primarily about the tradeoffs between diffraction and DOF and barely touches on the critical factor here - the tradeoff between optical aberrations and diffraction (and the role that pixel size plays).
What matters is, at which aperture can we start just barely noticing a loss of sharpness on a 20MP sensor camera, when comparing identical pictures taken at various apertures.
And I like to claim, that with the 10% very best resolution lenses we have in m43, we just barely can see the diffraction effect taking off at f/5.6. For the remaining 90% of lenses, it is at f/8 or somewhere in between. But hey, your eyes may be better than mine
Now you're singing a very different tune from when you wrote:
The effects of diffraction very much depend on sensor pixel size (and on the presence or not of an anti aliasing filter in the sensor stack). The smaller the pixel size, the sooner (at a smaller F/number) will diffraction begin to limit the image resolution.
I don't believe anybody in this sub-discussion has disputed that lens quality plays a major role with regard to which f-stop we begin to "see the diffraction effect taking off." Of course lens quality is critical to when diffraction becomes the primary source of blur! The dispute is about whether the visibility of the "diffraction effect taking off" varies because of pixel size, not because of lens quality.
Your original assertion was that there has been (must be!) a shift in the f-stop at which "the diffraction effect [visibly and measurably] takes off" because there's been a progressive decrease in pixel pitch in mFT cameras over the years. Rather than celebrating the increased sensor resolution this implies and therefore overall system resolution, you're fretting about Airy Disk limits. Well, to date, I've seen no evidence that supports your bolded claim above in the real world of photography, and I provided counter-evidence as well (DXOMark sharpness profiles for virtually all lenses that have been tested on cameras with differing pixel pitches) that you chose to ignore.