How to see diffraction in photos

Started 4 months ago | Discussions thread
knickerhawk Veteran Member • Posts: 7,552
Re: How to see diffraction in photos
2

cba_melbourne wrote:

knickerhawk wrote:

cba_melbourne wrote:

knickerhawk wrote:

cba_melbourne wrote:

finnan haddie wrote:

cba_melbourne wrote:

finnan haddie wrote:

cba_melbourne wrote:

finnan haddie wrote:

cba_melbourne wrote:

...

But the effects of diffraction only become apparent in our picture, when the size of the airy disc exceeds the pixel size of our sensor. And that happens the sooner, the smaller the pixel size is.

On a 20 MPix MFT sensor that would mean around f/2.8.

Interesting, is that with the Bayer mask, or without as in a black and white sensor?

For both types of sensors. A Bayer mask just adds funny false colors at the edges.

No, it increases the effective pixel size. Two green, one blue and one red pixel are demosaiced, and result in a larger pixel size for the purpose of diffraction visibility.

Diffraction doesn't care about Bayer masks, as you can easily see in RAW files. Also diffraction doesn't care if you develop your picture in black and white or color of choice.

If you say so...

but I disagree

Maybe this helps: https://www.scantips.com/lights/diffraction.html

Not really. It's a rambling discussion primarily about the tradeoffs between diffraction and DOF and barely touches on the critical factor here - the tradeoff between optical aberrations and diffraction (and the role that pixel size plays).

Do a search in the text for "pixel" and "bayer"

Oh please. We've both been posting here long enough for you to know that I don't need a remedial education on what "pixel" and "bayer" means.

Or read this article: https://panoramashots.co.uk/technical-notes/light-as-waves-diffraction-blur/

Once again, you are posting articles that don't address (or at least don't cogently address) the specific argument here. A better explanation of the specific dispute we're having can be found in Jim Kasson's post here and in his blog post on this very issue here . If you've ever hung out on the PST forum, you should be very familiar with Jim Kasson and the depth of his technical knowledge.

What matters is, at which aperture can we start just barely noticing a loss of sharpness on a 20MP sensor camera, when comparing identical pictures taken at various apertures.

And I like to claim, that with the 10% very best resolution lenses we have in m43, we just barely can see the diffraction effect taking off at f/5.6. For the remaining 90% of lenses, it is at f/8 or somewhere in between. But hey, your eyes may be better than mine

Now you're singing a very different tune

Not at all

from when you wrote:

The effects of diffraction very much depend on sensor pixel size (and on the presence or not of an anti aliasing filter in the sensor stack). The smaller the pixel size, the sooner (at a smaller F/number) will diffraction begin to limit the image resolution.

That is still 100% correct

For the final time, my beef is that the way you've phrased that bolded statement is wrong or at the very least extremely misleading.

I don't believe anybody in this sub-discussion has disputed that lens quality plays a major role with regard to which f-stop we begin to "see the diffraction effect taking off." Of course lens quality is critical to when diffraction becomes the primary source of blur! The dispute is about whether the visibility of the "diffraction effect taking off" varies because of pixel size, not because of lens quality.

Your original assertion was that there has been (must be!) a shift in the f-stop at which "the diffraction effect [visibly and measurably] takes off" because there's been a progressive decrease in pixel pitch in mFT cameras over the years.

absolutely, yes

Rather than celebrating the increased sensor resolution this implies and therefore overall system resolution, you're fretting about Airy Disk limits. Well, to date, I've seen no evidence that supports your bolded claim above in the real world of photography, and I provided counter-evidence as well (DXOMark sharpness profiles for virtually all lenses that have been tested on cameras with differing pixel pitches) that you chose to ignore.

DXOMark results do indeed change with camera resolution.

Of course they do. That's not the issue though. Your specific wording of how diffraction is affecting image blur implies that the f-stop at which peak resolution occurs changes when camera resolution changes. It doesn't, and that's what the DXOMark tests are showing while at the same time they show that, as between cameras of differing sensor resolution, the overall image resolution changes.

You cannot directly compare lens tests they did years ago with lesser resolution cameras to their latest results.

DXOMark hasn't changed the basic sharpness testing it does and how those tests results are displayed in the Sharpness Profile charts. Besides, for purposes of our dispute, it's really the testing used at different f-stops for a given camera that's more important. I assume you're not actually trying to argue that DXOMark changes lens testing methodology between the f-stops it measures for a given camera?!?

Most m43 lenses greatly outperform in resolution even our current 20MP sensors. If it was not so, pixel shift HR mode would not work, nor would it make any sense to use teleconverters or extension tubes instead of digital zoom. All those features only work and provide a benefit over digital zoom, because many m43 lenses have good enough resolution for a 60 or 70MP sensor.

And it is exactly this "lens outresolving our current sensor" that causes diffraction to "kick in" at a certain aperture. If it was not for this outresolving, diffraction would become visible on our pictures perfectly gradually and in a linear fashion.

If you still have one of the early 12MP cameras, do a series of pictures with different apertures. Then repeat, with the same lens, on your current 20MP camera. You should notice a shift at which aperture diffraction starts to visibly degrade image sharpness.

Actually, I did that test with my EPL1 (before it died) vs. my EM1iii using the Oly 75mm prime. Peak center sharpness for both was about f/3.5, so that means that diffraction "starts to visibly degrade image sharpness" at higher f-numbers.

Likewise, you will notice a similar shift when you do the same series of pictures on your 20MP camera, once with a top quality lens and once with a "cheaper" old adapted lens with much less resolution.

The "shift" that occurs with the cheaper/old adapted lens is due to the balance of lens aberration induced blur relative to diffraction, NOT because of the differences in sensor resolution. And you don't even need to compare multiple lenses to see this because blur-inducing lens aberrations generally occur at differing amounts across the image field projected by any lens. Peak resolution at or near-center is almost always achieved at a lower f-number than peak resolution at the perimeters of the lens - again, due to the relative difference in contribution in overall blur presented to the sensor made by lens aberrations and diffraction. Same lens. Same f-stop. Same sensor. Same pixel pitch. Same pixel aperture. Different visible image sharpness in different parts of the image. Hmmmmm...how do you reconcile that with your claim?

The cheap lens will only start to show diffraction softening at f/8 or f/11, whereas the expensive top quality lens may show it already at f/4 or f/5.6. That is because the top quality lens will already be nearly "diffraction limited" fully open, whereas the cheap old lens will only begin to be "diffraction limited" at f/8 or f/11, and before that it's resolution is limited by it's huge aberrations and not by diffraction. And you will notice one more interesting thing, once past the aperture that starts showing diffraction on the cheap lens, both lenses will have exactly the same resolution, because now both the cheap and the expensive lenses both are limited by diffraction and it affects both exactly the same.

In the end, this dispute is about the care with which we say things and what we mean when we say them. It's important on a forum such as this to be careful with our wording since it's so easy to make claims that mislead others.

One last try. Look, it is that easy:

No, it's not "that easy". It's more like it's this hard:

Where the magenta blur I've added represents the product of all lens aberrations, which decrease as aperture is reduced. You continue to fixate exclusively on diffraction induced blur while I continue to fixate on total optical blur presented to those pixels. As long as you isolate and only consider one element of optical blur (i.e., diffraction) you quickly stumble into the weird logic I described in my response to ahaslett, especially with respect to the implications for how we use cameras with different pixel sizes/apertures.I think it's probably time for us both to move on since we're largely talking past each other and repeating ourselves in our emphasis of different aspects of image blur and its relationship to pixel size/aperture.

Post (hide subjects) Posted by
Keyboard shortcuts:
FForum PPrevious NNext WNext unread UUpvote SSubscribe RReply QQuote BBookmark MMy threads
Color scheme? Blue / Yellow