Diffraction Limit

Started Aug 28, 2013 | Discussions thread
Detail Man
Detail Man Forum Pro • Posts: 16,785
Re: Yep ...

Dr_Jon wrote:

It's a nightmare reading through all of this and I'm not sure I'm cool on people knocking Airy disks so much (well, if that's what they are doing, with my Astronomical hat on I tend to see them a lot).

Luckily I like the unquoted content of your last post on this, although it does tend to some pointing out of fine details over what's useful in the real world, IMHO, that can perhaps be less helpful to people. Can I add some stuff (feel free to disagree, I learned the other day rushing something off before going out can go spectacularly wrong on me... and I'm due out the door in 4 mins and counting...)

There is no such thing as a "diffraction limit" except when the resolution falls to zero.
- well, yes that's true, except it is actually useful to have an idea where diffraction is going to really start killing the sharpness so worth remembering this so-called non-existent limit, while understanding it isn't one you can't bypass (at a cost). On m43 I tend to remember f8 is somewhere not to go below without thinking carefully, for example (I try to stick to f5.6 and up a lot of the time, but I have lenses that are really good at f4-f5.6). There might be diffraction at f2.8 but at that level it's more something for scientific argument than for photographers to worry about.

Hi Jon,

At F-Numbers higher than any F-Number setting that (may) result in a defined "peak" (relative maximum) in the manitude of the spacial frequency response (the "contrast" level of the MTF) - where the lens-system (itself) is referred to as being "diffraction limited" - the amount of futher MTF response "roll-off" where one's mind's eyes may make an aesthetic value-judgement of "un-sharpness" (?) is unclear even to ourselves.

Likely varies with viewing conditions, mood, not to mention the individual nature of the viewed image itself. (At least) one individual can perhaps eventually "iterate" towards what settings satisfy their own expectations.

Rather than beginning at an F-Number "sweet spot" of a "diffraction limited" range of a lens-systems F-Number adjustment - the tangible (albeit still "gradual") "damage" is going to occur when the MTF of the lens-system itself drops below (say) 10% Contrast, taking the rest of the system (the optical filter-stacks, photosite dimensions, and de-mosaicing) "down" by attenuating the highest values of the net composite system spatial frequency (MTF) response of the entire system (at the "front end" of the "signal path").

My estimates of the range of such critical Wavelength multpilied by F-Number products are here:


There is a point where diffraction softening becomes the dominant source of blur, and this point will vary from lens to lens, as well as where in the frame we are looking (the corners typically, but not always, lag about a stop behind the center for DSLR lenses).
- not arguing

All systems suffer the same diffraction softening at the same DOF.
- Okay, except other factors will affect how much it troubles you, so again more readers here will find it not so useful choosing whether to use a FF or m43 camera to shoot something.

More pixels, all else equal, will *always* resolve more detail.
- True, although a chunk of the time it will be to so small a degree you don't care, the rest of the system needs to be in the ball-park. The number of times a friend's D800 out-resolves my 5DmkII are less than you'd think as the pixel effect gets lost in other factors. (He has consumery long lenses, for example)

All systems do not necessarily resolve equally at the same DOF, as diffraction is one of many sources of blur. However, as the DOF deepens, the resolution decreases, and the resolution differences between systems narrows, typically becoming trivial by f/16 on mFT (f/32 on FF and f/5.6 on an FZ200), regardless of how sharp the lens is or how many pixels the sensor has.
- okay

Anyway, I should ask a question - are you unhappy about applying Airy disk maths to either a CoC for viewing a picture of size X at distance Y or using a CoC for the presumed resolution limit of a camera in the 2-3 pixel pitch range (depending on AA filters, de-Bayering algorithms, etc.)?

Thinking in the spatial domain in terms of Airy disk nodes and photosite sizes does not really yield the kind of "intuitively obvious" visualizations that one might imagine to exist. Any focusing-error will have profound (and complicated) additional ("bandlimiting") effects on the lens-system MTF (over and above what we are already discussing).

It is much more useful to perform the analysis in the form of working with MTF (spatiall frequency) response-curves. This enables one to simply multiply together all of the (spatial frequency domain) individual MFTs (of the lens, the optical low-pass filter, and the photosite dimensions themselves). After that comes the significant effects of de-mosaicing algorithms, too.

Not unlike an audio system (where we look at the final "frequency spectrum" that the ear hears rather than laboriously chart the time-domain impulse-shapes). It's the same when we are talking about the "spatial frequency domain" (in imagery) as opposed to the "time domain" in the case of audio analysis. The "time domain" does not tell us very much, and it is infinitely more mathematically laborious to numerically analyze. The "frequency response" tells the story.

Oh, I should say the point being working out when you are likely to really start caring about diffraction effects, but I'll avoid the "L" word. Okay, re-reading that maybe I should have said "Limit" word...

(I think it is something worth knowing, BTW.)

Post (hide subjects) Posted by
Keyboard shortcuts:
FForum PPrevious NNext WNext unread UUpvote SSubscribe RReply QQuote BBookmark MMy threads
Color scheme? Blue / Yellow