Diffraction Limit

Started Aug 28, 2013 | Discussions thread
Great Bustard Forum Pro • Posts: 40,846
As Sir George Mallory said, "Because we can". : )

zodiacfml wrote:

The OP is correct, diffraction is simply that.

You make a compelling case, sir!

Everything else constant except changing the f number of the lens, diffraction just limits the information we can capture.

Yes. However, lens aberrations also limit that information, and each lens on each system has the balance point for maximum information capture at a different point (not to mention other sources of blur, such DOF, motion, and noise).

Imagine the aperture of the lens as a filter/hole and photons of light as balls. For a diffraction limited f-stop, larger photons/balls such as the wavelengths of RED gets filtered and can't get through this hole. So, what gets through that hole is always constant regardless of what catches/absorbs it (such as a sensor format or number of pixels).

All wavelengths get through the hole, it's just that the longer wavelengths smear out more than shorter wavelengths when passing through the aperture.

If we have a large basket (such as a pixel) but two small green balls can fit in, then we are resolution limited. If we have very small baskets (small pixels) but one large red ball comes in, the large ball is sliced into pieces to fit in those baskets. What I'm trying to say here, smaller pixels don't add detail to a diffraction limited capture, it's just over-sampled.

The photon will either be recorded by a specific pixel or not. The smaller the aperture, the wider the range of pixels that might absorb the photon. In other words, the sensor will record a specific position for the photon, but the smaller the aperture, the greater the chance that the wrong position is recorded.

That said, smaller pixels for a given sensor size (greater sampling) will still record a more accurate image, on average, than larger pixels. It's just that this greater accuracy becomes trivial by some point (in my opinion, that point is f/16 on mFT, f/32 on FF, f/5.6 on the FZ200, etc.).

Yet, why do people say that smaller pixels(high res sensors) is bad due in terms of diffraction? Simple, you just lose efficiency due to diminishing returns. Efficiency of image file sizes. We still get more detail on higher resolution sensors but for very little gain.

Yes. The blur of diffraction is obscured by the blur of larger pixels. Thus, we notice the effects of diffraction sooner with smaller pixels because we can. With larger pixels, we could not see the diffraction as early since the diffraction itself was absorbed by the blur of larger pixels.

I would like to see a test on this subject since there are two Nikon DSLRs perfect for this, a D3s and a D800.

Here's an excellent demonstration of pixel size vs aperture in regards to diffraction:


In matters of DOF and varying sensor formats on diffraction, I think it is a complicated yet futile discussion since we are lens limited anyway, by the smaller formats.

Actually, smaller formats are more diffraction limited than larger formats, and larger formats are more lens limited than diffraction limited than smaller formats.  This is where they oft touted "wide open advantage" that so many mFT and 4/3 users say without understanding the implications.

For example, let's consider a 45 / 1.8 on mFT vs an 85 / 1.8 on FF.  Wide open at f/1.8 on mFT is equivalent to stopped down to f/3.5 on FF.  So, while the 45 / 1.8 is "sharp wide open", peaking, at, say, f/2.8, the FF lens is not sharp until, say, f/2.8 and peaking at f/5.6.

So, in reality, the FF lens gets sharp a bit earlier than the mFT lens, but both peak at the same point due to diffraction.

Post (hide subjects) Posted by
Keyboard shortcuts:
FForum PPrevious NNext WNext unread UUpvote SSubscribe RReply QQuote BBookmark MMy threads
Color scheme? Blue / Yellow