Diffraction and the 4/3 sensor

Well, it is not directly a function of sensor size,. It is directly a function of aperture diameter. Diffraction is what occurs when light interacts with an edge. Light that comes close to an edge is deflected from its path. At smaller aperture diameters, a greater proportion of all light passing through the lens is passing close enough to the edge to be deflected. As the portion of light deflected increases, the softness becomes more evident.
This analysis is wrong. In fact, the rays that are near the edge are the good rays, the ones that contribute most to the resolution (reduction in size of the Airy disk).
I suggest you do some reading so you can correct your misconceptions. The photons that pass closest to the edge of the aperture are the ones whose path is interfered with.

From Wiki:

"Diffraction refers to various phenomena that occur when a wave encounters an obstacle or a slit. It is defined as the bending of light around the corners of an obstacle or aperture."

There are no obstacles in the centre of the aperture.
 
That is one of the most misinformative parts of a site that is full of errors. Pixel size has no effect on diffraction, save that if the pixel size is coarse enough to obscure things on the scale of diffraction blurring, you won't see it.
The most interesting part is that You and Great Bustard are in conflict with your own opinions.

First you both claim that the site stated information is false. And then just after that you both go to say "but you won't see it...." just like the site says it.

This is because you seem to think everything in 1 and 0 in pure theoretical manner where you separate aperture and the sensor as totally two different units that has nothing to do with each other just so that you can handle the individual aspect of each as is, not as a combination to create something visual.

Diffraction is purely an effect, that is judged only by "seeing it". If you can't see the effect, it doesn't matter is it there or not. Even if you can see it, it is still question of opinion, does it matter (like, does it matter that your shoes wear off when you walk with them?)?

Every single person should be able to agree that f/22 is a clearly suffering from a softening caused by diffraction! But same time every single one can agree that they will not see that diffraction softening on 640x480 image! If we would have a 4/3" sensor that only has a 640x480 pixels, we would never see the diffraction effect like we see with 16Mpix.

We can take the same image at f/2.8 or f/22 and there wouldn't be any visual difference with so low pixel density.

Or do you both claim that if you would get a few such 640x480 images for comparisons, you could clearly immediately tell which one is taken with f/2.8 and which at f/22?

Pixel density ain't source of the diffraction, but it plays its part to capture it.
 
It's worth mentioning that Canon's RAW converter, I believe, actually has algorithms that can deconvolve the blur that comes from diffraction to some extent, as opposed to Rorger's clever sharpening method in the article above.
And we all should already know that at least Olympus TruePic does this in the body, and does very good job in that too for usual purposes.

 
Well, it is not directly a function of sensor size,. It is directly a function of aperture diameter,. Diffraction is what occurs when light interacts with an edge. Light that comes close to an edge is deflected from its path. At smaller aperture diameters, a greater proportion of all light passing through the lens is passing close enough to the edge to be deflected. As the portion of light deflected increases, the softness becomes more evident.

The aperture of the 400mm lens opened at f/32 is much, much wider than the 12mm opened at f/4.

400/32 = 12.5 mm

12/2 = 3.0 mm

That's more than 300% wider (in diameter)

Why the diffraction is more??
 
Yes, we know diffraction is always there. But on a practical point of view, I concern a lot more on when diffraction would be serious enough to affect IQ? Cambrigdeinclolour simply provides a quantitative method allowing me to predict where that point would be and by coincident or not, it match with the (limited) reality I know. I would be happy to use another source if I could get similar result from their work.
CiC's take on diffraction is at the pixel level, as opposed to the image level. What CiC says about diffraction is akin to when people say lenses won't work on higher resolution sensors because they're not sharp enough.

In any case, consider the following statement that was just made in another post:

Mathematically with 16Mpix the diffraction is f/7.2 so f/7.1 is the "non-diffraction" limit.

Simply not true, and I think we both know where that "diffraction limit" came from.

The reality is that just as the sharpness of a lens limits the additional resolution that more pixels offer, diffraction also limits the additional resolution that more pixels offer.

In other words, more smaller pixels doesn't result in a lens delivering photos that have less detail. Likewise, more smaller pixels doesn't result in a reduction of detail due to diffraction.
Thank you and now finally understand the difference between us.
I'm not sure which differences you are referring to. However, the way I see things are that there are the facts, the interpretation of the facts with regards to the visual properties of the photo, and, finally, whether or not it is relevant to one's photography.

So, while I am perfectly good with people saying that the facts are irrelevant to their photography, I am not good with the incorrect info being presented as fact, or correct info being misrepresented out of context.
Take it easy, happy shooting :-)
To you as well!
 
Last edited:
Well, it is not directly a function of sensor size,. It is directly a function of aperture diameter. Diffraction is what occurs when light interacts with an edge. Light that comes close to an edge is deflected from its path. At smaller aperture diameters, a greater proportion of all light passing through the lens is passing close enough to the edge to be deflected. As the portion of light deflected increases, the softness becomes more evident.
This analysis is wrong. In fact, the rays that are near the edge are the good rays, the ones that contribute most to the resolution (reduction in size of the Airy disk).
I suggest you do some reading so you can correct your misconceptions. The photons that pass closest to the edge of the aperture are the ones whose path is interfered with.

From Wiki:

"Diffraction refers to various phenomena that occur when a wave encounters an obstacle or a slit. It is defined as the bending of light around the corners of an obstacle or aperture."

There are no obstacles in the centre of the aperture.
It is the interference that enhances the sharpness for large aperture sizes.

Diffraction in general is any effect that is caused by the interference of light waves with each other. The present context (image sharpness as a function of aperture size) is not addressed by the Wiki article. Image blur caused by diffraction can be explained as follows:

Consider a point source of light focused to a point on the sensor. Now consider another point on the sensor near the image point. Ideally, all the light appears at the image point and none at the nearby point. For rays near the center of the beam, the distances to the image point and the nearby point are nearly the same so the waves arrive with nearly the same phase, resulting in light at both points. This is the blurring effect of diffraction. Rays that are far from the beam center (and refracted more) arrive at a steep angle. Thus the path lengths to the image point and the nearby point are significantly different so there is a phase difference and the nearby point sees less light than the image point because of interference.

With many rays far from the center (large aperture), the relative light intensity at the nearby point is small, which means a sharp image.The iris edge (the obstacle which defines the aperture) really has nothing to do with this. It is that the rays farther from the center have a larger interference effect and result in making the image of the point (Airy disk) smaller. A large aperture allows for more of these good far-from-center rays.
 
Last edited:
That's pretty much what MTF-50 charts show, except they do so in numerical form. As JK linked above, lensrentals has an excellent article to include both MTF-50 charts and actual photos.

It's worth mentioning that Canon's RAW converter, I believe, actually has algorithms that can deconvolve the blur that comes from diffraction to some extent, as opposed to Rorger's clever sharpening method in the article above. DxOMark does a similar thing with their RAW converter, except it is for lens aberrations, as opposed to diffraction.
In the dim recesses of my memory I thought that the Oly Truepic VII chip (E-M1 and a few more) has diffraction correction.

Finally after pounding Google a bit I found reference to "resolution management and diffraction degradation compensation" when talking about the chip. From http://fourthirds-user.com/2013/09/olympus_omd_em1_new_features_explained_.php/a

Seems to be mainly done by smart sharpening and contrast fiddles.

So testing for diffraction may be distorted by the actions of the Truepic chip at play.

Regards..... Guy
 
Heck, I'd do it but no time, too busy planing down timber to the size it should be.
I hear that Japanese plainer is sharper than the British plainer...
When my Taiwanese planer is adjusted down to small apertures I see that the timber gets diffracted.

Regards....... Guy
 
That is one of the most misinformative parts of a site that is full of errors. Pixel size has no effect on diffraction, save that if the pixel size is coarse enough to obscure things on the scale of diffraction blurring, you won't see it.
The most interesting part is that You and Great Bustard are in conflict with your own opinions.
Let's see how this goes.
First you both claim that the site stated information is false.
Neither of us have said that. What we have said is that some things are either wrong or misrepresented. For example, his Diffraction Limited Calculator is based on pixel size, when, in fact, pixel size plays a very small role -- it is the lens that determines the diffraction limited aperture. For example, in another post you said:

Mathematically with 16Mpix the diffraction is f/7.2 so f/7.1 is the "non-diffraction" limit.

Simply not true, and that statement was made as a direct result of what you read on CiC.
And then just after that you both go to say "but you won't see it...." just like the site says it.
Not sure what you mean here. For sure, the effects of diffraction are often, if not usually, obscured by other forms of blur.
This is because you seem to think everything in 1 and 0 in pure theoretical manner where you separate aperture and the sensor as totally two different units that has nothing to do with each other just so that you can handle the individual aspect of each as is, not as a combination to create something visual.
Well, there's gravity, air resistance, and the rotation of the Earth in a ballistics problem. Sometimes, one effect is so dominant over the others that you can ignore the other effects (e.g., a shot put throw need only consider gravity). Other times, one needs to consider the other effects in equal measure.
Diffraction is purely an effect, that is judged only by "seeing it". If you can't see the effect, it doesn't matter is it there or not. Even if you can see it, it is still question of opinion, does it matter (like, does it matter that your shoes wear off when you walk with them?)?
How much diffraction affects the "success" of the photo is another matter all together. For example, f/5.6 1/400 is twice as noisy as f/2.8 1/400, all else equal, but that doesn't mean that the f/5.6 photo appears noisy.
Every single person should be able to agree that f/22 is a clearly suffering from a softening caused by diffraction! But same time every single one can agree that they will not see that diffraction softening on 640x480 image! If we would have a 4/3" sensor that only has a 640x480 pixels, we would never see the diffraction effect like we see with 16Mpix.
The effects of diffraction, for sure, can be obscured with fewer larger pixels. However, the diffraction is every bit there with the 0.3 MP photo as it is with the 16 MP photo. More to the point, the 16 MP photo isn't at a disadvantage compared to the 0.3 MP photo in any way, shape, or form.
We can take the same image at f/2.8 or f/22 and there wouldn't be any visual difference with so low pixel density.
Sure, but that's like saying that the 40-150 / 2.8 is no sharper than the 40-150 / 4-5.6.
Or do you both claim that if you would get a few such 640x480 images for comparisons, you could clearly immediately tell which one is taken with f/2.8 and which at f/22?
No such claim was stated or implied.
Pixel density ain't source of the diffraction, but it plays its part to capture it.
More smaller pixels, all else equal, simply record the image projected on the sensor with more accuracy. The diffraction is in the projected image, the pixels merely record the diffraction that already exists with various degrees of precision depending on the severity of the diffraction and the number of pixels sampling the image.
 
Last edited:
Wasn't Ansel Adams and friends called the F/64 Club? Of course that was with using an entirely different camera system, so probably not comparable.
Equivalent to f/4 on mFT, actually.
There is no aperture equivalence among formats. f/64 is f/64, always, everywhere. In combination with a designated film speed, the lightmeter calculated shutter speed on all camera formats is same. The photography triangle is thus complete: the aperture, speed of film, the shutter speed.

No 'equivalence' there.
The appearance of diffraction, like noise, DOF, resolution, etc., does, indeed, depend on the display medium, display size, viewing distance, and visual acuity, as well as the scene itself.
And one million more facors: the quality of light captured, the microconstrast of the lens, tonal bandwidth of the lens, quality of the lens coatings, glass composition, glass thickness, number of corrective elements, and so on, and on ...
 
Well, it is not directly a function of sensor size,. It is directly a function of aperture diameter,. Diffraction is what occurs when light interacts with an edge. Light that comes close to an edge is deflected from its path. At smaller aperture diameters, a greater proportion of all light passing through the lens is passing close enough to the edge to be deflected. As the portion of light deflected increases, the softness becomes more evident.
The aperture of the 400mm lens opened at f/32 is much, much wider than the 12mm opened at f/4.

400/32 = 12.5 mm

12/2 = 3.0 mm

That's more than 300% wider (in diameter)

Why the diffraction is more??
Because the focal length is shorter, so the blur doesn't have as much space to spread. The smaller the hole the light passes through, the quicker the light spreads. The further from the hole the image is former, the further the light has spread.

Thus, the diameter of the spread (d) is inversely proportional to the diameter of the hole (a = aperture diameter) and directly proportional to the distance traveled (f = focal length):
  • d ∝ (1/a)(f) = f/a
Hence, you can see that the width of the spread is proportional to the [reciprocal of the] relative aperture, not the aperture diameter.

However, in terms of its effect on the photo, we have to consider what proportion of the photo that spread covers. For example, a spread of 1mm represents twice as much blur on an mFT sensor than on a FF sensor. On the other hand, for the same framing and DOF, a FF lens would use twice the focal length and the same aperture diameter resulting in twice the spread which would then be the same proportion of a FF sensor that half the spread is on an mFT sensor. Thus, the effects of diffraction are the same for the same framing and DOF on all systems.
 
Last edited:
Wasn't Ansel Adams and friends called the F/64 Club? Of course that was with using an entirely different camera system, so probably not comparable.
Equivalent to f/4 on mFT, actually.
There is no aperture equivalence among formats. f/64 is f/64, always, everywhere. In combination with a designated film speed, the lightmeter calculated shutter speed on all camera formats is same. The photography triangle is thus complete: the aperture, speed of film, the shutter speed.
Nothing you wrote above contradicts *anything* in Equivalence.
No 'equivalence' there.
Because you didn't list any quantities that resulted in equivalent aspects of the displayed photo.
The appearance of diffraction, like noise, DOF, resolution, etc., does, indeed, depend on the display medium, display size, viewing distance, and visual acuity, as well as the scene itself.
And one million more facors: the quality of light captured, the microconstrast of the lens, tonal bandwidth of the lens, quality of the lens coatings, glass composition, glass thickness, number of corrective elements, and so on, and on ...
Yeah -- I guess it's all completely unknowable. An Olympus 7-14 / 4 on an E5 isn't equivalent to a Panasonic 7-14 / 4 on an EM5, then, right? After all, totally different lens designs. And a 45 / 1.8 on an EM5 isn't equivalent to the same 45 / 1.8 on an EM1.2 because the sensors are totally different, right? Not to mention the focusing, fps, etc., etc., etc. And coincidences like this or this, or any number of countless other examples, are just examples of a broken clock being right twice a day, yes?

In short, the only thing equivalent to Lens A on Camera A is Lens A on Camera A, yes? Each camera-lens combination is so unique that there is nothing like it with any other camera and lens, right?
 
Last edited:
Wasn't Ansel Adams and friends called the F/64 Club? Of course that was with using an entirely different camera system, so probably not comparable.
Equivalent to f/4 on mFT, actually.
There is no aperture equivalence among formats. f/64 is f/64, always, everywhere. In combination with a designated film speed, the lightmeter calculated shutter speed on all camera formats is same. The photography triangle is thus complete: the aperture, speed of film, the shutter speed.
Nothing you wrote above contradicts *anything* in Equivalence.
No 'equivalence' there.
Because you didn't list any quantities that resulted in equivalent aspects of the displayed photo.
The appearance of diffraction, like noise, DOF, resolution, etc., does, indeed, depend on the display medium, display size, viewing distance, and visual acuity, as well as the scene itself.
And one million more facors: the quality of light captured, the microconstrast of the lens, tonal bandwidth of the lens, quality of the lens coatings, glass composition, glass thickness, number of corrective elements, and so on, and on ...
Yeah -- I guess it's all completely unknowable. An Olympus 7-14 / 4 on an E5 isn't equivalent to a Panasonic 7-14 / 4 on an EM5, then, right? After all, totally different lens designs. And a 45 / 1.8 on an EM5 isn't equivalent to the same 45 / 1.8 on an EM1.2 because the sensors are totally different, right? Not to mention the focusing, fps, etc., etc., etc. And coincidences like this or this, or any number of countless other examples, are just examples of a broken clock being right twice a day, yes?

In short, the only thing equivalent to Lens A on Camera A is Lens A on Camera A, yes? Each camera-lens combination is so unique that there is nothing like it with any other camera and lens, right?
Joe

Are you agreeing that different body and lens combinations can render images differently in nominally equivalent situations? Or are you making the point that most of the time images taken with equivalent parameters look pretty much the same? If you had a strongly lit flower and two lenses with different amounts of SA, the glow would be different, for example.

Equivalence has a spooky ability to predict the settings that will produce the images I expect. However different lenses and bodies do produce different images in nominally equivalent situations.

Andrew
 
Wasn't Ansel Adams and friends called the F/64 Club? Of course that was with using an entirely different camera system, so probably not comparable.
Equivalent to f/4 on mFT, actually.
There is no aperture equivalence among formats. f/64 is f/64, always, everywhere. In combination with a designated film speed, the lightmeter calculated shutter speed on all camera formats is same. The photography triangle is thus complete: the aperture, speed of film, the shutter speed.
Nothing you wrote above contradicts *anything* in Equivalence.
Quick correction to what I wrote just above -- the film speed (or ISO setting on the camera) is not part of exposure, per se. It only matters inasmuch as we adjust the aperture, exposure time, and/or flash power to "accommodate" the film speed / ISO setting.
No 'equivalence' there.
Because you didn't list any quantities that resulted in equivalent aspects of the displayed photo.
The appearance of diffraction, like noise, DOF, resolution, etc., does, indeed, depend on the display medium, display size, viewing distance, and visual acuity, as well as the scene itself.
And one million more facors: the quality of light captured, the microconstrast of the lens, tonal bandwidth of the lens, quality of the lens coatings, glass composition, glass thickness, number of corrective elements, and so on, and on ...
Yeah -- I guess it's all completely unknowable. An Olympus 7-14 / 4 on an E5 isn't equivalent to a Panasonic 7-14 / 4 on an EM5, then, right? After all, totally different lens designs. And a 45 / 1.8 on an EM5 isn't equivalent to the same 45 / 1.8 on an EM1.2 because the sensors are totally different, right? Not to mention the focusing, fps, etc., etc., etc. And coincidences like this or this, or any number of countless other examples, are just examples of a broken clock being right twice a day, yes?

In short, the only thing equivalent to Lens A on Camera A is Lens A on Camera A, yes? Each camera-lens combination is so unique that there is nothing like it with any other camera and lens, right?
Joe

Are you agreeing that different body and lens combinations can render images differently in nominally equivalent situations?
Absolutely! In fact, I *specifically* spelled that out in the Equivalence Essay:

http://www.josephjamesphotography.com/equivalence/#quick
  • Elements of IQ, such as bokeh, color, flare handling, distortion, etc., as well as elements of operation, such as AF speed/accuracy, size, weight, etc., are not covered in this use of the term "equivalent". For example, the Canon 50 / 1.4 on the Canon 5D (13 MP FF) is equivalent to the Sigma 50 / 1.4A on the Nikon D810 (36 MP FF) despite the fact that the latter system will have significantly higher resolution, lower noise, better bokeh, etc., etc..
Or are you making the point that most of the time images taken with equivalent parameters look pretty much the same?
Equivalent photos will look the same with regards to perspective, [diagonal] framing, DOF, motion blur, and brightness. By adding in conditions with regards to the hardware, more can be said about other visual properties of the photo (conditions for same noise and same resolution are spelled out in bullet points in the link above).
If you had a strongly lit flower and two lenses with different amounts of SA, the glow would be different, for example.
Sure. Flare can be radically different. Field curvature can wreak havoc. Distortion can be a hassle. Etc., etc., etc.
Equivalence has a spooky ability to predict the settings that will produce the images I expect. However different lenses and bodies do produce different images in nominally equivalent situations.
Absolutely, and this is all spelled out *explicitly* in the Equivalence Essay.
 
Last edited:
Wasn't Ansel Adams and friends called the F/64 Club? Of course that was with using an entirely different camera system, so probably not comparable.
Equivalent to f/4 on mFT, actually.
Horrors! Is there such a thing as being completely blown away by diffraction? I have a few lenses that I can adapt that reach f32 (not f64 - which I had thought was chosen because it was about the practical limit of lens design and should have been possible to get close to perfect precise scientific reproduction - such as Ansel set out to make his own).

I am sitting here looking at a Mamiya 645 145/f4.0 Soft Focus lens which is not only medium format but quite capable of being used on a 4/3 sensor. But maybe it is not quite so bad converted to the 4/3 sensor - but then this whole discussion seems to hinge around "actual" aperture and not "equivalent" aperture - so if I sound a little confused it is becasue I am confused.
A bit of the confusion coming from using a notation that obscures rather than explains. The aperture on your Mamiya is not 'f4.0' but 'f/4.0'. 'f' is 145' so 'f/4.0' is 145/4.0 = 36.25.

If you use this on a four thirds sensor, the lens doesn't change, it still has a focal length of 145mm and an aperture of 36.5mm. It's going to produce a rather different picture on your FT sensor than it does on the Olympus, though. It will produce the picture that a 472mm lens would on the Mamiya. The diffraction blur on the sensor remains just the same, but to view the final image, you're going to enlarge the FT frame 3.25 times more than you would the 645 image, so the blur will look 3.25 larger. To get the same diffraction on a real 472mm lens on the Mamiya, that needs the same aperture, 36,25nn, which is an f-number of f/13.
Dear old Ansel will be spinning in his grave ....
I doubt it, he understood all this stuff very well indeed.
To diffraction this must be like tacking down linoelum with a sledge hammer :)
Not when you're working with a large format camera, where 145mm is a wide angle lens.
 
That's pretty much what MTF-50 charts show, except they do so in numerical form. As JK linked above, lensrentals has an excellent article to include both MTF-50 charts and actual photos.

It's worth mentioning that Canon's RAW converter, I believe, actually has algorithms that can deconvolve the blur that comes from diffraction to some extent, as opposed to Rorger's clever sharpening method in the article above. DxOMark does a similar thing with their RAW converter, except it is for lens aberrations, as opposed to diffraction.
In the dim recesses of my memory I thought that the Oly Truepic VII chip (E-M1 and a few more) has diffraction correction.

Finally after pounding Google a bit I found reference to "resolution management and diffraction degradation compensation" when talking about the chip. From http://fourthirds-user.com/2013/09/olympus_omd_em1_new_features_explained_.php/a

Seems to be mainly done by smart sharpening and contrast fiddles.
Deconvolution. The diffraction point spread function is predictable, so you can deconvolve it against the image. The problem with it is, that you can get nasty artifacts and, since it is essentially boosting the high spatial frequencies, noise is also boosted.
So testing for diffraction may be distorted by the actions of the Truepic chip at play.

Regards..... Guy
 
There is also an assistance to reducing diffraction in your own processing of the RAW, the latest version of Capture One Pro 10 has in the lens tools an option for it to reduce the effect of diffraction. They have lens profiles for our lenses and as it increases the processing time quite a bit therefore you can not set it on by default so it is doing some very complex image analysis and processing.
 
Diffraction is a matter of physics. For that reason it doesn't depend on the quality of the lens. There may be small differences for reasons I won't bother to go into, but if you compare lens mtf curves, you will see that they gradually converge at smaller apertures.

Diffraction limited means a lens that is so well corrected it is limited mainly by diffraction. For most lenses, that only happens beyond a certain aperture; for high quality lenses it happens at a wider aperture than low quality lenses.

Whether diffraction is perceived as a problem depends upon the underlying image, how it is displayed and a variety of subjective factors.

However, we can say objectively:

1. That there is a point at which the pixel pitch of the sensor is the limiting factor on resolution, so diffraction can be more or less ignored. That is why diffraction is sometimes linked to pixel pitch.

2. There is a point at which stopping down to obtain depth of field means that you actually end up blurring the zone of focus due to diffraction to a greater degree than the out of focus areas will be blurred at a wider aperture. As depth of field is dependent on the same (partly subjective) factors as diffraction itself, this may or may not be noticeable. It does mean that stopping down beyond f22 on full frame or f11 on mft will result in a more evenly blurred image, but not more average resolution across the frame.

For a variety of reasons, however, diffraction does tend to respond well to sharpening in post processing, so that can also be a factor when deciding whether to use very small apertures.
 

Keyboard shortcuts

Back
Top