Sooner reaching diffraction limit with higher megapixel cameras

David_Winston

Member
Messages
14
Reaction score
3
I am trying to grasp my head around certain things. One of them is diffraction, according to Cambridge in colour it is dependent of pixel pitch of the sensor,the aperture of the lens and of course the final size and viewing distance of the print. Neglecting the print size and the viewing distance and just focusing on aperture and pixel pitch. Will increased megapixels lead to sooner hitting the diffraction limit?

So if you were to jam 60 or 70 megapixels in a full frame sensor wouldn't you at one point reach diffraction very soon? I think it was some manager from Fujifilm that said, that FF's limit would be around 60mp to 100mp. Was it that that he was referring to?

I did some testing with an A7iii and an A7R iii with the same lens of a lens testing chart at various apertures (5.6 to 11). 5.6 where diffraction starts to kick in. When I downsampled the A7R iii image down (both no sharpening) I still found the A7R iii to be sharper at all apertures.

Why is that so? Is that gain in resolution outweighing the diffraction? Or is the jump from 24mp to 42mp just not that big?

Or should I have tested at smaller apertures (maybe then I would have seen the difference (I rarely use f16 so I didn't feel I should test it)?

What do you think? Will 70mp FF cameras even be useful?
 
Solution
1. The point at which diffraction induced blur becomes a factor that limits lens resolution. Stopping a lens down minimises most aberrations, but increases the size of the blur disc from diffraction. As a result, most lenses offer more resolution when stopped down to some extent, then offer less resolution when stopped down more than that. When a lens is described as 'diffraction limited' from a particular aperture, that means it is sharpest at that aperture. This is in theory independent of sensor pixel count and pixel size, though in practice, as most lens tests are carried out on camera, it may be confused with point 2 below.

2. The point at which diffraction induced blur becomes an important factor in practice. When diffraction...
That is about the “raw” optical image (the formula relates to the Airy disk size that I referred to); it does not rule our subsequent deconvolution image processing to partially undo the effects of diffraction. Bart Van Der Wolf has done some writing and demos of this, described briefly at https://openphotographyforums.com/f...tion-of-diffraction-with-deconvolution.12555/
This is a really bad simulation. It is a well known "crime" in certain circles. He simulates diffraction with a discrete convolution with a very rough 9x9 block and then inverts it. The the real diffraction convolves a continuous image with a continuous kernel, and then samples it with all the noise and discretization errors associated with it. Even then, you can see that the detail in the blinds on the lower right is mostly lost.
Maybe, but diffraction deconvolution is an established method in microscopy and such, with plenty of peer-reviewed sources.
Well, the one you cited is not one of those. Of course, nowadays you can publish everything somewhere...
I agree that there is a theoretical limit on how much improvement can be made, but it is quite a bit better than often inferred by ignoring the possibility deconvolution. IIRC, the actual hard limit on resolution in cycles per mm is 1/(wavelength times F-number) and since you need (at least) two pixels per cycle [Nyquist], that puts a rough limit on "useful" pixel size at about (wavelength times F-number)/2.

For the typical 550nm wavelength of visible light, that gives a "useful pixel size limit" of about (0.275 times F-number) microns, so for example with an f/2, down to 0.55 micron pixels could be "useful", and conversely, the smallest current pixels in 35mm format — 4 microns — are only "hard diffraction limited" at about f/14 with optimal processing (and otherwise optically perfect lenses.) Without deconvolution instead, diffraction effects are noticeable by about f-number = (2 times pixel pitch in microns), so f/8.
I did not check those numbers but you seem to ignore the Bayer array. Also, there is noise, etc. Finally, some healthy amount of oversampling is always desirable.
 
Canon’s DPP can correct diffraction in dual pixel raw files.
No, it can't.

It's a marketing claim. Diffraction is not correctable.
Could be. I haven’t seen any tests or articles on it, but it’s in the manual and has an adjustment in DPP.

Since there are two “sensors” (for lack of a better word) for each pixel, and each is a different size, it might see the diffraction differently on each and have the information to correct it that way.

I’ll ask Canon and maybe test it some day.

You would think that DPR would want to debunk the claim, if it wasn’t true.
You really believe that?
After reading the rest of this thread, I have to say yes, within limits.
 
Canon’s DPP can correct diffraction in dual pixel raw files.
No, it can't.

It's a marketing claim. Diffraction is not correctable.
Could be. I haven’t seen any tests or articles on it, but it’s in the manual and has an adjustment in DPP.

Since there are two “sensors” (for lack of a better word) for each pixel, and each is a different size, it might see the diffraction differently on each and have the information to correct it that way.

I’ll ask Canon and maybe test it some day.

You would think that DPR would want to debunk the claim, if it wasn’t true.
You really believe that?
After reading the rest of this thread, I have to say yes, within limits.
I do not.
 
I just use Bart van de Wolf’s because it had illustrations! The peer-reviewed stuff on legitimate medical research sites is a tougher read, as is anything on PSFs and 2D Fourier transforms

The Bayer CFA certainly complicates things, but is both good and bad. The bad is only having data for a given color channel at some locations. The ironical good is that the resolution is lower than “two pixel widths per cycle”, so pixels have to get smaller before diffraction is the main limit. One naive guideline is that it could be a useful Moiré mitigation to have the Airy disk cover a full quartet of photosites;

GR

BG
 
Canon’s DPP can correct diffraction in dual pixel raw files.
No, it can't.

It's a marketing claim. Diffraction is not correctable.
Could be. I haven’t seen any tests or articles on it, but it’s in the manual and has an adjustment in DPP.

Since there are two “sensors” (for lack of a better word) for each pixel, and each is a different size, it might see the diffraction differently on each and have the information to correct it that way.

I’ll ask Canon and maybe test it some day.

You would think that DPR would want to debunk the claim, if it wasn’t true.
You really believe that?
After reading the rest of this thread, I have to say yes, within limits.
I do not.
I hear some compelling technical arguments that it is true. Diffraction is not random.

Since it is not really advertised, but just in the technical material, I suspect that it actually works, but only for a small level of diffraction, such as a stop or a half a stop over the theoretical point where diffraction occurs.

If it were big, they would advertise it. If it was just marketing hype, what good does it do if they don’t advertise it?

I would love to read something authoritative on the subject. Most of us are just hacks here, engaging in speculation.
 
Canon’s DPP can correct diffraction in dual pixel raw files.
No, it can't.

It's a marketing claim. Diffraction is not correctable.
Could be. I haven’t seen any tests or articles on it, but it’s in the manual and has an adjustment in DPP.

Since there are two “sensors” (for lack of a better word) for each pixel, and each is a different size, it might see the diffraction differently on each and have the information to correct it that way.

I’ll ask Canon and maybe test it some day.

You would think that DPR would want to debunk the claim, if it wasn’t true.
You really believe that?
After reading the rest of this thread, I have to say yes, within limits.
I do not.
I hear some compelling technical arguments that it is true. Diffraction is not random.

Since it is not really advertised, but just in the technical material, I suspect that it actually works, but only for a small level of diffraction, such as a stop or a half a stop over the theoretical point where diffraction occurs.
When does it occur?
If it were big, they would advertise it. If it was just marketing hype, what good does it do if they don’t advertise it?
It "works" partly, as it should.
I would love to read something authoritative on the subject. Most of us are just hacks here, engaging in speculation.
I am not a hack.
 
Canon’s DPP can correct diffraction in dual pixel raw files.
No, it can't.

It's a marketing claim. Diffraction is not correctable.
Could be. I haven’t seen any tests or articles on it, but it’s in the manual and has an adjustment in DPP.

Since there are two “sensors” (for lack of a better word) for each pixel, and each is a different size, it might see the diffraction differently on each and have the information to correct it that way.

I’ll ask Canon and maybe test it some day.

You would think that DPR would want to debunk the claim, if it wasn’t true.
You really believe that?
After reading the rest of this thread, I have to say yes, within limits.
I do not.
I hear some compelling technical arguments that it is true. Diffraction is not random.

Since it is not really advertised, but just in the technical material, I suspect that it actually works, but only for a small level of diffraction, such as a stop or a half a stop over the theoretical point where diffraction occurs.
When does it occur?
If it were big, they would advertise it. If it was just marketing hype, what good does it do if they don’t advertise it?
It "works" partly, as it should.
I would love to read something authoritative on the subject. Most of us are just hacks here, engaging in speculation.
I am not a hack.
Cool. Are you an optical engineer or a physicist?

I’m a hack. I’m just a real estate guy.
 
Canon’s DPP can correct diffraction in dual pixel raw files.
No, it can't.

It's a marketing claim. Diffraction is not correctable.
Could be. I haven’t seen any tests or articles on it, but it’s in the manual and has an adjustment in DPP.

Since there are two “sensors” (for lack of a better word) for each pixel, and each is a different size, it might see the diffraction differently on each and have the information to correct it that way.

I’ll ask Canon and maybe test it some day.

You would think that DPR would want to debunk the claim, if it wasn’t true.
You really believe that?
After reading the rest of this thread, I have to say yes, within limits.
I do not.
I hear some compelling technical arguments that it is true. Diffraction is not random.

Since it is not really advertised, but just in the technical material, I suspect that it actually works, but only for a small level of diffraction, such as a stop or a half a stop over the theoretical point where diffraction occurs.
When does it occur?
If it were big, they would advertise it. If it was just marketing hype, what good does it do if they don’t advertise it?
It "works" partly, as it should.
I would love to read something authoritative on the subject. Most of us are just hacks here, engaging in speculation.
I am not a hack.
Cool. Are you an optical engineer or a physicist?
A mathematician.
 
I did some testing with an A7iii and an A7R iii with the same lens of a lens testing chart at various apertures (5.6 to 11). 5.6 where diffraction starts to kick in. When I downsampled the A7R iii image down (both no sharpening) I still found the A7R iii to be sharper at all apertures.

Why is that so? Is that gain in resolution outweighing the diffraction? Or is the jump from 24mp to 42mp just not that big?

Or should I have tested at smaller apertures (maybe then I would have seen the difference (I rarely use f16 so I didn't feel I should test it)?
Yes, test at smaller apertures: photograph scenes you normally photograph and note the differences.

Regarding smaller apertures: some years ago I tested with this scene to see how much I could stop down for DOF and still have good resolution in the foliage in the foreground. f/14 worked in this case.

852c383a6f314aafba4ad2c7a2300e99.jpg

1c169b3f8c20438fb2985c43c04fe81f.jpg

- Richard

--
Careful photographers run their own tests. — Fred Picker
 
Last edited:
Yes, test at smaller apertures: photograph scenes you normally photograph and note the differences.

Regarding smaller apertures: some years ago I tested with this scene to see how much I could stop down for DOF and still have good resolution in the foliage in the foreground. f/14 worked in this case.
Yes knowing your own gear is paramount.

I tested one of my own lenses on my particular camera and found I would still be quite happy enough using it at f/16 if I particularly wanted a deeper depth of field or slower shutter speed...

(100% crops)

 f/5.6
f/5.6



 f/8
f/8



 f/11
f/11



 f/16
f/16



 f/22
f/22



--
 
Most lens aberrations improve as you stop down; diffraction gets worse.
As shown in Canon EOS R white paper:

Showing the two boundaries to resolution in all lenses (this is a generic curve shown here)
Showing the two boundaries to resolution in all lenses (this is a generic curve shown here)
 
Canon’s DPP can correct diffraction in dual pixel raw files.
No, it can't.

It's a marketing claim. Diffraction is not correctable.
Could be. I haven’t seen any tests or articles on it, but it’s in the manual and has an adjustment in DPP.

Since there are two “sensors” (for lack of a better word) for each pixel, and each is a different size, it might see the diffraction differently on each and have the information to correct it that way.

I’ll ask Canon and maybe test it some day.

You would think that DPR would want to debunk the claim, if it wasn’t true.
You really believe that?
After reading the rest of this thread, I have to say yes, within limits.
I do not.
I hear some compelling technical arguments that it is true. Diffraction is not random.

Since it is not really advertised, but just in the technical material, I suspect that it actually works, but only for a small level of diffraction, such as a stop or a half a stop over the theoretical point where diffraction occurs.
When does it occur?
If it were big, they would advertise it. If it was just marketing hype, what good does it do if they don’t advertise it?
It "works" partly, as it should.
I would love to read something authoritative on the subject. Most of us are just hacks here, engaging in speculation.
I am not a hack.
Cool. Are you an optical engineer or a physicist?
A mathematician.
Your opinion has more value than mine, then.
 
Diffraction is a result of a physcial process at the border of your aperture. Depending on the wavelength the light will be distracted to a certain and fix extend at your aperture.

The size of your aperture will have an influence on the resulting diffraction.

The size of the diffraction will be the same - no matter which sensor you have in your camera.

But if you have a higher resolution your sensor may resolve an diffraction effect which would not be visible on a sensor with lower pixel density and less resolution.

For this reason: the higher the pixel density of a sensor the more sensitve the camera will be for displaying diffraction effects.

The good thing regarding diffraction is, that it can be corrected easily. You would need a programm that offers deconvolution sharpening to get rid of diffraction effects. Some cameras have diffraction correction as a feature in JPG processing. For Pentax cameras I can tell you that it works with high efficiency.

Best regards

Holger
 
I am trying to grasp my head around certain things. One of them is diffraction, according to Cambridge in colour it is dependent of pixel pitch of the sensor,the aperture of the lens and of course the final size and viewing distance of the print. Neglecting the print size and the viewing distance and just focusing on aperture and pixel pitch. Will increased megapixels lead to sooner hitting the diffraction limit?

So if you were to jam 60 or 70 megapixels in a full frame sensor wouldn't you at one point reach diffraction very soon? I think it was some manager from Fujifilm that said, that FF's limit would be around 60mp to 100mp. Was it that that he was referring to?

I did some testing with an A7iii and an A7R iii with the same lens of a lens testing chart at various apertures (5.6 to 11). 5.6 where diffraction starts to kick in. When I downsampled the A7R iii image down (both no sharpening) I still found the A7R iii to be sharper at all apertures.

Why is that so? Is that gain in resolution outweighing the diffraction? Or is the jump from 24mp to 42mp just not that big?

Or should I have tested at smaller apertures (maybe then I would have seen the difference (I rarely use f16 so I didn't feel I should test it)?

What do you think? Will 70mp FF cameras even be useful?
What has happened is that you have been misled by Cambridge in Colour (sorry, no award, you're not the first, there have been thousands) then done your own experiments which prove that site's diffraction article to be the nonsense it is (and also a good few other articles.one to beware). Your cognitive dissonance is you think it's authoritative and can't quite believe your own tests. They gave the right result. Here's why. Where CIC goes wrong is they think that diffraction blur is somehow masked by pixelisation. If it's smaller than twice the pixel size it somehow disappears. This would happen if every Airy disc was neatly aligned with the pixel grid, so there was never sampling across boundaries, but in real life that doesn't happen. To analyse what goes on when different sources of blur (in this case the diffraction blur and pixellisation, plus the aberration blur from the lens) you need a mathematical operation called convolution, 'running' one function over another. Look at it this way and the results show exactly what your experiments showed, that higher pixel count gives higher resolution even when diffraction limited, though the futrher into diffraction (or for that matter aberration blur) you get the more it's a case of diminishing returns.
 
I am trying to grasp my head around certain things. One of them is diffraction, according to Cambridge in colour it is dependent of pixel pitch of the sensor,the aperture of the lens and of course the final size and viewing distance of the print. Neglecting the print size and the viewing distance and just focusing on aperture and pixel pitch. Will increased megapixels lead to sooner hitting the diffraction limit?

So if you were to jam 60 or 70 megapixels in a full frame sensor wouldn't you at one point reach diffraction very soon? I think it was some manager from Fujifilm that said, that FF's limit would be around 60mp to 100mp. Was it that that he was referring to?

I did some testing with an A7iii and an A7R iii with the same lens of a lens testing chart at various apertures (5.6 to 11). 5.6 where diffraction starts to kick in. When I downsampled the A7R iii image down (both no sharpening) I still found the A7R iii to be sharper at all apertures.

Why is that so? Is that gain in resolution outweighing the diffraction? Or is the jump from 24mp to 42mp just not that big?

Or should I have tested at smaller apertures (maybe then I would have seen the difference (I rarely use f16 so I didn't feel I should test it)?

What do you think? Will 70mp FF cameras even be useful?
Here is a nice write up on lens diffraction that is worth reading:

 
Yes, test at smaller apertures: photograph scenes you normally photograph and note the differences.

Regarding smaller apertures: some years ago I tested with this scene to see how much I could stop down for DOF and still have good resolution in the foliage in the foreground. f/14 worked in this case.
Yes knowing your own gear is paramount.

I tested one of my own lenses on my particular camera and found I would still be quite happy enough using it at f/16 if I particularly wanted a deeper depth of field or slower shutter speed...
Congratulations! I haven't seen many on these forums who test this way.

- Richard
 
Most lens aberrations improve as you stop down; diffraction gets worse.
As shown in Canon EOS R white paper:

Showing the two boundaries to resolution in all lenses (this is a generic curve shown here)
Showing the two boundaries to resolution in all lenses (this is a generic curve shown here)
Those are isolated curves, of course. The combined curve is not the lower of the two in any X position, but rather drops short of that peak where they are crossing each other. Blurs combine in quadrature, just like noise. There is no thresholding effect.

--
John
 
Right: under any given viewing conditions (displayed image size size/viewing distance) a higher resolution (higher pixel count) sensor at the same f-stop will give a somewhat sharper image, not worse. That is "line pairs per picture height" improves at least somewhat. Maybe the image is not as sharp as with the same sensor used at a lower f-stop, but it is never worse for overall resolution to increase sensor resolution at a given f-stop.

At worst, the resolution gains from increasing pixel count become less and less once pixel size in microns is significantly smaller than the aperture ratio.

Since increasing sensor resolution is easier/cheaper than increasing lens resolution, it makes sense to me to have sensors that out-resolve one's best lenses somewhat, so as to get the best out of all one's lenses.
The problem with sensors outresolving lenses is you get MTFs that look like this:

mtf.png


rather than this:

mtf.png


Which could kind of limit your composition options if you position subjects towards one side of the frame. You could have a situation where one side of someone's face is visibly softer than the other despite it all being in focus. Probably not that big of a deal but definitely a side effect with certain lenses. And with high res bodies getting so cheap (I wouldn't doubt my A7R2 dipping under $1K by Xmas) it's not unlikely for people to be pairing them with cheaper glass.

--
Sometimes I take pictures with my gear- https://www.flickr.com/photos/41601371@N00/
 

Keyboard shortcuts

Back
Top