You've no doubt heard warnings about the effects of diffraction at small apertures. But what exactly is it? Our resident mad scientist, Don Komarechka, explains what diffraction is and how it impacts your photos.
Don - Thanks for the excellent video. The obvious point of it was to educate concerning diffraction. Somehow that offends people because they get acceptably sharp photos using smaller apertures. What is "acceptable" is, of course, highly subjective. And depending on your type of photography may not be that relevant. While the macro example is used to show the extreme case, those of us who are mostly landscape shooters will see the detriment of diffraction in certain photos. Many photographers leaned that depth of field required f16 or above. Very often, to obtain depth of field f11 would be perfectly suitable without focus stacking. As others have said below, you can't get back missing detail with sharpening. If the type of photography you do depends on ultimate sharpness, then be aware of this. If it doesn't do as you wish. But diffraction is a scientific fact. If you live in a fact free world, believe what you want.
Yes, those are wonderful images that, in a very small size look acceptably sharp. But how would it look making a larger sized print? At least on the surface it seems to me that looking at these small images is like looking at the display on your camera where everything looks acceptably sharp, and then you go into your image-editing software of choice and see that it's not so sharp. Wondering what your experience is with this?
I make poster size prints (60cmx90cm) on canvas (the ultimate matte paper) and get lots of compliments about the detail in them. I do not allow myself to crop in post, so I can send all of the pixels my camera captures to a printer. Am I losing some detail to diffraction? Sure, but you'll only notice when pixel peeping and no one prints 100% crops or saves them to their desktop as wallpaper...
Apparent diffraction is also relative to the size of the sensor/film because of the need to magnify the image more - or less - for a given final image size. The old 8x10 lenses could stop down past f22 easily because the final print would usually be anear the size of the negative. Guys like weegee would just stop down massively, range focus, and fire off flash, assured everything would look sharp because the end result would be a contact print.
For me I set a limit of about f16 on FF , but usually call it done by f11 - if I was shooting aps, I'd cut that by a stop - ie not go past f8.
F11 FF is suffering some diffraction, but because the corners and edges even up, once again, apparent sharpness is good. If it is straight repro work, I'd never go beyond f8. FF.
I work with no AA filter and very hi resolving lenses YMMV
”You know, I have one simple request. And that is to have sharks with frickin' laser beams attached to their heads!” Thanks for fulfilling it! Enjoyable video as always, this macro shot comparison shows diffraction effects very well.
I don't believe anything he says. Oh wait, he's wearing a white coat. I believe it all! haha nice vid. I didn't know what it was, but I knew not to shoot much smaller than f8 due to it.
Nice video. It would be great to have some guideline for different cameras, sensor size&res and lens combinations? (to avoid diffraction) eg. on a 12Mpix FF sensor can f16 be used ok? But on a FF 45Mpix sensor is f11 ok?
Because of all the various sensor sizes, apertures, factors such as pupil ratio etc. on the lens to consider, it's best to plug in the details of your specific equipment rather than try to cover every possible camera and lens scenario in a video.
Its sensor size and wavelength dependent. So what you think the average ok value for visible light center around 550nm will not be acceptable if you photograph in IR at 850nm. Also UV photography gets a boost as you can shoot one or 2 stops slower while avoiding diffraction issues.
Not sure about that laser illustration. Even fast F2 telescopes show an Arie disc on stars. And it is a common tool or optical phenomena used to climate them.
In other words the so called science does not add up. I am calling bull crap on the entire video. For years I have done and seen many other astronomer use telescopes at f30 and some of the best planetary photos that I have sene are taken at f50. Also after reading this https://jonrista.com/2013/03/24/the-diffraction-myth/ I am inclined to think that this video is just payed for by someone pushing products.
The size of the aperture is related to the focal length and the f-stop. The aperture is FL/f-stop. So a 60m lens at f19 has an aperture diameter of 3.2mm. A telescope of say FL=1200mm at f50 would have an aperture of 24mm, which is more like the 60mm lens at f2.4 (25mm) So for long focal length lenses / telescopes you can use higher f numbers before seeing the effects of difraction.
@Grimstod, interesting that you say that the "science doesn't add up," when the article you cite says the following: "When it comes to diffraction at small apertures like f/16 or f/22, it is true that your IQ drops at those levels. Diffraction puts an upper limit on the resolving power of a lens, and each narrower aperture lowers that limit. Diffraction, however, blurs an image in a very even, very predictable way. That means softening caused by diffraction can fairly easily be corrected with some sharpening while post-processing." This fellow believes sharpening solves it, not that the science isn't accurate. Test it yourself. Do a series of photo with high-frequency information (i.e. blades of grass, needles on a fir tree, etc.) and run through various apertures, then look at them magnified (what a large print might look like), and then try to sharpen the softer images and see what you get. I have tried this and it is beyond reasonable question that diffraction is real.
SUBSTANTIALLY MISLEADING - because it deals only with a single lens aberration.
There are many other aberrations - that usually prevent the diffraction limit at a wide aperture (yes there is a diffraction limit even at f2) being reached.
It is not uncommon to encounter a lens where other lens aberrations reduce image resolution at f2 or f2.8 to below that at f16.
Providing a lens has enough resolution for the intended purpose at a particular aperture - either wide or small - it does the job.
Love the lab coat. Gives the video a spurious impression of scientific authority/ respectability even though not shot in a lab. Maybe that's his normal dress.
I want to thank you again for the wonderful video and stirring the dark matter inside our heads. Probably my favorite video I have seen here in a long time. I can't tell you how many times I have taught workshops and had a student begin to create a landscape image where basically the whole scene was going to be at infinity and I would always prompt them with the question "what aperture are you going to use and why"? They usually reply with f11 or f16 because they want to maximize their depth of field. I too fell into this trap on my first trip many many years ago standing at the edge of the Grand Canyon and I couldn't understand why all my Kodachrome 25 images shot at f/22 were quite soft. It wasn't until I started to grasp the idea of diffraction and depth of field that I understood my problem. Thanks again.
Glad you enjoyed the video! There is a matter of "final intent" as well. Making an image for Instagram or social media in general? You can get away with a lot when critical resolution isn't paramount. Landscape photography also differs from macro in the sense that the smaller effective aperture is a more concerning factor. F/22 at extreme magnification is much more problematic than F/22 at infinity focus.
I suppose the proper answer exists with the weaving of art and science together into a fabric that becomes the final image. :)
First of all, deconvolution software in Lightroom and Photoshop has made diffraction much less of a concern so his dire portrayal of the phenomenon is misguided and harmful to beginners who might not know the other side of the story. Second, he inserts his own bias without explaining that there is an equally valid point of view. Discussing the tradeoff between diffraction and depth of field he says "it's not a compromise I'm willing to make" without discussing the opposite point of view. Finally, he doesn't mention the very simple solution to small apertures—more light. Photographers do not accept the light as it is if it doesn't suit their purpose. They use reflectors, strobes, and continuous light to provide the ƒstop they need to create the image they have in mind.
The problem with deconvolution software as good as it may be is that the starting point is already corrupted, sure you can make things look better but you can't recover the detail as it is in the original. I think beginners should try anything once regardless of what others say. His bias is perfectly valid as it is his video, if you think you could do better, then do so! Finally more light will not improve diffraction, it will only help you get more noticeable diffraction should you wish to venture that way.
@MrBrightSide: “deconvolution … has made diffraction much less of a concern”
Correct — if you replace ”much less” by “a bit”. Diffraction is two-prong: at some (spacial) frequencies (or: “at some level of details”), it just weakens the signal. This decreases S/N ratio — which cannot be “undone” — but if your noise was very small, this may not matter.
However, at smaller spacial frequencies, the CUT-OFF IS ABSOLUTE. No signal passes at all. No matter how small your noise is, there is NOTHING TO RESTORE by deconvolution.
If you understand these two issues (the S/N ratio and the absolute cut-off), then indeed deconvolution may give a significant visual improvement (I would say, make things “about 2 times sharper”).
@stevevelvia50: interesting that in "difraction myth" article didn't mention magnification at all. Posted video primarily touch the theme of difraction in much more bigger (real) apertures than F22 or F32 - the biggest one that "myth article" know about.
Use F22 aperture and 10x magnification in real world (real aperture no. F242) and be happy with results from sharpening algorithms that maked up details which are not taken from results.
My concern is details of tiny insects. Deconvolution software can change dull pictures into crispy ones, but it cannot bring to light details that have been lost by using a too small aperture. Test your camera and lens combination and conclude what works for you.
Nicely explained. Whenever photostacking is not an option in macro (I am chasing a bee handheld...) at about half life size 1:2 or 1:1 I just close the aperture to f16 or f22 and flash. I know I get diffraction but there is a chance that the eyes of the bee are in focus and due to DOF more of the eye as well. In non lab circumstances one just have to compromise so the final image looks best (viewed as a photo one may enjoy looking at rather than a set of pixels).
Ok, I need a camera with built-in focus-stacking coupled with a TOF sensor (or better yet, LIDAR).
I've tried Panasonic's in-camera focus-stacking, but it leaves a lot to be desired. The clunky interface takes a while to implement, and I get better results just using small aperture and associated diffraction. They really need to work on this.
Olympus does it best, since the EM5II came out. My old EM1.1s do it very well after a firmware upgrade and the latest EM1s do it extremely well handheld.
I took a handheld 8-shot focus-stacked image of a massive pelican (trying to keep warm, so it was very static) with my EM1x and 300mmF4 at around f6.3. No single shot at any aperture would get the entire beast in focus. However, this did the trick.
I don't know which Panasonic camera you tried for auto focus bracketing, but even my old gx8 which only uses highly compressed 4k video can turn out really impressive results with good stacking software. So the g9 which shoots much improved 6k should be capable of turning out stunning results! But I moved to the Olympus em1-mk2 (before the g9 was launched) and it it shoots ful resolution RAW & jpg brackets with as fine or as coarse a step as required and you can program it up 999 shits. Results are amazing. What's not to like :-)
@Adrian Harris, I am talking about in-camera focus-stacking, not focus-bracketing. I have the focus-stacking in my G85, but it is clunky, takes many menu presses, and less than happy with results. I don't think it would take much on Panasonic's part to 1) make this a one-touch operation/function, and 2) clean up the results so that they are on par with output from Helicon Focus, Zerene, etc. This becomes especially so with 8k captures.
This very sensible explanation will never gain traction here because it contradicts the dominant narrative that your pictures will be unbearably blurry if you use any aperture smaller than ƒ8 and that you must only shoot at ƒ1.8 and wider so that you can luxuriate in the glow of "total light."
Thanks for the great video Don! It seems from things that I've read that the diffraction starts becoming an issue in regular photography at f-stops that vary due to sensor size. From what I've read, you start seeing it beyond f5.6 on M43, f8 on APS-C, and f11 on FF. Does that sound right?
Everybody - Don has a very interesting podcast, "Photo Geek Weekly". Check it out!
You're correct in a sense! However if you go one step further, it's more about the density of the pixels on the sensor (they're actually called "photosites". Pixels are what happens when the RAW red/blue/green individual photosites get demosaiced into "pixels" that each have RGB values)/
A 20MP MFT sensor would have a similar density to a 40MP full frame camera (2x crop factor). The same would be true when comparing a 24MP APS-C camera to a 36MP full frame camera. The diffraction is noted in a 100% view of the pixels equally. However, since you have fewer pixels on a smaller sensor, you're more likely to view the image at 100% and notice this.
It's all about the physical "light collection area" compared to the size at which it is then viewed.
Don I think you made an important mistake. When comparing across sensors you naturally want to compare at equivalent FL, and the 1/2x f number of M43 (or 2/3x for APSC) comes from the same physical aperture meaning different F numbers to different FL. If you view different sensors as crops from the same lens, then the 80mp 3:2 FF sensor faces the same amount of diffraction as the 20 mp 3:2 M43 sensor, i.e. both start to see diffraction at f5.6 and you cannot get away with closing down further on FF.
If you look at the camera specifications, the Olympus Em5mk2 is the same density as the 26Mpixel Fuji XT3, as the 64 Mpixel Sony A7iv and the 100Mpixel Fuji GFX100.
Don already pointed out the most important thing: "since you have fewer pixels on a smaller sensor, you're more likely to view the image at 100% and notice this." Its an intrinsic "problem" of M43 and probably the reason why we usually dont see good sunstars (that would need f11) from M43 lenses.
If you have a fast lens, like a f2.0 that you stop down to f16 you will get a lot of diffraction, course all the light coming through the big front lens, has to bend to pass the small aperture opening.
Then will de diffraction be smaller if you started off with a slower f5.6 lens stopped down to f16, as the light coming through the smaller front lens would have to bend less to get through the same opening?
And if that is the case, could you not just use the fast f2 lens, but put a lens cap on it with a small hole, like in the video, to get around the bending of the light at smaller apertures, as it then just will be a matter of exposure?
If you stop an F/2 lens down to F/8, and you do the same with a lens that has an aperture of F/4 wide open, you end up with roughly the same light hitting the sensor. (note that it would be identical if we were talking about T stops, not F stops)
Yes, the exposure will be the same, but you will still get more diffraction from the F2 lens, wouldn't you? The front lens is bigger, so it let in more light that needs to be bend around the aperture-blades. So smaller front opening, less light to be bend = less diffraction. Is that correct?
HjVN If your F2 lens and F5.6 lens have the same focal length (say 50mm in this case) then both at F16 will have the identical diameter hole in the iris ...the hole in the iris is calculated by dividing the F number with the focal length Now your F2 50mm lens wide open will have an iris diameter of 25mm ..so 50 divided by 2= 25mm your F5.6 50mm lens wide open will have an iris diameter of 8.9mm now both the 50mm F2 lens and the 50mm F5.6 lens both stopped down to F16 BOTH will have an iris diameter of 3.1mm
Okay davev8 If you take your 50mm lens, stopped down to f16 it will have an iris opening of 3.1mm Now if you put a lens cap on it and make a little pinhole hole in the cap of 2mm, that's smaller than the iris opening, will you then still get diffraction at the iris?
i imagine a pin hole that small will completely upset the optical formula of the lens .it may work like another element in front of the lens but the mathematical formula for calculating diffraction (i am not an expert on algebra i read the accompanying text) say its the ratio of the size of the light wave and the hole the light is going through that determines the amount of diffraction ..i am guessing that the fact the pin hole will not change the size of the the light wave and the 3.1mm iris is unaltered in size than the diffraction formula will still give the same answer, As the pinhole is smaller if its calculation is made it may have more diffraction and add to it an afterthought as waves can cancel each other out if they fall wrong ..If the light went through 2 holes 1 after the other at a set distance apart but the spacing may be very critical like to a fraction of a light wavelength maybe it will undefract the light but what do i know i only went to school sometimes
You can replace the internal iris with an equivalent circular aperture stop just in front of your lens! For a 50mm lens and f/2 you will need a simple 25mm diameter hole. Difraction effect will also be exactlky the same. NO Light "is bend" differently.
@Franz: you forgot to add … as far as diffraction is concerned.
Moving the aperture to a different position inside the lens would change a lot of OTHER properties of the lens. However, the diffraction would not change indeed!
[In the pedantic mode:] Moreover, your argument works only when the focusing distance is “large enough”. Otherwise when you move the aperture closer to the object, you need to make the aperture smaller (to preserve the cone I discussed — referenced above).
I have a question and a request. First how is pixel shift affected? You said larger pixels will be affected less, but can we overcome part of diffraction by using pixel shift with a 20MP camera to create an 80MP image vs using an 80MP sensor. (I know smartphone are relying on stacking for many things these days).
And is there a chart of how much diffraction affects sharpens as the aperture gets smaller? Some people act like diffraction is almost an all or nothing thing and any diffraction is bad or cannot be overcome.
This is a VERY interesting point. Pixel-shift high resolution modes effectively make the pixel size smaller and reveal more of the underlying diffraction than you would have otherwise seen.
I use pixel-shift with macro photography quite regularly, because I get farther away from my subject and crop in. The results are great, but only if I loosen up on my aperture, which negates the benefit of the added depth of field from being farther away. I have taken some images in the high resolution mode that suffer from diffraction when shot with the Lumix 24-105 F/4 lens @ F/13, noticed when cropped down. Lesson learned - I usually limit the minimum aperture size to F/5.6 or F/8. Diffraction would still likely play a role in limiting the total resolution, but much less than what is gained by quadrupling the resolution.
There are a lot of moving parts to figure this all out. Because of that, it's hard to create a chart with exact values for every scenario. There are calculators, though!
Diffraction has nothing to do with and is not caused by your physical pixels and depends ONLY on the focal ratio of your lens. The sampling of the diffration pattern (the point spread function) is dominated by your pixels(be it real ones or subsampled by eg. pixelshift).
Edward Weston and other members of the f64 Group often photographed at f64 and even f90 but many forget that Edward Weston made 8x10 prints using his 8x10 negatives. You can shoot all day long with your digital camera and not worry about diffraction if you are willing to make images only as large as your sensor ;-)
Excellent point! If you had an enormous sensor, or a giant piece of film, and you made a comparatively sized print, you'd be fine. No issue! Try to do the same with a sensor 46mm across with 60MP worth of resolution, and diffraction will slap you in the face. :)
Diffraction is correlated to DoF. If you aim for the same DoF you will have the same amount of diffraction regardless of format. If you achieve that by a large aperture number on a large format or a small aperture number on a small format doesn't matter
While there is a correlation, it's not directly causation. There are a lot of different elements at play. Depth of Field is affected by a number of factors, including focal length, aperture, and distance from your subject.
@Don Komarechka: “While there is a correlation, it's not directly causation.”
Panther fan is absolutely right. Diffraction is a very simple issue (provided you look at it squinting in a particular way), and there is a very simple relation between it and defocusing: A PARTICLE-WAVE DUALISM or DIFFRACTION AND DEFOCUSING ARE HEISENBERG-DUAL. (Mathematically, one would say that they are FOURIER-dual.)
Here two Heisenberg-dual “values” are related to each other by the Heisenberg indeterminacy principle. In particular, when you try to “squeeze” indeterminacy of one (by restricting the light rays by an aperture), you would “widen” the indeterminacy of the other (the diffraction blur).
In particular: if you know EXACTLY how bokeh grows when you defocus, you can¹⁾ find EXACTLY the shape of the diffraction pattern. (So the relation is quantitative, not qualitative.)
¹⁾ In fact to do this in a simple way (the Fourier transform) one needs to know not only the intensity, but also the phase of light in “the bokeh pattern”. But again, mathematically speaking, one determines the other.
For example, a bokeh with a particular shape of corners would cause the particular sun-star pattern (and the smaller the bokeh circle, the larger the sun-star pattern).
⁜⁜⁜⁜⁜⁜⁜⁜⁜⁜⁜⁜⁜⁜⁜⁜⁜ HOW TO SQUINT
I promised that there is a very simple way to treat diffraction:
LOOK AT DIFFRACTION IN THE OBJECT SPACE
For example, if you want a control the diffraction blur of an eyelash, do not think about its image on the sensor. Think about THE OBJECT ITSELF. Just say something like: I want the eyelash to be blurred by ¼ of its thickness (say, by 0.05mm).
The answer: MEASURED THIS WAY, the diffraction blur is COMPLETELY DETERMINED by the cone of lightrays from the source which can reach the sensor.
(Since 0.05mm≈100 wavelengths, the cone should have the angle of 1/100 radians — or the entry pupil = 0.01 object distance.)
So: connect a particular place on the eyelash with the entry pupil of the lens. You get a conic shape. THIS is what causes the diffraction. (And obviously, the same cone determines the defocus/bokeh.)
(This approach works at least when the lens aberrations — except defocusing — are negligible. Moreover, the light loss in the lens should be the same for all rays. Otherwise one should modulate the interior of the cone before taking the Fourier transform. For example, “the onion bokeh” changes the details of the shape of the diffraction blur.)
CONCLUSION: the details of the lens/the sensor do not matter (as far as aberrations do not matter). Only the size and the position of “the real entry pupil” matter. This “use the cone” recipe works no matter what is the object distance; it works for objects near the edge of a fisheye lens. Etc.
This immediately answers at least 3 questions asked in the comments: • What happens if one adds “another diagram” (a lens cap with a small hole). • Why one can use ƒ/96 with 8×10 cameras. • How diffraction changes in macro when one changes the focusing distance.
… — as well as many other questions which are more or less impossible to answer other ways. Like what is the diffraction near the edge of fisheye lens. (Answer: the visible from the object pupil is foreshortened, so looks not like ∅5mm, but as an ellipse, say 0.5mm×5mm — horizontally and vertically. Hence the diffraction spot “on the object” is going to be extended 10x in horizontal direction. In particular, the only thing you NEED to know about the lens is how the entry pupil is foreshortened when you move away from the axis.)
Any other PARTICULAR questions about diffraction? ALL of them are easy to answer using this approach. (This is what is called Fourier optic — but I “moved” the “measurements” sensor→object.)
The comment about pixel size was a bit unfortunate, as pixel size doesn't affect diffraction at all. It merely samples the already affected-by-diffraction image at a higher frequency. The phenomenon of of diffraction is independent of the resolution of the sensor.
You're absolutely correct, but diffraction would not be visible if the pixels are larger. By the same virtue, an image taken on larger format film at smaller apertures will show less evidence of diffraction on a print because the physical area of the film is larger. Higher resolution cameras suffer from the effects of diffraction more than lower resolution cameras. Why buy a 50MP camera when your use case will only give you 20MP of actual resolution? These are the limits we play against as photographers. :)
If you take a 36mm x 24mm sensor image and compare it to a 4" x 5" film image, both printed at 8" x 10", you would notice diffraction much more readily on the digital image.
Edit: I should state that this is true of very small apertures that would readily present the effects of diffraction on the smaller sensor that would be less evident on the larger film format.
Don, even if the Airy disk is larger it helps to have more photosites, as each can only sample a single wavelength. Would you rather know some point of detail landed on an RGB “Red” big pixel, and not know if the point had some blue or green in it, or have 4 smaller points there, and know and confirm it had 10% blue, 60% green and 30% red? Now you know the color of the point. Of course, in low light you have noise, but following the same example, the one with 4x the pixels was able to get 1/2 * 60% + 1/4 * 10% + 1/4 * 30% vs 1 * 30%. Which one got more total light and thus less shot noise, in addition to better guessing which color this was? So an RGB sensor with 4x the density makes total sense. Now also the Airy disc is circular. Higher density will map the real center better, aiding deconvolution. But of course, if possible and needed, one can go to 654 or larger sensor formats.
Pardon my ignorance BUT: How does diffraction impact 4x5 cameras with exposures in the f22 to f32 range that achieve tack sharp images? In my film days, at around f45, I first saw some softness creep in with a 4x5. What is the issue? Is it the size of the sensor/film? Distance to subject? Maybe a chart showing diffraction occurrence, by f stop, by film/sensor size would have been more helpful. I think most would be disappointed if they saw the real limitations of digital sensors in this regard.
Diffraction has nothing to do with digital vs film. It happens on both mediums.
Also for the same subject, at the same distance and FoV, you can estimate diffraction well by using equivalent apertures. Larger formats allow for slower apertures as they can "handle" larger airy disks
4x5 has a crop factor of ~0.27 So F45 on 4x5 is equivalent to F12 on 35mm, or F8 on APS-C both in terms of DOF as well as in terms of the diffraction limit
There is no format that can offer deeper DoF with "less diffraction". They are exactly correlated to one another. So no format has an advantage here
Simple answer (I'll let someone else go into more detail though) When you enlarge a image from a 4/3rds sensor to lets say 16x20 you have to enlarge it a lot more than lets say a medium format image and a heck of a lot more than a large format image. The diffraction becomes a lot more visible the more you enlarge the final image. I was shooting my 6x9 camera at F22-F29, but my FF camera beyond F11 is questionable, a M43rds camera beyond F5.6 is questionable. If you aren't enlarging as much or viewing as magnified(closely) you can get away with tighter apertures. That all said I'm sure someone else can answer this even better but you should have a general idea/direction now.
Diffraction is due to the physical size of the hole and the wavelength of the electromagnet radiation (light, microwave, IR, etc). The f/ number is a ratio between the lens focal length and and aperture size (the hole).
I think this is a good explanation, but there are more facters with general photography that should be needed. this goes down the rabbit hole where you have people running around screaming "DLA" without it being a problem in a lot of cases.
for macro, I could see being far more aware of the aperture and effective aperture as being far more important, but for general purpose, photography not as much because we tend to want to see the image in its entirety.
For a more general photography, airy disc of diffraction is only noticeable as it becomes larger than the circle of confusion. That's only determined by your image magnification against your observer distance.
You can shoot well beyond the aperture at DLA, if you know how you are displaying the image and in what context.
Cambridge in color has an interesting "advanced" diffraction calculator for this.
@rrc1967 DLA is more of an issue than you suggest. I add a quote from your link above:
As a result of the sensor's anti-aliasing filter (and the Rayleigh criterion above), an airy disk can have a diameter of about 2-3 pixels before diffraction limits resolution (assuming an otherwise perfect lens).
However, diffraction will likely have a visual impact prior to reaching this diameter. Diffraction will cause a loss of contrast prior to reaching the Nyquist sampling limit. Note also that many DSLR/MILC do not have an anti-aliasing filter.
The Diffraction Limit Calculator in the link uses a Maximum Circle of Confusion: 32 µm. This may be OK for small prints viewed at normal distances and was a standard in film days. It doesn't suffice in the pixel peeping modern world of 17 element, 1 kg+ normal lenses.
again, we're not talking pixel peeping. pixel peepers will never be satisfied, but people that print, and shoot landscapes will and can use a much wider aperture range than what DLA suggests.
*pixel peeping* on a monitor creates a false impression of diffraction.
Also the diffraction limit calculator that you are thinking of - the basic. I clearly stated the advanced calculator.
As an example. I can shoot with a 100mp full frame camera, at F/8 and display it on a 1m wide print, and not be able to determine diffraction when viewing the image in it's entirety.
However if you stress over artifical constraints of per pixel diffraction, you are limited to less than f/5.6 on that same camera and thus limiting your creative options.
Well-executed video Don. I would work a bit on eliminating the lip-smacking from your elocution, either the habit or in post. This is coming from someone who wrestles with his own speech quirks in post. 😀
Just a note the effective aperture is only for exposure and not diffraction. Diffraction is related to the physical size of the aperture and its distance to the film/sensor plane and of course the wavelength.
I'd just like to add... If you change the distance between the aperture and film/sensor plane by focusing closer or adding extension tubes the effective aperture and diffraction will change accordingly and together. However if you leave the aperture in one spot but gain magnification by adding close-up lenses (achomats highly preferred) then the diffraction shouldn't change.
The "effective aperture" is not just the amount of light required, this number also has a direct impact on the way diffraction occurs in the lens. The two factors (amount of light needed and diffraction) are tightly linked when you're dealing with an increase in magnification.
Easy to test this out practically as well - just shoot at 1x-5x @ F/8 and see the quality start dropping as the magnification increases. Diffraction at work. :)
Your test moves the aperture further from the film/sensor plane, you can obtain more magnification in other ways that don't move the aperture further from the sensor plane.
This is true - and in most scenarios the aperture will change position when your magnification increases well beyond 1:1 lifesize - such as adding extension tubes, etc. - it's all part of the process.
I'll be keeping in touch with you by PM's as when the covid lockdowns lift I'll be finishing a few photo-scientific projects you would be interested in. On another note I recall Venus Optics making the claim that wider angle lenses tend to have less diffraction, I suspect that is because the aperture is much closer to the film plane. Now by less I don't know how much less, I've never tried to find that out.
There are SO many factors to consider here. Pupil ratio is an element we didn't discuss in this video which can have a dramatic impact on the calculation of effective aperture and also impacts diffraction - this is also a factor to consider when using close-up filters. A scientific rabbit hole that would have been lost on the majority of the audience of the video. :)
Well explained and illustrated! The 'effective aperture' is not important for you because the shutter time will be automatically adjusted, but it is very important in macro photography to test your lens and see at which aperture 'diffraction takes over', that is where resolution becomes less thanks to diffraction. At 5x, as in the video, you can go to f/4 or eventually f/4.5 without loosing resolution. At f/16, as in the video, the effect is enormous, very well shown.
The "effective aperture" is not just the amount of light required, this number also has a direct impact on the way diffraction occurs in the lens. The two factors (amount of light needed and diffraction) are tightly linked when you're dealing with an increase in magnification.
Easy to test this out practically as well - just shoot at 1x-5x @ F/8 and see the quality start dropping as the magnification increases. Diffraction at work. :)
Using the Olympus E-M1 II for focus stacking I have to say that the results are nice, they are even better with Helicon Focus. But is a good thing to have it in camera, so I have a preview of the finished stack. To bad, that I can not do the stack again from the stored raw.
The a7R V is the fifth iteration of Sony's high-end, high-res full-frame mirrorless camera. The new 60MP Mark IV, gains advanced AF, focus stacking and a new rear screen arrangement. We think it excels at stills.
Topaz Labs' flagship app uses AI algorithms to make some complex image corrections really, really easy. But is there enough here to justify its rather steep price?
Above $2500 cameras tend to become increasingly specialized, making it difficult to select a 'best' option. We case our eye over the options costing more than $2500 but less than $4000, to find the best all-rounder.
There are a lot of photo/video cameras that have found a role as B-cameras on professional film productions or even A-cameras for amateur and independent productions. We've combed through the options and selected our two favorite cameras in this class.
What’s the best camera for around $2000? These capable cameras should be solid and well-built, have both the speed and focus to capture fast action and offer professional-level image quality. In this buying guide we’ve rounded up all the current interchangeable lens cameras costing around $2000 and recommended the best.
Family moments are precious and sometimes you want to capture that time spent with loved ones or friends in better quality than your phone can manage. We've selected a group of cameras that are easy to keep with you, and that can adapt to take photos wherever and whenever something memorable happens.
What's the best camera for shooting sports and action? Fast continuous shooting, reliable autofocus and great battery life are just three of the most important factors. In this buying guide we've rounded-up several great cameras for shooting sports and action, and recommended the best.
While peak Milky Way season is on hiatus, there are other night sky wonders to focus on. We look at the Orion constellation and Northern Lights, which are prevalent during the winter months.
We've gone hands-on with Nikon's new 17-28mm F2.8 lens for its line of Z-mount cameras. Check out the sample gallery to see what kind of image quality it has to offer on a Nikon Z7 II.
The winning and finalist images from the annual Travel Photographer of the Year awards have been announced, showcasing incredible scenes from around the world. Check out the gallery to see which photographs took the top spots.
The a7R V is the fifth iteration of Sony's high-end, high-res full-frame mirrorless camera. The new 60MP Mark IV, gains advanced AF, focus stacking and a new rear screen arrangement. We think it excels at stills.
Using affordable Sony NP-F batteries and the Power Junkie V2 accessory, you can conveniently power your camera and accessories, whether they're made by Sony or not.
According to Japanese financial publication Nikkei, Sony has moved nearly all of its camera production out of China and into Thailand, citing geopolitical tensions and supply chain diversification.
A pro chimes in with his long-term impressions of DJI's Mavic 3. While there were ups and downs, filmmaker José Fransisco Salgado found that in his use of the drone, firmware updates have made it better with every passing month.
Landscape photography has a very different set of requirements from other types of photography. We pick the best options at three different price ranges.
AI is here to stay, so we must prepare ourselves for its many consequences. We can use AI to make our lives easier, but it's also possible to use AI technology for more nefarious purposes, such as making stealing photos a simple one-click endeavor.
This DIY project uses an Adafruit board and $40 worth of other components to create a light meter and metadata capture device for any film photography camera.
Scientists at the Green Bank Observatory in West Virginia have used a transmitter with 'less power than a microwave' to produce the highest resolution images of the moon ever captured from Earth.
The tiny cameras, which weigh just 1.4g, fit inside the padding of a driver's helmet, offering viewers at home an eye-level perspective as F1 cars race through the corners of the world's most exciting race tracks. In 2023, all drivers will be required to wear the cameras.
The new ultrafast prime for Nikon Z-mount cameras is a re-worked version of Cosina's existing Voigtländer 50mm F1 Aspherical lens for Leica M-mount cameras.
There are plenty of hybrid cameras on the market, but often a user needs to choose between photo- or video-centric models in terms of features. Jason Hendardy explains why he would want to see shutter angle and 32-bit float audio as added features in cameras that highlight both photo and video functionalities.
SkyFi's new Earth Observation service is now fully operational, allowing users to order custom high-resolution satellite imagery of any location on Earth using a network of more than 80 satellites.
In some parts of the world, winter brings picturesque icy and snowy scenes. However, your drone's performance will be compromised in cold weather. Here are some tips for performing safe flights during the chilliest time of the year.
The winners of the Ocean Art Photo Competition 2022 have been announced, showcasing incredible sea-neries (see what we did there?) from around the globe.
Venus Optics has announced a quartet of new anamorphic cine lenses for Super35 cameras, the Proteus 2x series. The 2x anamorphic lenses promise ease of use, accessibility and high-end performance for enthusiast and professional video applications.
We've shot the new Fujinon XF 56mm F1.2R WR lens against the original 56mm F1.2R, to check whether we should switch the lens we use for our studio test scene or maintain consistency.
Nature photographer Erez Marom continues his series about landscape composition by discussing the multifaceted role played by the sky in a landscape image.
The NONS SL660 is an Instax Square instant camera with an interchangeable lens design. It's made of CNC-milled aluminum alloy, has an SLR-style viewfinder, and retails for a $600. We've gone hands-on to see what it's like to shoot with.
Recently, DJI made Waypoints available for their Mavic 3 series of drones, bringing a formerly high-end feature to the masses. We'll look at what this flight mode is and why you should use it.
Astrophotographer Bray Falls was asked to help verify the discovery of the Andromeda Oxygen arc. He describes his process for verification, the equipment he used and where astronomers should point their telescopes next.
Comments