Help me understand these DoF calculators

BBViet

Forum Enthusiast
Messages
266
Reaction score
22
I went to http://www.cambridgeincolour.com/tutorials/dof-calculator.htm to try the dof calculator and what perplexed me was that by changing the format from 1.5 crop to full frame and keeping all other parameters the same I get a deeper dof.

I have always thought that using the same lens at the same aperture and focusing at the same distance would yield the same dof no matter what sensor was put behind the lens.

Can anyone help me understand this?
 
I went to http://www.cambridgeincolour.com/tutorials/dof-calculator.htm to try the dof calculator and what perplexed me was that by changing the format from 1.5 crop to full frame and keeping all other parameters the same I get a deeper dof.

I have always thought that using the same lens at the same aperture and focusing at the same distance would yield the same dof no matter what sensor was put behind the lens.

Can anyone help me understand this?
My understanding of what's going on is that, for a start, using the same lens (e.g. a 50mm) will give different angles of view on different sensor sizes but it will give the same DoF at the sensor. However, the factor you haven't considered is the degree of enlargement for screen display or for printing.

DoF is a perceptual matter. Lets assume that under given viewing conditions the average human eye will perceive a dot 0.1mm across as a small disc rather than as a point. Anything smaller than this will look like a point. An image is made up of such points.

If you take a sharply focused image of a thin line, for example, and print it to a size (say 8"x12" or 20x30cm) where the line is less than 0.1mm thick then it will look sharp. If the line is slightly out of focus at the sensor then it will be wider at that stage, so you won't be able to print quite so big and have a sharp looking line.

To print an 8"x12 from FF requires that you enlarge the whole sensor image by a factor of 8x. To get the same size print from an APS-C sensor you need to enlarge 12x.

If the FF image has a line 0.01mm across it will still look sharp enlarged by 8x (0.08mm wide) . Meanwhile the APS-C image will look fuzzy because it will be enlarged by 12x , making it 0.12mm wide -- enough for the human eye to see the blur.

All this has nothing to do with the fact that you get different angles of view.
 
Original poster is right, but most here will not agree. As to more enlargement, if the sensor was film, this might apply, but digital is digital. You are not enlarging the signal used to form the image. The digital image is made up of pixels, outside of possibly better quality pixels from larger pixels on the sensor, a pixel is a pixel, the program that makes images out of them has no idea what the physical dimensions of the sensor or individual pixels are. They only provide information for that particular pixel. So a 20 mp sensor whether tiny or large still provides the computer in your camera with 20mp of information to make an image of. Neither one requires more or less enlargement. It's really easy to prove to yourself, but if you can also easily interpret results the way you want them to be.

There is no way you can take this hypothetical image taken with same lens, same aperture, same distance and comparable images to compare in a print side by side by enlarging the image from the small sensor. It is already larger than the one from the larger sensor. The only way to compare similar printed images is to enlarge the image from the large sensor so that objects are the same size as in the print from the small sensor, then cut away the portion of the image from the large sensor that did not get recorded by the small sensor. Now we have two prints as identical as we are going to get them. You can never make the image recorded by the small sensor look like the one from the large sensor in this case, because the images are completely different, the large image has things in the picture that simply are not there in the small sensor image.
 
Last edited:
Original poster is right, but most here will not agree. As to more enlargement, if the sensor was film, this might apply, but digital is digital. You are not enlarging the signal used to form the image. The digital image is made up of pixels, outside of possibly better quality pixels from larger pixels on the sensor, a pixel is a pixel, the program that makes images out of them has no idea what the physical dimensions of the sensor or individual pixels are. They only provide information for that particular pixel. So a 20 mp sensor whether tiny or large still provides the computer in your camera with 20mp of information to make an image of. Neither one requires more or less enlargement. It's really easy to prove to yourself, but if you can also easily interpret results the way you want them to be.
I'm trying to work out what you are talking about here. The sensor is just a tool for passing an optical image to the viewing medium. Making an 8"x12" print of a FF size optical image requires less magnification than from an APS-C size optical image.

I'll go through the process at sensor level.

If you have an optical image formed on the surface of a sensor a sharp pint image centred on a particular sensel (sensor pixel) should record only on that sensel. If it is sufficiently out of focus it will record on a group of sensels.

OK so far?

Assume a pair of 24MP (to make LxH calculations easier, giving 6000 x 4000 sensels) sensors, FF and APS-C. There will be more sensels per square millimetre on the smaller sensor than the larger one.

OK so far?

On a FF sensor each sensel (including its share the masked area separating it from its neighbours) will be 36/6000mm square. That's .006mm high and wide.

On APS-C it will be 0.004mm square.

OK so far?

An ideal optical image of a sharply focused point (e.g. a star) will only register on one sensel. If you throw that image increasingly out of focus it will eventually also register on adjacent sensels, giving a group of pixels in the final image, rather than a single pixel.

OK so far?

As the APS-C sensels are more closely spaced than the FF sensels this result will become apparent when the optical image on the APS-C sensor is less out of focus than that on theFF sensor.

OK so far?

I need to go out now, but let me know if I've gone wrong so far.
 
Original poster is right, but most here will not agree. As to more enlargement, if the sensor was film, this might apply, but digital is digital. You are not enlarging the signal used to form the image. The digital image is made up of pixels, outside of possibly better quality pixels from larger pixels on the sensor, a pixel is a pixel, the program that makes images out of them has no idea what the physical dimensions of the sensor or individual pixels are. They only provide information for that particular pixel. So a 20 mp sensor whether tiny or large still provides the computer in your camera with 20mp of information to make an image of. Neither one requires more or less enlargement. It's really easy to prove to yourself, but if you can also easily interpret results the way you want them to be.

There is no way you can take this hypothetical image taken with same lens, same aperture, same distance and comparable images to compare in a print side by side by enlarging the image from the small sensor. It is already larger than the one from the larger sensor. The only way to compare similar printed images is to enlarge the image from the large sensor so that objects are the same size as in the print from the small sensor, then cut away the portion of the image from the large sensor that did not get recorded by the small sensor. Now we have two prints as identical as we are going to get them. You can never make the image recorded by the small sensor look like the one from the large sensor in this case, because the images are completely different, the large image has things in the picture that simply are not there in the small sensor image.
I cannot work out your mindsed and thoughts too. It rather seems closer to OT than solution for OP. There are many facts right, some wrong, and all these have one thing common. It has close to nothing to do with our problem.

You have it wrong anyway - you don´t need to compare two different sensor sizes.

Take Sony A7S with 12Mpx sensor, and then A7R, or Canon 5DS resolution beasts.

For same viewing size (which is unknown to us in the Dof Calculator -SILLY!!), The DoF will be sufficient or insuficcient for all that bodies at once.

But once you blast the output to the monitor for 1:1 viewing, you get more magnification from higher resolution sensors, and less DoF for higher resolution sensors too. It doesn´t matter we´re wiewing at the crop, or at the FF sensor. There are pixels present physically on the same (part) space behind the lens. What matters is, how magnified view we have.
 
Last edited:
My understanding of what's going on is that, for a start, using the same lens (e.g. a 50mm) will give different angles of view on different sensor sizes but it will give the same DoF at the sensor.
No.
However, the factor you haven't considered is the degree of enlargement for screen display or for printing.
Yes.
DoF is a perceptual matter.
And that's why I said no the first time. There is no DOF at the sensor since there's no perception there. There's depth-of-focus (which can be defined in a different way) but that's another topic.

Your explanation is basically correct - more enlargement from the smaller sensor means more enlargement of the blur circles as well, which means shallower DOF. The reason larger sensor are thought to have shallower DOF is that the enlargement effect is linear while the change in focal length you need to keep the same framing (you have to zoom in with the larger sensor to keep the same angle of view as you had with the smaller sensor) is quadratic. This means, for the same framing, the shallower DOF from the increased focal length dominates the deeper DOF you get naturally from a larger sensor with its reduced enlargement.
 
Original poster is right, but most here will not agree. As to more enlargement, if the sensor was film, this might apply, but digital is digital. You are not enlarging the signal used to form the image. The digital image is made up of pixels, outside of possibly better quality pixels from larger pixels on the sensor, a pixel is a pixel, the program that makes images out of them has no idea what the physical dimensions of the sensor or individual pixels are. They only provide information for that particular pixel. So a 20 mp sensor whether tiny or large still provides the computer in your camera with 20mp of information to make an image of. Neither one requires more or less enlargement. It's really easy to prove to yourself, but if you can also easily interpret results the way you want them to be.
I'm trying to work out what you are talking about here.
I wouldn't - you'll just wrap your head in knots.
 
If you'll study this image and apply some deductive reasoning based on the parameters of the DoF calculator, it'll answer your question plus many of the questions regarding format equivalency.



from DPR's review of the Canon 5D

from DPR's review of the Canon 5D

I went to http://www.cambridgeincolour.com/tutorials/dof-calculator.htm to try the dof calculator and what perplexed me was that by changing the format from 1.5 crop to full frame and keeping all other parameters the same I get a deeper dof.

I have always thought that using the same lens at the same aperture and focusing at the same distance would yield the same dof no matter what sensor was put behind the lens.

Can anyone help me understand this?


--
Once you've done fifty, everything else is iffy.
 
I went to http://www.cambridgeincolour.com/tutorials/dof-calculator.htm to try the dof calculator and what perplexed me was that by changing the format from 1.5 crop to full frame and keeping all other parameters the same I get a deeper dof.

I have always thought that using the same lens at the same aperture and focusing at the same distance would yield the same dof no matter what sensor was put behind the lens.
If you have a telephoto zoom lens with DoF scale, observe the DoF of a specific f/stop at different FL. The shortest FL is the wide angle and the longest FL is the telephoto. I assume you are aware that DoF has inversed relationship with the FL.

I observed that a 50mm lens designed for a FF camera becomes roughly equivalent to a 75mm when fitted to a camera that is roughly 1.5 cropped to FF.

I have a Nikon FF and a Nikon DX. I observed that when a 50mm FX lens is mounted on the DX body, the image is magnified, essentially turning that 50mm to a 75mm lens. However, the (physical) FL of the lens, not the effective FL is recorded in the NEF.


Here is a jury-rig set-up.

Camera mounted on a tripod. Lens is Nikkor FX 24-85mm set at FL=50mm.
Tripod location is fixed. First shot is with D800. D800 is dismounted from the tripod and replaced with D5100. The FX 24-85mm lens is mounted to the D5100. Focus was on the 9.99 sticker on the lower center area of the photo.

The EXIF identifies the camera used for the photo.

8aa6ad60f7d141758fbc7904db470309.jpg



c03eccd512264a958491628f2116971a.jpg
 
I went to http://www.cambridgeincolour.com/tutorials/dof-calculator.htm to try the dof calculator and what perplexed me was that by changing the format from 1.5 crop to full frame and keeping all other parameters the same I get a deeper dof.

I have always thought that using the same lens at the same aperture and focusing at the same distance would yield the same dof no matter what sensor was put behind the lens.

Can anyone help me understand this?
The easiest way to help you understand what's going on is to tell you that if you took a photo of the same scene from the same position with the same focal length and relative aperture using cameras with different sensor sizes, printed the photos out at a size proportional to the sensor size (e.g. 1.5x at 12x18 inches and FF at 18x24 inches), and cropped all the photos to the same framing, then the DOFs would be the same.

Alternatively, photos of the same scene taken from the same position with the same framing and aperture diameter (e.g. the aperture diameter for 100mm f/2 is 100mm / 2 = 50mm) displayed at the same size and viewed from the same distance will have the same DOF for all systems.

For example, photos of a scene from the same position taken at 100mm f/4 on 1.5x and 150mm f/6 on FF will have the same DOF if displayed at the same size.

For a more in-depth (and technical) explanation, see here.
 
...what perplexed me was that by changing the format from 1.5 crop to full frame and keeping all other parameters the same I get a deeper dof.
Your statement is correct, but you must consider that when you change from full frame to 1.5 crop using the same lens, you have to increase the distance to the focused object by 50%, in order to mantain it at the same relative size.

Using the calculator I got the following:

1. Full frame, 50mm lens, f:4, object at 2 meters: Total DOF = 0.4m

2. 1.5 crop frame, 50mm lens, f:4, object at 2 meters: Total DOF = 0.26m

3. 1.5 crop frame, 50mm lens, f:4, object at 3 meters: Total DOF = 0.6m

Cases 1 and 3 are equivalent in the sense that both give the same relative size of the object on the frame. As you see, full frame gives a shallower (smaler) DOF

mapril
 
I went to http://www.cambridgeincolour.com/tutorials/dof-calculator.htm to try the dof calculator and what perplexed me was that by changing the format from 1.5 crop to full frame and keeping all other parameters the same I get a deeper dof.

I have always thought that using the same lens at the same aperture and focusing at the same distance would yield the same dof no matter what sensor was put behind the lens.
@Great Bustard - you seem to be ignoring the OP's point (bolded above).
Can anyone help me understand this?
The easiest way to help you understand what's going on is to tell you that if you took a photo of the same scene from the same position with the same focal length and relative aperture using cameras with different sensor sizes,
I agree except for the relative aperture because that deviate from the OP's issue.
printed the photos out at a size proportional to the sensor size (e.g. 1.5x at 12x18 inches and FF at 18x24 inches), and cropped all the photos to the same framing, then the DOFs would be the same.
Why all the fuzz about printing when printing will introduce another unwanted element? Just do a side-by-side (split screen) of unretouched photos is easier. The two photos I posted above will just do as well. Those are unretouched, not cropped. The filesize was reduced to accommodate DPR filesize upload requirement.
Alternatively, photos of the same scene taken from the same position with the same framing and aperture diameter (e.g. the aperture diameter for 100mm f/2 is 100mm / 2 = 50mm) displayed at the same size and viewed from the same distance will have the same DOF for all systems.
Aperture diameter is irrelevant for three reasons: (1) almost physically impossible to measure the aperture diameter (2) The lens has already marked aperture ring (3) changing the aperture setting is not in-line with the OP.

I posted your suggestion at least two hours ahead of your posting. Look above your post.
For example, photos of a scene from the same position taken at 100mm f/4 on 1.5x and 150mm f/6 on FF will have the same DOF if displayed at the same size.
Again, using two different Aperture and FL does not address the spirit of the OP.
For a more in-depth (and technical) explanation, see here.
 
I went to http://www.cambridgeincolour.com/tutorials/dof-calculator.htm to try the dof calculator and what perplexed me was that by changing the format from 1.5 crop to full frame and keeping all other parameters the same I get a deeper dof.

I have always thought that using the same lens at the same aperture and focusing at the same distance would yield the same dof no matter what sensor was put behind the lens.

Can anyone help me understand this?
The main factors in Depth of Field are aperture diameter and angle of view. If you know these two things, you don't need to know focal length, or sensor size.

However, we generally do know sensor size and focal length, and therefore the formulas use these to compute angle of view and aperture diameter. Note that aperture diameter is the absolute, not relative, diameter. A 100mm lens at f/2 and a 150mm lens at f/3, both have an aperture diameter of 50mm.

The reason we don't need to know sensor size is that Depth of Field is a property of the final print. Standard Depth of Field formula make assumptions about how the print will be viewed, If these assumptions are not correct, then the results won't match reality.

It's easy to show that DoF varies with viewing conditions. Make a 1x2 print and an 8x10 print. You should see that the depth of field looks greater in the 1x2 print. This is also why depth of field can look good on the small rear LCD camera preview, but be too shallow in the final print.

.

Let's look at this another way. In order for something to look in focus in the print, we want the amount of focus error in the print to be smaller than a certain amount (typically called the circle of confusion). With smaller sensors, you need higher magnification for the same size print (where "magnification" is the ratio of print size to sensor size). Therefore the circle of confusion on a smaller sensor must be smaller than on a larger sensor, as it will experience more magnification.
 
Last edited:
I went to http://www.cambridgeincolour.com/tutorials/dof-calculator.htm to try the dof calculator and what perplexed me was that by changing the format from 1.5 crop to full frame and keeping all other parameters the same I get a deeper dof.

I have always thought that using the same lens at the same aperture and focusing at the same distance would yield the same dof no matter what sensor was put behind the lens.
@Great Bustard - you seem to be ignoring the OP's point (bolded above).
Actually, the first scenario given by GB is exactly what the OP asked: same focal length and shooting distance, relative aperture (aka f-number); with the printed output sized proportional to the sensor size -- in that case the OP's understanding would be confirmed; depth of field would be the same.

However, if the print size in both cases is the same -- which is how photos would be typically viewed and how they would be properly compared; and which is the assumption built into the DoF calculator -- then the depth of field yielded by the smaller sensor would be shallower.

GB is pointing out the the evaluation of depth of field requires accounting for all of the variables, including not just the shooting parameters, but all of the steps to producing a final, viewable image.
Can anyone help me understand this?
The easiest way to help you understand what's going on is to tell you that if you took a photo of the same scene from the same position with the same focal length and relative aperture using cameras with different sensor sizes,
I agree except for the relative aperture because that deviate from the OP's issue.
The relative aperture is the f-number. Although the OP used the vernacular "aperture" it is best to be clear whether one means the actual aperture size, or the that size relative to the focal length, which is what is more commonly meant. GB was being clear about that.
printed the photos out at a size proportional to the sensor size (e.g. 1.5x at 12x18 inches and FF at 18x24 inches), and cropped all the photos to the same framing, then the DOFs would be the same.
Why all the fuzz about printing when printing will introduce another unwanted element? Just do a side-by-side (split screen) of unretouched photos is easier. The two photos I posted above will just do as well. Those are unretouched, not cropped. The filesize was reduced to accommodate DPR filesize upload requirement.
Assuming you mean your earlier post, it isn't clear what point you are making with those. But if images taken with different sensor sizes but identical shooting conditions (including focal length, relative aperture, and subject distance), and are then displayed at the same size and viewing distance, then the one from the smaller sensor will exhibit less depth of field. Is that what you were illustrating?
Alternatively, photos of the same scene taken from the same position with the same framing and aperture diameter (e.g. the aperture diameter for 100mm f/2 is 100mm / 2 = 50mm) displayed at the same size and viewed from the same distance will have the same DOF for all systems.
Aperture diameter is irrelevant for three reasons: (1) almost physically impossible to measure the aperture diameter (2) The lens has already marked aperture ring (3) changing the aperture setting is not in-line with the OP.
I posted your suggestion at least two hours ahead of your posting. Look above your post.
Well here you are kind of off the rails. The actual aperture diameter (not the relative aperture) is what directly determines depth of field. As GB points out, if those are normalized (and everything else, including framing, is the same), then depth of field will be the same. Understanding that is crucial. Incidentally, one need not measure the actual aperture diameter, is is easy and convenient to determine it from the known focal length and relative aperture, as GB shows.
For example, photos of a scene from the same position taken at 100mm f/4 on 1.5x and 150mm f/6 on FF will have the same DOF if displayed at the same size.
Again, using two different Aperture and FL does not address the spirit of the OP.
The OP wanted to better understand how the various parameters work interdependently to influence depth of field, including cases where the depth of field is the same (as he expected) and why they would be different, as he discovered from the DoF calculator. GB's discussion is 100% on point.
For a more in-depth (and technical) explanation, see here.
Dave
 
Please look at the photos I posted and the accompanying description of how taken?
The EXIF has the information of the lens setting as the OP asked.

"same lens at the same aperture and focusing at the same distance would yield the same dof no matter what sensor was put behind the lens."

Lens used was Nikkort FX; Aperture f/4.0; FL=50mm
Cameras used: Nikon D800 and Nikon D5100 ___ alternately mounted on a tripod.
Vector of camera to subject, including high and angle, practically the same within 1/4 inch and perhaps 0.01 degree..

Any changes from the condition (bolded) skirts around the issue posted by the OP.
I went to http://www.cambridgeincolour.com/tutorials/dof-calculator.htm to try the dof calculator and what perplexed me was that by changing the format from 1.5 crop to full frame and keeping all other parameters the same I get a deeper dof.

I have always thought that using the same lens at the same aperture and focusing at the same distance would yield the same dof no matter what sensor was put behind the lens.
@Great Bustard - you seem to be ignoring the OP's point (bolded above).
Actually, the first scenario given by GB is exactly what the OP asked: same focal length and shooting distance, relative aperture (aka f-number); with the printed output sized proportional to the sensor size -- in that case the OP's understanding would be confirmed; depth of field would be the same.
The OP stated same aperture, not relative aperture. Moreover, the OP did not mention prints.
However, if the print size in both cases is the same -- which is how photos would be typically viewed and how they would be properly compared; and which is the assumption built into the DoF calculator -- then the depth of field yielded by the smaller sensor would be shallower.

GB is pointing out the the evaluation of depth of field requires accounting for all of the variables, including not just the shooting parameters, but all of the steps to producing a final, viewable image.
This sounds like a matter of interpretation and opinion.

Let's agree to disagree.
Can anyone help me understand this?
The easiest way to help you understand what's going on is to tell you that if you took a photo of the same scene from the same position with the same focal length and relative aperture using cameras with different sensor sizes,
I agree except for the relative aperture because that deviate from the OP's issue.
The relative aperture is the f-number. Although the OP used the vernacular "aperture" it is best to be clear whether one means the actual aperture size, or the that size relative to the focal length, which is what is more commonly meant. GB was being clear about that.
I go by what the OP wrote, not what I think he meant. Since the OP specified same lens at the same aperture, how would the the FL change? Why introduce "noise" in the form of relative aperture to FL?
printed the photos out at a size proportional to the sensor size (e.g. 1.5x at 12x18 inches and FF at 18x24 inches), and cropped all the photos to the same framing, then the DOFs would be the same.
Why all the fuzz about printing when printing will introduce another unwanted element? Just do a side-by-side (split screen) of unretouched photos is easier. The two photos I posted above will just do as well. Those are unretouched, not cropped. The filesize was reduced to accommodate DPR filesize upload requirement.
Assuming you mean your earlier post, it isn't clear what point you are making with those. But if images taken with different sensor sizes but identical shooting conditions (including focal length, relative aperture, and subject distance), and are then displayed at the same size and viewing distance, then the one from the smaller sensor will exhibit less depth of field. Is that what you were illustrating?
Again, pictures are worth thousand words. The two photos I posted are the original, no post processing so as to show the effect on the DoF using A LENS on camera with different sensor sizes. You don't have to use the photos I posted. Do your own experiment and decide for yourself.
Alternatively, photos of the same scene taken from the same position with the same framing and aperture diameter (e.g. the aperture diameter for 100mm f/2 is 100mm / 2 = 50mm) displayed at the same size and viewed from the same distance will have the same DOF for all systems.
Aperture diameter is irrelevant for three reasons: (1) almost physically impossible to measure the aperture diameter (2) The lens has already marked aperture ring (3) changing the aperture setting is not in-line with the OP.
I posted your suggestion at least two hours ahead of your posting. Look above your post.
Well here you are kind of off the rails. The actual aperture diameter (not the relative aperture) is what directly determines depth of field. As GB points out, if those are normalized (and everything else, including framing, is the same), then depth of field will be the same. Understanding that is crucial. Incidentally, one need not measure the actual aperture diameter, is is easy and convenient to determine it from the known focal length and relative aperture, as GB shows.
You keep on insisting "relative aperture". It is the same lens, hence, the physical FL has not changed. Please do not change the conditions specified by the OP.
For example, photos of a scene from the same position taken at 100mm f/4 on 1.5x and 150mm f/6 on FF will have the same DOF if displayed at the same size.
Again, using two different Aperture and FL does not address the spirit of the OP.
The OP wanted to better understand how the various parameters work interdependently to influence depth of field, including cases where the depth of field is the same (as he expected) and why they would be different, as he discovered from the DoF calculator. GB's discussion is 100% on point.
Perhaps the OP wanted to know everything about those conditions you stated
BUT the OP did not post what you are suggesting.

It would be ideal for the OP to clarify his intent, otherwise, this loop will never end even if the cows come home.
 
Please look at the photos I posted and the accompanying description of how taken?
The EXIF has the information of the lens setting as the OP asked.

"same lens at the same aperture and focusing at the same distance would yield the same dof no matter what sensor was put behind the lens."
Which is not true. Which was GB's point.
I go by what the OP wrote, not what I think he meant. Since the OP specified same lens at the same aperture, how would the the FL change? Why introduce "noise" in the form of relative aperture to FL?
Because that's what f-stop is, and most people use the term "aperture" interchangeably with "f-stop", even though aperture diameter (entrance pupil diameter) is probably more appropriately called "aperture" and f-stop should be called "relative aperture".
 
I went to http://www.cambridgeincolour.com/tutorials/dof-calculator.htm to try the dof calculator and what perplexed me was that by changing the format from 1.5 crop to full frame and keeping all other parameters the same I get a deeper dof.

I have always thought that using the same lens at the same aperture and focusing at the same distance would yield the same dof no matter what sensor was put behind the lens.

Can anyone help me understand this?
The main factors in Depth of Field are aperture diameter and angle of view.
And focus distance (subject distance).
 
BBViet wrote:.

I have always thought that using the same lens at the same aperture and focusing at the same distance would yield the same dof no matter what sensor was put behind the lens.
Your assumption is wrong. You can take a single image file, and the Depth of Field will vary depending on how you crop it, what size you print it, and even the viewing distance.

Depth of Field calculators assume that you won't crop, and make assumptions about typical print sizes and viewing distances.

Consider a full frame and a 1.5 crop body. Assume same subject distance for all shots, and an uncropped 8x12 print from the image file.

If both cameras have the same focal length and f/stop (perhaps 100mm at f/2.8) then the print from the full frame body will have more depth of field. The print from the full frame will show a wider field of view.

If you put a 150mm lens on the full frame, and a 100 mm on the crop body, again both at f/2.8, then the print from the full frame body will have less depth of field. Both prints will show the same field of view.

If you put a 150mm lens on the full fram, and a 100 mm lens on the crop, and set the same aperture diameter (say 50mm - which is 100mm at f/2 and 150mm at f/3) then the prints from both cameras will have the same depth of field, field of view, and approximately same overall image noise level (assuming both cameras use similar technology modern sensors)

In terms of camera settings, if the field of view and aperture diameter are the same, then the depth of field will be the same. (Note that cropping changes the field of view).
 
BBViet wrote:.

I have always thought that using the same lens at the same aperture and focusing at the same distance would yield the same dof no matter what sensor was put behind the lens.
Your assumption is wrong. You can take a single image file, and the Depth of Field will vary depending on how you crop it, what size you print it, and even the viewing distance.

Depth of Field calculators assume that you won't crop, and make assumptions about typical print sizes and viewing distances.

Consider a full frame and a 1.5 crop body. Assume same subject distance for all shots, and an uncropped 8x12 print from the image file.

If both cameras have the same focal length and f/stop (perhaps 100mm at f/2.8) then the print from the full frame body will have more depth of field. The print from the full frame will show a wider field of view.

If you put a 150mm lens on the full frame, and a 100 mm on the crop body, again both at f/2.8, then the print from the full frame body will have less depth of field. Both prints will show the same field of view.

If you put a 150mm lens on the full fram, and a 100 mm lens on the crop, and set the same aperture diameter (say 50mm - which is 100mm at f/2 and 150mm at f/3) then the prints from both cameras will have the same depth of field, field of view, and approximately same overall image noise level (assuming both cameras use similar technology modern sensors)
And the same shutter speed.
In terms of camera settings, if the field of view and aperture diameter are the same, then the depth of field will be the same. (Note that cropping changes the field of view).
Good post - all right.
 
Please look at the photos I posted and the accompanying description of how taken?
The EXIF has the information of the lens setting as the OP asked.

"same lens at the same aperture and focusing at the same distance would yield the same dof no matter what sensor was put behind the lens."
If it was your intent to demonstrate that, you did not succeed. The image from the D5100 has less depth of field, because was enlarged more to be displayed at the same size on the screen, as it would have been had the same size print been made. that's why the sensor size matters when determining depth of field, as did film size in the old days.
BBViet, post: 58915140, member: 1713081"]
I went to http://www.cambridgeincolour.com/tutorials/dof-calculator.htm to try the dof calculator and what perplexed me was that by changing the format from 1.5 crop to full frame and keeping all other parameters the same I get a deeper dof.

I have always thought that using the same lens at the same aperture and focusing at the same distance would yield the same dof no matter what sensor was put behind the lens.
@Great Bustard - you seem to be ignoring the OP's point (bolded above).
Actually, the first scenario given by GB is exactly what the OP asked: same focal length and shooting distance, relative aperture (aka f-number); with the printed output sized proportional to the sensor size -- in that case the OP's understanding would be confirmed; depth of field would be the same.
The OP stated same aperture, not relative aperture. Moreover, the OP did not mention prints.
[/QUOTE]
But depth of field calculators DO account for prints (or for screen viewing; it makes no difference), which is the explanation for why the OP's assumption was incorrect. GB explained that. Depth of field cannot be understood or evaluated without reference to the viewing conditions for the final image. Even when the shooting conditions were the same, a difference in viewing conditions (which includes the size of the output image and the distance from whihc it is viewed) will result in a difference in depth of field.

Your misunderstanding about aperture and relative aperture has been addressed elsewhere.
However, if the print size in both cases is the same -- which is how photos would be typically viewed and how they would be properly compared; and which is the assumption built into the DoF calculator -- then the depth of field yielded by the smaller sensor would be shallower.

GB is pointing out the the evaluation of depth of field requires accounting for all of the variables, including not just the shooting parameters, but all of the steps to producing a final, viewable image.
This sounds like a matter of interpretation and opinion.

Let's agree to disagree.
Um, so you are saying that evaluation of depth of field should not account for all the variables that influence it? Well, you are entitled to you opinion, but that seems like a difficult one to support.
Can anyone help me understand this?
The easiest way to help you understand what's going on is to tell you that if you took a photo of the same scene from the same position with the same focal length and relative aperture using cameras with different sensor sizes,
I agree except for the relative aperture because that deviate from the OP's issue.
The relative aperture is the f-number. Although the OP used the vernacular "aperture" it is best to be clear whether one means the actual aperture size, or the that size relative to the focal length, which is what is more commonly meant. GB was being clear about that.
I go by what the OP wrote, not what I think he meant. Since the OP specified same lens at the same aperture, how would the the FL change? Why introduce "noise" in the form of relative aperture to FL?
Because, as explained above and elsewhere depth of field is not, and cannot, be evaluated by stopping at the capture stage. Specifically, you cannot complete a depth of field calculation without using the "circle of confusion" parameter, which includes the ratio of enlargement needed from the sensor size to the print or screen size. If either the print/screen size or the sensor size is different, the CoC will be different, and the resulting depth of field will be different.
printed the photos out at a size proportional to the sensor size (e.g. 1.5x at 12x18 inches and FF at 18x24 inches), and cropped all the photos to the same framing, then the DOFs would be the same.
Why all the fuzz about printing when printing will introduce another unwanted element? Just do a side-by-side (split screen) of unretouched photos is easier. The two photos I posted above will just do as well. Those are unretouched, not cropped. The filesize was reduced to accommodate DPR filesize upload requirement.
Assuming you mean your earlier post, it isn't clear what point you are making with those. But if images taken with different sensor sizes but identical shooting conditions (including focal length, relative aperture, and subject distance), and are then displayed at the same size and viewing distance, then the one from the smaller sensor will exhibit less depth of field. Is that what you were illustrating?
Again, pictures are worth thousand words. The two photos I posted are the original, no post processing so as to show the effect on the DoF using A LENS on camera with different sensor sizes. You don't have to use the photos I posted. Do your own experiment and decide for yourself.
Yes, as noted above, your pictures demonstrate that when all else is the same except for sensor size, the image from the smaller sensor has less depth of field. If that is what you were trying to demonstrate, you succeeded. Otherwise, not so much.
Alternatively, photos of the same scene taken from the same position with the same framing and aperture diameter (e.g. the aperture diameter for 100mm f/2 is 100mm / 2 = 50mm) displayed at the same size and viewed from the same distance will have the same DOF for all systems.
Aperture diameter is irrelevant for three reasons: (1) almost physically impossible to measure the aperture diameter (2) The lens has already marked aperture ring (3) changing the aperture setting is not in-line with the OP.
I posted your suggestion at least two hours ahead of your posting. Look above your post.
Well here you are kind of off the rails. The actual aperture diameter (not the relative aperture) is what directly determines depth of field. As GB points out, if those are normalized (and everything else, including framing, is the same), then depth of field will be the same. Understanding that is crucial. Incidentally, one need not measure the actual aperture diameter, is is easy and convenient to determine it from the known focal length and relative aperture, as GB shows.
You keep on insisting "relative aperture". It is the same lens, hence, the physical FL has not changed. Please do not change the conditions specified by the OP.
Sigh. As GB was originally explaining, if the pictures are taken from the same distance and have the same framing, then the focal length cannot be the same. If that's the case, if the relative aperture (f-number) is kept the same, then the pictures will not have the same depth of field. But if the actual aperture is kept the same, they will, if they are viewed under the same conditions. We make a distinction between the actual aperture and relative aperture because they are not the same thing, and the difference makes a difference in things like depth of field. When someone simply says "aperture" it isn't clear which is meant. When discussing things like this, it is essential to be clear about what is being communicated, which is why we keep insisting on "relative aperture" if that is what we mean.
For example, photos of a scene from the same position taken at 100mm f/4 on 1.5x and 150mm f/6 on FF will have the same DOF if displayed at the same size.
Again, using two different Aperture and FL does not address the spirit of the OP.
The OP wanted to better understand how the various parameters work interdependently to influence depth of field, including cases where the depth of field is the same (as he expected) and why they would be different, as he discovered from the DoF calculator. GB's discussion is 100% on point.
Perhaps the OP wanted to know everything about those conditions you stated
BUT the OP did not post what you are suggesting.
It would be ideal for the OP to clarify his intent, otherwise, this loop will never end even if the cows come home.
The OP's subject line was "Help me understand these DoF calculators", which is exactly what GB and other posters are helping him with.

Dave
 

Keyboard shortcuts

Back
Top