Doug Kerr
Forum Pro
As camera sensors evolve, we understandably run into discussions about how the resolution capability of sensors "matches" the resolution capability of typical lenses. As this occurs, however, we often run into two misconceptions which can warp the conclusions people reach:
1. If we have a sensor whose resolution is, say, 50 line pairs per millimeter (50 lp/mm), then if we have a lens with a resolution of 50 lp/mm, that will allow us to take full advantage of the sensor resolution (or sometimes it is said the other way up). But this is not quite how it works.
2. If we have a sensor whose pixel pitch is 100 pixels per millimeter (px/mm), then its resolution capability is 50 lp/mm. But this is not quite how that works, either.
Let me give a brief clarification of both of these matters.
*************
1. Lens, sensor, and system resolution.
When we say that a lens has a resolution of x lp/mm, and are being "scientific" about it, we usually mean that at a spatial frequency of x lp/mm, the modulation transfer function (MTF) of the lens has dropped to some arbitrary fraction of the MTF at "low" spatial frequencies. There is not a consistent fraction always used as this criterion. Let's assume for the moment that we use 0.2 - that is, the spatial frequency at which the MTF drops to 0.2 times its value at a "low" spatial frequency (for some particular spot on the image, perhaps the center) is considered to be the "limit of usable resolution".
Can we see this from the MTF curves published by the lens manufacturer? No. The type of MTF curve they usually publish shows the MTF at different distances from the center of the image but for only two spatial frequencies, one "low" and one "fairly high". So we can't really see how the MTF varies with frequency.
Now, we may wish to reckon sensor resolution the same way. That is, the spatial frequency at which the response of the sensor (its MTF) has dropped to 0.2 its value at "low" spatial frequency is considered to be the resolution capability of the sensor.
Now suppose we have the tidy-sounding situation in which (a) the lens has a resolution of 50 lp/mm and (b) the sensor has a resolution of 50 lp/mm (to just pick an arbitrary number). Great! Now our whole system (camera) will have a resolution of 50 lp/mm, right? Neither the lens nor the sensor is hobbling the achievement of the other, right? No.
We must consider the fact that, in this situation, the MTF of the entire "system" (that is the "chain" through the lens and the sensor) at any frequency is essentially the product of the MTFs of the two components.
In fact, if at a spatial frequency of 50 lp/mm, the MTF of the lens is 0.2 its value at "low" spatial frequencies, and the MTF of the sensor is 0.2 its value at "low" spatial frequencies, then the MTF of the whole chain is only 0.04 (0.2 X 0.2) its value at low spatial frequencies , a response that falls far short of our criterion for the limit of usable resolution for the entire "chain".
So at what spatial frequency does the whole chain exhibit the MTF we consider to be the limit of usable resolution?
Well, let's just suppose that at a spatial frequency of 35 lp/mm:
Then the MTF of the entire chain would be about 0.2 (0.45 x 0.45) its value at low spatial frequency. Thus we would consider the resolution limit of the entire chain to be at this spatial frequency, 35 lp/mm, not at 50 lp/mm.
Can we figure this out from the MTF curves of the lens and the sensor? No. The MTF curve of the lens does not show us how the MTF varies with spatial frequency (except at two arbitrary frequencies), and we don't usually receive MTF curves for the sensor.
*************
2. Resolution of the sensor
We have a tendency to think that if our sensor has 2400 pixel rows across its height that its resolution is 1200 line pairs in the vertical direction. And if the sensor height were 15 mm, we would equate that to a resolution of 80 lp/mm. But that is not so.
Consider a test target with horizontal lines, alternating black and white, imaged such that on the sensor there were 80 lines per mm. Will the sensor capture its image? Well, if its image happens to fall such that each line were centered on a row of pixel detectors, yes. If it happened to fall so that its lines fell on the boundaries between rows of pixel detectors, no.
Of course, in real life, we are not usually concerned with images that are a series of equally spaced black and white lines. But this concept still applies.
Year ago, in connection with the development of facsimile transmission, Kell investigated this situation, and determined that, averaged over different type of material, where the orientation and alignment of the subject varied randomly, the actual usable resolution of the system would on the average typically be about 75% the resolution implied by the scanning line pitch (which plays the same role as the pixel pitch does in our digital cameras).
Thus, our camera with 2400 rows of pixels across the height of the sensor might in fact develop an actual resolution of 1800 lines per picture height (900 lp/ph).
*************
We find both of these issues at work when we examine the resolution figures reported by, for example, DPR in their camera reviews. For the EOS 20D, for example, DPR reports a resolution in the vertical direction of 1650 lines per picture height (54.6 lp/mm). The "geometric" resolution (that is, based on pixel pitch) is 2336 lines per picture height (77.3 lp/mm). The actual resolution is about 70% the "geometric" resolution, reflecting the impact of the Kell effect as well as the influence of the lens used for the testing.
Best regards,
Doug
1. If we have a sensor whose resolution is, say, 50 line pairs per millimeter (50 lp/mm), then if we have a lens with a resolution of 50 lp/mm, that will allow us to take full advantage of the sensor resolution (or sometimes it is said the other way up). But this is not quite how it works.
2. If we have a sensor whose pixel pitch is 100 pixels per millimeter (px/mm), then its resolution capability is 50 lp/mm. But this is not quite how that works, either.
Let me give a brief clarification of both of these matters.
*************
1. Lens, sensor, and system resolution.
When we say that a lens has a resolution of x lp/mm, and are being "scientific" about it, we usually mean that at a spatial frequency of x lp/mm, the modulation transfer function (MTF) of the lens has dropped to some arbitrary fraction of the MTF at "low" spatial frequencies. There is not a consistent fraction always used as this criterion. Let's assume for the moment that we use 0.2 - that is, the spatial frequency at which the MTF drops to 0.2 times its value at a "low" spatial frequency (for some particular spot on the image, perhaps the center) is considered to be the "limit of usable resolution".
Can we see this from the MTF curves published by the lens manufacturer? No. The type of MTF curve they usually publish shows the MTF at different distances from the center of the image but for only two spatial frequencies, one "low" and one "fairly high". So we can't really see how the MTF varies with frequency.
Now, we may wish to reckon sensor resolution the same way. That is, the spatial frequency at which the response of the sensor (its MTF) has dropped to 0.2 its value at "low" spatial frequency is considered to be the resolution capability of the sensor.
Now suppose we have the tidy-sounding situation in which (a) the lens has a resolution of 50 lp/mm and (b) the sensor has a resolution of 50 lp/mm (to just pick an arbitrary number). Great! Now our whole system (camera) will have a resolution of 50 lp/mm, right? Neither the lens nor the sensor is hobbling the achievement of the other, right? No.
We must consider the fact that, in this situation, the MTF of the entire "system" (that is the "chain" through the lens and the sensor) at any frequency is essentially the product of the MTFs of the two components.
In fact, if at a spatial frequency of 50 lp/mm, the MTF of the lens is 0.2 its value at "low" spatial frequencies, and the MTF of the sensor is 0.2 its value at "low" spatial frequencies, then the MTF of the whole chain is only 0.04 (0.2 X 0.2) its value at low spatial frequencies , a response that falls far short of our criterion for the limit of usable resolution for the entire "chain".
So at what spatial frequency does the whole chain exhibit the MTF we consider to be the limit of usable resolution?
Well, let's just suppose that at a spatial frequency of 35 lp/mm:
- The lens has an MTF of 0.45 its value at low spatial frequency, and
- The sensor has an MTF of 0.45 its value at low spatial frequency.
Then the MTF of the entire chain would be about 0.2 (0.45 x 0.45) its value at low spatial frequency. Thus we would consider the resolution limit of the entire chain to be at this spatial frequency, 35 lp/mm, not at 50 lp/mm.
Can we figure this out from the MTF curves of the lens and the sensor? No. The MTF curve of the lens does not show us how the MTF varies with spatial frequency (except at two arbitrary frequencies), and we don't usually receive MTF curves for the sensor.
*************
2. Resolution of the sensor
We have a tendency to think that if our sensor has 2400 pixel rows across its height that its resolution is 1200 line pairs in the vertical direction. And if the sensor height were 15 mm, we would equate that to a resolution of 80 lp/mm. But that is not so.
Consider a test target with horizontal lines, alternating black and white, imaged such that on the sensor there were 80 lines per mm. Will the sensor capture its image? Well, if its image happens to fall such that each line were centered on a row of pixel detectors, yes. If it happened to fall so that its lines fell on the boundaries between rows of pixel detectors, no.
Of course, in real life, we are not usually concerned with images that are a series of equally spaced black and white lines. But this concept still applies.
Year ago, in connection with the development of facsimile transmission, Kell investigated this situation, and determined that, averaged over different type of material, where the orientation and alignment of the subject varied randomly, the actual usable resolution of the system would on the average typically be about 75% the resolution implied by the scanning line pitch (which plays the same role as the pixel pitch does in our digital cameras).
Thus, our camera with 2400 rows of pixels across the height of the sensor might in fact develop an actual resolution of 1800 lines per picture height (900 lp/ph).
*************
We find both of these issues at work when we examine the resolution figures reported by, for example, DPR in their camera reviews. For the EOS 20D, for example, DPR reports a resolution in the vertical direction of 1650 lines per picture height (54.6 lp/mm). The "geometric" resolution (that is, based on pixel pitch) is 2336 lines per picture height (77.3 lp/mm). The actual resolution is about 70% the "geometric" resolution, reflecting the impact of the Kell effect as well as the influence of the lens used for the testing.
Best regards,
Doug