Variation Facts and Fallacies
2 Variation Facts and Fallacies
What is normal variation?
When we at Lensrentals started using computerized analysis (we use the Imatest package) to assess the MTF of large numbers of lenses, it became obvious that there is sharpness variation among copies. The example below shows the groupings for several different 100mm lenses: six samples of the Carl Zeiss Makro-Planar T* 2/100, eight samples of the Canon EF 100mm f/2.8 Macro USM, and twenty-four samples of the Canon EF 100mm f/2.8L Macro IS USM.
|This graph plots center sharpness against a sharpness figure for the whole lens (measured in line pairs per picture height for unsharpened RAW images)|
Two things should be apparent from this chart. First, there clearly is variation between the different copies of each lens. On average the Canon 100mm f2.8 IS L lenses are the sharpest, but some copies are sharper than others. And some copies of the other two lenses are sharper again. You can see how this might lead two different reviewers to hold slightly different opinions on which 100mm macro lens is the best.
Second, a true 'bad lens' is truly an outlier, and you can see a bad one way down on the lower left. The difference between a soft or bad copy and the main group is very large. The copy-to-copy variation that occurs between the other lenses is really minor. If you want to know how bad that bad copy is: our techs could identify it looking at JPEGs at 50% on a computer screen, but they'd be unlikely to spot it by looking at a web-sized image.
A second point about copy-to-copy variation must be made: the data above were for all of those lenses on one single camera body. As I've already explained, when you change the body the results change - the overall pattern will look the same, but each lens result will be slightly different. The sharpest lens on test camera A may well not be the sharpest on test camera B.
The example below shows two things. The green triangles represent multiple different samples of a lens shot on the same camera (notice that the range on both axis has been changed to illustrate the difference more clearly). We then took one of those lenses (circled) and shot it on 11 other camera bodies - the performance is represented here by the blue and red squares. As you can see, there is significant variation in the results from the different cameras.
|A single lens (circled) was re-tested on eleven camera bodies (red boxes and blue diamonds)|
from two different serial number series. All were brand new from box.
As suggested by our name, Lensrentals is a lens rental house, and as such I'd like to clarify one point: the variations that Ive been talking about don't occur because we're looking at used lenses. When we first started testing lenses, we made sure to carefully compare brand new lenses with our stock lenses to make sure that our quality assurance was keeping the stock lenses in good shape.
We buy new lenses in significant quantities; so we usually start testing a given lens by examining a dozen new copies right out of the box, then we start testing our rental copies. The graph below compares a group of new-out-of-the-box lenses with a group from rental stock to give an example. But in the graph immediately above, all of the camera bodies were new stock.
We’ve only found one lens so far that did show a difference with age, but results from that aren’t used in any of these illustrations (I have discussed this in detail in this article).
To give some idea of how small the difference between various lenses really is, we can examine the variation of a single camera's autofocus with a single lens. It's reasonable to assume that you shoot dozens of images with your camera and depend on autofocus to be accurate. You never notice, unless you look very carefully, that if you automatically focus on the same shot several times the camera focuses slightly differently each time. That’s why for testing purposes the data points I’ve shown were obtained using the best possible live-view manual focus for each lens.
But let’s look at the graph below, which consists of multiple autofocus shots made with one lens on one camera. First, I let the camera autofocus the lens once, turned off autofocus, and took 4 consecutive shots, represented by the blue diamonds in the graph. In theory they should be identical, since nothing was changed. In reality, they are just a little different, which demonstrates the amount of shot-to-shot variation in the testing method - very little, in short.
Without changing anything, I then moved the focus ring to one extreme or the other, let the camera autofocus again, took a single shot, and repeated that process six times. Those six shots are the red boxes. You can see the camera autofocuses the shot a bit differently each time (and this is with center focus point on a star chart: a near ideal situation). Finally, I focused the lens using live-view magnified focusing four consecutive times (represented here by the green triangles).
|A single lens shot repeatedly with no changes (blue), manually focused 4 separate|
times (green), and autofocused 6 times (red).
The scale in this series is changed again to illustrate the point clearly. It should be obvious that the variation of the camera’s autofocus system on repeated shots with a single lens is about 1/3 as great as the variation between different lenses. Autofocus microadjustment to the camera would improve the average AF performance to be nearly as good as the live-view focusing but it wouldn’t change the shot-to-shot variation (it would shift the red cluster towards the green results but it would remain just as spread out).
So How Big is This Variation in Real Life?
That’s really the point of all this. The variation I’ve described is easy to detect using modern computerized optical analysis. And we’ve seen that the variation between a truly bad lens and the group of acceptable lenses is very large. But how big is the variation between the different acceptable copies?
The best answer I can give is probably that it might be detectable by pixel-peepers, but not working photographers, at least hardly ever. Lensrentals has six full-time pixel-peepers on our staff (the inspection technicians) who are armed with a large array of well-lit charts to analyze sharpness, aberration, back and front focus, you name it. They can pick out a bad lens fairly easily, but can’t tell the difference between the best and worst images in the groups above.
But there’s a more scientific way to look at things than 'our techs said so.' The Subjective Quality Factor (SQF) is a measurement developed by Ed Granger and K.N. Cupery in the 1970s for Kodak and used by Popular Photographer for their lens reviews. Basically, SQF uses a mathematical formula, taking the MTF data from the lens (which we get from these tests) to predict with good accuracy how sharp a print would be perceived at various sizes and distances.
I’m not going into detail about SQF (for a more thorough discussion, see Bob Atkins' excellent article or the references below). The important part is that several experts have shown an SQF difference of less than 5 for a reasonably sized print is basically not detectable by human vision.
It’s a simple matter to have the computer calculate the SQF from the data we’ve already obtained in our testing. We arbitrarily calculate it for 8 X 10 inch print size, but the SQF difference would be the same for any reasonably sized print.
The graph below is for the same lens that I used for the autofocus example above, plus I added several other copies of that lens tested on the same camera. Then I had the computer calculate the SQF data for the best and worst copies.
|SQF for 8 X 10 inch print taken from selected MTF points.|
As you can see the variation in SQF is less than 5, meaning theoretically, you wouldn’t be able to detect the difference in a print. The difference is real and can be detected by Imatest, or by a careful pixel-peeper armed with a few test charts, a large monitor, and too much time on their hands. But it wouldn't be significant enough to make an obvious difference in a print.
So What’s the Point of All This?
The main points are fairly straightforward:
1) Every lens and every camera exhibits slight variations relative to its twins that are detectable, but rarely significant.
2) Variations that wouldn’t make the slightest difference in a print may seem quite different when the numbers are presented in a lens review. And, just because one copy of lens X is sharper than one copy of lens Y, doesn’t mean they all are, or that they all will be in your camera.
3) Occasionally, an acceptable lens mounted to an acceptable camera combine their variations in a way that makes them unacceptable together. The lens may be fine with a different camera, and the camera fine with a different copy of the lens.
4) Really bad, soft, out-of-acceptable range lenses do occur. They are fairly rare though and easy to detect.
5) Camera autofocus is more variable and less accurate than you think.*
* Before you go all Major League Fanboy about the superiority of your camera’s autofocus system: autofocus variation exists in every camera from every brand we’ve tested. Want to prove it? Put a wide aperture prime on your camera, mount it to a tripod, and focus on something in the middle distance. Now move the focus ring to infinity, let the camera autofocus and note exactly where it ends up on the distance scale. Now turn the focus ring to near focus, let the camera autofocus on the subject at middle distance, and note the number on the distance scale. They will be slightly different.
When you buy a lens, and assuming your camera allows you to, you should microfocus adjust it. If you do it properly, using a sensible focus distance, it really does make a difference. Then do some very basic tests to make sure it functions properly, and go take some pictures. If you like the pictures it makes, then keep it.
Oddly enough, the conclusion I’ve reached from several years of dedicated pixel-peeping and lens analysis is this: Trying to find exactly the sharpest copy of the sharpest lens is a fool’s errand: you’ll be looking for something that doesn’t exist.
Roger Cicala, Lensrentals.com
For more of Roger's thoughts on lens and camera variation - and much, much more, visit his blog.
Feb 5, 2014
Oct 14, 2011
Nov 24, 2014
Nov 22, 2014
- Canon EOS M58.8%
- Panasonic G85/G803.3%
- Panasonic FZ2500/FZ20001.9%
- Panasonic LX10/LX151.2%
- Panasonic GH5 development3.6%
- Sony a99 II15.9%
- Nikon KeyMission 170 and 801.0%
- Fujifilm GFX 50S development28.3%
- Olympus E-M1 II development18.7%
- Olympus E-PL80.1%
- Olympus 25mm F1.2 Pro1.5%
- Olympus 12-100mm F4 IS Pro1.9%
- Olympus 30mm F3.5 Macro0.1%
- Sigma 85mm F1.4 Art3.6%
- Sigma 12-24mm F4 Art2.6%
- Sigma 500mm F4 DG OS HSM Sport2.4%
- YI M12.2%
- GoPro Hero50.8%
- GoPro Karma drone2.2%