But underexposure may also give misleading results. For example, I can count more individual strands of hair over the model's left eyelid in your second photo (with flash) than in your first one (natural light), and I think it's largely because the exposure is better in the second shot. (I'm assuming that one image was made soon after the other and that the hair was really the same.) So I would say that the flash brought out the sharpness your lens was capable of producing and that the quality of the lens had been "masked" by underexposure with the settings used to make the natural light image.I'm afraid we're comparing apples and pineapples. Using flash,
levels adjustments, or high in-camera sharpening gives misleading
results, as I'll show here.
On the other hand, flash could also enhance the impression of "sharpness" (even apart from any effect on overall exposure) by creating small shadows in a portrait, for example along skin lines or pores, especially if the flash is off-camera. So I think you're right that it isn't completely fair to compare flash and natural light images when judging lens sharpness.
But then what about the quality and direction of natural light? Direct sunlight from an angle also creates shadows, so should comparison pictures be limited to those made only on overcast days? Or maybe the light should just be "standardized" with respect to quality and direction -- but then as more conditions are imposed, "real world" tests get to be more and more controlled, and we might as well just go back to photographing two-dimensional test patterns in a laboratory.
Of course the reality is that there are a lot of factors other than the use of a particular lens that can influence image sharpness, as many have pointed out. We haven't even mentioned the camera. For example, I think unsharpened raw images from a D70 often look sharper than those from a D100.
--
Jim Kaye
PBase supporter