7. “By 2002 or 2003, digital cameras will overtake film in
resolution.” Hiroshi Ono
Well, that depends on what kind of film you're talking about. If you
mean the cheap low-res 110-type film, then I agree. But to compete with
FujiChrome Velvia will be another decade - AT LEAST.
Velvia may have the advantage over the films shown in this D# article but
it gives an intuitive glimpse into the differences from digi to chemi.
http://www.d-sharp.net/miscall/DC/compair/d1/
-iNova
These pix make film look like crap, I will admit. I do not know what the
problem is - is it their camera support (did they use the tripod?),
focusing and depth of field (was it adequately consistent?), or scanning.
I would say these pix do not seem to do an ASA200 film a whole lot of
justice and I am convinced a caring and competent operator should get
sharper results from the said film.
It's like one could own a very sharp lens but if it isn't focused right,
it won't deliver its full sharpness, in which case the technique, and not
the lens, would be to blame for the less than thrilling results.
To me the lower shot looks clearly slightly out of focus. IN addition,
the film shots are larger than digital ones somewhat (which is
understandable considering different digitizing methods' different
intrinsic resolutions), but nonetheless should be taken into account.
For another look at film resolution with a rather better scanner, look at
http://www.users.uswest.net/~rnclark/scandetail.htm
FJM
The D# pictures don't show out of focus effects or camera movement. They were scanned by a Nikon LS-2000 if I'm reading the caption right. I've never encountered the beast but it does seem to resolve the grain. Look at the blue sky, for instance.
The grain of the 200 is larger and clumpier than the 100 and both show well. I think there wasn't a lot more in those 35mm camera images to be brought out.
Clark's stuff is very interesting. It sure proves that you can get more detail from a good scan, but my own experience with PhotoCD's is more like the D# results than like Clark's. We all can't own a drum scanner so the practical equivalent is more like the $1000 film scanner.
If that affordable gizmo won't make our film efforts much better than a 3 megapixel digital camera, why bother? No processing flaws, emulsion dings, scratches, fingerprints, dust or spit on a digital picture (don't ya hate it when you were just trying to blow off that eyelash and...).
I believe there are errors in this:
"The human eye resolves about 6 to 8 lines per mm from a reading distance of about 10 inches. A line means light "up, down, up, down", not just two adjacent
pixels. That is 12 to 16 pixels/mm or 300 to 400 pixels/inch at about 10 inches. These are image pixels, not printer pixels. An ink jet printer needs several
times more printer pixels (perhaps only a couple more with ink blending technology)."
A "line pair" is one black, one white. One black "line", one white "line". (Digital cameras are often expressed in ppi as if they were equivalent but they're not, by half.)
An old rule of thumb I've had for years is this: the human eye resolves 50 line pairs per degree at 20/20 maximum and that's about 250 pixels per inch at ten inches in front of your eye. I'm sure 400 pixels crammed into that inch will look fine, just don't expect to actually READ the fine print. My own vision, corrected, bears this out. Still, some people may be able to resolve detail this small but how often do you see the dots of a 300 dot screen in a fine litho at 10 inches away?
He's right about the 80 line pairs per mm of film. Another rule of thumb I've been carrying says Ektachrome 100 has about 50 line pairs per mm, tops. Film is measured right out to the limits of visible change at the point an MTF curve hits zero. A better measure might be at some specified contrast minimum like 25% or so, wherever your eye starts to give up on fine detail. Digital imaging systems hit a hard wall at the pixel size but do better than film systems up to that wall. It's why digital IMAX computer graphics can get away with being rendered at 'merely' 3072 pixels wide, and they are.
Clark makes reference to what I call the 300 ppi myth. Simply stated it implies that you need 300 pixels per inch on the paper for a picture to look photographic. It sure will, but so will one at 2/3 that pixel count and some at even less, depending on the subject, lighting, photographic intent, etc. You need the 300 dots on the screen to take the dot detail out of the visible range. You don't need to go beyond the visible to have a good photo. Or an excellent one, either.
The DOT SCREEN on a super fine lithograph is only 300 dots per inch. Since we can't see those dots on that screen at all with our eyes, do we still need to put dot sized resolvable detail into that screen to make an image look high quality? Sure, it WILL look high quality, don't get me wrong, it's just overkill. That last 2% of perceived "quality" just cost your budget 66% more storage space and handling difficulty at every step.
Film is far from dead but the 35mm variety is being encroached upon with great rapidity.
-iNova