starting the squable anew, film vs. digital

One problem with film is that it's actual resolution is indeterminate.
When the photosensitive chrystals are applied to the paper they are not
lined up in neet rows like in a CCD. The chrystals are randomly
distributed. The result of a random distribution is clumpyness. Some
areas have extra chrystals and other areas have fewer or none. This
affect is clearly visible when looking a photo of a blue sky. Some of my
35mm prints have really blotchy sky's.

I suppose the actual resolution of film would be the resolution at which
this clumpyness was averaged out and was not visible. In a picture of
just a blue sky the actual pixel equivelent may be only about 1M. It
would be better in areas of colours other than blue and higher detail.

Perhaps the real question to ask is what is the real pixel resolution of
35mm?
And how much longer will we care?
Well put. From Francis C.F.P.'s test shots it seems that 35mm is perhaps
25-35% higher in its detail count on fine line material. Not enough to
care. A good digital image will pound the koolaid out of a marginal 35
which is as it should be.

-iNova
 
Well, Joe. That got everyone quiet.
Cute, but no cigar. The Playboy Centerfold has been digitally
manipulated for a looooong time. Maybe not at the film plane, but in the
Photoshop.

But is the real question about a 3 page, gatefold image? Count the
number of those that appear in print every month worldwide. Couple of
hundred, maybe? Let's be generous and say a thousand.

Now count the number of photographs reproduced in all publications in the
same time period.

Now count the number of those that are digital.

Bye, bye film. You had a great run. I don't expect to see you around
much after 2005.

-iNova
Cool... I'm glad SOMEONE got my original point.
 
My question:

How is a Nikon D-1 (w/o any lenses) cheaper than the average good medium format film camera? Not to mention a D-!, a Nikon 990 PLUS a Sony F505?

Mike

Paul Caldwell wrote:
. . .

I have taken way too many shots from either a D1 or 990 or Sony 505 up to 11 x 14 . . .I stopped at medium format and feel that it is still the best
solution, but still more than I want to spend.
 
Yes Peter,

If your final output is going to be digital then of course you are wasting a step to have film as an intermediate. I have to admit that I cannot differentiate a quality scan from quality digital camera output on my monitor. Let us see the hard copies (and I am not talking hard copy of a scanned negative output onto an inkjet).

I love digital cameras, and I hold techies in the highest regard. However, I believe my own eye over some technical explanation for why the flat monochromatic digital sky is actually superior to the subtle detailed (even if not 100% accurate) sky that shows up in my prints and slides.

While I seldom print larger than 8X10, I often crop before printing that size. Digital sky starts to show its heritage there.

P.S. Is that APS film? I don't think they make Velvia in APS.
I kind of get a kick out of all the techie explanations for why I should
throw my film camera away. Rather than reading Discovery, why not
perform a simple test. Get a good digital camera and a good film camera
(with good film) and take a picture of the sky. Print both images at
8X10. Which has more subtle variation?

I love my S-10, and ALWAYS have it with me. I even take it when I go out
with the express purpose of shooting film. I have shot almost 3,000mb
worth of digital pics in the last year (and about 150 rolls of film).

Nobody is more eager than I am for digital to match film (in some ways it
is already better). When a D-1 costs a thousand dollars, I will likely
put my film camera away, as the d-1 appears to have the amount of control
I desire, and the amount of quality my eye finds pleasing.
Which is film and which is digital:











You have ten seconds.

Is that your final answer?

If you got it right, you win.

-iNova

Hint: The clear sky. Look for clarity in the sky.
Now what would film grain look like in a clear sky?

Hint2: Three of them are film.

Hint3: Run the cursor over the image and the data line
at the bottom of your browser may contain another clue.
 
From the top, #1, #2, and $4 have the look of film grain in the sky (~ 20 sec.). Were we supposed to vote?

Joe Kurkjian
I kind of get a kick out of all the techie explanations for why I should
throw my film camera away. Rather than reading Discovery, why not
perform a simple test. Get a good digital camera and a good film camera
(with good film) and take a picture of the sky. Print both images at
8X10. Which has more subtle variation?

I love my S-10, and ALWAYS have it with me. I even take it when I go out
with the express purpose of shooting film. I have shot almost 3,000mb
worth of digital pics in the last year (and about 150 rolls of film).

Nobody is more eager than I am for digital to match film (in some ways it
is already better). When a D-1 costs a thousand dollars, I will likely
put my film camera away, as the d-1 appears to have the amount of control
I desire, and the amount of quality my eye finds pleasing.
Which is film and which is digital:











You have ten seconds.

Is that your final answer?

If you got it right, you win.

-iNova

Hint: The clear sky. Look for clarity in the sky.
Now what would film grain look like in a clear sky?

Hint2: Three of them are film.

Hint3: Run the cursor over the image and the data line
at the bottom of your browser may contain another clue.
 
Of course, we're only supposed to be marveling at how the digital skies are ever so much more grain free that your unspecified and unknown APS film choice(s) and magnifications, but I also couldn't help but notice a bit more saturation in the film renditions - for example, detectable in the diagonally woody (?) stuff above the "KENWOOD". But since we're playing YOUR game, it's not known if that's actually closer to truth or not. But then, no photography - film or digital - is truth, only a representation. It's all what the photographer would rather see.

Also detectable in the three film versions is greater modeling and resolution in the gridded slatting below the "KENWOOD". Although, I admit, the apparent greater resolution could be simply because the film versions are enlarged somewhat - perhaps to emphasize the overall grain. The modeling may not be due to the greater display size of the film, however; there's not even a hint of modeling in the flat digital renditions of the same area.

And do my eyes deceive me, or is the D1 really getting the left face of the gridded region below the "KENWOOD" subtly WRONG? Perhaps due to the CCD pixels beating asynchronously against the grid. But then I'm looking at all this on a laptop LCD, not the best way to critically view anything. But, oh so digital...

Like I seem to recall saying, seems like ages ago, both film AND digital have strengths and weaknesses. Deal with it.
Which is film and which is digital:











You have ten seconds.

Is that your final answer?

If you got it right, you win.

-iNova

Hint: The clear sky. Look for clarity in the sky.
Now what would film grain look like in a clear sky?

Hint2: Three of them are film.

Hint3: Run the cursor over the image and the data line
at the bottom of your browser may contain another clue.
 
One problem with film is that it's actual resolution is indeterminate.
When the photosensitive chrystals are applied to the paper they are not
lined up in neet rows like in a CCD. The chrystals are randomly
distributed. The result of a random distribution is clumpyness. Some
areas have extra chrystals and other areas have fewer or none. This
affect is clearly visible when looking a photo of a blue sky. Some of my
35mm prints have really blotchy sky's.

I suppose the actual resolution of film would be the resolution at which
this clumpyness was averaged out and was not visible. In a picture of
just a blue sky the actual pixel equivelent may be only about 1M. It
would be better in areas of colours other than blue and higher detail.

Perhaps the real question to ask is what is the real pixel resolution of
35mm?
And how much longer will we care?
Well, how long anyone cares is entirely up to them - I'd rather be taking
pictures, and using whatever medium my mood requires at the moment (and
whatever medium consumer society allows to be available to me).

On a couple of your other points...

Randomness in an imaging material is not necessarily a bad thing. It can
allow a lot more realism (and be a lot easier for the eye and mind to
tune out, if visible) than a nicely organized cartesian substructure.

As far as the pixel resolution of 35mm film, if you are honestly
interested, and you don't mind doing a bit of open-minded reading, you
can find one assessment here:

http://www.luminous-landscape.com/pixels_vs_film.htm
Interesting article. Like you said, it is a matter of what you like to see.

With regard to randomness - the human perception, on the most basic level does not like randomness. The human visual circuit is specifically wired to reject randomness and produce order. This must be so to have survived to our present state. In fact the human perception is so keen on imparting it's own opinion on reality that it is capable of adding, deleting or rearranging the images of objects that our eyes recieve. In real time our heads contain more powerfull photo editing software than every copy of PC based photo editing software combined.

In world war II there were ships painted with jagged lines and highly contrasting colours. Easy to see on the high seas... No. The subconcious image processing would eliminate the ship from conscious awareness because the image did not make sense. How could such a jagged image possibly be present on a rolling sea scape? A.K A. the 'somebody elses problem field' of Hitchhikers Guide to the Galaxy.

If human perception is so resistant to inconsistency then why would randomness be perceptually better. If you are just going to tune it out then why have it there? Surely in some instances an image would look better with graininess - historically true in many instances. Wouldn't you rather have the ability to decide when that was appropriate.

With film I am stuck with what I get. With digital I can use the "SIMULATE CLUMPY RANDOMNESS WITH UNCERTAIN RESOLUTION AND EDGES" filter if I like. And I get much better colour with my own printer than Kodak or Fuji ever gave me in film.

Randomness on the quantum-mechanical level is a necessary nature of the universe but on the human scale the mind does not percieve things that way. Why would I try to force it to? Why not work along with the minds perception tendencies and achieve greater clarity in imaging?
 
One problem with film is that it's actual resolution is indeterminate.
When the photosensitive chrystals are applied to the paper they are not
lined up in neet rows like in a CCD. The chrystals are randomly
distributed. The result of a random distribution is clumpyness. Some
areas have extra chrystals and other areas have fewer or none. This
affect is clearly visible when looking a photo of a blue sky. Some of my
35mm prints have really blotchy sky's.

I suppose the actual resolution of film would be the resolution at which
this clumpyness was averaged out and was not visible. In a picture of
just a blue sky the actual pixel equivelent may be only about 1M. It
would be better in areas of colours other than blue and higher detail.

Perhaps the real question to ask is what is the real pixel resolution of
35mm?
And how much longer will we care?
Well, how long anyone cares is entirely up to them - I'd rather be taking
pictures, and using whatever medium my mood requires at the moment (and
whatever medium consumer society allows to be available to me).

On a couple of your other points...

Randomness in an imaging material is not necessarily a bad thing. It can
allow a lot more realism (and be a lot easier for the eye and mind to
tune out, if visible) than a nicely organized cartesian substructure.

As far as the pixel resolution of 35mm film, if you are honestly
interested, and you don't mind doing a bit of open-minded reading, you
can find one assessment here:

http://www.luminous-landscape.com/pixels_vs_film.htm
Interesting article. Like you said, it is a matter of what you like to see.

With regard to randomness - the human perception, on the most basic level
does not like randomness. The human visual circuit is specifically wired
to reject randomness and produce order. This must be so to have survived
to our present state. In fact the human perception is so keen on
imparting it's own opinion on reality that it is capable of adding,
deleting or rearranging the images of objects that our eyes recieve. In
real time our heads contain more powerfull photo editing software than
every copy of PC based photo editing software combined.

In world war II there were ships painted with jagged lines and highly
contrasting colours. Easy to see on the high seas... No. The
subconcious image processing would eliminate the ship from conscious
awareness because the image did not make sense. How could such a jagged
image possibly be present on a rolling sea scape? A.K A. the 'somebody
elses problem field' of Hitchhikers Guide to the Galaxy.

If human perception is so resistant to inconsistency then why would
randomness be perceptually better. If you are just going to tune it out
then why have it there? Surely in some instances an image would look
better with graininess - historically true in many instances. Wouldn't
you rather have the ability to decide when that was appropriate.

With film I am stuck with what I get. With digital I can use the
"SIMULATE CLUMPY RANDOMNESS WITH UNCERTAIN RESOLUTION AND EDGES" filter
if I like. And I get much better colour with my own printer than Kodak
or Fuji ever gave me in film.

Randomness on the quantum-mechanical level is a necessary nature of the
universe but on the human scale the mind does not percieve things that
way. Why would I try to force it to? Why not work along with the minds
perception tendencies and achieve greater clarity in imaging?
I won't say I entirely disagree with you, I do not at all. And as the link you read stated, with respect to toleration of film graininess "your mileage may vary". But I also know that if a person shoots with Fuji 100 ASA Provia 100F slide film, they will not see any discernible grain - not even at large projections. That is what I shoot with - when I am shooting film. And one small advantage of using film is that some VERY fundamental picture taking characteristics can be changed by using the contents of a different yellow, or green, or ... box.

But I don't really want to fall in the trap of saying film is better than digital or vice-versa. The honest truth is (though I might get some small enjoyment out of playing "devil's advocate") that each has some advantages over the other. However, what I heartily object to is someone stating that either film or digital is categorically better than the other. Ultimate image quality is made up of MANY factors, color purity, lack (or presence) of grain, color contour, color differentiation, resolution, contrast, latitude (shadow detail, highlight detail), saturation, lens optical factors, amount of flare, amount of undesired color effects, ... If digital eventually exceeds film in all ways, and it may well, it won't get there if people only desire it to excel in one attribute. The thing about light is it's made up of a spectrum of colors. Photography is still made up of a spectrum of technologies.

If people see digital as already being better than film - well, perhaps for their own uses and from their own personal experience, it is. And that's fine. But some people who state that should know better; they have the credentials to understand that digital's only better in some areas. And from MY experience,digital's not even close to replacing my slide projector. Not even close. Pixels aren't there yet, digtal projection technology isn't there yet. Grain? I'm not even sure if I know the meaning of the word - certainly don't see any with Provia 100F or Velvia. And film IS a moving target, keeps getting better and better.

Okay, one final word on randomness vs. regularity underlying an imaging medium. Inkjet printers use (a very fine, calculated) randomness. Newspapers use a very regular half-tone. That's the kind of thing I had in mind. If film grain appeared in a regular matrix, it would be ugly indeed. You likely know what I mean, but here's my best shot at an example:

http://www.sirius.com/~johnkyrk/conservatory.html

Shoot digital, I do. Shoot it all you want; that's okay. But it's a big world and film still has a place in the picture. Film may eventually die of neglect. But when that day comes, the world might truly be a poorer place. And if that day comes before digital has figured out how to exceed it in ALL areas...
 
One problem with film is that it's actual resolution is indeterminate.
When the photosensitive chrystals are applied to the paper they are not
lined up in neet rows like in a CCD. The chrystals are randomly
distributed. The result of a random distribution is clumpyness. Some
areas have extra chrystals and other areas have fewer or none. This
affect is clearly visible when looking a photo of a blue sky. Some of my
35mm prints have really blotchy sky's.

I suppose the actual resolution of film would be the resolution at which
this clumpyness was averaged out and was not visible. In a picture of
just a blue sky the actual pixel equivelent may be only about 1M. It
would be better in areas of colours other than blue and higher detail.

Perhaps the real question to ask is what is the real pixel resolution of
35mm?
And how much longer will we care?
Well, how long anyone cares is entirely up to them - I'd rather be taking
pictures, and using whatever medium my mood requires at the moment (and
whatever medium consumer society allows to be available to me).

On a couple of your other points...

Randomness in an imaging material is not necessarily a bad thing. It can
allow a lot more realism (and be a lot easier for the eye and mind to
tune out, if visible) than a nicely organized cartesian substructure.

As far as the pixel resolution of 35mm film, if you are honestly
interested, and you don't mind doing a bit of open-minded reading, you
can find one assessment here:

http://www.luminous-landscape.com/pixels_vs_film.htm
Interesting article. Like you said, it is a matter of what you like to see.

With regard to randomness - the human perception, on the most basic level
does not like randomness. The human visual circuit is specifically wired
to reject randomness and produce order. This must be so to have survived
to our present state. In fact the human perception is so keen on
imparting it's own opinion on reality that it is capable of adding,
deleting or rearranging the images of objects that our eyes recieve. In
real time our heads contain more powerfull photo editing software than
every copy of PC based photo editing software combined.

In world war II there were ships painted with jagged lines and highly
contrasting colours. Easy to see on the high seas... No. The
subconcious image processing would eliminate the ship from conscious
awareness because the image did not make sense. How could such a jagged
image possibly be present on a rolling sea scape? A.K A. the 'somebody
elses problem field' of Hitchhikers Guide to the Galaxy.

If human perception is so resistant to inconsistency then why would
randomness be perceptually better. If you are just going to tune it out
then why have it there? Surely in some instances an image would look
better with graininess - historically true in many instances. Wouldn't
you rather have the ability to decide when that was appropriate.

With film I am stuck with what I get. With digital I can use the
"SIMULATE CLUMPY RANDOMNESS WITH UNCERTAIN RESOLUTION AND EDGES" filter
if I like. And I get much better colour with my own printer than Kodak
or Fuji ever gave me in film.

Randomness on the quantum-mechanical level is a necessary nature of the
universe but on the human scale the mind does not percieve things that
way. Why would I try to force it to? Why not work along with the minds
perception tendencies and achieve greater clarity in imaging?
I won't say I entirely disagree with you, I do not at all. And as the
link you read stated, with respect to toleration of film graininess "your
mileage may vary". But I also know that if a person shoots with Fuji 100
ASA Provia 100F slide film, they will not see any discernible grain - not
even at large projections. That is what I shoot with - when I am
shooting film. And one small advantage of using film is that some VERY
fundamental picture taking characteristics can be changed by using the
contents of a different yellow, or green, or ... box.

But I don't really want to fall in the trap of saying film is better than
digital or vice-versa. The honest truth is (though I might get some
small enjoyment out of playing "devil's advocate") that each has some
advantages over the other. However, what I heartily object to is someone
stating that either film or digital is categorically better than the
other. Ultimate image quality is made up of MANY factors, color purity,
lack (or presence) of grain, color contour, color differentiation,
resolution, contrast, latitude (shadow detail, highlight detail),
saturation, lens optical factors, amount of flare, amount of undesired
color effects, ... If digital eventually exceeds film in all ways, and
it may well, it won't get there if people only desire it to excel in one
attribute. The thing about light is it's made up of a spectrum of
colors. Photography is still made up of a spectrum of technologies.

If people see digital as already being better than film - well, perhaps
for their own uses and from their own personal experience, it is. And
that's fine. But some people who state that should know better; they
have the credentials to understand that digital's only better in some
areas. And from MY experience,digital's not even close to replacing my
slide projector. Not even close. Pixels aren't there yet, digtal
projection technology isn't there yet. Grain? I'm not even sure if I
know the meaning of the word - certainly don't see any with Provia 100F
or Velvia. And film IS a moving target, keeps getting better and better.

Okay, one final word on randomness vs. regularity underlying an imaging
medium. Inkjet printers use (a very fine, calculated) randomness.
Newspapers use a very regular half-tone. That's the kind of thing I had
in mind. If film grain appeared in a regular matrix, it would be ugly
indeed. You likely know what I mean, but here's my best shot at an
example:

http://www.sirius.com/~johnkyrk/conservatory.html

Shoot digital, I do. Shoot it all you want; that's okay. But it's a
big world and film still has a place in the picture. Film may eventually
die of neglect. But when that day comes, the world might truly be a
poorer place. And if that day comes before digital has figured out how
to exceed it in ALL areas...
It probably will be quite some time before digital exceeds film in all areas. Especially, as you say, when compared to ASA100 slide film. Projected slides offer the best viewing image compared to any other method of viewing today (IMO). For everyday use most are using 400 nowadays which does have more noticable graininess. Being able to set my own printing paramaters is what I think sets digital way ahead of film - unless you are a proffessional with your own complete colour darkroom.

I may be a little bit zealous on digital right now because my digital camera so exceeded my expectations. Compared to my own prints commercial film prints just don't cut it - and my printer (a BJC6000) is not even one that is favoured for photo printing by most (apperently).

Perhaps film in the right hands with the right equipement is superior to digital for now and probably for years to come. My feeling is that digital is a better concept for image recording with potential to see things as we never have before.
 
Interesting observations. But I can take all the same arguments as prove that randomness is better. Since the mind rejects randomness, it acts as a filter to remove noise that don't belong there. An example is, to represent 50% grey, is a checker board better or a random 50% dither pattern better? One can argue that the dither pattern is better because the pattern doesn't stand out and catch the eye as much.

I don't think there is a right answer, as long as you can see the dots, some people prefers regular patterns and some prefers dithering. If shouldn't be long before the dots are small enough that it won't matter any more.

gordon
----------------------------
Interesting article. Like you said, it is a matter of what you like to see.

With regard to randomness - the human perception, on the most basic level
does not like randomness. The human visual circuit is specifically wired
to reject randomness and produce order. This must be so to have survived
to our present state. In fact the human perception is so keen on
imparting it's own opinion on reality that it is capable of adding,
deleting or rearranging the images of objects that our eyes recieve. In
real time our heads contain more powerfull photo editing software than
every copy of PC based photo editing software combined.

In world war II there were ships painted with jagged lines and highly
contrasting colours. Easy to see on the high seas... No. The
subconcious image processing would eliminate the ship from conscious
awareness because the image did not make sense. How could such a jagged
image possibly be present on a rolling sea scape? A.K A. the 'somebody
elses problem field' of Hitchhikers Guide to the Galaxy.

If human perception is so resistant to inconsistency then why would
randomness be perceptually better. If you are just going to tune it out
then why have it there? Surely in some instances an image would look
better with graininess - historically true in many instances. Wouldn't
you rather have the ability to decide when that was appropriate.

With film I am stuck with what I get. With digital I can use the
"SIMULATE CLUMPY RANDOMNESS WITH UNCERTAIN RESOLUTION AND EDGES" filter
if I like. And I get much better colour with my own printer than Kodak
or Fuji ever gave me in film.

Randomness on the quantum-mechanical level is a necessary nature of the
universe but on the human scale the mind does not percieve things that
way. Why would I try to force it to? Why not work along with the minds
perception tendencies and achieve greater clarity in imaging?
 
Perhaps it depends how you feel about photography. Is it a record of a scene or an artistic rendition? Obviously photography has always been a bit of both. A good photographer has to be part scientist, and part artist.

I don't think it is a stretch to say that the even distribution of pixels in digital makes it more of a record than an art on the small scale of pixels. This does not mean that the picture as a whole is not an art - just on the level of recording the smallest bits of information on a scene. The choice to create art on the small scale can be made after and at the same time the record can remain intact.

Film on the small scale is pure art. There is no such thing as acurate placement of information because the colour/light sensitive chrystals are randomly distributed. Colour depth is virtually analogue because the chrystals contain so many molecules within each chrystal. Chrystal distribution is not even close to analog, rather it is randomly placed chrystal/pixels. This has worked because the individual chrystals are too small to reveal there presence to the casual observer. What we usually percieve as graininess in film is actually clumps of chrystals. The actual chrystals are much smaller.

Film is often refered to as an analog medium, which is sited as the reason it is superior to digital. But what is analog. It is "of, relating to, or being a mechanism in which data is represented by continuously variable physical quantities" according to Mariam Webster. In the real world such a thing does not exist. No matter how many levels of colour a chrystal can represent there are only so many molecules and therefore there is a descrete and set number of levels of intensity that can be represented. Even a potentiometer in electronics is not fully anolog - as it is adjusted the device favours certain resistance values due to bumps in the contact surface. Sometimes you just can't tune it to the value you need because the wiper wants to settle in a valley just above or just below the required value. This is the classic 'analog' device in electronics.

I think the real key to analog is: are there too many levels of variation for a human to percieve. In film the colour depth is clearly beyond human perception. It could be argued that digital too is anolog because the difference between levels of colour are imperceptable - at least to normal people. When I print a digital camera photo they do not know it is from digital until I tell them. If someone has to be told to know it is digital then it is as-good-as analog. And since there is really no thing as analog as-good-as analog is analog. (say that five times fast.)

This is an exciting time for digital - as exciting as the early days of film I am sure. With film I am seeing a peak. With digital I am seeing expanding possibilities and the ability to do properly myself what commercial film developers have failed to do for me in the past.
I don't think there is a right answer, as long as you can see the dots,
some people prefers regular patterns and some prefers dithering. If
shouldn't be long before the dots are small enough that it won't matter
any more.

gordon
----------------------------
Interesting article. Like you said, it is a matter of what you like to see.

With regard to randomness - the human perception, on the most basic level
does not like randomness. The human visual circuit is specifically wired
to reject randomness and produce order. This must be so to have survived
to our present state. In fact the human perception is so keen on
imparting it's own opinion on reality that it is capable of adding,
deleting or rearranging the images of objects that our eyes recieve. In
real time our heads contain more powerfull photo editing software than
every copy of PC based photo editing software combined.

In world war II there were ships painted with jagged lines and highly
contrasting colours. Easy to see on the high seas... No. The
subconcious image processing would eliminate the ship from conscious
awareness because the image did not make sense. How could such a jagged
image possibly be present on a rolling sea scape? A.K A. the 'somebody
elses problem field' of Hitchhikers Guide to the Galaxy.

If human perception is so resistant to inconsistency then why would
randomness be perceptually better. If you are just going to tune it out
then why have it there? Surely in some instances an image would look
better with graininess - historically true in many instances. Wouldn't
you rather have the ability to decide when that was appropriate.

With film I am stuck with what I get. With digital I can use the
"SIMULATE CLUMPY RANDOMNESS WITH UNCERTAIN RESOLUTION AND EDGES" filter
if I like. And I get much better colour with my own printer than Kodak
or Fuji ever gave me in film.

Randomness on the quantum-mechanical level is a necessary nature of the
universe but on the human scale the mind does not percieve things that
way. Why would I try to force it to? Why not work along with the minds
perception tendencies and achieve greater clarity in imaging?
 
David S wrote:
First of all, "crystal" has no "h" in it.
This is an exciting time for digital - as exciting as the early days of
film I am sure. With film I am seeing a peak. With digital I am seeing
expanding possibilities and the ability to do properly myself what
commercial film developers have failed to do for me in the past.
While I agree that digital has expanding possibilities (due partly to its low current point, its hard to go anywhere but up), I would strongly disagree that film has reached its peak. The latest issue of Discover has an article that talks about a new method of improving the resolution (or speed) of film by a factor of 5, with the potential of further improvements to come. Imagine film with the grain of current ASA 50 film and the speed of current ASA 400 film.

I agree that digital can give great convenience, and is sufficient for many things, but for my own purposes it just hasn't gotten there yet.

As for doing things properly myself, the only digital images I've seen that were indistinguishable from chemical prints WERE chemical prints, exposed from a digital file by LEDs onto photographic paper (and the machine is NOT consumer-level). Any inkjet/dyesub print that I've seen betrayed itself quite easily by a simple examination of light-coloured areas.

Chris
 
I once had a boss who said, "You know, I never met a programmer who could spell with a damm, and you can't spell with the best of them".

I am a computer geek, I just didn't take the time to type it in a word processer and then cut/paste.
Spelling misyakes bother me a little ;-))
 
according to the artical, after the development has taken place, each crystal is either black (for b&w, 'full' primary color for color) or clear. one state or the other is binary. to make a half tone with a binary system, you have to have a matrix. 3x3 is about as small as people want to use. (9 bits give 512 levels of gray).
They both got strengths, they both got weaknesses, baby cakes. Deal with
it.
If you have any doubts about 3 MP camers being equal to 35MM film, read
Discovery mag, August 2000, 'The chemistry of Photography' page 24 -27.
Film is not analog, its binary digital!!. It takes at least a 3x3 matrix
of crystals to give a 'gray scale'. That's a 9 division on the number of
crystals to give a 512 level gray scale. Cells (pixels) in CCDs and
CMOS sensors are analog with more possible bits per cell (the a-d
converter resolution).

The artical is wrong about the 'best' digital cameras, but we can forgive
them this time.

Yes it is possable to capture a bright reflection off a thin wire that
would show up in a 35mm film, but you could not tell much from it because
you would not have enough information about the wire, only that it was
there. In the digital world, you wouldn't see the wire until the number
of pixels went up 2 or 3 fold, but when you did, you would know more
than that it was just there.

Yes it will be another few years before digital surpass film completely
and by a wide enough margin that film is left to history buffs, but will
come and sooner, not later. We are standing on the edge now!!!
The 3x3 matrix of crystal to give a gray scale? Oh no, don't confuse the
half tone technology used in OFF-SET printing & inkjet printing with the
layered emulsion technology of film. There are a lot of books of both
film & digital technology in public library, have to read them in depth
before drawing your own conclusion.

Francis C.F.P.
 
half just received a new pair of glasses and wearing them for several hours after wearing contacts for many years, I noticed all kinds of light distortions that I never remember having before. purple wire/tree limbs/ anything small is a problem with the lens and it gets worse the farther away from the center of the lens you go. It will happen with film as well as digital.

(the world is very warped with glasses and -6 diopter corrections)
With a slightly larger wire, or a tree limb, the 35mm camera would show
the actual color of the wire or limb.

With a digital camera, it'll probaby be purple.

That's my biggest complaint. I've now tried 3 digital cameras (Epson
3000z, Olympus 2500L, Nikon 990), and I can't seem to take outdoor photos
that include treetops against a bright sky, without seeing blue/purple
leaves and limbs. Same thing for powerlines against a bright sky -- lots
of purple.

Although I don't plan on taking photos of thin wires or powerlines, so I
could care less about them. However, I do care about taking photos of
landscapes, and the quality of the existing digicams is not up to my
expecations.

When digital cameras get around this blue/purple problem, give you the
same exposure/focus capabilities of 35mm cameras, at a reasonable price,
letting me print an 11x14 or larger print, then I'll be happy....

The 3.3MP images are detailed enough for me, with the exception of the
blue/purple fringing... That's my biggest complaint, and they all seem to
have this problem now.
If you have any doubts about 3 MP camers being equal to 35MM film, read
Discovery mag, August 2000, 'The chemistry of Photography' page 24 -27.
Film is not analog, its binary digital!!. It takes at least a 3x3 matrix
of crystals to give a 'gray scale'. That's a 9 division on the number of
crystals to give a 512 level gray scale. Cells (pixels) in CCDs and
CMOS sensors are analog with more possible bits per cell (the a-d
converter resolution).

The artical is wrong about the 'best' digital cameras, but we can forgive
them this time.

Yes it is possable to capture a bright reflection off a thin wire that
would show up in a 35mm film, but you could not tell much from it because
you would not have enough information about the wire, only that it was
there. In the digital world, you wouldn't see the wire until the number
of pixels went up 2 or 3 fold, but when you did, you would know more
than that it was just there.

Yes it will be another few years before digital surpass film completely
and by a wide enough margin that film is left to history buffs, but will
come and sooner, not later. We are standing on the edge now!!!
 
David,

I guess I'm a confused person. I see photography as both an artistic rendition and a record of a scene. But if I have to chose, I'll give up some truth for a better picture (i.e. digital darkroom).

To add to the confusion, I also 'see' and analyze a picture in both spatial and frequency domain. Given a flat field, 50% grey for example, even distribution of pixels is actually not as good a record as randomly dithered dots at the same resolution. In frequency domain, the evenly distributed dots has all the energy (noise) in one specific frequency. Your eye is very sensitive to that. Dithered pattern has the energy evenly distributed over wide spectrum, but the magnitude of each peak is much lower.

In spatial domain, it's a matter of taste. Some people prefer the look of uniform dot, others dithered dot. But natural scenes seldom contain uniformed pattern at the max resolution, and the eyes do pick out repeating patterns that don't belong quick easily. So I'll argue that random dots is actually more accurate. A good example would be to take a photo and convert it to bitmap in diffused dither vs. half tone. You may argue which one looks better but I think dither looks more natural. Evenly distributed dots and randomly dither dots both contain noise, neither is more accurate, it's just a game of how to distribute the noise.

Sorry about so much geek talk. Hopfully the resolution can be high enough soon so none of this matters.

gordon
------------------------------------------
Perhaps it depends how you feel about photography. Is it a record of a
scene or an artistic rendition? Obviously photography has always been a
bit of both. A good photographer has to be part scientist, and part
artist.

I don't think it is a stretch to say that the even distribution of pixels
in digital makes it more of a record than an art on the small scale of
pixels. This does not mean that the picture as a whole is not an art -
just on the level of recording the smallest bits of information on a
scene. The choice to create art on the small scale can be made after and
at the same time the record can remain intact.
 
Chris,

Firstly, keep the pedantic to youself. All of us here have forgiven so many mispelled words and poorly assembled sentences from everyone because the content is what we are here for. This is a cheap shot, Chris, and not worthy of this place.

But maybe you intended this in a more lite hearted fashion than it reads to me. If so I resolve to not be offended and forgive.

Anyway...

Wow, a five fold increase in resolution would be incredible. Of course it would not likely have happened now if digital was not competing. The big advantage of film here is that you can improve resolution by using different film and not have to replace the entire camera everytime like digital. (to the limits of the lense of course.)

When I said digital prints were indistinguishable from photographic prints I hope I said, or at least hinted, to the casual observer. The majority of the people reading this can see the difference, but the people who you took you photo's for: your family, friends, most clients cannot tell.
The big problem is of course the lite coloured areas, which can look awful.
This is an exciting time for digital - as exciting as the early days of
film I am sure. With film I am seeing a peak. With digital I am seeing
expanding possibilities and the ability to do properly myself what
commercial film developers have failed to do for me in the past.
While I agree that digital has expanding possibilities (due partly to its
low current point, its hard to go anywhere but up), I would strongly
disagree that film has reached its peak. The latest issue of Discover
has an article that talks about a new method of improving the resolution
(or speed) of film by a factor of 5, with the potential of further
improvements to come. Imagine film with the grain of current ASA 50 film
and the speed of current ASA 400 film.

I agree that digital can give great convenience, and is sufficient for
many things, but for my own purposes it just hasn't gotten there yet.

As for doing things properly myself, the only digital images I've seen
that were indistinguishable from chemical prints WERE chemical prints,
exposed from a digital file by LEDs onto photographic paper (and the
machine is NOT consumer-level). Any inkjet/dyesub print that I've seen
betrayed itself quite easily by a simple examination of light-coloured
areas.

Chris
 
David,

I guess I'm a confused person. I see photography as both an artistic
rendition and a record of a scene. But if I have to chose, I'll give up
some truth for a better picture (i.e. digital darkroom).
Hopefully our confusion level is being reduced by all of this.

Yes, you can alter the truth in the digital darkroom to create a better picture. But I want the raw data to be as close to the truth as possible and decide after if this is alteration necessary and what alteration to perform. With film the choice is taken away to some extent. There is dithering whether you like it or not.
To add to the confusion, I also 'see' and analyze a picture in both
spatial and frequency domain. Given a flat field, 50% grey for example,
even distribution of pixels is actually not as good a record as randomly
dithered dots at the same resolution. In frequency domain, the evenly
distributed dots has all the energy (noise) in one specific frequency.
Your eye is very sensitive to that. Dithered pattern has the energy
evenly distributed over wide spectrum, but the magnitude of each peak is
much lower.
Interesting observations.

With a uniform 50% grey displayed on my monitor there is no visible pattern unless I put my nose to the screen. What I see is a uniform field of grey with no visible pattern whatsoever.

When I print out that uniform grey the printer creates it through a pattern of dots which is basicly a dithered pattern. With a uniform grey the digital printer would print the exact same uniform dithered pattern into an 8X10 whether I sent the printer a million pixels or 2 pixels. If I attempted to print two grey crystal/pixels of film I would get a grey blotch. On a uniform grey field the digital printers behaves more fractally and produces the same uniform output.
In spatial domain, it's a matter of taste. Some people prefer the look of
uniform dot, others dithered dot. But natural scenes seldom contain
uniformed pattern at the max resolution, and the eyes do pick out
repeating patterns that don't belong quick easily. So I'll argue that
random dots is actually more accurate. A good example would be to take a
I do not see how random can be more acurate. Looks like an oxymoron to me. Maybe you can explain this more fully?
photo and convert it to bitmap in diffused dither vs. half tone. You may
argue which one looks better but I think dither looks more natural.
Evenly distributed dots and randomly dither dots both contain noise,
neither is more accurate, it's just a game of how to distribute the noise.
Both dither and pixel matrix can look pretty awful if you look too close. Looking too close is either a curse on us or a blessing for future photographers.

Again, I just feel that a major advantage of digital is the ability to introduce these affects afterwards if we so choose. The crystal distribution is just not controlable and the affect of them is present whether we like it or not.

I also feel that the resolution of film is overstated, for example by calling it analog which it is not - closer than digital for sure - but not analog. Comparisons are judged by number of crystals in the emulsion per unit area. But a single crystal cannot represent the full range of colour of a particular location any more than a single bit can represent the full depth of colour. A mass of different colour crystals has to come into play to represent the colour depth that can exist in just one 24bit+ digital pixel.

The problem we are having here is that the resolution of digital cameras is clearly quantified. The randomness of crystal distribution makes film resolution not as easily determined. Just as the weather man cannot determine tomorrows weather.
Sorry about so much geek talk. Hopfully the resolution can be high enough
soon so none of this matters.
Geeks all :o)
gordon
------------------------------------------
Perhaps it depends how you feel about photography. Is it a record of a
scene or an artistic rendition? Obviously photography has always been a
bit of both. A good photographer has to be part scientist, and part
artist.

I don't think it is a stretch to say that the even distribution of pixels
in digital makes it more of a record than an art on the small scale of
pixels. This does not mean that the picture as a whole is not an art -
just on the level of recording the smallest bits of information on a
scene. The choice to create art on the small scale can be made after and
at the same time the record can remain intact.
 
Thanks for keeping this a civilized open discussion. I think we disagree partly due to a different background. I am looking at this from a DSP standpoint. I've been working on various digital video and audio systems and I analysis picture quality in both time and frequency domain. It may help to see my point if you can also think of it from the DSP angle. That said, I can certainly appreciate your view, after all, we are looking at photos and the subjective evaluation is the ultimate judge. I think I agree with you on most points, I too want the best raw image to be captured and believe in digital darkroom. I am not a defender of flim over digital, nor vinyl over CD.
With a uniform 50% grey displayed on my monitor there is no visible
pattern unless I put my nose to the screen. What I see is a uniform
field of grey with no visible pattern whatsoever.
This is a case where the resolution is high enough that the dots are merged into a flat field. At this resolution, if the dots are the same size but randomly distributed, you also would see a uniform field of grey.
I do not see how random can be more acurate. Looks like an oxymoron to
me. Maybe you can explain this more fully?
The only point I disagree on is the statement about uniform pattern is better than random distribution. My point is than random distribution can also be quantified and analysed with a mathmatical model. And if the resolution is comparable, the noise magnitude is also comparable, and the only difference is the distribution of noise. You may have subjective preference to the type of noise you get, but neither is more "accurate". In retrospect, the dithered bitmap example is not an accurate one, because, in flim, it's the sensors that are randomly distributed not the pixel content.

Sorry for another long post and I wish I can explain things a little better.
Both dither and pixel matrix can look pretty awful if you look too close.
Looking too close is either a curse on us or a blessing for future
photographers.
That's the point I want to make, that both patterns look awful if you look too close, i.e. that both contain noise if you zoom in far enough. Neither on is more accurate than the other.
Again, I just feel that a major advantage of digital is the ability to
introduce these affects afterwards if we so choose. The crystal
distribution is just not controlable and the affect of them is present
whether we like it or not.

I also feel that the resolution of film is overstated, for example by
calling it analog which it is not - closer than digital for sure - but
not analog. Comparisons are judged by number of crystals in the emulsion
per unit area. But a single crystal cannot represent the full range of
colour of a particular location any more than a single bit can represent
the full depth of colour. A mass of different colour crystals has to
come into play to represent the colour depth that can exist in just one
24bit+ digital pixel.
While crystal distribution is random, it can be modeled mathmatically and analysis just like any other digital system. I hope the 14M pixel type of number is taking that into account. I am not a flim professional, so I wouldn't even attemp to guess the resolution of flim.

Here another counter intuitive concept in signal processing. Pixel accuracy can be translated into spatial resolution. That's what halftoning is counting on, where an 8-bit pixel can be mapped into a NxN box of finer dots. The more bits we can extract from a crystal, the more effective resolution we can get out the the picture.

gordon
 
Well GC my brain hurts.

It is good to encounter such a unique and relevent perspective on this subject.

My only point to make at this time is that I do not believe a crystal, no matter what colour depth it carries is the equivalent of a digital pixel. Firstly the crystal only carries one colour while a digital pixel carries several - up to 4 in the case of CMYK or 3 for RGB.

If people are rating film resolution based on crystal count then the estimate is too high. If three crystal colour types are present then the actual count must be divided by three at least.

The problem with random systems is that the model is never accurate. The atmosphere is a random system that has eluded all attempts at accurate modeling - do we really know for sure if it will rain tomorrow?

Random itself is an interesting term - all it means is that we do not have enough information to determine the outcome. Some things have moved from the realm of random to quantifiable. Even randomness itself has a kind of wavefunction but predicting randomness can only be achieved in general terms. We may know that the water will flow downhill but we cannot predict the order in which the molecules will reach the bottom.

From the perspective of entropy: the higher the randomness, the lower the level of order. Lower order means less infomation in the system. The information in a digital image is easily understood but film contains less information than the sum of it's crystals the laws of entropy dictate that it must. Although film may have a higher 'real' resolution now you cannot equate a single crystal to a single digital pixel. It takes a group of crystals to equal the information in a pixel.

I am sure someone has a formula for calculating the level of order in a random field. This is probably the only way to truly compare the resolutions of film and digital.

Personal preferences for the appearence of the image, however, will probably never be quantized which is as it should be.
With a uniform 50% grey displayed on my monitor there is no visible
pattern unless I put my nose to the screen. What I see is a uniform
field of grey with no visible pattern whatsoever.
This is a case where the resolution is high enough that the dots are
merged into a flat field. At this resolution, if the dots are the same
size but randomly distributed, you also would see a uniform field of grey.
I do not see how random can be more acurate. Looks like an oxymoron to
me. Maybe you can explain this more fully?
The only point I disagree on is the statement about uniform pattern is
better than random distribution. My point is than random distribution can
also be quantified and analysed with a mathmatical model. And if the
resolution is comparable, the noise magnitude is also comparable, and the
only difference is the distribution of noise. You may have subjective
preference to the type of noise you get, but neither is more "accurate".
In retrospect, the dithered bitmap example is not an accurate one,
because, in flim, it's the sensors that are randomly distributed not the
pixel content.

Sorry for another long post and I wish I can explain things a little better.
Both dither and pixel matrix can look pretty awful if you look too close.
Looking too close is either a curse on us or a blessing for future
photographers.
That's the point I want to make, that both patterns look awful if you
look too close, i.e. that both contain noise if you zoom in far enough.
Neither on is more accurate than the other.
Again, I just feel that a major advantage of digital is the ability to
introduce these affects afterwards if we so choose. The crystal
distribution is just not controlable and the affect of them is present
whether we like it or not.

I also feel that the resolution of film is overstated, for example by
calling it analog which it is not - closer than digital for sure - but
not analog. Comparisons are judged by number of crystals in the emulsion
per unit area. But a single crystal cannot represent the full range of
colour of a particular location any more than a single bit can represent
the full depth of colour. A mass of different colour crystals has to
come into play to represent the colour depth that can exist in just one
24bit+ digital pixel.
While crystal distribution is random, it can be modeled mathmatically and
analysis just like any other digital system. I hope the 14M pixel type of
number is taking that into account. I am not a flim professional, so I
wouldn't even attemp to guess the resolution of flim.

Here another counter intuitive concept in signal processing. Pixel
accuracy can be translated into spatial resolution. That's what
halftoning is counting on, where an 8-bit pixel can be mapped into a NxN
box of finer dots. The more bits we can extract from a crystal, the more
effective resolution we can get out the the picture.

gordon
 

Keyboard shortcuts

Back
Top