What is digital photo

Not sure what the point of this thread really is but if you go back to basics at where the image is captured then film grain is digital
There is NOTHING digital about film grain.
Photon hits silver compound, it changes.

Photon does not hit silver compound, does not change.

Seems binary to me and that's a bit digital.
Nothing is ever digitally encoded or digitally processed.
Yes, it's very digitally encoded, just not on a medium you can transfer easily to the medium we're so familiar with today.
You have a weird concept of digital encoding.
and a sensor is analog in nature.
Light is analog and the sensor in the early stages of processing it convert light to digital with the ADC.
The pixels are little analog wells of electrons, and need an analog to digital converter to give it a measure that the computer can recognise. So it starts out analog.
No argument there. Light is analog and that is what it starts out being.
From that point forward the path could be digital or analog or a mixture of both but it all starts quite the reverse to what many think.
The processing is all digital between the ADC and the DAC to display the image.
The only part I consider digital is the digit I use to press the shutter button, all the rest doesn't matter how it was done. :-)
 
Not sure what the point of this thread really is but if you go back to basics at where the image is captured then film grain is digital
There is NOTHING digital about film grain.
Photon hits silver compound, it changes.

Photon does not hit silver compound, does not change.

Seems binary to me and that's a bit digital.
Nothing is ever digitally encoded or digitally processed.
Yes, it's very digitally encoded, just not on a medium you can transfer easily to the medium we're so familiar with today.
You have a weird concept of digital encoding.
Just going with the flow...er the DAC.
and a sensor is analog in nature.
Light is analog and the sensor in the early stages of processing it convert light to digital with the ADC.
The pixels are little analog wells of electrons, and need an analog to digital converter to give it a measure that the computer can recognise. So it starts out analog.
No argument there. Light is analog and that is what it starts out being.
From that point forward the path could be digital or analog or a mixture of both but it all starts quite the reverse to what many think.
The processing is all digital between the ADC and the DAC to display the image.
The only part I consider digital is the digit I use to press the shutter button, all the rest doesn't matter how it was done. :-)
 
Not sure what the point of this thread really is but if you go back to basics at where the image is captured then film grain is digital
There is NOTHING digital about film grain.
Photon hits silver compound, it changes.

Photon does not hit silver compound, does not change.

Seems binary to me and that's a bit digital.
Nothing is ever digitally encoded or digitally processed.
Yes, it's very digitally encoded, just not on a medium you can transfer easily to the medium we're so familiar with today.
You have a weird concept of digital encoding.
Just going with the flow...er the DAC.
The DAC is for decoding.
and a sensor is analog in nature.
Light is analog and the sensor in the early stages of processing it convert light to digital with the ADC.
The pixels are little analog wells of electrons, and need an analog to digital converter to give it a measure that the computer can recognise. So it starts out analog.
No argument there. Light is analog and that is what it starts out being.
From that point forward the path could be digital or analog or a mixture of both but it all starts quite the reverse to what many think.
The processing is all digital between the ADC and the DAC to display the image.
The only part I consider digital is the digit I use to press the shutter button, all the rest doesn't matter how it was done. :-)
 
Not sure what the point of this thread really is but if you go back to basics at where the image is captured then film grain is digital
There is NOTHING digital about film grain.
Photon hits silver compound, it changes.

Photon does not hit silver compound, does not change.

Seems binary to me and that's a bit digital.
Nothing is ever digitally encoded or digitally processed.
Yes, it's very digitally encoded, just not on a medium you can transfer easily to the medium we're so familiar with today.
You have a weird concept of digital encoding.
Just going with the flow...er the DAC.
The DAC is for decoding.
Maybe, I guess. It's for converting to analogue. Decoding / encoding is more about things done in the digital domain - from one code to another. We use codecs to compress and decompress digital video signals so they can be transferred faster, but everything stays in the digital domain. Converting digital to analogue involves convolution perhaps and some degree of smoothing, but doesn't necessarily need "decoding". At least that's how I've always seen it.

But nonetheless, try to go with the flow man, it's just a loose association. Lighten up. You're too afraid of getting your fake Rolex ripped off your wrist by mythological criminals.
 
Last edited:
Not sure what the point of this thread really is but if you go back to basics at where the image is captured then film grain is digital
There is NOTHING digital about film grain.
Photon hits silver compound, it changes.

Photon does not hit silver compound, does not change.

Seems binary to me and that's a bit digital.
Nothing is ever digitally encoded or digitally processed.
Yes, it's very digitally encoded, just not on a medium you can transfer easily to the medium we're so familiar with today.
You have a weird concept of digital encoding.
Just going with the flow...er the DAC.
The DAC is for decoding.
Maybe, I guess. It's for converting to analogue. Decoding / encoding is more about things done in the digital domain - from one code to another. We use codecs to compress and decompress digital video signals so they can be transferred faster, but everything stays in the digital domain. Converting digital to analogue involves convolution perhaps and some degree of smoothing, but doesn't necessarily need "decoding". At least that's how I've always seen it.
Decoding is essential to go from 1's and 0's to some analog level.
But nonetheless, try to go with the flow man, it's just a loose association. Lighten up. You're too afraid of getting your fake Rolex ripped off your wrist by mythological criminals.
 
First, let's start with some definitions and history.
It's a common misconception that dictionaries actually define the meanings of words: they don't. What they do is give what their lexicographers have understood to be the established meaning (or meanings) of words.

But a major problem, especially with technical words, is that the lexicographers don't really understand what they are dealing with. That is, conveniently for this discussion, illustrated by your first definition.
According to the Oxford Dictionary:

Photograph – A picture made using a camera, in which an image is focused on to light-sensitive material and then made visible and permanent by chemical treatment, or stored digitally.
The phrase I've underlined describes how a picture is made from a film photo but has nothing to do with a digital photo - a digital photo has electronic and computer treatment, not chemical treatment.
I think you misunderstood what it says. Chemical treatment is what is done with the negative on a film camera. Stored digitally is what is done in the digital camera when the captured analog image is digitized and stored on the memory card.

It is just a very simplistic overview.
In other words, the definition is simply and straightforwardly wrong. As soon as the definition is wrong any conclusion based solely on it must also be wrong.
Photography – The art or practice of taking and processing photographs.

This art/practice about two centuries of history, if we do not count camera obscura imaging which is many centuries old
… and, of course, we should not count the camera obscura because it didn't conform to the correct part of your first definition "in which an image is focused on to light-sensitive material".
since Niépce managed to fix an image that was captured with a camera. Niépce's associate Daguerre went on to develop the daguerreotype process, the first publicly announced viable photographic process.

Now, here are the scientific definitions of analog and digital signals:
  • Analog signal is a continuous signal which represents physical measurements.
  • Digital signals are discrete time signals generated by digital modulation.
Again, I don't think these stand up as valid definitions, although they are fair descriptions of what happens.
Now, according to the presented above definitions, my old film negative or print scanned and digitally presented for a printout or as a digital file for dpreview or other digital media site is a digital photo.
As above - if the definition is wrong so is a conclusion drawn from it.

--
---
Gerry
___________________________________________
First camera 1953, first Pentax 1985, first DSLR 2006
http://www.pbase.com/gerrywinterbourne
[email protected]
 
Last edited:
Not sure what the point of this thread really is but if you go back to basics at where the image is captured then film grain is digital
There is NOTHING digital about film grain.
Photon hits silver compound, it changes.

Photon does not hit silver compound, does not change.

Seems binary to me and that's a bit digital.
Nothing is ever digitally encoded or digitally processed.
Yes, it's very digitally encoded, just not on a medium you can transfer easily to the medium we're so familiar with today.
You have a weird concept of digital encoding.
Just going with the flow...er the DAC.
The DAC is for decoding.
Maybe, I guess. It's for converting to analogue. Decoding / encoding is more about things done in the digital domain - from one code to another. We use codecs to compress and decompress digital video signals so they can be transferred faster, but everything stays in the digital domain. Converting digital to analogue involves convolution perhaps and some degree of smoothing, but doesn't necessarily need "decoding". At least that's how I've always seen it.
Decoding is essential to go from 1's and 0's to some analog level.
Not exactly. Going from simple digital 1's and 0's to analogue is just conversion. You need no decoding to do that.

Going from an encoded signal such as encryption or compression to a simple set of 1's and 0's would be decoding. So a DAC might have some decoding depending on the input. But the DAC is for ultimately a black box for digital to analogue conversion - digital in, analogue out. Is there some decoding before the final 1's and 0's? Maybe. See the difference?

I thought I told you to lighten up? Your stress level is palpable.
But nonetheless, try to go with the flow man, it's just a loose association. Lighten up. You're too afraid of getting your fake Rolex ripped off your wrist by mythological criminals.
 
Last edited:
Not sure what the point of this thread really is but if you go back to basics at where the image is captured then film grain is digital
There is NOTHING digital about film grain.
Photon hits silver compound, it changes.

Photon does not hit silver compound, does not change.

Seems binary to me and that's a bit digital.
Nothing is ever digitally encoded or digitally processed.
Yes, it's very digitally encoded, just not on a medium you can transfer easily to the medium we're so familiar with today.
You have a weird concept of digital encoding.
Just going with the flow...er the DAC.
The DAC is for decoding.
Maybe, I guess. It's for converting to analogue. Decoding / encoding is more about things done in the digital domain - from one code to another. We use codecs to compress and decompress digital video signals so they can be transferred faster, but everything stays in the digital domain. Converting digital to analogue involves convolution perhaps and some degree of smoothing, but doesn't necessarily need "decoding". At least that's how I've always seen it.
Decoding is essential to go from 1's and 0's to some analog level.
Not exactly. Going from simple digital 1's and 0's to analogue is just conversion. You need no decoding to do that.

Going from an encoded signal such as encryption or compression to a simple set of 1's and 0's would be decoding. So a DAC might have some decoding depending on the input. But the DAC is for ultimately a black box for digital to analogue conversion - digital in, analogue out. Is there some decoding before the final 1's and 0's? Maybe. See the difference?

I thought I told you to lighten up? Your stress level is palpable.
With comments like that you are the one stressed out. You are trying too hard to make sense and coming up with word salad.
But nonetheless, try to go with the flow man, it's just a loose association. Lighten up. You're too afraid of getting your fake Rolex ripped off your wrist by mythological criminals.
 
Not sure what the point of this thread really is but if you go back to basics at where the image is captured then film grain is digital
There is NOTHING digital about film grain.
Photon hits silver compound, it changes.

Photon does not hit silver compound, does not change.

Seems binary to me and that's a bit digital.
Nothing is ever digitally encoded or digitally processed.
Yes, it's very digitally encoded, just not on a medium you can transfer easily to the medium we're so familiar with today.
You have a weird concept of digital encoding.
Just going with the flow...er the DAC.
The DAC is for decoding.
Maybe, I guess. It's for converting to analogue. Decoding / encoding is more about things done in the digital domain - from one code to another. We use codecs to compress and decompress digital video signals so they can be transferred faster, but everything stays in the digital domain. Converting digital to analogue involves convolution perhaps and some degree of smoothing, but doesn't necessarily need "decoding". At least that's how I've always seen it.
Decoding is essential to go from 1's and 0's to some analog level.
Not exactly. Going from simple digital 1's and 0's to analogue is just conversion. You need no decoding to do that.

Going from an encoded signal such as encryption or compression to a simple set of 1's and 0's would be decoding. So a DAC might have some decoding depending on the input. But the DAC is for ultimately a black box for digital to analogue conversion - digital in, analogue out. Is there some decoding before the final 1's and 0's? Maybe. See the difference?

I thought I told you to lighten up? Your stress level is palpable.
With comments like that you are the one stressed out.
Nah, I'm having fun the whole time.
You are trying too hard to make sense and coming up with word salad.
Seems you've run out of words.
 
Last edited:
Not sure what the point of this thread really is but if you go back to basics at where the image is captured then film grain is digital
There is NOTHING digital about film grain.
Photon hits silver compound, it changes.

Photon does not hit silver compound, does not change.

Seems binary to me and that's a bit digital.
Nothing is ever digitally encoded or digitally processed.
Yes, it's very digitally encoded, just not on a medium you can transfer easily to the medium we're so familiar with today.
You have a weird concept of digital encoding.
Just going with the flow...er the DAC.
The DAC is for decoding.
Maybe, I guess. It's for converting to analogue. Decoding / encoding is more about things done in the digital domain - from one code to another. We use codecs to compress and decompress digital video signals so they can be transferred faster, but everything stays in the digital domain. Converting digital to analogue involves convolution perhaps and some degree of smoothing, but doesn't necessarily need "decoding". At least that's how I've always seen it.
Decoding is essential to go from 1's and 0's to some analog level.
Not exactly. Going from simple digital 1's and 0's to analogue is just conversion. You need no decoding to do that.

Going from an encoded signal such as encryption or compression to a simple set of 1's and 0's would be decoding. So a DAC might have some decoding depending on the input. But the DAC is for ultimately a black box for digital to analogue conversion - digital in, analogue out. Is there some decoding before the final 1's and 0's? Maybe. See the difference?

I thought I told you to lighten up? Your stress level is palpable.
With comments like that you are the one stressed out.
Nah, I'm having fun the whole time.
You are trying too hard to make sense and coming up with word salad.
Seems you've run out of words.
You have, too. You are just changing the order and combinations.
 
First, let's start with some definitions and history.
It's a common misconception that dictionaries actually define the meanings of words: they don't. What they do is give what their lexicographers have understood to be the established meaning (or meanings) of words.

But a major problem, especially with technical words, is that the lexicographers don't really understand what they are dealing with. That is, conveniently for this discussion, illustrated by your first definition.
According to the Oxford Dictionary:

Photograph – A picture made using a camera, in which an image is focused on to light-sensitive material and then made visible and permanent by chemical treatment, or stored digitally.
The phrase I've underlined describes how a picture is made from a film photo but has nothing to do with a digital photo - a digital photo has electronic and computer treatment, not chemical treatment.
I think you misunderstood what it says.
No I haven't.

In both types of photography the first stage is that light hits and acts on a sensitive medium and alters it according to the strength and wavelength of the light. That change is to the chemical molecules in the case of film or to the charge on the sensels in the case of film. In neither case does this create a picture.
Chemical treatment is what is done with the negative on a film camera.
The second stage is to create the picture by "treating" the changes in some way. In the case of film that is, as you say, by chemical means. In the case of digital it is by electronic means. It's that stage of electronic treatment that the "definition" misses.
Stored digitally is what is done in the digital camera when the captured analog image is digitized and stored on the memory card.
The final stage is that once the picture has been created it must be stored if it is to have anything more than a transient existence. For film this requires a second stage of chemical treatment (fixing, without which the picture soon disappears) after which storage is on paper or similar medium. The "definition" omits this stage for film.

For digital the storage is electronic in a file that must be retrieved and processed again for viewing.
It is just a very simplistic overview.
I don't know if you chose the word "simplistic" deliberately: I hope so. It is not a synonym for "simple". Simplistic means over simplification to the point of being wrong, which is the point I'm making.

In other words, the definition is simply and straightforwardly wrong. As soon as the definition is wrong any conclusion based solely on it must also be wrong.
 
Let me compare it to recording and storage of music.

In times we did not have digital technique, musice was recorded analog, processed with analog teechnique and stored on analog media (records).

There was an abreviation for this way: AAA

Next we had digital technique at different stages of the making of music products.

Depending on the technique used at different stages it was marked with an "A" for analog and with a "D" for digital.

The first CDs we could buy may be AAD based: reecording was analog, processing was analog and storage was digital.

If you buy a CD from the Beatles nowadays the first step still will be analog - as in time the Beatles were recording their music there was no digital technique available. But the music will be processed with digital technique and your MP3 or CD is digital - and we end up with ADD.

Most musice produced nowadays will be DDD.

And if you buy a record produced in these days you may have DDA.

In this way you could also classify your photos taken and stored on film but processed and stored with digital technique as ADD.

And if your final sotrage of the photo would be a print of it you would end up with ADA.

Pretty simple.

Best regards

Holger
 
First, let's start with some definitions and history.
It's a common misconception that dictionaries actually define the meanings of words: they don't. What they do is give what their lexicographers have understood to be the established meaning (or meanings) of words.

But a major problem, especially with technical words, is that the lexicographers don't really understand what they are dealing with. That is, conveniently for this discussion, illustrated by your first definition.
According to the Oxford Dictionary:

Photograph – A picture made using a camera, in which an image is focused on to light-sensitive material and then made visible and permanent by chemical treatment, or stored digitally.
The phrase I've underlined describes how a picture is made from a film photo but has nothing to do with a digital photo - a digital photo has electronic and computer treatment, not chemical treatment.

In other words, the definition is simply and straightforwardly wrong. As soon as the definition is wrong any conclusion based solely on it must also be wrong.
Photography – The art or practice of taking and processing photographs.

This art/practice about two centuries of history, if we do not count camera obscura imaging which is many centuries old
… and, of course, we should not count the camera obscura because it didn't conform to the correct part of your first definition "in which an image is focused on to light-sensitive material".
since Niépce managed to fix an image that was captured with a camera. Niépce's associate Daguerre went on to develop the daguerreotype process, the first publicly announced viable photographic process.

Now, here are the scientific definitions of analog and digital signals:
  • Analog signal is a continuous signal which represents physical measurements.
  • Digital signals are discrete time signals generated by digital modulation.
Again, I don't think these stand up as valid definitions, although they are fair descriptions of what happens.
Now, according to the presented above definitions, my old film negative or print scanned and digitally presented for a printout or as a digital file for dpreview or other digital media site is a digital photo.
As above - if the definition is wrong so is a conclusion drawn from it.
Yes, don't draw technical conclusions from a general dictionary. It is unrealistic to expect a general dictionary to be technically rigorous, especially regarding changing technology. Even Wikipedia is not to be always trusted for accuracy.
 
Now, here are the scientific definition...
Whenever I see the word "scientific" used, I know it's not coming from an actual scientist.
of analog and digital signals:
  • Analog signal is a continuous signal which represents physical measurements.
Not necessarily a physical measurement. For example, music synthesizers generate an audio signal that is created without any physical measurement.
  • Digital signals are discrete time signals generated by digital modulation.
Not necessarily "time signals". And "time" in what sense? Measuring time? Data encoded as time intervals? And "modulation" of what? Your definition is a jumble of words that themselves are open to wide interpretation.
 
Last edited:
Not sure what the point of this thread really is but if you go back to basics at where the image is captured then film grain is digital
There is NOTHING digital about film grain.
Photon hits silver compound, it changes.

Photon does not hit silver compound, does not change.

Seems binary to me and that's a bit digital.
Nothing is ever digitally encoded or digitally processed.
Yes, it's very digitally encoded, just not on a medium you can transfer easily to the medium we're so familiar with today.
You have a weird concept of digital encoding.
Just going with the flow...er the DAC.
The DAC is for decoding.
Maybe, I guess. It's for converting to analogue. Decoding / encoding is more about things done in the digital domain - from one code to another. We use codecs to compress and decompress digital video signals so they can be transferred faster, but everything stays in the digital domain. Converting digital to analogue involves convolution perhaps and some degree of smoothing, but doesn't necessarily need "decoding". At least that's how I've always seen it.
Decoding is essential to go from 1's and 0's to some analog level.
Not exactly. Going from simple digital 1's and 0's to analogue is just conversion. You need no decoding to do that.

Going from an encoded signal such as encryption or compression to a simple set of 1's and 0's would be decoding. So a DAC might have some decoding depending on the input. But the DAC is for ultimately a black box for digital to analogue conversion - digital in, analogue out. Is there some decoding before the final 1's and 0's? Maybe. See the difference?

I thought I told you to lighten up? Your stress level is palpable.
With comments like that you are the one stressed out.
Nah, I'm having fun the whole time.
You are trying too hard to make sense and coming up with word salad.
Seems you've run out of words.
You have, too. You are just changing the order and combinations.
could have been fun if you could relax.
 

Keyboard shortcuts

Back
Top