Why is digital high Iso difficult to achieve?

glaopt1

Well-known member
Messages
141
Reaction score
5
Location
UK
I was reading in an astronomy text recently that CCD sensors are 100 times more sensitive to light than silver halide film. When CCD sensors were fitted to astronomical telescopes it increased their sensitivity to light 100 fold. If that is the case, why is it difficult to achieve high ISO ratings in digital cameras?
 
I was reading in an astronomy text recently that CCD sensors are 100 times more sensitive to light than silver halide film
I have some problem with this notion. What is the "sensitivity" of a sensel on the sensor? The sensel has to capture and retain the light such way, that thousands of different light intensities can be retrieved. How will this be compared to film?
When CCD sensors were fitted to astronomical telescopes it increased their sensitivity to light 100 fold
This is plain nonsense. If you attach a camera to a huge telescope, then you have a "lens" with f/0.001 (or whatever), but that does not change the sensor's "sensitivity".
If that is the case, why is it difficult to achieve high ISO ratings in digital cameras?
Is 3200 not high? What about 12800? With what do you compare these, when saying that they are not "high"?

--
Gabor

http://www.panopeeper.com/panorama/pano.htm
 
Most (if not all) scientific astronomy sensors are cryogenically cooled. The cooling keeps the noise down. You also have to remember that astronomy exposures often run into hours not fractions of a second. So sensitivity becomes a very relative term.

--

The greatest of mankind's criminals are those who delude themselves into thinking they have done 'the right thing.'
  • Rayna Butler
 
Is 3200 not high? What about 12800? With what do you compare these, when saying that they are not "high"?
And doesn't sensitivity go like: iso 1600 is 16 times more sensitive than iso 100? So iso 2500 is 100 times more sensitive than iso 25 (a very common film speed in the past). There's your 100 fold. :)
 
Is 3200 not high? What about 12800? With what do you compare these, when saying that they are not "high"?
And doesn't sensitivity go like: iso 1600 is 16 times more sensitive than iso 100? So iso 2500 is 100 times more sensitive than iso 25 (a very common film speed in the past). There's your 100 fold. :)
Saying "100 fold" in this context could mean the sensitivity was doubled one hundred times, not that it was one hundred times more sensitive. Photographic exposure is conveniently noted in doublings or halvings (1/100 lets in twice as much light as 1/200; f/8 lets in half as much light as f/5.6; ISO 400 is twice as sensitive as ISO 200). Every "stop" is a doubling or halving.

Saying Device A is 100 fold as sensitive as Device B might mean that if the native ISO of Device B is 100, the native ISO of Device A is 100 * 2 (one hundred times), or 63,382,530,011,411,470,074,835,160,268,800.

I don't know about the tech behind serious scientific astrophotography, but that's a massive difference if true!
I was reading in an astronomy text recently that CCD sensors are 100 times more sensitive to light than silver halide film. When CCD sensors were fitted to astronomical telescopes it increased their sensitivity to light 100 fold.
Perhaps the OP could give us more information? The above could be interpreted in two different ways.

--
Decentralize and Repeal.
http://www.flickr.com/photos/10454889@N06/
 
I was reading in an astronomy text recently that CCD sensors are 100 times more sensitive to light than silver halide film. When CCD sensors were fitted to astronomical telescopes it increased their sensitivity to light 100 fold. If that is the case, why is it difficult to achieve high ISO ratings in digital cameras?
One word: noise

--mamallama
 
Most (if not all) scientific astronomy sensors are cryogenically cooled. The cooling keeps the noise down. You also have to remember that astronomy exposures often run into hours not fractions of a second. So sensitivity becomes a very relative term.
The other kicker is that with hour-long exposures, film speed drops a lot through reciprocity loss. Film likes to see its photons all in a bunch. If only a couple are coming through at a time, it forgets it ever saw the first ones. So a film with ISO 100 at normal sub-second exposure times could be ISO 6 for 1 hour exposure times. CCD's, cryogenically cooled to reduce noise as noted above, remain linear for much longer times.

--
Leonard Migliore
 
Most (if not all) scientific astronomy sensors are cryogenically cooled. The cooling keeps the noise down. You also have to remember that astronomy exposures often run into hours not fractions of a second. So sensitivity becomes a very relative term.
The other kicker is that with hour-long exposures, film speed drops a lot through reciprocity loss. Film likes to see its photons all in a bunch. If only a couple are coming through at a time, it forgets it ever saw the first ones. So a film with ISO 100 at normal sub-second exposure times could be ISO 6 for 1 hour exposure times. CCD's, cryogenically cooled to reduce noise as noted above, remain linear for much longer times.

--
Leonard Migliore
At normal light levels CCDs aren't 100 times as sensitive as film.

Greg
 
I was reading in an astronomy text recently that CCD sensors are 100 times more sensitive to light than silver halide film.
Simple definitions of ISO just multiply the number of photons detected by a constant number.

Good definitions of ISO include a noise consideration; like a signal to noise ratio of 40:1 per viewing pixel at 18% exposure. Physics and math require that each pixel detect something like at least 1600 photons achieve this noise level.

Modern camera sensors approach the theoretical limits in detecting photons. That is, the scene & lens, not the sensor determine how many photons are detected per second (ie. for perfect sensors the noise ratio observed does not depend on the sensor.)

Since the lowest possible noise in a signal is related to the signal, multiplying the signal received by a constant number does no good.

There are NO possible exact solutions to this problem. However, software can help a lot by making certain assumptions (much like our brain does); for example, a large area of smooth pale blue is probably the sky so ignore the noise even though it is there.

Dave
 
I was reading in an astronomy text recently that CCD sensors are 100 times more sensitive to light than silver halide film
I have some problem with this notion. What is the "sensitivity" of a sensel on the sensor? The sensel has to capture and retain the light such way, that thousands of different light intensities can be retrieved. How will this be compared to film?
It's pretty simple to work this out, what is the dimmest object that can be detected for a given time exposure. CCDs can detect much more faint objects therefore they are more sensitive.
When CCD sensors were fitted to astronomical telescopes it increased their sensitivity to light 100 fold
This is plain nonsense. If you attach a camera to a huge telescope, then you have a "lens" with f/0.001 (or whatever), but that does not change the sensor's "sensitivity".
Firstly no astronomical telescope is very fast in photographic terms. An astronomer's idea of fast is about f/4, but then they have VERY long focal lengths. A 5200mm f/4 telescope is defined as a very wide field, very fast telescope. Secondly it's not possible to create a lens faster than f/0.7 in air.
If that is the case, why is it difficult to achieve high ISO ratings in digital cameras?
Is 3200 not high? What about 12800? With what do you compare these, when saying that they are not "high"?
Astronomical CCD sensors don't have colour array filters and are cryogenically cooled so they can detect much fainter objects. Also ISO is about the amount of light need to saturate a pixel, not the minimum amount of light that can be detected, so you're comparing the wrong properties.
 
Secondly it's not possible to create a lens faster than f/0.7 in air.
Do you happen to know if there is a similar limit for reflecting telescopes? Most large telescopes use a mirror rather than a lens.

Regards,
Peter
 
Very interesting comments. Thankyou. So, if i understand this properly CCDs ARE more sensitive to light than film but are limited by

1. noise - some of which can be minimised by cooling the sensor - one of the reasons the astronomers immerse their instruments in liquid helium?

2. the exposure times involved - film needs a high concentration of photons in a short time for the chemical reaction to take place whereas a CCD can effectively react to a sequence of photons recorded over a longer period of time
3. the filters used in cameras

Thaks for the useful advice and for the flixxy link
 
I was reading in an astronomy text recently that CCD sensors are 100 times more sensitive to light than silver halide film. When CCD sensors were fitted to astronomical telescopes it increased their sensitivity to light 100 fold. If that is the case, why is it difficult to achieve high ISO ratings in digital cameras?
I'd like to see a source for the 100 fold sensitivity increase. My understanding is that black and white photographic film has a typical quantum efficiency of somewhere around 2%, so it simply isn't possible to increase the sensitivity (at a constant SNR) by more than about 50x. The best digital sensors seem to top out at something like 80% QE, which represents about a 40x increase in sensitivity.

And that's for monochrome light. Color filter arrays cause an extra loss of sensitivity, so they have typical QEs in the neighborhood of 30%- only about 4 stops better than film. Unsurprisingly, that's the same general sensitivity advantage color sensors have over color film. Color film frequently started at about ISO 50, with ISO 400 showing some obvious grain, and ISO 1600 and ISO 3200 being very grainy stuff that you'd only use when you desperately need the speed. With the best 35FF digital sensors, results can be very nice up to about ISO 3200, with obvious grain showing up at 6400 and 12,800 and sensitivities up to 102,400 available when absolutely necessary.

Edited to add: The biggest difference is that most cameras have taken advantage of the extra sensitivity to go to smaller sensor sizes instead of higher absolute sensitivity. Typical digital compacts use sensors that are only 3-6% of the area of a 35mm film frame, but they're still able to put out results that are competitive with 35mm film. It looks as though designers have shrunken the sensor to maintain "good enough" quality.
--

As with all creative work, the craft must be adequate for the demands of expression. I am disturbed when I find craft relegated to inferior consideration; I believe that the euphoric involvement with subject or self is not sufficient to justify the making and display of photographic images. --Ansel Adams
 
My mistake, I looked up the formula and you are correct (providing you are using glass). The PRACTICAL limit is about f/0.7. If you want to use exotic materials like diamond you can, IN THEORY, get down to about f/0.235, but that's for a simple lens without any apochromatic correction (i.e. lots of CA).
 
I was reading in an astronomy text recently that CCD sensors are 100 times more sensitive to light than silver halide film. When CCD sensors were fitted to astronomical telescopes it increased their sensitivity to light 100 fold. If that is the case, why is it difficult to achieve high ISO ratings in digital cameras?
One word: noise
Actually, it is thermal noise and read noise that create the biggest obstacle to practical super-high ISOs. Photon noise is not very limiting in this regard. This can be hard to believe, because you can see photon noise even at high ISOs, but photon noise only doubles when you quarter the exposure, whereas thermal and read noises double from only halving the exposure. That difference becomes much greater when you get to ISOs in the hundreds of thousands or greater. Shot noise is also very aesthetic noise compared to read noise, and it has no problems becoming much less visible when you downsize or display an image smaller. It does not blanket the shadows like read noise does.

When we finally get sensor that can just count photons, we'll be seeing shots of dogs running towards the camera under moonlight (if the AF can keep up).

--
John

 
The Federal spooks (CIA, FBI, etc.) have cameras with MPs in the 4000 to 5000 range. A shot taken at the inaugural was widely circulated via e-mail. A small dot of a face way back in the crowd could be enlarged many-many times without image degradation. With the amazing advances in technology, I believe we will be seeing consumer cameras with very-very high MP ratings very soon. I'll stick my neck out and say we will have cameras in the 100 to 200 MP range on the market within two years. And quality problems (noise, etc.) will be a thing of the past as well.
 

Keyboard shortcuts

Back
Top