Sensor dynamic range

Started Oct 8, 2011 | Discussions
Gomer Pyle
New MemberPosts: 20Gear list
Like?
Sensor dynamic range
Oct 8, 2011

I've been wondering how it is that, there can be significant difference between cameras, when it comes to dynamic range capability, but it's not a point that is commonly listed in the spec. sheets.

When comparing monitors or TV-sets there is almost always a spec. called Contrast Ratio. Am I just looking in all the wrong places or is it simply not something that photographers are bothered with?

I have the Olympus XZ-1 and I find it very hard if not impossible to record daylight pictures without lots of clipping in both ends. Especially cars are challenging as they will have either excessive reflexes and clipped surface highlights or tires completely swallowed by shadow.

How can I determine what camera equipment will best suited for HDR-scenes, with out using bracketing?

Daniel

 Gomer Pyle's gear list:Gomer Pyle's gear list
Olympus XZ-1
bubblzzz
Regular MemberPosts: 458
Like?
Re: Sensor dynamic range
In reply to Gomer Pyle, Oct 8, 2011

This site may help you some.

http://www.dxomark.com/index.php/Cameras/Camera-Sensor-Ratings/ (type) usecase_landscape

Reply   Reply with quote   Complain
AnandaSim
Forum ProPosts: 13,409Gear list
Like?
Re: Sensor dynamic range
In reply to Gomer Pyle, Oct 8, 2011

Gomer Pyle wrote:

I've been wondering how it is that, there can be significant difference between cameras, when it comes to dynamic range capability, but it's not a point that is commonly listed in the spec. sheets.

When comparing monitors or TV-sets there is almost always a spec. called Contrast Ratio. Am I just looking in all the wrong places or is it simply not something that photographers are bothered with?

Interesting observation.

  • In the film days, the DR was the responsibility of the film. Now we are shooting digital and the camera and capture mechanism is one package, it does make sense to document this aspect. Yes.

However,

  • We are very concerned about DR but this is only one aspect of a camera.

  • The DR of the actual scene often is frequently higher than the DR of the camera and the DR of the film and we are trying to capture the scene - the art and skill of the photographer is to have a lower DR device and use it to capture beauty. Compare that to the Contrast Ratio of the TV - it's fixed, and the TV is not a creative device - you just switch it on and sit back and watch - you don't actually "work it". So, in perspective, we know that the DR of the camera will always be lacking whatever we do so although capturing the scene and working the DR is vitally important, we put ourselves into the equation. Hence the saying "the picture is due to the photog, not the equipment" even tough one always tries to get better equipment.

  • The rendered DR of the image is at the end of a long chain from the source DR of the scene.

DR of the scene > change the scene, change the direction, change the time of day, change the day, change the season, change the weather - are variables.

Then the camera takes the photo and records it as RAW in memory/disk. The RAW DR may already be slightly adjusted a little. Then, the camera or the computer uses the JPEG rendering engine with a gamma transformation curve to map the tones from RAW to the JPEG tones. It is not a one for one mapping.

We can choose the slope of the curve, we can choose roll offs in the highlights and the shadows.

Subsequently, you can use further software manipulation.
See

http://www.dpreview.com/articles/8036606804/filters-after-the-fact-digital-split-nd-filters-versus-hdr

I have the Olympus XZ-1 and I find it very hard if not impossible to record daylight pictures without lots of clipping in both ends. Especially cars are challenging as they will have either excessive reflexes and clipped surface highlights or tires completely swallowed by shadow.

Yes. That is because the scene DR is often higher than the camera's.

If you move up to a bigger sensor camera, the bigger sensor the better, your DR will improve by a couple of "stops"

How can I determine what camera equipment will best suited for HDR-scenes, with out using bracketing?

Here:
http://www.dpreview.com/reviews/nikond3s/page19.asp

DPR reviews have sections on DR as tested by them. That was the Nikon D3s review

 AnandaSim's gear list:AnandaSim's gear list
Kodak EasyShare P880 Olympus E-510 Olympus E-620 Panasonic Lumix DMC-G2 Olympus PEN E-PM2 +16 more
Reply   Reply with quote   Complain
mstecker
Contributing MemberPosts: 539
Like?
Re: Sensor dynamic range
In reply to Gomer Pyle, Oct 8, 2011

I believe the dynamic range of a CCD is the log ratio of well depth (full well capacity) to the readout noise in decibels. For example, a system with a well depth of 85,000 electrons and a readout noise of 12 electrons would have a dynamic range = 20 log (85,000/12), or 77dB. The higher the number the better the dynamic range.
http://www.ccd.com/ccd111.html
and
http://www.andor.com/learning/digital_cameras/?docid=321

More simply, the dynamic range of a camera sensor is, related to the size (not number) of photosites (pixels). Most people talk about noise and pixel size, but it is also related to DR. So when I look for a camera the first thing I check for is pixel size and pixel density

The full well capacity and read-out noise are the number to look for. Large pixels (not the number of pixels: megaspixels) hold more electrons so you get better sampling and more gradations. Astronomical CCDs almost always list these values. I think you probably could get this information if you contact the manufacturer of the camera.
--
Mike
http://mstecker.com

Reply   Reply with quote   Complain
Gomer Pyle
New MemberPosts: 20Gear list
Like?
Re: Sensor dynamic range
In reply to Gomer Pyle, Oct 9, 2011

Thanks everyone! It seems it takes some more digging into technical details than just looking up a number in a spec.-sheet.

Daniel

 Gomer Pyle's gear list:Gomer Pyle's gear list
Olympus XZ-1
Reply   Reply with quote   Complain
AnandaSim
Forum ProPosts: 13,409Gear list
Like?
Re: Sensor dynamic range
In reply to Gomer Pyle, Oct 9, 2011

LOL. Daniel, if DR could be boiled down to a number on a spec sheet, probably a third of the need for a human bean to hold the camera is dismissed.

Have fun....

Gomer Pyle wrote:

Thanks everyone! It seems it takes some more digging into technical details than just looking up a number in a spec.-sheet.

Daniel

 AnandaSim's gear list:AnandaSim's gear list
Kodak EasyShare P880 Olympus E-510 Olympus E-620 Panasonic Lumix DMC-G2 Olympus PEN E-PM2 +16 more
Reply   Reply with quote   Complain
Mark Scott Abeln
Veteran MemberPosts: 3,354Gear list
Like?
Re: Sensor dynamic range
In reply to Gomer Pyle, Oct 9, 2011

Gomer Pyle wrote:

How can I determine what camera equipment will best suited for HDR-scenes, with out using bracketing?

If you are doing HDR, then the camera's dynamic range does not matter nearly as much, since you are greatly expanding it with multiple exposures and software.

What matters more is what software or technique you are using to blend your images.

 Mark Scott Abeln's gear list:Mark Scott Abeln's gear list
Nikon D200 Nikon D7000 Nikon AF-S DX Nikkor 35mm f/1.8G Nikon AF Nikkor 50mm f/1.8D Rokinon 85mm F1.4
Reply   Reply with quote   Complain
Camp Freddy
Senior MemberPosts: 1,385Gear list
Like?
Re: Sensor dynamic range
In reply to Gomer Pyle, Oct 10, 2011

The XZ1 has only a bit to go from RAW in terms of DR, but for the best extended DR image development, you should autobracket: and why not? It costs nothing.

Even hand held images can be aligned for merging layers, and it is a very rewarding process to do manually in LR or PS or Gimp.

( see below, for an image which was too light and glarey in one frame, also lacking sharpness, while being too dark in another from fully manual exposures: layered down light transparent onto heavily sharpened darker, so the glare and clipping on the lamps is reduced.)

Incidentally, you can argue that the oly is no worse than any other quality conpact at this sensor size, but the faster lens means much more light!

-- hide signature --

================================
Enjoying Photography like never before with the E-450!
Images, photo and gimp tips:
http://olympe450rants.blogspot.com/

NORWEGIAN WOOD GALLERY
http://fourthirds-user.com/galleries/showgallery.php/cat/888

Olympus' Own E450 Gallery http://asia.olympus-imaging.com/products/dslr/e450/sample/

"to be is to do" Descartes;
"to do is to be" Satre ;

............................"DoBeDoBeDo" Sinatra.
=============================

Reply   Reply with quote   Complain
Rachotilko
Contributing MemberPosts: 529
Like?
Reviving an old thread but I need some theoretical explanation.
In reply to Gomer Pyle, Jul 3, 2012

So what is this DR thing ? I don't get it and need some not-so-simplistic explanation. Please be patient and follow my thought experiment:

Think of a perfect camera: the noiseless one. It also takes RAWs of a high encoding bit-depth (possibly floating-point encoded).

My question is: if such a camera existed (it does not, of course) it would necessarily have an extremelly high DR, would it not ?

Because whatever detail present in the raw data could have been recovered by simple curve adjustment (ie. a function in the value domain). Or if that is not sufficient (due to the contrast limitations of displaying media), by some tonemapping (local constrast enhancement) algorithm.

So now I come to my question : why does photographic community talk of DR separatelly of noise ? As if it was some kind of independent characteristics of a sensing element, while it is IMHO completelly determined by the device's S/N ratio.

Reply   Reply with quote   Complain
Deleted1929
Forum ProPosts: 13,050
Like?
Re: Reviving an old thread but I need some theoretical explanation.
In reply to Rachotilko, Jul 3, 2012

Your ideal sensor doesn't and cannot exist. So let's completely ignore it.

First things first. Light exists as discrete particles. We count them, and the number we count is what we call intensity ( or brightness in human terms ). We don't use floating point because there's no such thing as half a photon.

Noise cannot be ignored in dynamic range because when the average noise equals or exceeds the signal level ( "the brightness level of a given pixel" for simplicity sakes ) defines the lowest level we can distinguish - that's the base of your dynamic range.

Noise can never be zero in a physical system. Even if, for practical purposes, we make it really small, we're still going to have a limit of zero count, so we've a definite lower limit we can't improve on.

There's a different limit when the pixels get really bright. Beyond a certain point the sensor gets saturated and can't measure any more light ( photons ) hitting it. At the risk of insulting you, it's like you running out of fingers to count on. With a couple of complications that aren't really important here, that's out upper limit for dynamic range.

No matter how clever the engineers get, they can't invent something that can count to infinity. So there must be an upper limit.

Finally humans see things differently from this counting of photos. So adding another EV in dynamic range means the sensor has to be able to count twice as many photons as it did before. That's a big ask for an engineer, and every improvement makes the next one much, much harder to get.

In fact most of the dynamic range improvements have been because we've reduced noise levels. But, as I've explained, that's a improvement you can tap once but not twice, so we've pretty much hit the limit on that front.

So we can't get a better lower limit and it's very, very hard to extend the upper limit.

So that, rather simplified, is your dynamic range issue and why you can't get arbitrary amounts of it.

-- hide signature --

StephenG

Reply   Reply with quote   Complain
Rachotilko
Contributing MemberPosts: 529
Like?
Re: Reviving an old thread but I need some theoretical explanation.
In reply to Deleted1929, Jul 3, 2012

Hi, thanks for reply.

I know the quantum phenomena limit the SNR.

It was a thought experiment only, and it served a purpose of definig the DR enlargement problem wholly in terms of SNR and nothing else .

In that respect I don't agree with you that the DR's "upper limit" is a difficult problem to solve. It is easily remedied by the underexposing. That aggravates the lower limit part of the DR problem, for sure.

But this was the point of my post: community of DPR users often thinks of DR in terms of "blown highlights", when IMHO it is wholly determined by SNR.

But maybe I am wrong.

Reply   Reply with quote   Complain
sherwoodpete
Veteran MemberPosts: 7,758
Like?
Re: Reviving an old thread but I need some theoretical explanation.
In reply to Rachotilko, Jul 3, 2012

Rachotilko wrote:

Hi, thanks for reply.

I know the quantum phenomena limit the SNR.

It was a thought experiment only, and it served a purpose of definig the DR enlargement problem wholly in terms of SNR and nothing else .

In that respect I don't agree with you that the DR's "upper limit" is a difficult problem to solve. It is easily remedied by the underexposing. That aggravates the lower limit part of the DR problem, for sure.

But that isn't solving the upper limit problem. It's merely reducing the strength of the signal so that it never reaches the upper limit.

But this was the point of my post: community of DPR users often thinks of DR in terms of "blown highlights", when IMHO it is wholly determined by SNR.

Well, if it's determined by SNR, then reducing the signal is hardly a beneficial approach, I don't think.

At the lower end of the range, there are various components which contribute to the noise. One of them is shot noise. This arises because of the way that photons arrive at the sensor in a random way. When the signal is very low (deep shadow) this becomes increasingly important, and there is no engineering solution, as it is the light itself which gives rise to the noise.
http://en.wikipedia.org/wiki/Shot_noise

That's not to say that sensors can't be improved to some extent, but the more the sensor is improved, the greater the relevance of shot noise at the darkest end of the range.

Regards,
Peter

Reply   Reply with quote   Complain
apaflo
Veteran MemberPosts: 3,854
Like?
Re: Reviving an old thread but I need some theoretical explanation.
In reply to Rachotilko, Jul 3, 2012

Rachotilko wrote:

It was a thought experiment only, and it served a purpose of definig the DR enlargement problem wholly in terms of SNR and nothing else .

In that respect I don't agree with you that the DR's "upper limit" is a difficult problem to solve. It is easily remedied by the underexposing. That aggravates the lower limit part of the DR problem, for sure.

The upper limit is actually a bit different than has been described. It is limited by the quantum efficiency of the sensor. Consider that the sensor on a Nikon D3S can count about 58% of all photons that strike it. To get one more stop above the noise level would require twice as many photons, which obviously cannot happen. So the best that can be done on the high end is not quite a full stop. Improvement necessarily must come by reducing read noise.

The next consideration is the distinction between SNR and Dynamic Range. An Analog to Digital Converter with N bit depth can only record 1.76 + (N * 6.01) dB of Dynamic Range. That is actually the SNR resulting from the noise of quantization distortion. But the actual SNR of an ADC also includes many other noise sources (such as the reference voltage, clock jitter, etc), and that may or may not be higher or lower than quantization distortion. Hence a 14-bit depth ADC can have, at the absolute most, a Dynamic Range of 85.9 dB, but that will only occur if the SNR is otherwise greater than 89 dB.

Note that higher speed ADC's have lower SNR's, hence for a camera such as the D4 that is intended to have a high frame rate it is more difficult to get the same DR as can be done with a camera that has a slower frame rate. That is exacerbated with cameras like the D800 that have a lot more data to process too. That is why the D800 has a much slower frame rate than the D4, and is why the D4 has a much lower resolution.

But this was the point of my post: community of DPR users often thinks of DR in terms of "blown highlights", when IMHO it is wholly determined by SNR.

But maybe I am wrong.

It's the same difference actually! With a scene DR that exceeds the camera's DR, either the shadows will be blocked or the highlights will be clipped (which effectively is a form of quantization noise). The shadows are blocked by read noise. It might well be quantization distortion or other noise that is part of "read noise". (Photon shot noise limits the SNR at the higher signal values, not in the shadows so that is not part of the equation other than resulting in a dividing line with increased exposure where the resulting image changes from being read noise limited to photon noise limited.)

Reply   Reply with quote   Complain
Great Bustard
Forum ProPosts: 20,714
Like?
Let's get this cleared up.
In reply to Rachotilko, Jul 4, 2012

Rachotilko wrote:

So what is this DR thing ? I don't get it and need some not-so-simplistic explanation. Please be patient and follow my thought experiment:

Think of a perfect camera: the noiseless one. It also takes RAWs of a high encoding bit-depth (possibly floating-point encoded).

My question is: if such a camera existed (it does not, of course) it would necessarily have an extremelly high DR, would it not ?

Because whatever detail present in the raw data could have been recovered by simple curve adjustment (ie. a function in the value domain). Or if that is not sufficient (due to the contrast limitations of displaying media), by some tonemapping (local constrast enhancement) algorithm.

So now I come to my question : why does photographic community talk of DR separatelly of noise ? As if it was some kind of independent characteristics of a sensing element, while it is IMHO completelly determined by the device's S/N ratio.

The DR (dynamic range) is the number of stops from the noise floor to the saturation limit. So, let's talk about that perfect sensor of yours.

A perfect sensor would be a photon counter. That is, it would count every photon that landed on it -- no more, no less. The DR would be infinite, because the noise floor would be zero.

However, as we know, nothing's perfect, and every now and then, the sensor might record a photon that didn't land on it or fail to record a photon that did land on it. Enter the noise floor, and, consequently, a finite DR.

So, let's say the standard deviation of the mean number of electrons being released from a pixel is 4 electrons, and that the pixel can release at most 20000 electrons. This gives us a DR of log2 (20000 / 3) = 12.3 stops / pixel.

But a per-pixel measure of DR doesn't mean much in a comparative sense unless the photos are made from the same number of pixels.

To that end, let's consider two sensors: Sensor A (40 MP) and Sensor B (10 MP), both with the same pixels, but Sensor A has four times as many.

A more realistic comparison is to compare the DR over the same area of the photo. I like to use the DR / μphoto measure, where a μphoto is one-millionth of a photo (40 pixels for Sensor A, 10 pixels for Sensor B).

As mentioned above, noise is a standard deviation, which is the square root of the variance. For random non-correlated phenomena, the variance of the sum is the sum of the variances, which means the standard deviation (noise) of the sum is the square root of the sum of the squares of the standard deviations.

Confused? Sure. Let me work out an example, in the hopes it makes more sense.

The read noise is 4 electrons. This means that the read noise for 40 pixels is sqrt (4² + 4² + ... + 4²) = sqrt (40 x 4²) = 4 sqrt 40 = 25.3 electrons. Likewise, the read noise for 10 pixels is 4 sqrt 10 = 12.6 electrons.

The saturation, however, adds linearly. Thus, the saturation for 40 pixels is 40 x 20000 electons = 800000 electons, and the saturation for 10 pixels is 10 x 20000 electrons = 200000 electrons.

Hence, Sensor A (40 MP) will have a DR of log2 (800000 / 25.3) = 14.9 stops / μphoto and Sensor B (10 MP) will have a DR of log2 (200000 / 12.6) = 13.9 stops / μphoto.

In general, for every quadrupling of pixels, you will get a one stop increase in DR / area, just as for every quadrupling of sensor area, you will get a one stop decrease in photon noise.

This brings up the next point: photon noise is not included in the DR calculation except inasmuch as one can use the NSR as a basis for the noise floor in the DR calculation.

If we choose a 100% NSR for the noise floor, for example, the noise floor is 4.5 electrons / pixel as opposed to 4 electrons / pixel. We can choose different NSRs to get different noise floors, and thus different measures for DR.

But the fact of the matter is that if we have two sensors, both with the same number of pixels, but one has pixels with a read noise of 4 electrons and saturation of 20000 electrons, and the other with a read noise of 8 electrons and a saturation of 40000 electrons, the DR will be the same, but the photos will have different characteristics, because photon noise is not included in DR measure, except, perhaps, to set the noise floor.

So, what have we learned? First off, DR / pixel is not as useful a measure as DR / area, and, as a corollary, every quadrupling of pixels results in a one stop increase in DR / area (in stark contrast to all those that claim more pixels result in less DR).

Secondly, the choice of the noise floor is arbitrary (as is the area of the photo over which we measure the DR), so it is important to clarify what noise floor we are using (and why we are using it) as well as what area of the photo we are measuring the DR over.

Lastly, two systems with the same DR will not necessarily have the same characteristics since photon noise is not taken into account in the DR measurement except in choosing a noise floor other than the read noise.

I hope this helps. But if you have any questions, please don't hesitate to ask.

Reply   Reply with quote   Complain
Rachotilko
Contributing MemberPosts: 529
Like?
Thanks everybody !
In reply to Rachotilko, Jul 4, 2012

The wealth of information you provided me requires some time to digest, which I will have to find in weekends to come.

Anyway, I really appreciate the usage of rigorous speach when talking of technology, so I am grateful to you. I know the artists here don't appreciate the "gearheadish" terminology, but I think DPR is place to be sometimes somewhat serious too

Again, thank you, I will return here soon.

Reply   Reply with quote   Complain
AlanPezz
Regular MemberPosts: 138
Like?
Re: Thanks everybody !
In reply to Rachotilko, Jul 4, 2012

Large sensor cameras have more dynamic range than it is possible to render into a print without a lot of human intervention. This is illustrated here -

http://www.squidoo.com/a-misconception-about-exposure

Reply   Reply with quote   Complain
Great Bustard
Forum ProPosts: 20,714
Like?
Hmm.
In reply to AlanPezz, Jul 5, 2012

AlanPezz wrote:

Large sensor cameras have more dynamic range than it is possible to render into a print without a lot of human intervention. This is illustrated here -

http://www.squidoo.com/a-misconception-about-exposure

The S100 sensor (4.6x) has 11.5 stops of DR at base ISO:

http://www.sensorgen.info/CanonPowershot_S100.html

so is it safe to say that the S100 is an example of a "large sensor camera"?

Reply   Reply with quote   Complain
Keyboard shortcuts:
FForum MMy threads