Pixel-size, noise and DR.

Started May 12, 2009 | Discussions
Steen Bay Veteran Member • Posts: 6,974
Pixel-size, noise and DR.

Many here seems to think that smaller pixels is a good thing (and I tend to agree..), because for example a 25mp APS-C sensor would give more detail with the same noise, if we look at the whole image. (although the per-pixel noise would be higher)

That sounds reasonable, but what about Dynamic Range? Isn't it unavoidable that smaller pixels, with smaller 'wells', will clip ealier, and therefore cause loss of highlight detail? Isn't that the same as reduced DR? ..and isn't noise and DR like two sides of the same coin? If the image (or sensor) has less Dynamic Range, isn't that the same thing as saying that the image has more noise?

OP Steen Bay Veteran Member • Posts: 6,974
Re: Pixel-size, noise and DR.

Steen Bay wrote:

Many here seems to think that smaller pixels is a good thing (and I
tend to agree..), because for example a 25mp APS-C sensor would give
more detail with the same noise, if we look at the whole image.
(although the per-pixel noise would be higher)

That sounds reasonable, but what about Dynamic Range? Isn't it
unavoidable that smaller pixels, with smaller 'wells', will clip
ealier, and therefore cause loss of highlight detail? Isn't that the
same as reduced DR? ..and isn't noise and DR like two sides of the
same coin? If the image (or sensor) has less Dynamic Range, isn't
that the same thing as saying that the image has more noise?

Or maybe I've misunderstood.. maybe the smaller wells are just as 'deep' as the bigger wells, and can handle the same exposure before clipping?

Chris59 Forum Pro • Posts: 14,862
Re: Pixel-size, noise and DR.

For what it's worth, I agree with your premise about smaller pixels, but cannot see that smaller electron wells will necessarily mean more limited dynamic range.

I'm not an electronics expert but even if there was a corresponding reduction in dynamic range as pixel size gets smaller, there are also fewer photons to deal with, so dynamic range need not be compromised.

One can also look at the problem of DR from the other side of the coin and ask why it is that much larger pixels as typically found in FF DSLRs yield such a small increase in DR. I think that whatever the answer is, we are still only "tinkering" at the edges with new DSLR models yielding a third stop here or half a stop there and that a real increase in DR requires a change in the technology employed.

My favourite idea for increasing DR is one proposed by Samsung. As well as each pixel recording the amount of light that strikes it, each pixel is also timed by a very accurate clock to see how long it takes to saturate a "blown" pixel. This will give you an image with blown highlights each of which can be assigned a brightness level according to the time taken to blow that particular pixel.

If you think about it, you can expose for the shadows (reducing noise) and allow the highlights to "blow out" with the knowledge that full details (to the bit limts of the camera) can be recovered.

The theory, at least to my mind, is elegant and simple and will work extremely well. The difficulty of course will be in the implementation of the technology and I haven't heard or read anything more about it. However, it does boggle the mind that with a simple "sidestepping" solution to the problem of increased DR from small pixels images will be able to display clean noise free images with HDR like dynamic range.

 Chris59's gear list:Chris59's gear list
Samsung NX1 Samsung 16-50mm F2.0-2.8
richardplondon
richardplondon Forum Pro • Posts: 10,487
Re: Pixel-size, noise and DR.

Steen Bay wrote:

Or maybe I've misunderstood.. maybe the smaller wells are just as
'deep' as the bigger wells, and can handle the same exposure before
clipping?

AFAICT, they don't need to have the same capacity, because they are only covering a small area. If a lot of 2-inch-high containers are put out in the rain, and an inch of rain falls, the small ones will contain only a little water, and the large ones will contain a lot of water, but they will all be half full. If two inches of rain fall, they will all (in principle) overflow together. The rain-measuring capacity is due to the height of the containers, not the width.

However, some of the small containers (especially) will trap more or less rain than others, so there will be a little variation. This means that many will still not quite be full when 2.1" of rain fall, and a few will not quite be full when 2.2" of rain fall. So averaging these small containers together, we may be able to tell the difference between 2.1" of rain and 2.2" of rain. We can't judge that from the large containers, which all overflowed at 2" of rain.

RP

 richardplondon's gear list:richardplondon's gear list
Panasonic LX100 Pentax K-5 Sigma 10-20mm F4-5.6 EX DC HSM Pentax smc DA 21mm F3.2 AL Limited Pentax smc DA 70mm F2.4 AL Limited +7 more
ejmartin Veteran Member • Posts: 6,274
DR is scale dependent

Steen Bay wrote:

Many here seems to think that smaller pixels is a good thing (and I
tend to agree..), because for example a 25mp APS-C sensor would give
more detail with the same noise, if we look at the whole image.
(although the per-pixel noise would be higher)

That sounds reasonable, but what about Dynamic Range? Isn't it
unavoidable that smaller pixels, with smaller 'wells', will clip
ealier, and therefore cause loss of highlight detail? Isn't that the
same as reduced DR? ..and isn't noise and DR like two sides of the
same coin? If the image (or sensor) has less Dynamic Range, isn't
that the same thing as saying that the image has more noise?

DR of a pixel must be properly scaled to determine the captured DR of the image. DR has two components, signal level at saturation, and noise with no signal (or if one prefers, signal level at minimum acceptable S/N ratio). The ratio of these two is the dynamic range.

Each of the two components of dynamic range is affected by pixel size. Smaller pixels have smaller well capacity, but then, they make up less of the image so they don't need to have the same capacity. For DR at base ISO, the important quantity is the saturation density of electrons (electrons/unit area at saturation) and that turns out to be largely independent of pixel size, from digicams to FF DSLR's.

The other component of DR is shadow noise. Noise should also be scaled per unit area, so that one is discussing the noise of a fixed portion of the image independent of how many pixels it contains. There is no clear trend of shadow noise per area in current cameras. It seems to increase as pixel size decreases at high ISO, where the noise is predominantly due to the sensor at the lowest signal levels; it seems to decrease with pixel size at low ISO, because the noise is coming predominantly from electronics downstream of the sensor which are helped by the lower per pixel signals it has to handle when the pixels are smaller. For instance, at ISO 100 the read noise per area of the Canon 40D is 10% less than that of the Canon 1D3, as measured by the std dev (caveat: the 40D has more pattern noise, which is not accurately measured by the std dev), while at ISO 1600 it is about 30-35% higher.

So, while DR of a pixel goes down as pixel size decreases (largely due to the decreased well capacity), DR per area and therefore DR of the image capture is largely independent of pixel size, particularly at low ISO.

-- hide signature --
OP Steen Bay Veteran Member • Posts: 6,974
Clarkvision.com?

ejmartin wrote:

Steen Bay wrote:

Many here seems to think that smaller pixels is a good thing (and I
tend to agree..), because for example a 25mp APS-C sensor would give
more detail with the same noise, if we look at the whole image.
(although the per-pixel noise would be higher)

That sounds reasonable, but what about Dynamic Range? Isn't it
unavoidable that smaller pixels, with smaller 'wells', will clip
ealier, and therefore cause loss of highlight detail? Isn't that the
same as reduced DR? ..and isn't noise and DR like two sides of the
same coin? If the image (or sensor) has less Dynamic Range, isn't
that the same thing as saying that the image has more noise?

DR of a pixel must be properly scaled to determine the captured DR of
the image. DR has two components, signal level at saturation, and
noise with no signal (or if one prefers, signal level at minimum
acceptable S/N ratio). The ratio of these two is the dynamic range.

Each of the two components of dynamic range is affected by pixel
size. Smaller pixels have smaller well capacity, but then, they make
up less of the image so they don't need to have the same capacity.
For DR at base ISO, the important quantity is the saturation density
of electrons (electrons/unit area at saturation) and that turns out
to be largely independent of pixel size, from digicams to FF DSLR's.

The other component of DR is shadow noise. Noise should also be
scaled per unit area, so that one is discussing the noise of a fixed
portion of the image independent of how many pixels it contains.
There is no clear trend of shadow noise per area in current cameras.
It seems to increase as pixel size decreases at high ISO, where the
noise is predominantly due to the sensor at the lowest signal levels;
it seems to decrease with pixel size at low ISO, because the noise is
coming predominantly from electronics downstream of the sensor which
are helped by the lower per pixel signals it has to handle when the
pixels are smaller. For instance, at ISO 100 the read noise per area
of the Canon 40D is 10% less than that of the Canon 1D3, as measured
by the std dev (caveat: the 40D has more pattern noise, which is not
accurately measured by the std dev), while at ISO 1600 it is about
30-35% higher.

So, while DR of a pixel goes down as pixel size decreases (largely
due to the decreased well capacity), DR per area and therefore DR of
the image capture is largely independent of pixel size, particularly
at low ISO.

It sounds perfectly reasonable when you say that "The saturation density.. ..turns out to be largely independent of pixel size", but I knew that my misunderstanding came from somewhere, and I actually managed to find the source! Near the top of the link below is an illustration of 'Photon rain' into buckets. The small bucket has much lower sides than the big bucket, and will therefore overflow long before the bigger bucket! But that's wrong?

http://www.clarkvision.com/imagedetail/does.pixel.size.matter/

ejmartin Veteran Member • Posts: 6,274
Re: Clarkvision.com?

Steen Bay wrote:

ejmartin wrote:

[snip]

It sounds perfectly reasonable when you say that "The saturation
density.. ..turns out to be largely independent of pixel size", but I
knew that my misunderstanding came from somewhere, and I actually
managed to find the source! Near the top of the link below is an
illustration of 'Photon rain' into buckets. The small bucket has much
lower sides than the big bucket, and will therefore overflow long
before the bigger bucket! But that's wrong?

http://www.clarkvision.com/imagedetail/does.pixel.size.matter/

Yes, Roger's graphic is misleading. In an accurate analogy, the bucket would be just as tall, but with a narrower cross-sectional area. Then a large number of tall narrow buckets would have the same holding capacity as one big bucket of the same height.

The rest of that page is mostly a comparison between large and small sensors rather than large and small pixels . As usual with Roger, you need to read carefully; the data doesn't always support the conclusion being drawn, or people are inclined to misinterpret what he is saying on the basis of internet-myth generated preconceptions.

To pick two currently popular cameras, the Panasonic LX3 (2µ pixels) has a saturation density over twice that of the Nikon D3 (8.5µ pixels). In part that is because the LX3 base ISO is 80, as compared to the D3's base ISO of 200. But it shows that there need not be much correlation between pixel size, and the number of electrons per unit area at saturation of the sensor.

-- hide signature --
kwik Forum Member • Posts: 84
DR and the human eye

Sorry to sidetrack the discussion, but do you happen to know what the DR of the human eye is under various circumstances and how these figures are derived? Please refer to this post and the three responses to it:

http://forums.dpreview.com/forums/read.asp?forum=1022&message=31827941

richardplondon
richardplondon Forum Pro • Posts: 10,487
Re: DR and the human eye

kwik wrote:

Sorry to sidetrack the discussion, but do you happen to know what the
DR of the human eye is under various circumstances and how these
figures are derived?

It is human nature to first look for the extreme answer, and it often happens in this area.

Sure: we can squint at the sun for a very brief moment with our eyelids nearly crammed shut, putting up with the big multicoloured afterimages; and several hours later, we can hang around for ten minutes in the dark while our eyes adapt, and then we can pick out a faint planet in the night sky.

But this is our visual system's capability , not its dynamic range, IMO. A digital camera can also photograph the sun for a brief moment, and next it can make a long exposure in almost perfect darkness - and with zero adaptation time between. We don't consider that to be dynamic range, we consider it to be adjustability, because the two captures were not simultaneous.

The same is strictly true for our eyes and brain, but because our visual experiences are always inseparably grouped - cumulative - we think about something seen more or less at once, when really we have scanned it for a while [using many different and complex processes, including adaptation, edge detection, daring inference and sheer invention - for example, filling in our "blind spot" and reconciling binocular vision into a single impression. There's a lot we will report having seen, which experiment shows we simply haven't.]

Futhermore, we can't directly compare the subjective aspects of looking at a compressed brightness representation (a print, say) with a full brightness range scene in reality, without also considering the effects of that reduced brightness range. If we could duplicate that for our eyes - a low contrast filter - then we could look at the bright sun and the deep shadows all at the same time; but we can't. So that's not a fair comparison.

So about all we can do is to look at the information the human eye can gather during a mere second or two (allowing "local scanning" adaptation but not "overall scene" adaptation), from a single extremely contrasty scene, and judge how well a picture from a digital camera viewed with a similar luminance range compares in information content, when also viewed for the same period. Most monitors cannot do this, and (for sure) a print cannot, unless we subjectively and imaginatively "enter its world" in the viewing, which has no counterpart in viewing direct reality.

But there are specialist displays now that show something remarkably like looking at the world - a dazzling sun is dazzling to look at, deep shadows are dim and obscure, when we view this display in all the same flawed and problematic ways that we view reality. A brief impression is easily represented within our usual digital camera DR - with blown sky, filled shadows, and all: after all, the sky is a mere dazzle when we see it peripherally as we look at the land; and while we look at the clouds instead, the land is a dark blur. All cameras can do somewhat better than that.

True, a longer, more exploratory and forensic exploration will always reveal when something is a photograph and not a live scene, even in ideal viewing circumstances; but to expect everything from a photograph that we do from a real scene is to miss the point of photography IMO - to stray into the realms of simulation, not representation.

RP

 richardplondon's gear list:richardplondon's gear list
Panasonic LX100 Pentax K-5 Sigma 10-20mm F4-5.6 EX DC HSM Pentax smc DA 21mm F3.2 AL Limited Pentax smc DA 70mm F2.4 AL Limited +7 more
OP Steen Bay Veteran Member • Posts: 6,974
Free lunch?

ejmartin wrote:

Steen Bay wrote:

ejmartin wrote:

[snip]

It sounds perfectly reasonable when you say that "The saturation
density.. ..turns out to be largely independent of pixel size", but I
knew that my misunderstanding came from somewhere, and I actually
managed to find the source! Near the top of the link below is an
illustration of 'Photon rain' into buckets. The small bucket has much
lower sides than the big bucket, and will therefore overflow long
before the bigger bucket! But that's wrong?

http://www.clarkvision.com/imagedetail/does.pixel.size.matter/

Yes, Roger's graphic is misleading. In an accurate analogy, the
bucket would be just as tall, but with a narrower cross-sectional
area. Then a large number of tall narrow buckets would have the same
holding capacity as one big bucket of the same height.

The rest of that page is mostly a comparison between large and small
sensors rather than large and small pixels . As usual with Roger,
you need to read carefully; the data doesn't always support the
conclusion being drawn, or people are inclined to misinterpret what
he is saying on the basis of internet-myth generated preconceptions.

To pick two currently popular cameras, the Panasonic LX3 (2µ pixels)
has a saturation density over twice that of the Nikon D3 (8.5µ
pixels). In part that is because the LX3 base ISO is 80, as compared
to the D3's base ISO of 200. But it shows that there need not be
much correlation between pixel size, and the number of electrons per
unit area at saturation of the sensor.

So, sensor size matters, but pixel size (largely) doesn't, when we're talking noise and DR, and since smaller pixels gives us more resolution, it seems that a 'free lunch' actually does exist!

richardplondon
richardplondon Forum Pro • Posts: 10,487
Re: Free lunch?

Steen Bay wrote:

So, sensor size matters, but pixel size (largely) doesn't, when we're
talking noise and DR, and since smaller pixels gives us more
resolution, it seems that a 'free lunch' actually does exist!

Bigger processing load, more data to buffer, more storage, possibly more waste heat to deal with? Greater reject rate in the sensor manufacture? TANSTAAFL.

It has taken a considerable engineering effort to get us where we are now, and this will get harder and harder. Results need to justify this effort, and consumer demand needs to pay for it.

RP

 richardplondon's gear list:richardplondon's gear list
Panasonic LX100 Pentax K-5 Sigma 10-20mm F4-5.6 EX DC HSM Pentax smc DA 21mm F3.2 AL Limited Pentax smc DA 70mm F2.4 AL Limited +7 more
OP Steen Bay Veteran Member • Posts: 6,974
DR and digital cameras.

I suppose one could also argue that my 40D has 33 stops of DR, namely from 30 sec, f/1,0, iso3200 to 1/8000 sec, f/32, iso100!

kwik wrote:

Sorry to sidetrack the discussion, but do you happen to know what the
DR of the human eye is under various circumstances and how these
figures are derived? Please refer to this post and the three
responses to it:

http://forums.dpreview.com/forums/read.asp?forum=1022&message=31827941

bobn2
bobn2 Forum Pro • Posts: 51,546
Re: DR and digital cameras.

Steen Bay wrote:

I suppose one could also argue that my 40D has 33 stops of DR, namely
from 30 sec, f/1,0, iso3200 to 1/8000 sec, f/32, iso100!

It's also very hard to make any type of assessment of the quality of image produced by the eye, and whether highlights are blown, etc. Given that the image you perceive depends a great deal on the image processing, stacking and stitching multiple images in real time most of this DR would seem to be in the signal processing. It would physically be difficult for the eye to have the DR proposed by some pundits on the issue. The issue then would be the 'read noise' of the eye and the photon saturation density. Even assuming 100% photon collection efficiency (and I'm sure it's far from that, those rods and cones don't cover an enormous area of the retina), the 40 stops range proposed by some would require stupendous saturation density, and deep sub-photon read noise. My guess would be that the eye performs worse on both fronts than modern silicon sensors.
--
Bob

Lee Jay Forum Pro • Posts: 51,895
Re: Free lunch?

Steen Bay wrote:

So, sensor size matters, but pixel size (largely) doesn't, when we're
talking noise and DR, and since smaller pixels gives us more
resolution, it seems that a 'free lunch' actually does exist!

That's right, except there are secondary effects that do cause a loss of DR in smaller pixels. They don't have an inherent reduction of DR for the reasons explained (Qsat goes down as fast as photon flux), but those secondary effects do cause a small reduction as size goes down (Qsat goes down a little faster than photon flux).

-- hide signature --

Lee Jay
(see profile for equipment)

 Lee Jay's gear list:Lee Jay's gear list
Canon IXUS 310 HS Canon PowerShot SX260 HS Canon EOS 5D Canon EOS 20D Canon EOS 550D +22 more
bobn2
bobn2 Forum Pro • Posts: 51,546
Re: Free lunch?

ljfinger wrote:

Steen Bay wrote:

So, sensor size matters, but pixel size (largely) doesn't, when we're
talking noise and DR, and since smaller pixels gives us more
resolution, it seems that a 'free lunch' actually does exist!

That's right, except there are secondary effects that do cause a loss
of DR in smaller pixels. They don't have an inherent reduction of DR
for the reasons explained (Qsat goes down as fast as photon flux),
but those secondary effects do cause a small reduction as size goes
down (Qsat goes down a little faster than photon flux).

I'd be interested to see the source of that - there's a few of us been discussing it for a while. In practice, so far as I can see, Qsat has a lot of design in it, and it is controlled in real sensors to adjust operational parameters. This case would seem to suggest that the link is not as hard as you suggest: http://forums.dpreview.com/forums/read.asp?forum=1018&message=31843673

The other question is the exact scale dependence of read noise. There still seems to be room for discussion there.

-- hide signature --

Bob

Lee Jay Forum Pro • Posts: 51,895
Re: Free lunch?

bobn2 wrote:

The other question is the exact scale dependence of read noise. There
still seems to be room for discussion there.

That topic is so dependent on the details of the design that I don't believe any generalities can be drawn.

-- hide signature --

Lee Jay
(see profile for equipment)

 Lee Jay's gear list:Lee Jay's gear list
Canon IXUS 310 HS Canon PowerShot SX260 HS Canon EOS 5D Canon EOS 20D Canon EOS 550D +22 more
bobn2
bobn2 Forum Pro • Posts: 51,546
Re: Free lunch?

ljfinger wrote:

bobn2 wrote:

The other question is the exact scale dependence of read noise. There
still seems to be room for discussion there.

That topic is so dependent on the details of the design that I don't
believe any generalities can be drawn.

but they can on saturation density?
--
Bob

bobn2
bobn2 Forum Pro • Posts: 51,546
Re: Clarkvision.com?

Part of the problem is that Roger says things that are actually profoundly misleading. Two quotes from that page:

'Because good digital cameras are photon noise limited, the larger pixels will always have higher signal-to-noise ratios unless someone finds a way around the laws of physics, which is highly unlikely.'

That's where all these spurious quotes of the 'laws of physics' come from! Well, it's true that larger pixels have higher signal to noise ratios, but that doesn't mean that images produced from them have higher SNR's at any given spatial frequency. You'd think, as a physicist, he'd understand that.

'Image detail can be blurred by diffraction. Diffraction is more of an issue with smaller pixels, so again cameras with larger pixels will perform better, giving sharper images.'
Which is just plain wrong.
--
Bob

Lee Jay Forum Pro • Posts: 51,895
Re: Free lunch?

bobn2 wrote:

but they can on saturation density?

I think so - similar process technologies and materials used. Unless you know of some SiC or Diamond sensors.

-- hide signature --

Lee Jay
(see profile for equipment)

 Lee Jay's gear list:Lee Jay's gear list
Canon IXUS 310 HS Canon PowerShot SX260 HS Canon EOS 5D Canon EOS 20D Canon EOS 550D +22 more
bobn2
bobn2 Forum Pro • Posts: 51,546
Re: Free lunch?

ljfinger wrote:

bobn2 wrote:

but they can on saturation density?

I think so - similar process technologies and materials used. Unless
you know of some SiC or Diamond sensors.

The same applies to read noise (or at least the sensel component of it).

However, I don't think I agree with your proposition, if that's what it's based on. If you're talking about saturation photoelectron density under pure two dimensional scaling, then it should be pretty invariate. Under three dimensional scaling it will increase as the scale is reduced, due to thinner layers. If you're talking about design variations, all bets are off.

-- hide signature --

Bob

Keyboard shortcuts:
FForum MMy threads