Relative Pixel Density - don't expect miracles from 40D

Then you should be comparing equal physical size crops, to isolate that difference.
Are you saying a larger sensor with the same pixel count but lower
pixel density gives a better IQ than a smaller one due to the sensor
size difference instead of the pixel size difference, assuming they
share the same modern technology?
Well, they are both varying in that case. The bottom line is that there are more photons collected for the image in the large sensor, and that would be true even if you increased the pixel density of the larger sensor to include more pixels.

In the specific case of 20D vs 1Dmk2, the 1Dmk2 could conceivably do much better, shot-noise-wise, than it does, but it is compromised for whatever reasons, partly because of all the support transistors at each photosite. A simple CCD sensor the same size could potentially collect up to twice as many photons as the 1Dmk2 does, for lower shot noise.
If not that, why is it so important to isolate these things (pixel
density in a given sensor size vs. pixel density in different sized
sensors)?
So that you know what the real factors are. There is a lot of pixel-centric talk going on, which leads to false conclusions about the trade-offs involved in pixel density. There are also lots of horror stories about small sensors lowering quantum efficiency, etc, but for the most part they aren't coming true. The 2 micron compact camera sensors are capturing photons more efficiently, per unit of area, than many DSLRs are, including big-pixel DSLRsl. This is exactly the opposite of what the horror stories would suggest.

The only disadvantage to the capture per unit of area in something like an FZ50, compared to a 1Dmk2, is that the Panasonic CCD sensor does not have provisions for optimized readout at high ISOs, so your ISO 1600 is going to have a noise floor about 1/2 stop higher. At ISO 100, however, it is a stop lower for the Panasonic.
I'm not sure if any real life examples of a given sensor size with
equal technology - including equal noise reduction - but different
pixel density can be found.
That's why you have to compare crops of large sensors with big pixels to equal area crops or full sensors of compact cameras. There is no other way to see the potential of small pixels, in real-world full-sensor examples.

--
John

 
Ok, but what about Dynamic Range? Wouldn't a 100 MP FF-sensor have a
lousy DR compared to a 10 MP FF-sensor?
From a pixel-centric viewpoint, yes, the smaller pixels would have more noise from shot noise alone, and might have more read noise due to readout speed concerns, but what is the purpose of the pixel-centric viewpoint? It is only meaningful if you intend to display the 100MP image 3.16x as wide in the viewer's field of view. If you plan to view both images at the same size in the field of view, then the dynamic range will not vary because of shot noise (if the total image photon capture does not change), and how it may vary because of read noise depends on the methods and circuitry used for readout; parallelized readout could actually give lower image read noise for the 100MP image, and read noise is usually the main limiter of DR.

I'm just talking about DR as affected by the noise floor there; there is also the issue of clipping at max values; having more shot noise per pixel means that statistically, a small number of pixels will not clip even when the average signal is well above the clipping point, and the closer to the actual clipping point for the clipped average value, the more pixels will fail to clip, giving a gentle roll-off near clipping that squeezes in a bit of extra highlights.

--
John

 
Ok, but what about Dynamic Range? Wouldn't a 100 MP FF-sensor have a
lousy DR compared to a 10 MP FF-sensor?
From a pixel-centric viewpoint, yes, the smaller pixels would have
more noise from shot noise alone, and might have more read noise due
to readout speed concerns, but what is the purpose of the
pixel-centric viewpoint? It is only meaningful if you intend to
display the 100MP image 3.16x as wide in the viewer's field of view.
If you plan to view both images at the same size in the field of
view, then the dynamic range will not vary because of shot noise (if
the total image photon capture does not change), and how it may vary
because of read noise depends on the methods and circuitry used for
readout; parallelized readout could actually give lower image read
noise for the 100MP image, and read noise is usually the main limiter
of DR.

I'm just talking about DR as affected by the noise floor there; there
is also the issue of clipping at max values; having more shot noise
per pixel means that statistically, a small number of pixels will not
clip even when the average signal is well above the clipping point,
and the closer to the actual clipping point for the clipped average
value, the more pixels will fail to clip, giving a gentle roll-off
near clipping that squeezes in a bit of extra highlights.
So, noise is actually a good thing! :) Seriously, wouldn't the 100 MP sensor have several stops lower DR than the 10 MP sensor, and isn't that a serious problem?
 
So, noise is actually a good thing! :) Seriously, wouldn't the 100 MP
sensor have several stops lower DR than the 10 MP sensor, and isn't
that a serious problem?
No, the sensor would not have significantly less DR. Only pixels, taken out of context (the image), would.

In fact, DR of the image can actually increase if you can keep read noise from getting much higher per pixel.

--
John

 
So, noise is actually a good thing! :) Seriously, wouldn't the 100 MP
sensor have several stops lower DR than the 10 MP sensor, and isn't
that a serious problem?
No, the sensor would not have significantly less DR. Only pixels,
taken out of context (the image), would.

In fact, DR of the image can actually increase if you can keep read
noise from getting much higher per pixel.
I hope you are right, but it sounds almost too good to be true, doesn't it? Infinite resolution, better DR and lower noise... Well, time will tell, I suppose :)
 
I hope you are right, but it sounds almost too good to be true,
doesn't it? Infinite resolution, better DR and lower noise... Well,
time will tell, I suppose :)
Well, it's not that good. Resolution will always be limited by the optics. Trillions of pixels require more data storage, even if they don't have any extra photons to record. There are points of diminishing returns, but I don't think they are very close with current DSLRs.

--
John

 
Well, they are both varying in that case. The bottom line is that
there are more photons collected for the image in the large sensor,
and that would be true even if you increased the pixel density of the
larger sensor to include more pixels.
I'd think we should isolate resolution and IQ (= noise in this context) from each other here. The larger amount of photons in the larger sensor case can be converted to a better resolution (sticking with the pixel density of the smaller sensor while increasing pixel count) OR better IQ (lowering pixel density while maintaining the pixel count). Both can't be achieved at the same time.
That's why you have to compare crops of large sensors with big pixels
to equal area crops or full sensors of compact cameras. There is no
other way to see the potential of small pixels, in real-world
full-sensor examples.
If you go the other way round: take a no-crop DSLR image vs. several zoomed-in P&S images composed, both in low light / high ISO. Presume the resolution is sufficient for your needs in both cases. Which one would you pick? I'd suggest you can see the difference: clean vivid colours, deep blacks and crisp whites in the DSLR image vs. worse IQ in the P&S case... The final composed sensor sizes are the same but the high pixel density of the P&S has led to worse IQ.

Timo
 
I'd think we should isolate resolution and IQ (= noise in this
context) from each other here. The larger amount of photons in the
larger sensor case can be converted to a better resolution (sticking
with the pixel density of the smaller sensor while increasing pixel
count) OR better IQ (lowering pixel density while maintaining the
pixel count).
That's not IQ. That's PQ (Pixel Quality).
Both can't be achieved at the same time.
Pixel density is adverse to PQ, AOTBE. It is not adverse to IQ or SQ (Subject Quality).
If you go the other way round: take a no-crop DSLR image vs. several
zoomed-in P&S images composed, both in low light / high ISO. Presume
the resolution is sufficient for your needs in both cases. Which one
would you pick?
I do not understand what your scenario is here. The language is not precise enough; I can only guess what you're asking. If your are asking which would be better, a full DSLR image, or several compact-sensor images stitched together, to simulate the same size sensor as the DSLR, then that is the same thing that I suggested, only taken to a larger scale. The stitched compacts will give much better resolution, similar or even better image shot noise, and anywhere from slightly more image read noise at ISO 1600 (about 1/2 stop compared to some Canon DSLRs), to much less image read noise for the stitch (compared to a high-noise DSLR like the D2X).
I'd suggest you can see the difference: clean vivid
colours, deep blacks and crisp whites in the DSLR image vs. worse IQ
in the P&S case... The final composed sensor sizes are the same but
the high pixel density of the P&S has led to worse IQ.
Imagine a real, high-detail analog image on a focal plane. Imagine now, tiles of different sizes placed over the image, taking the average value under the tile and uniformly coloring the tile that average color. The larger tiles are going to have a larger percentage of the area they cover represented falsely by the single value, than the smaller tiles will. In smooth bokeh areas, the individual tile noise will be higher for the smaller tiles, but the larger size of the larger tiles makes their lesser noise just as visible. You can simulate this yourself in photoshop; take a 400x400 pixel 128/128/128 grey image, and add chromatic gaussian noise to it. Make three copies, and pixelate the copies to 2x2, 3x3, and 4x4 tiles. Look at all 4 at 100% on the screen, up close. Now step back and look from across the room; you might only see noise in the the bigger tiles. If you think the tiles are too sharp, then use downsampling and upsampling to achieve a smoother pixelation.

--
John

 
You can simulate this yourself in photoshop; take a 400x400 pixel 128/128/128
grey image, and add chromatic gaussian noise to it. Make three copies, and
pixelate the copies to 2x2, 3x3, and 4x4 tiles. Look at all 4 at 100% on the
screen, up close. Now step back and look from across the room; you might only

see noise in the the bigger tiles. If you think the tiles are too sharp, then use
downsampling and upsampling to achieve a smoother pixelation.
Three issues:
400 x 400 pixels instead of 8+ MP
grey color (= average noise color)
looking at a monitor image instead of a print.

This thread is full now. I think I'll make some experiment and maybe come back.

Btw, your understanding of my awkward english was right.

Timo
 

Keyboard shortcuts

Back
Top