More megapixels, better photos: Fact or fiction?

Started Feb 7, 2008 | Discussions thread
sugar Senior Member • Posts: 2,136
I think you are misinformed

You only talk about statistics and forget about electronics

1. More pixels will deliver a lower 'net' sensor surface (because of the circuitery) using the same technology.

2. You assume that the signal/noise ratio per pixel is independent of pixel size, which is wrong ... google on 'noise floor will learn you something more about it. If we would continue to make the pixels smaller, we would arrive at a point were pictures would be a mere 'image' of noise signals.

The article is flawed because they let the camera makers go with their absolutely lousy excuses . When posing the right question, camera makers would have to admit that the picture quality would have improved even more if they would have maintained the pixel count and merely integrated the most modern technology. But , who would buy an 8 Mp camera as an upgrade for a 8 MP camera? Noone, because most consumers aren't informed and don't understand what picture quality really stands for...

ejmartin wrote:

I see a lot of posts decrying the megapixel race. Much of this is
due to pixel-based analysis. As pixel size decreases, the number of
photons being sampled by a pixel decreases, and so the sample
variance goes up. It's basic statistics -- the larger the sample
size, the smaller the sampling error; the smaller the sample size,
the larger the sampling error (that's why accurate polls, drug
trials, etc use as large a sample as possible).

So of course, increasing the megapixels decreases the pixel size,
which decreases the sample size in photons counts of a pixel for a
given amount of ambient light, which increases the noise per pixel .
But it's not as though some photons were lost (assuming fill factor
remains constant) -- it's just that a given area of sensor is
occupied by more pixels, and so the sampling of photons is divided up
into smaller parcels. While the individual parcels have more
variance, since a fixed area of sensor has the same number of photons
falling on it, a fixed area of the image still has the same number of
photons and thus the same sample variance and hence the same noise on
a per area basis. The only thing that more pixels brings is more
resolution, and that's a good thing.

The problem here is that the conventional wisdom is wrong -- most
people look at noise by viewing their images at 100% on a monitor,
and this is using the pixel as the basis of comparison. So yes,
increasing the pixel count and viewing the image at 100%, you are
blowing the image up more and looking at a finer scale which has more
noise. And test protocols such as here at DPR and elsewhere use the
noise per pixel as the basis of comparison, which biases the analysis
against smaller pixels from the beginning. A proper measure would
account for pixel size by measuring noise per area and not noise per
pixel, since this is what someone looking at the entire image will
see -- the eye takes in the image on a per area basis, not a per
pixel basis.
--
emil
--

http://theory.uchicago.edu/~ejm/pix/20d/

-- hide signature --

I can crop at the long end myself if I want to

http://supermasj.zenfolio.com/

Post (hide subjects) Posted by
DRG
cpw
cpw
bkj
bkj
bkj
Keyboard shortcuts:
FForum PPrevious NNext WNext unread UUpvote SSubscribe RReply QQuote BBookmark MMy threads
Color scheme? Blue / Yellow