What the imager has

Started 9 months ago | Discussions thread
DickLyon
New MemberPosts: 7
Like?
Re: What the imager has
In reply to Kendall Helmstetter Gelner, 9 months ago

Kendall, long time...

You're right that there won't be much aliasing. A lot of people seem to have the idea that aliasing has something to do with different sampling positions or density, as in Bayer. But that's not the key issue. The problem with Bayer is that the red plane (for example) can never have more than 25% effective fill factor, because the sampling aperture is only half the size, in each direction, of the sample spacing. If you take the Fourier transform of that half-size aperture, you'll find it doesn't do much smoothing, so the response is still quite too high way past the Nyquist frequency. That's why it needs an anti-aliasing filter to do extra blurring. But if the AA filter is strong enough to remove all the aliasing in red, it also throws away the extra resolution that having twice as many green samples is supposed to give. It's a tough tradeoff.

In the Foveon sensor, the reason no AA filter is needed is not because of where the samples are, or what the different spatial sampling densities are. It's because each sample is through an aperture of nearly 100% fill factor, that is, as wide each way as the sample pitch. The Fourier transform of this aperture has a null at the spatial frequencies that would alias to low frequencies; this combined with a tiny bit more blur from the lens psf is plenty to keep aliasing to a usually invisible level, while keeping the image sharp and high-res.

In the 1:1:4 arrangement, each sample layer has this property, but at different rates -- very unlike the Bayer's red and blue planes.  The large area of the lower-level pixels is the ideal anti-aliasing filter for those layers; the top layer is not compromised by the extra spatial blurring in the lower layers, so it provides the extra high frequencies needed to make a full-res image.

Another good way to think of the lower levels is that they get the same four samples as the top level, and then "aggregate" or "pool" four samples into one. This is easy to simulate by processing a full-res RGB image in Photoshop or whatever.

The pooling of 4 into 1 is done most efficiently in the domain of collected photo-electrons, before converting to a voltage in the readout transistor. The result is the same read noise, but four times as much signal, so about a 4X better signal-to-noise ratio. Plus with fewer plugs, transistors, wires, etc. to service the lower levels, the pixel fill factor is closer to 100% with easier microlenses, and the readout rate doesn't have to be as high. Wins all around -- except for the chroma resolution.

The main claim of Bryce Bayer, and the fact that most TV formats and image and video compression algorithms rely on, is that the visual system doesn't care nearly as much about chroma resolution as about luma resolution. Unfortunately, trying to exploit that factor with a one-layer mosaic sensor has these awkward aliasing problems. Doing it with the Foveon 1:1:4 arrangement works better, requiring no AA filter, no filtering compromises. So, yes, the chroma resolution is less than the luma resolution, but you'd be hard pressed to see that in images.

If you throw out the extra luma resolution and just make 5 MP images from this new camera, you'll still have super-sharp super-clean versions of what the old DP2 or SD15 could do. Now imagine adding 2X resolution in each dimension, but with extra luma detail only, like in a typical JPEG encoding that encodes chroma at half the sample rate of luma. Whose eyes are going to be good enough to even tell that the chroma is less sharp than the luma? It's not impossible, but hard.

Speaking of stories from the old days, Foveon's first version of Sigma Photo Pro had a minor bug in the JPEG output, as you probably recall: our calls to the jpeg-6b library defaulted to encoding with half-res chroma. It took a while, but a user did eventually find an image where he could tell something was not perfect, by comparing to TIFF output, and another user told us how to fix it, so we did. It we could have gotten that original level of JPEG quality from the SD9 with 5 million instead of 10 million pixel sensors and data values, and could have gotten cleaner color as a result, would that have been a problem? I don't think so; except for marketing, and they had enough problems already. Same way with Sigma's new one, I expect; if 30 M values gives an image that will be virtually indistinguishable from what could be done with 60 M, but with cleaner color, will someone complain?

Probably so.

So, it's complicated.  Yes, reduced chroma resolution is a compromise; but a very good one, well matched to human perception -- not at all like the aliasing-versus-resolution compromise that the mosaic-with-AA-filter approach has to face.

Dick

disclaimer: I've been away from this technology too long to have any inside knowledge.  And give my apologies to Laurence for my too many words.

Reply   Reply with quote   Complain
Post (hide subjects)Posted by
(unknown member)
(unknown member)
(unknown member)
(unknown member)
(unknown member)
Keyboard shortcuts:
FForum PPrevious NNext WNext unread UUpvote SSubscribe RReply QQuote BBookmark post MMy threads
Color scheme? Blue / Yellow