R
Ron Parr
Guest
There are many misconceptions about Bayer interpolation. I dispel many of these in my FAQ (listed below). However, it's worth mentioning a few here:
1. Bayer interpolation gets the luminance (the B&W part) of the image right, and just interpolates the color. This is wrong. Every pixel interpolates 2 out of 3 of the RGB values. Errors in any one of these will cause errors in luminance (though errors in green are worse). If you figure out how much of the luminance signal a Bayer sensor is actually getting (by projecting into the luminance dimension) you find that the sensor is capturing about 40% of the luminance information.
2. The types of patterns that cause trouble for Bayer sensors don't occur in real life. Would that this were so! Here's an amazing example I came across recently on pbase, with my thanks and apologies to the original photographer (D30, I think):
http://www.pbase.com/image/1121096
Admittedly, this is a worst case kind if example, but it is real and there's no reason to think that our images aren't riddled with lots of smaller patches of errors like this in areas of fine detail as well as lots of subtler errors that would only be noticeable in comparison to the correct image.
If you want to see some examples of what's going on with Bayer interpolation (and an education on interpolation methods), I suggest you download this paper:
http://www4.ncsu.edu:8030/~rramana/Research/demosaicking4.pdf
Even if you don't follow the math, skip forward to the figures, where you can see some wonderful examples of the types of errors Bayer interpolation can make. Figure 18 is particularly striking.
--Ron Parr
FAQ: http://www.cs.duke.edu/~parr/photography/faq.html
Gallery: http://www.pbase.com/parr/
1. Bayer interpolation gets the luminance (the B&W part) of the image right, and just interpolates the color. This is wrong. Every pixel interpolates 2 out of 3 of the RGB values. Errors in any one of these will cause errors in luminance (though errors in green are worse). If you figure out how much of the luminance signal a Bayer sensor is actually getting (by projecting into the luminance dimension) you find that the sensor is capturing about 40% of the luminance information.
2. The types of patterns that cause trouble for Bayer sensors don't occur in real life. Would that this were so! Here's an amazing example I came across recently on pbase, with my thanks and apologies to the original photographer (D30, I think):
http://www.pbase.com/image/1121096
Admittedly, this is a worst case kind if example, but it is real and there's no reason to think that our images aren't riddled with lots of smaller patches of errors like this in areas of fine detail as well as lots of subtler errors that would only be noticeable in comparison to the correct image.
If you want to see some examples of what's going on with Bayer interpolation (and an education on interpolation methods), I suggest you download this paper:
http://www4.ncsu.edu:8030/~rramana/Research/demosaicking4.pdf
Even if you don't follow the math, skip forward to the figures, where you can see some wonderful examples of the types of errors Bayer interpolation can make. Figure 18 is particularly striking.
--Ron Parr
FAQ: http://www.cs.duke.edu/~parr/photography/faq.html
Gallery: http://www.pbase.com/parr/