Purple flare: Causes and remedies

Started Mar 1, 2013 | Discussions thread
Anders W
Forum ProPosts: 17,055Gear list
Like?
Re: Purple flare: Causes and remedies
In reply to Anders W, Mar 2, 2013

OK. Enough guessing and time to spell out my little theory. My idea is that the tendency for the flare to go purple results from an interaction effect between two factors: color-channel pollution and color-channel imbalance. In other words, some of both must be present for the purple color to result. Either alone will not do.

Color-channel pollution

By color-channel pollution I mean a phenomenon such that photons that have passed the color filter of a certain pixel actually end up being registered by the photodiode of a neighboring pixel of another color. In other words, a photon with a wavelength corresponding to one of the three primary colors (red, green, and blue) ends up being recorded by the sensor as belonging to another color, e.g., a green photon being recorded as a red or blue one.

Can such things happen? Well, apparently they can. According to the broschure descring the new Leica M (p. 22), from which I borrow the drawing shown below of the sensor architecture of a "standard CMOS sensor" (not the one used by Leica M), "rays of incoming light at large angles of incidence can fail to reach the photodiode of the corresponding pixel and reach only the adjacent pixel". One possibility is the one shown by the drawing, where a photon passes directly through the filter of one pixel and then on to the photodiode of another. Another possibility, not shown in this drawing, is that a photon is reflected, after having passed the filter of one pixel, in such a way as to end up at the photodiode of another pixel.

Note that the broschure speaks of "large angles of incidence" (i.e., angles that are large relative to a line perpendicular to the sensor plane and thus small relative to the sensor plane itself) as a precondition for what I call color-channel pollution to occur. Sensors in Leica M cameras must be specifically designed to handle such large angles of incidence due to the fact that the lenses of the M system are, for historical reasons, not designed so as to approach the ideal of telecentricity. For other systems, especially those designed from the outset for digital sensors, like micro four thirds, the problem is instead avoided by designing the lenses in such a way that large angles of incidence are avoided.

Now this holds for light that is meant to end up on the sensor. It does not apply to light that is not intended to reach the sensor, such as flare. Such light may well arrive at a large enough angle of incidence for color channel polution to occur.

While some of the things I describe above are difficult to test in a direct fashion, there is a simple test that anyone with access to an E-M5 can easily perform. Remove the lens and hold the camera with the naked sensor so as to vary the angle of incidence of the light from some suitable source, e.g., a spotlight in your apartment. If you do so, and inspect the resultant live-view image via the EVF or LCD, you can easily see that the color shifts increasingly towards purple the larger the angle of incidence becomes.

Color-channel imbalance

While white is an equal mix of red, green, and blue, this does not mean that what we eventually see as white or gray in our images correspond to an equal mix of the three colors on the sensor level. First, any sensor with a conventional Bayer filter has two green pixels for every red and blue one. Second, the number of photons recorded by the green pixels when the sensor is exposed to ordinary white/gray daylight is approximately twice as large as that recorded by a red or blue one. In order for the white/gray light to eventually appear white/gray rather than intensely green, the red and blue channels must be amplified in the course of image processing so as to reach parity with the dominant green channel.

Had the channels been evenly balanced on the sensor level, color-channel pollution would not lead to a color shift. The stream of pollutive photons from green to red/blue would be as large as the stream in the opposite direction. Consequently, the two pollutive streams would cancel out and we would still see white/gray. In view of the color-channel imbalance that actually exists on the sensor level, however, this is not the case. The pollutive stream from green to red/blue will ordinarily be roughly twice as large as the stream in the opposite direction, thus giving rise to a shift towards purple/magenta.

The purple-filter experiment

As I pointed out at the outset, my little theory is that both color-channel pollution and color-channel imbalance are required for the flare to go purple. It is difficult to experiment directly with a given sensor's tendency to pollute when exposed to light with a large angle of incidence. All we can easily do in this regard is to compare across cameras with different sensors, as I as well as others have already done in the course of prior threads, e.g., by comparing the Olympus E-M5 (where the flare is very prone to go purple) with the Panasonic G1 (which shows no purple tendencies whatsoever).

One other thing we can do is to study the effect of reducing the imbalance at the sensor level between the green channel and the red and blue ones. If the theory is correct, a reduction of that imbalance should reduce the purpleness of the flare. This is the idea behind my purple-filter experiment. The direct effect of the filter is to reduce the imbalance between green on the one hand and red/blue on the other and as predicted by the theory, this in turn reduces the purpleness of the flare.

 Anders W's gear list:Anders W's gear list
Panasonic Lumix DMC-G1 Olympus OM-D E-M5 Olympus E-M1 Panasonic Lumix G Vario 14-45mm F3.5-5.6 ASPH OIS Panasonic Lumix G Vario 7-14mm F4 ASPH +21 more
Reply   Reply with quote   Complain
Post (hide subjects)Posted by
Keyboard shortcuts:
FForum PPrevious NNext WNext unread UUpvote SSubscribe RReply QQuote BBookmark post MMy threads
Color scheme? Blue / Yellow