Purple flare: Causes and remedies

Started Mar 1, 2013 | Discussions thread
Cani Regular Member • Posts: 386
Re: Hmmm... CC experiment implies non-linear process...

kenw wrote:

What I see:

First image (no filters of any kind) - There is obvious white flaring across the top of the image and there is also a rather obvious more localized purple flare.

Second image (polarizer) - The white flaring remains, but the purple flare is essentially gone. Most notably, the purple flare hasn't turned white - it has disappeared completely - that is to say if we made monochrome versions of the two images they would look different with the second image having no sign of the extra brightening in the region of the purple flare in the first image.

Third image (CC filters and WB adjust) - In this image the purple flare in the first image has now turned white, there is clearly additional flaring where the purple exists in the first image but now it is white instead of purple. This is distinct from the second image in which the flare associated with the purple regions has been completely eliminated.

To summarize what I see:

First image, general flare plus a purple "streak". Second image, general flare unchanged and purple "streak" completely gone. Third image, general flare unchanged and purple "streak" has turned white.

Thanks for having detailed what you see. I fully agree with your observations.

To summarize what I conclude from the tests:

First image: There is a strange purple streak, it is probably an optical reflection from some place (most flare is of course). Why is it purple though? Was the reflection itself purple (e.g. like we might see from some optical coatings) or is there something "funny" going on.

Second image: Demonstrates that the purple flare is from some reflection source because Anders could eliminate it with a polarizer (as opposed to a scattering source which wouldn't be polarized). Not really a surprise, but a useful illustration that there is a reflection someplace. Still left unanswered, is this reflection purple or is it white and then changed to purple by some other process further down the chain.

There remain some ambiguity around "reflection source". The strong source of light is indirect so we are dealing with reflected or diffused light incoming the lens. Besides reflections can occur inside the optics and become a "reflection source". IMO the first experiment, i.e., second image shows either or both of (i) a large portion of the flare plus the purple streak comes from polarized (non-direct thus reflected/diffused) light and (ii) if you polarize this (non-direct thus reflected/diffused) incoming light in a certain way when it reflects internally it produces neither flare nor purple cast.

Third image: This is the image that proves the reflection itself is not purple. If it was truly purple then the CC+WB trick wouldn't make it become the same color as the rest of the originally white flare. This test strongly implies that the offending reflection is likely white (or close to it) and being changed to appearing purple in the final image because of the sensor reflections past the CFA from off axis light that Anders illustrated. The implication is this part of the flare while actually white is originating from a point off the optical axis of the lens and that is why it turns purple when measured by the sensor. The rest of the flare that appears white in the first image is coming from close to the lens axis and so is measured as white.

Same here, the "outside" reflection cannot be purple, but what about the internal reflection (you provide the example of optical coatings that could produce such reflection (or diffraction?)? However, the fact that most Panasonic bodies are unaffected by the purple flare should eliminate the possibility of the internal reflection being purple.

Now I still do not get it. I do not get the asymetric impact of the color-channel imbalance.

Let's assume 12 pixels ; 6 are green, 3 are blue, 3 are red. Let's assume color-channel pollution : they all fail to capture incoming light that goes directly to the adjacent pixel (one of the 4 pixels with which they share a side, not a corner). So the 6 green --> 3 blue + 3 red (statistically). The 3 red --> 3 green pixels. The 3 blue --> 3 green pixels. So we end up exactly with what we had started with . Why should then the imbalance matter?

Of course, if the light can strike a pixel adjacent by the corner, it is completely different.

But then it's a question of calculating the chances that the photon ends up in this or this adjacent pixel given the incidence angle, to determine how the imbalance will result (I think).

 Cani's gear list:Cani's gear list
Panasonic Lumix DMC-GH1 Olympus PEN E-P5 Panasonic Lumix G Vario 14-45mm F3.5-5.6 ASPH OIS Panasonic Leica DG Macro-Elmarit 45mm F2.8 ASPH OIS Panasonic Lumix G 14mm F2.5 ASPH +10 more
Post (hide subjects) Posted by
Keyboard shortcuts:
FForum PPrevious NNext WNext unread UUpvote SSubscribe RReply QQuote BBookmark MMy threads
Color scheme? Blue / Yellow