Purple flare: Causes and remedies

Started Mar 1, 2013 | Discussions
TORN
Contributing MemberPosts: 713
Like?
Adding another aspect
In reply to Anders W, Mar 2, 2013

Very interesting read. As you might know I am on the fence of leaving mft due to this issue with the 7-14, my main lens.

I wonder if lenses play their part too in this. At least it might help to exclude this factor to get a more serious base for the theory. So precisely my question is, does Olympus do anything with their lenses to lessen the purple flare effect which Panasonic does not do with its lens range? Of course lenses render and flare differently but maybe some insight might be won if we compare similar focal length lenses from both brands on the E-M5 and have a look at how they handle the purple. 7-14 vs 12, 20 vs 17, 45 vs 45 come to my mind.

Unfortunately I do not have any pair of those and my E-M5 is in for repair but this could be an interesting test.

Reply   Reply with quote   Complain
JamieTux
Veteran MemberPosts: 3,777Gear list
Like?
Re: Purple flare: Causes and remedies
In reply to Dr_Jon, Mar 2, 2013

From my own experience years ago on the Sony a900 assuming that the cause is the same as the focussed green spots I got on that...

Smaller aperture will increase the strength of the coloured flare

 JamieTux's gear list:JamieTux's gear list
Panasonic Lumix DMC-GF2 Panasonic Lumix DMC-GX7 Panasonic Lumix DMC-GH4 Olympus M.Zuiko Digital ED 9-18mm 1:4.0-5.6 Panasonic Lumix G 14mm F2.5 ASPH +12 more
Reply   Reply with quote   Complain
kenw
Veteran MemberPosts: 4,239Gear list
Like?
Re: Hmmm... CC experiment implies non-linear process...
In reply to Cani, Mar 2, 2013

Cani wrote:

It is not clear to me why you think the result implies non-linearity. Could you elaborate on this?

It is a little obtuse, and Anders has already explained quite well what the most likely non-linear process involved is (color channel mixing).  But to answer your question the clue is that for any linear process you can not change the frequency (i.e. wavelength, color) of an input.  You can only change the magnitude of a frequency already present.  What Anders cleverly did was apply two offsetting linear filtering processes - one at the input with his CC filters and one at the output with the WB change.  These filtering processes are linear - they only change the magnitude of the wavelengths (frequencies) of light in the optical path - they can't make say red light change into blue light, they can only attenuate (the CC filter) or amplify (the WB change) different wavelengths.  For a linear process (one in which superposition holds) doing what Anders did should result in no color changes anywhere in the scene.  As an example if you take a photo with a WB altering filter in front of the lens (say a daylight to tungsten filter) and they apply the opposite WB correction in post you expect the exact same scene rendition as if you had shot without the the filter and WB correction.  If that didn't occur you know the process is not linear.  But as Anders has demonstrated, and this is where my reference was a bit obtuse, non-linear does not necessarily mean clipping or a non-linear tone curve.  At the heart of it, and the basis of my observation, was the fact that measured wavelengths of light changed relative to each other in a non-linear way.

Do you mean that for both versions of the second picture the results seem to violate superposition, but cannot and thus it's an artefact from an origin that remains to be determined? Or do you believe superposition can be violated?

Well, superposition certainly can be violated by any measurement system - something just needs to clip or be non-linear in some other way.  Or it can mix wavelengths of light as Anders has illustrated so nicely.  My point was that since we were seeing superposition violated then the root cause had to be at the sensor (just past the CFA it turns out) and not in the optics or AA filter.  Although of course in order for this sensor issue to appear you need off axis light and that light appears because of the lens...

-- hide signature --

Ken W
See profile for equipment list

 kenw's gear list:kenw's gear list
Sony RX100 Panasonic Lumix DMC-G1 Olympus OM-D E-M5 Panasonic Lumix DMC-GM1 Panasonic Lumix G Vario 14-45mm F3.5-5.6 ASPH OIS +26 more
Reply   Reply with quote   Complain
JamieTux
Veteran MemberPosts: 3,777Gear list
Like?
My finidings are the same as Surefoot's and I think this is sensor reflections
In reply to Anders W, Mar 2, 2013

The reason being that they are clearly focussed and detailed and always purple - using my GF2 they are just normal flare.

When digital SLRs first came out you had either green or purple elements like this quite a lot when using non-digital optimised lenses - I think Sigma actually digitally optimised their lenses by adding a coating to their rear element just to cut this kind of thing down.

 JamieTux's gear list:JamieTux's gear list
Panasonic Lumix DMC-GF2 Panasonic Lumix DMC-GX7 Panasonic Lumix DMC-GH4 Olympus M.Zuiko Digital ED 9-18mm 1:4.0-5.6 Panasonic Lumix G 14mm F2.5 ASPH +12 more
Reply   Reply with quote   Complain
kenw
Veteran MemberPosts: 4,239Gear list
Like?
Excellent!
In reply to Anders W, Mar 2, 2013

Very well thought out and demonstrated Anders!

I now completely understand why you tried your CC filter experiment and why it worked.  Nicely done.  Yes, there is non-linearity in there - not the way probably most people think about non-linearity (clipping) but exactly the way some one who does signal processing (me) thinks about non-linearity (frequency mixing).

I can't think of a better explanation than the one you offered.  I might also add for others clarification that the same reason that Anders describes color imbalance resulting in purple/magenta coloration in this case is also the reason shadow noise is magenta in color in high ISO shots.  And as a very far aside the same reason a camera converted to IR with a long wavelength filter (830nm) results in magenta images.

I can see another excellent application of your very clever CC testing technique - purple fringing.  In almost all cases I've seen it is fairly clear most purple fringing is just longitudinal CA.  There are, however, a few times I've seen things that might not be that and of course there are many pet theories as to what causes it beyond longitudinal CA (some clearly wrong, such as blooming, and others more believable and closely related to what you demonstrate here).  Your CC test would of course also sort out those cases in which purple fringe is from the expected CA and which cases something more interesting at the sensor/microlens/CFA is occurring.

Oh, one last thought and aside.  I haven't thought about it long enough, so I could be wrong, but given we need off axis light to cause the purple and that light is coming from a reflection somewhere in the lens it seems to me that the reflection causing it could only come from a reflection that occurs between the aperture stop and the sensor (i.e. reflections in front of the aperture stop can't turn purple as it isn't possible for them to be off axis).

Again great work!

-- hide signature --

Ken W
See profile for equipment list

 kenw's gear list:kenw's gear list
Sony RX100 Panasonic Lumix DMC-G1 Olympus OM-D E-M5 Panasonic Lumix DMC-GM1 Panasonic Lumix G Vario 14-45mm F3.5-5.6 ASPH OIS +26 more
Reply   Reply with quote   Complain
Cani
Regular MemberPosts: 344Gear list
Like?
Re: Hmmm... CC experiment implies non-linear process...
In reply to kenw, Mar 2, 2013

kenw wrote:

Cani wrote:

It is not clear to me why you think the result implies non-linearity. Could you elaborate on this?

It is a little obtuse, and Anders has already explained quite well what the most likely non-linear process involved is (color channel mixing). But to answer your question the clue is that for any linear process you can not change the frequency (i.e. wavelength, color) of an input. You can only change the magnitude of a frequency already present. What Anders cleverly did was apply two offsetting linear filtering processes - one at the input with his CC filters and one at the output with the WB change. These filtering processes are linear - they only change the magnitude of the wavelengths (frequencies) of light in the optical path - they can't make say red light change into blue light, they can only attenuate (the CC filter) or amplify (the WB change) different wavelengths. For a linear process (one in which superposition holds) doing what Anders did should result in no color changes anywhere in the scene. As an example if you take a photo with a WB altering filter in front of the lens (say a daylight to tungsten filter) and they apply the opposite WB correction in post you expect the exact same scene rendition as if you had shot without the the filter and WB correction. If that didn't occur you know the process is not linear. But as Anders has demonstrated, and this is where my reference was a bit obtuse, non-linear does not necessarily mean clipping or a non-linear tone curve. At the heart of it, and the basis of my observation, was the fact that measured wavelengths of light changed relative to each other in a non-linear way.

Thanks kenw. This I understood, at least until "But as Anders...".

My question was in relation to your perceptions of the changes in the picture, especially the colors. I was not sure what you and others observed, how you oserved it.  I am still not sure except that the highlights kind of get "whiter", i.e., blue and red seem attenuated in the highlights only and enhanced in the rest of the picture. This matches Anders' speculation about the causes :attenuation of the incoming green light results in blue & red pixels (where flare shows) being struck with less light, thoughh the lower response.

I suppose it was obvious but...

Do you mean that for both versions of the second picture the results seem to violate superposition, but cannot and thus it's an artefact from an origin that remains to be determined? Or do you believe superposition can be violated?

Well, superposition certainly can be violated by any measurement system - something just needs to clip or be non-linear in some other way. Or it can mix wavelengths of light as Anders has illustrated so nicely. My point was that since we were seeing superposition violated then the root cause had to be at the sensor (just past the CFA it turns out) and not in the optics or AA filter. Although of course in order for this sensor issue to appear you need off axis light and that light appears because of the lens...

OK.

 Cani's gear list:Cani's gear list
Panasonic Lumix DMC-GH1 Olympus PEN E-P5 Panasonic Lumix G Vario 14-45mm F3.5-5.6 ASPH OIS Panasonic Lumix G 20mm F1.7 ASPH Panasonic Leica DG Macro-Elmarit 45mm F2.8 ASPH OIS +8 more
Reply   Reply with quote   Complain
TrapperJohn
Forum ProPosts: 10,401
Like?
A very effective, albeit less practical way, to eliminate purple spots.
In reply to Anders W, Mar 2, 2013

Anders, your very detailed approach got me to looking back at my EM5 and EP1 shots that have slightly out of frame light sources. Bright ones, like... the sun. Especially with the 7-14, which sucks in everything, there is often a light source just out of the visible frame.

Purple spots? None. I have never seen one, not even with the 7-14, not even with the sun just out of frame.

However... I typically have 4/3 HG and SHG ZD lenses mounted, or my beloved 4/3 PL25 1.4. The 7-14 I have is the ZD 7-14, quite a bit larger than the Panny 7-14.

All of the high grade ZD and PL 4/3 lenses have a telecentric light path, which is to say they straighten the light out so it hits the sensor at a perpendicular angle. This was done to eliminate edge problems on the earlier deep well sensors, and is probably the reason the better ZD lenses are renowned for their uniform sharpness, edge to edge, even on an ultrawide like the 7-14. To a degree, the telecentric light path accounts for their larger size.

M43 uses a shallow well sensor that isn't so fussy about how light hits it. Consequently, M43 lenses are not telecentric, it wasn't needed, and wasn't desirable from a lens size perspective. In fact, the shallow well sensor was key to M43's short registration distance.

My guess is - the telecentric light path also suppresses reflections from out of frame light sources - the straight path knocks out reflections by getting them to hit the out of frame area near the sensor straight on, rather than at an angle where they can reflect off of internal parts.

My guess also is that one doesn't see purple spots as much on Panasonic bodies because Panasonic tends to apply more PP and correction automatically in body. That's a tradeoff - you get more consistent photos and you can cut lens development cost by correcting in body rather than optically, but you can also lose detail if you lean on that too heavily - depending on the body to correct CA rather than eliminating it optically will cost some detail.

So there is a way to eliminate purple spots optically, before they ever hit the sensor: use telecentric 4/3 PL and ZD lenses. Not necessarily a practical solution for everyone, but it does work.

Reply   Reply with quote   Complain
kenw
Veteran MemberPosts: 4,239Gear list
Like?
Re: Hmmm... CC experiment implies non-linear process...
In reply to Cani, Mar 2, 2013

Cani wrote:

My question was in relation to your perceptions of the changes in the picture, especially the colors. I was not sure what you and others observed, how you oserved it. I am still not sure except that the highlights kind of get "whiter", i.e., blue and red seem attenuated in the highlights only and enhanced in the rest of the picture. This matches Anders' speculation about the causes :attenuation of the incoming green light results in blue & red pixels (where flare shows) being struck with less light, thoughh the lower response.

I suppose it was obvious but...

Ah, sorry I missed the point of your question!

What I see:

First image (no filters of any kind) - There is obvious white flaring across the top of the image and there is also a rather obvious more localized purple flare.

Second image (polarizer) - The white flaring remains, but the purple flare is essentially gone. Most notably, the purple flare hasn't turned white - it has disappeared completely - that is to say if we made monochrome versions of the two images they would look different with the second image having no sign of the extra brightening in the region of the purple flare in the first image.

Third image (CC filters and WB adjust) - In this image the purple flare in the first image has now turned white, there is clearly additional flaring where the purple exists in the first image but now it is white instead of purple. This is distinct from the second image in which the flare associated with the purple regions has been completely eliminated.

To summarize what I see:

First image, general flare plus a purple "streak". Second image, general flare unchanged and purple "streak" completely gone. Third image, general flare unchanged and purple "streak" has turned white.

To summarize what I conclude from the tests:

First image: There is a strange purple streak, it is probably an optical reflection from some place (most flare is of course). Why is it purple though? Was the reflection itself purple (e.g. like we might see from some optical coatings) or is there something "funny" going on.

Second image: Demonstrates that the purple flare is from some reflection source because Anders could eliminate it with a polarizer (as opposed to a scattering source which wouldn't be polarized). Not really a surprise, but a useful illustration that there is a reflection someplace. Still left unanswered, is this reflection purple or is it white and then changed to purple by some other process further down the chain.

Third image: This is the image that proves the reflection itself is not purple. If it was truly purple then the CC+WB trick wouldn't make it become the same color as the rest of the originally white flare. This test strongly implies that the offending reflection is likely white (or close to it) and being changed to appearing purple in the final image because of the sensor reflections past the CFA from off axis light that Anders illustrated.  The implication is this part of the flare while actually white is originating from a point off the optical axis of the lens and that is why it turns purple when measured by the sensor.  The rest of the flare that appears white in the first image is coming from close to the lens axis and so is measured as white.

-- hide signature --

Ken W
See profile for equipment list

 kenw's gear list:kenw's gear list
Sony RX100 Panasonic Lumix DMC-G1 Olympus OM-D E-M5 Panasonic Lumix DMC-GM1 Panasonic Lumix G Vario 14-45mm F3.5-5.6 ASPH OIS +26 more
Reply   Reply with quote   Complain
Anders W
Forum ProPosts: 17,376Gear list
Like?
Re: Hmmm... CC experiment implies non-linear process...
In reply to kenw, Mar 2, 2013

kenw wrote:

Cani wrote:

My question was in relation to your perceptions of the changes in the picture, especially the colors. I was not sure what you and others observed, how you oserved it. I am still not sure except that the highlights kind of get "whiter", i.e., blue and red seem attenuated in the highlights only and enhanced in the rest of the picture. This matches Anders' speculation about the causes :attenuation of the incoming green light results in blue & red pixels (where flare shows) being struck with less light, thoughh the lower response.

I suppose it was obvious but...

Ah, sorry I missed the point of your question!

What I see:

First image (no filters of any kind) - There is obvious white flaring across the top of the image and there is also a rather obvious more localized purple flare.

Second image (polarizer) - The white flaring remains, but the purple flare is essentially gone. Most notably, the purple flare hasn't turned white - it has disappeared completely - that is to say if we made monochrome versions of the two images they would look different with the second image having no sign of the extra brightening in the region of the purple flare in the first image.

Third image (CC filters and WB adjust) - In this image the purple flare in the first image has now turned white, there is clearly additional flaring where the purple exists in the first image but now it is white instead of purple. This is distinct from the second image in which the flare associated with the purple regions has been completely eliminated.

To summarize what I see:

First image, general flare plus a purple "streak". Second image, general flare unchanged and purple "streak" completely gone. Third image, general flare unchanged and purple "streak" has turned white.

To summarize what I conclude from the tests:

First image: There is a strange purple streak, it is probably an optical reflection from some place (most flare is of course). Why is it purple though? Was the reflection itself purple (e.g. like we might see from some optical coatings) or is there something "funny" going on.

Second image: Demonstrates that the purple flare is from some reflection source because Anders could eliminate it with a polarizer (as opposed to a scattering source which wouldn't be polarized). Not really a surprise, but a useful illustration that there is a reflection someplace. Still left unanswered, is this reflection purple or is it white and then changed to purple by some other process further down the chain.

Third image: This is the image that proves the reflection itself is not purple. If it was truly purple then the CC+WB trick wouldn't make it become the same color as the rest of the originally white flare. This test strongly implies that the offending reflection is likely white (or close to it) and being changed to appearing purple in the final image because of the sensor reflections past the CFA from off axis light that Anders illustrated. The implication is this part of the flare while actually white is originating from a point off the optical axis of the lens and that is why it turns purple when measured by the sensor. The rest of the flare that appears white in the first image is coming from close to the lens axis and so is measured as white.

Thanks Ken! You said it all at least as well as I could.

 Anders W's gear list:Anders W's gear list
Panasonic Lumix DMC-G1 Olympus OM-D E-M5 Olympus E-M1 Panasonic Lumix G Vario 14-45mm F3.5-5.6 ASPH OIS Panasonic Lumix G Vario 7-14mm F4 ASPH +21 more
Reply   Reply with quote   Complain
Anders W
Forum ProPosts: 17,376Gear list
Like?
Re: Excellent!
In reply to kenw, Mar 2, 2013

kenw wrote:

Very well thought out and demonstrated Anders!

Thanks. I am glad the theory passed at least first scrutiny.

I now completely understand why you tried your CC filter experiment and why it worked. Nicely done. Yes, there is non-linearity in there - not the way probably most people think about non-linearity (clipping) but exactly the way some one who does signal processing (me) thinks about non-linearity (frequency mixing).

I can't think of a better explanation than the one you offered. I might also add for others clarification that the same reason that Anders describes color imbalance resulting in purple/magenta coloration in this case is also the reason shadow noise is magenta in color in high ISO shots. And as a very far aside the same reason a camera converted to IR with a long wavelength filter (830nm) results in magenta images.

Yes. One might add here, again for clarification to others, that while the color-channel imbalance is an essential ingredient in the purple flare as well as the purple shadow noise we might see in high-ISO shots (and low-ISO shots if the shadows are strongly pushed in PP), the other essential ingredient differs: what I call color-channel pollution in the former case and clipping of the read-noise distribution in the latter.

I can see another excellent application of your very clever CC testing technique - purple fringing. In almost all cases I've seen it is fairly clear most purple fringing is just longitudinal CA. There are, however, a few times I've seen things that might not be that and of course there are many pet theories as to what causes it beyond longitudinal CA (some clearly wrong, such as blooming, and others more believable and closely related to what you demonstrate here). Your CC test would of course also sort out those cases in which purple fringe is from the expected CA and which cases something more interesting at the sensor/microlens/CFA is occurring.

Yes, you are right about that. The CC technique might be useful in settling some old disputes about lens or sensor as the cause of fringing of this or that kind. If it's the lens, altering the channel balance should have no impact. If it's the sensor, it should.

Oh, one last thought and aside. I haven't thought about it long enough, so I could be wrong, but given we need off axis light to cause the purple and that light is coming from a reflection somewhere in the lens it seems to me that the reflection causing it could only come from a reflection that occurs between the aperture stop and the sensor (i.e. reflections in front of the aperture stop can't turn purple as it isn't possible for them to be off axis).

Not sure I follow you entirely here. You may well be right but could you elaborate a bit on your idea that a reflection prior to the aperture stop could not pass the stop and subsequent lens elements to finally arrive at the sensor at an angle of incidence greater than that possible for anything the lens actually images?

Finally, do you have any further ideas about purple flare in the diagonal versus horizontal/vertical direction? An idea that has occurred to me is that the horizontal and vertical but not diagonal pattern shown by my second series of test shots might be due to the pollution being possible only between pixels that are in the same row or column but blocked diagonally. On the other hand, we have plenty of examples of purple blobs that do not take on this pattern but I cannot at present rule out the possibility that they could look the way they look even if diagonal pollution wouldn't be possible. Optical pathways are obviously pretty tricky to sort out in cases like these.

At any rate, I will try to see what happens if I try to form and then get rid of (by means of the polarizer) purple streaks like those in the first series of test shots but with the camera tilted so that the streaks become diagonal rather than vertical.

 Anders W's gear list:Anders W's gear list
Panasonic Lumix DMC-G1 Olympus OM-D E-M5 Olympus E-M1 Panasonic Lumix G Vario 14-45mm F3.5-5.6 ASPH OIS Panasonic Lumix G Vario 7-14mm F4 ASPH +21 more
Reply   Reply with quote   Complain
Anders W
Forum ProPosts: 17,376Gear list
Like?
Re: How about purple orbs ?
In reply to Surefoot, Mar 2, 2013

Surefoot wrote:

I have a multi colored spot on the GF1 too but it is less saturated, and more transparent. Also the GF1 doesnt show any kind of clipped orb like the one i see on the OMD.

More example of clipped color orbs (first one is a crop, others have to be seen in original size):

Never had such a problem with the GF1, here are examples taken exactly at the same spot with same conditions (and the same lens, still 7-14mm):

As you can see the flares are still there, but no orbs...

Thanks. Yes, I would be inclined to say that the intense purple blobs we see in the E-M5 shots but not those from the GF1 are due to the same phenomenon as the one discussed in the rest of this thread.

 Anders W's gear list:Anders W's gear list
Panasonic Lumix DMC-G1 Olympus OM-D E-M5 Olympus E-M1 Panasonic Lumix G Vario 14-45mm F3.5-5.6 ASPH OIS Panasonic Lumix G Vario 7-14mm F4 ASPH +21 more
Reply   Reply with quote   Complain
Anders W
Forum ProPosts: 17,376Gear list
Like?
Re: Adding another aspect
In reply to TORN, Mar 2, 2013

TORN wrote:

Very interesting read. As you might know I am on the fence of leaving mft due to this issue with the 7-14, my main lens.

I wonder if lenses play their part too in this. At least it might help to exclude this factor to get a more serious base for the theory. So precisely my question is, does Olympus do anything with their lenses to lessen the purple flare effect which Panasonic does not do with its lens range? Of course lenses render and flare differently but maybe some insight might be won if we compare similar focal length lenses from both brands on the E-M5 and have a look at how they handle the purple. 7-14 vs 12, 20 vs 17, 45 vs 45 come to my mind.

Unfortunately I do not have any pair of those and my E-M5 is in for repair but this could be an interesting test.

I think lenses play their part in this only in the sense that some of them are more prone to flare in general and/or to the kind of flare that leads to large angles of incidence when it hits the sensor (and therefore becomes purple). And no, I don't think Olympus does anything special with their lenses to prevent this. Note that the lens I used for my experiments in this thread was the Olympus 45/1.8.

I think it likely that the problems we now see with some lenses more than others, the 7-14 in particular (although there have been threads about similar problems with the 9-18) are due to oversights of one kind or another. They didn't test certain things carefully enough when the sensor was developed and/or there were serious trade-offs with other design objectives had they tried to do something about it.

 Anders W's gear list:Anders W's gear list
Panasonic Lumix DMC-G1 Olympus OM-D E-M5 Olympus E-M1 Panasonic Lumix G Vario 14-45mm F3.5-5.6 ASPH OIS Panasonic Lumix G Vario 7-14mm F4 ASPH +21 more
Reply   Reply with quote   Complain
Cani
Regular MemberPosts: 344Gear list
Like?
Re: Hmmm... CC experiment implies non-linear process...
In reply to kenw, Mar 2, 2013

kenw wrote:

What I see:

First image (no filters of any kind) - There is obvious white flaring across the top of the image and there is also a rather obvious more localized purple flare.

Second image (polarizer) - The white flaring remains, but the purple flare is essentially gone. Most notably, the purple flare hasn't turned white - it has disappeared completely - that is to say if we made monochrome versions of the two images they would look different with the second image having no sign of the extra brightening in the region of the purple flare in the first image.

Third image (CC filters and WB adjust) - In this image the purple flare in the first image has now turned white, there is clearly additional flaring where the purple exists in the first image but now it is white instead of purple. This is distinct from the second image in which the flare associated with the purple regions has been completely eliminated.

To summarize what I see:

First image, general flare plus a purple "streak". Second image, general flare unchanged and purple "streak" completely gone. Third image, general flare unchanged and purple "streak" has turned white.

Thanks for having detailed what you see. I fully agree with your observations.

To summarize what I conclude from the tests:

First image: There is a strange purple streak, it is probably an optical reflection from some place (most flare is of course). Why is it purple though? Was the reflection itself purple (e.g. like we might see from some optical coatings) or is there something "funny" going on.

Second image: Demonstrates that the purple flare is from some reflection source because Anders could eliminate it with a polarizer (as opposed to a scattering source which wouldn't be polarized). Not really a surprise, but a useful illustration that there is a reflection someplace. Still left unanswered, is this reflection purple or is it white and then changed to purple by some other process further down the chain.

There remain some ambiguity around "reflection source". The strong source of light is indirect so we are dealing with reflected or diffused light incoming the lens. Besides reflections can occur inside the optics and become a "reflection source". IMO the first experiment, i.e., second image shows either or both of (i) a large portion of the flare plus the purple streak comes from polarized (non-direct thus reflected/diffused) light and (ii) if you polarize this (non-direct thus reflected/diffused) incoming light in a certain way when it reflects internally it produces neither flare nor purple cast.

Third image: This is the image that proves the reflection itself is not purple. If it was truly purple then the CC+WB trick wouldn't make it become the same color as the rest of the originally white flare. This test strongly implies that the offending reflection is likely white (or close to it) and being changed to appearing purple in the final image because of the sensor reflections past the CFA from off axis light that Anders illustrated. The implication is this part of the flare while actually white is originating from a point off the optical axis of the lens and that is why it turns purple when measured by the sensor. The rest of the flare that appears white in the first image is coming from close to the lens axis and so is measured as white.

Same here, the "outside" reflection cannot be purple, but what about the internal reflection (you provide the example of optical coatings that could produce such reflection (or diffraction?)? However, the fact that most Panasonic bodies are unaffected by the purple flare should eliminate the possibility of the internal reflection being purple.

Now I still do not get it. I do not get the asymetric impact of the color-channel imbalance.

Let's assume 12 pixels ; 6 are green, 3 are blue, 3 are red. Let's assume color-channel pollution : they all fail to capture incoming light that goes directly to the adjacent pixel (one of the 4 pixels with which they share a side, not a corner). So the 6 green --> 3 blue + 3 red (statistically). The 3 red --> 3 green pixels. The 3 blue --> 3 green pixels. So we end up exactly with what we had started with . Why should then the imbalance matter?

Of course, if the light can strike a pixel adjacent by the corner, it is completely different.

But then it's a question of calculating the chances that the photon ends up in this or this adjacent pixel given the incidence angle, to determine how the imbalance will result (I think).

 Cani's gear list:Cani's gear list
Panasonic Lumix DMC-GH1 Olympus PEN E-P5 Panasonic Lumix G Vario 14-45mm F3.5-5.6 ASPH OIS Panasonic Lumix G 20mm F1.7 ASPH Panasonic Leica DG Macro-Elmarit 45mm F2.8 ASPH OIS +8 more
Reply   Reply with quote   Complain
technic
Veteran MemberPosts: 7,697Gear list
Like?
Re: Purple flare: Causes and remedies
In reply to Sanpaku, Mar 2, 2013

Sanpaku wrote:

The EM-5 sensor, as exceptional as it may be, has a coating on its attached hot mirror that reflects purple light off axis:

many digital cameras of this type have such a hot mirror; often there is a combination of absorption filter (cyan color) and reflection (interference) filter that will produce the magenta/purple off axis glow. So the question would then be why the EM-5 has this problem and not those many other cameras with similar hot mirror. IMHO the hot mirror is an unlikely cause.

 technic's gear list:technic's gear list
Canon EOS 450D Canon EF 50mm f/1.8 II Canon EF 200mm f/2.8L II USM Canon EF 300mm f/4.0L IS USM Canon EF-S 15-85mm f/3.5-5.6 IS USM +4 more
Reply   Reply with quote   Complain
Anders W
Forum ProPosts: 17,376Gear list
Like?
Re: My finidings are the same as Surefoot's and I think this is sensor reflections
In reply to JamieTux, Mar 2, 2013

JamieTux wrote:

The reason being that they are clearly focussed and detailed and always purple - using my GF2 they are just normal flare.

When digital SLRs first came out you had either green or purple elements like this quite a lot when using non-digital optimised lenses - I think Sigma actually digitally optimised their lenses by adding a coating to their rear element just to cut this kind of thing down.

Yes, but I think what you are talking about in the second paragraph is a partly different phenomenon, i.e., the digital-sensor-reflection effect, where the light is reflected from the sensor to the rear elements of the lens and then back again. See here:

http://thesybersite.com/minolta/sensor-reflection/

Meanwhile, it has become routine procedure to coat lens elements on both sides and I find no reason to think that this particular effect is of much importance with newly designed, native MFT lenses.

 Anders W's gear list:Anders W's gear list
Panasonic Lumix DMC-G1 Olympus OM-D E-M5 Olympus E-M1 Panasonic Lumix G Vario 14-45mm F3.5-5.6 ASPH OIS Panasonic Lumix G Vario 7-14mm F4 ASPH +21 more
Reply   Reply with quote   Complain
Anders W
Forum ProPosts: 17,376Gear list
Like?
Re: Hmmm... CC experiment implies non-linear process...
In reply to Cani, Mar 2, 2013

Cani wrote:

kenw wrote:

What I see:

First image (no filters of any kind) - There is obvious white flaring across the top of the image and there is also a rather obvious more localized purple flare.

Second image (polarizer) - The white flaring remains, but the purple flare is essentially gone. Most notably, the purple flare hasn't turned white - it has disappeared completely - that is to say if we made monochrome versions of the two images they would look different with the second image having no sign of the extra brightening in the region of the purple flare in the first image.

Third image (CC filters and WB adjust) - In this image the purple flare in the first image has now turned white, there is clearly additional flaring where the purple exists in the first image but now it is white instead of purple. This is distinct from the second image in which the flare associated with the purple regions has been completely eliminated.

To summarize what I see:

First image, general flare plus a purple "streak". Second image, general flare unchanged and purple "streak" completely gone. Third image, general flare unchanged and purple "streak" has turned white.

Thanks for having detailed what you see. I fully agree with your observations.

To summarize what I conclude from the tests:

First image: There is a strange purple streak, it is probably an optical reflection from some place (most flare is of course). Why is it purple though? Was the reflection itself purple (e.g. like we might see from some optical coatings) or is there something "funny" going on.

Second image: Demonstrates that the purple flare is from some reflection source because Anders could eliminate it with a polarizer (as opposed to a scattering source which wouldn't be polarized). Not really a surprise, but a useful illustration that there is a reflection someplace. Still left unanswered, is this reflection purple or is it white and then changed to purple by some other process further down the chain.

There remain some ambiguity around "reflection source". The strong source of light is indirect so we are dealing with reflected or diffused light incoming the lens.

No. The light from the halogen spots hits the lens directly, which they can certainly do even though they are outside the frame imaged by the lens.

Besides reflections can occur inside the optics and become a "reflection source". IMO the first experiment, i.e., second image shows either or both of (i) a large portion of the flare plus the purple streak comes from polarized (non-direct thus reflected/diffused) light and (ii) if you polarize this (non-direct thus reflected/diffused) incoming light in a certain way when it reflects internally it produces neither flare nor purple cast.

The reason why the polarizer reduces the purple in the second image is not that it blocks light that has already been polarized by prior reflection. It reduces the purple by polarizing the direct incoming light in such a way that certain reflections that would normally have taken place later along the optical path no longer take place. This is the reason why a circular polarizer doesn't work in this case. The circular polarizer "unpolarizes" the light before passing it on to the lens and therefore has no impact on reflections further down the optical path.

Third image: This is the image that proves the reflection itself is not purple. If it was truly purple then the CC+WB trick wouldn't make it become the same color as the rest of the originally white flare. This test strongly implies that the offending reflection is likely white (or close to it) and being changed to appearing purple in the final image because of the sensor reflections past the CFA from off axis light that Anders illustrated. The implication is this part of the flare while actually white is originating from a point off the optical axis of the lens and that is why it turns purple when measured by the sensor. The rest of the flare that appears white in the first image is coming from close to the lens axis and so is measured as white.

Same here, the "outside" reflection cannot be purple, but what about the internal reflection (you provide the example of optical coatings that could produce such reflection (or diffraction?)? However, the fact that most Panasonic bodies are unaffected by the purple flare should eliminate the possibility of the internal reflection being purple.

Now I still do not get it. I do not get the asymetric impact of the color-channel imbalance.

Let's assume 12 pixels ; 6 are green, 3 are blue, 3 are red. Let's assume color-channel pollution : they all fail to capture incoming light that goes directly to the adjacent pixel (one of the 4 pixels with which they share a side, not a corner). So the 6 green --> 3 blue + 3 red (statistically). The 3 red --> 3 green pixels. The 3 blue --> 3 green pixels. So we end up exactly with what we had started with . Why should then the imbalance matter?

Of course, if the light can strike a pixel adjacent by the corner, it is completely different.

But then it's a question of calculating the chances that the photon ends up in this or this adjacent pixel given the incidence angle, to determine how the imbalance will result (I think).

The imbalance occurs because the green channel is so much stronger than red and blue. For a white target in normal daylight, roughly twice as many photons manage to pass the green filter compared to the read and the blue filter. When some of these photons "escape", the pollutive stream from green to red/blue becomes roughly twice as strong as the pollutive stream in the opposite direction. On top of that you have the impact of the "amplification" in the course of image processing applied to the red and blue channels to help them reach parity with green in spite of the significantly smaller of photons they can be expected capture. In practice, this means that even a fairly limited amount of pollution can generate a rather strong color shift towards purple.

 Anders W's gear list:Anders W's gear list
Panasonic Lumix DMC-G1 Olympus OM-D E-M5 Olympus E-M1 Panasonic Lumix G Vario 14-45mm F3.5-5.6 ASPH OIS Panasonic Lumix G Vario 7-14mm F4 ASPH +21 more
Reply   Reply with quote   Complain
Anders W
Forum ProPosts: 17,376Gear list
Like?
Re: A very effective, albeit less practical way, to eliminate purple spots.
In reply to TrapperJohn, Mar 2, 2013

TrapperJohn wrote:

Anders, your very detailed approach got me to looking back at my EM5 and EP1 shots that have slightly out of frame light sources. Bright ones, like... the sun. Especially with the 7-14, which sucks in everything, there is often a light source just out of the visible frame.

Purple spots? None. I have never seen one, not even with the 7-14, not even with the sun just out of frame.

However... I typically have 4/3 HG and SHG ZD lenses mounted, or my beloved 4/3 PL25 1.4. The 7-14 I have is the ZD 7-14, quite a bit larger than the Panny 7-14.

All of the high grade ZD and PL 4/3 lenses have a telecentric light path, which is to say they straighten the light out so it hits the sensor at a perpendicular angle. This was done to eliminate edge problems on the earlier deep well sensors, and is probably the reason the better ZD lenses are renowned for their uniform sharpness, edge to edge, even on an ultrawide like the 7-14. To a degree, the telecentric light path accounts for their larger size.

M43 uses a shallow well sensor that isn't so fussy about how light hits it. Consequently, M43 lenses are not telecentric, it wasn't needed, and wasn't desirable from a lens size perspective. In fact, the shallow well sensor was key to M43's short registration distance.

My guess is - the telecentric light path also suppresses reflections from out of frame light sources - the straight path knocks out reflections by getting them to hit the out of frame area near the sensor straight on, rather than at an angle where they can reflect off of internal parts.

My guess also is that one doesn't see purple spots as much on Panasonic bodies because Panasonic tends to apply more PP and correction automatically in body. That's a tradeoff - you get more consistent photos and you can cut lens development cost by correcting in body rather than optically, but you can also lose detail if you lean on that too heavily - depending on the body to correct CA rather than eliminating it optically will cost some detail.

So there is a way to eliminate purple spots optically, before they ever hit the sensor: use telecentric 4/3 PL and ZD lenses. Not necessarily a practical solution for everyone, but it does work.

Hi John,

I am not sure what you mean by "shallow well sensor" here. The sensors used by MFT cameras are to my knowledge not different in any such regard than those used by FT cameras. Further, I think not only FT but also MFT lenses are designed so as to approach the objective of telecentricity since that is generally an advantage for digital sensors. That said, it is of course possible that some FT lenses, e.g., the ZD 7-14, due to the amount and kind of flare it tends to produce, is less likely to generate purple blobs or streaks on cameras like the E-M5 than for example the Panasonic 7-14.

As to in-camera software-correction, the only difference is that Panasonic bodies auto-correct lateral CA on Panasonic lenses whereas Olympus bodies don't. This should be completely inconsequential for the kind of purple flare we are dealing with in this thread.

 Anders W's gear list:Anders W's gear list
Panasonic Lumix DMC-G1 Olympus OM-D E-M5 Olympus E-M1 Panasonic Lumix G Vario 14-45mm F3.5-5.6 ASPH OIS Panasonic Lumix G Vario 7-14mm F4 ASPH +21 more
Reply   Reply with quote   Complain
Cani
Regular MemberPosts: 344Gear list
Like?
Re: Color-channel imbalance
In reply to Anders W, Mar 2, 2013

Now I still do not get it. I do not get the asymetric impact of the color-channel imbalance.

Let's assume 12 pixels ; 6 are green, 3 are blue, 3 are red. Let's assume color-channel pollution : they all fail to capture incoming light that goes directly to the adjacent pixel (one of the 4 pixels with which they share a side, not a corner). So the 6 green --> 3 blue + 3 red (statistically). The 3 red --> 3 green pixels. The 3 blue --> 3 green pixels. So we end up exactly with what we had started with . Why should then the imbalance matter?

Of course, if the light can strike a pixel adjacent by the corner, it is completely different.

But then it's a question of calculating the chances that the photon ends up in this or this adjacent pixel given the incidence angle, to determine how the imbalance will result (I think).

The imbalance occurs because the green channel is so much stronger than red and blue. For a white target in normal daylight, roughly twice as many photons manage to pass the green filter compared to the read and the blue filter. When some of these photons "escape", the pollutive stream from green to red/blue becomes roughly twice as strong as the pollutive stream in the opposite direction. On top of that you have the impact of the "amplification" in the course of image processing applied to the red and blue channels to help them reach parity with green in spite of the significantly smaller of photons they can be expected capture. In practice, this means that even a fairly limited amount of pollution can generate a rather strong color shift towards purple.

Hum...

You said in your explanation post : "Second, the number of photons recorded by the green pixels when the sensor is exposed to ordinary white/gray daylight is approximately twice as large as that recorded by a red or blue one."

I had thought you meant it as a consequence of having twice as many green pixels. Actually you say that not only do we have as many green pixels as red&bluepixels together, but each of them individually receives more light, I assume due to (i) the spectrum of typical incoming light (sun light or bulb/tungsten) and (i) the wavelength bandwidth of each channel (could not find any info on how it is implemented in Bayer's array).

Thanks, it makes sense to me.

 Cani's gear list:Cani's gear list
Panasonic Lumix DMC-GH1 Olympus PEN E-P5 Panasonic Lumix G Vario 14-45mm F3.5-5.6 ASPH OIS Panasonic Lumix G 20mm F1.7 ASPH Panasonic Leica DG Macro-Elmarit 45mm F2.8 ASPH OIS +8 more
Reply   Reply with quote   Complain
Anders W
Forum ProPosts: 17,376Gear list
Like?
Re: Color-channel imbalance
In reply to Cani, Mar 2, 2013

Cani wrote:

Now I still do not get it. I do not get the asymetric impact of the color-channel imbalance.

Let's assume 12 pixels ; 6 are green, 3 are blue, 3 are red. Let's assume color-channel pollution : they all fail to capture incoming light that goes directly to the adjacent pixel (one of the 4 pixels with which they share a side, not a corner). So the 6 green --> 3 blue + 3 red (statistically). The 3 red --> 3 green pixels. The 3 blue --> 3 green pixels. So we end up exactly with what we had started with . Why should then the imbalance matter?

Of course, if the light can strike a pixel adjacent by the corner, it is completely different.

But then it's a question of calculating the chances that the photon ends up in this or this adjacent pixel given the incidence angle, to determine how the imbalance will result (I think).

The imbalance occurs because the green channel is so much stronger than red and blue. For a white target in normal daylight, roughly twice as many photons manage to pass the green filter compared to the read and the blue filter. When some of these photons "escape", the pollutive stream from green to red/blue becomes roughly twice as strong as the pollutive stream in the opposite direction. On top of that you have the impact of the "amplification" in the course of image processing applied to the red and blue channels to help them reach parity with green in spite of the significantly smaller of photons they can be expected capture. In practice, this means that even a fairly limited amount of pollution can generate a rather strong color shift towards purple.

Hum...

You said in your explanation post : "Second, the number of photons recorded by the green pixels when the sensor is exposed to ordinary white/gray daylight is approximately twice as large as that recorded by a red or blue one."

I had thought you meant it as a consequence of having twice as many green pixels.

OK. I see now that I should have said "the number of photons recorded by each green pixel" for greater clarity.

Actually you say that not only do we have as many green pixels as red&bluepixels together, but each of them individually receives more light, I assume due to (i) the spectrum of typical incoming light (sun light or bulb/tungsten) and (i) the wavelength bandwidth of each channel (could not find any info on how it is implemented in Bayer's array).

You find some more information on the sensitivity of the individual channels if you look at the "color response" section of the DxOMark sensor tests.

 Anders W's gear list:Anders W's gear list
Panasonic Lumix DMC-G1 Olympus OM-D E-M5 Olympus E-M1 Panasonic Lumix G Vario 14-45mm F3.5-5.6 ASPH OIS Panasonic Lumix G Vario 7-14mm F4 ASPH +21 more
Reply   Reply with quote   Complain
kenw
Veteran MemberPosts: 4,239Gear list
Like?
Re: Excellent!
In reply to Anders W, Mar 2, 2013

Anders W wrote:

kenw wrote:

Oh, one last thought and aside. I haven't thought about it long enough, so I could be wrong, but given we need off axis light to cause the purple and that light is coming from a reflection somewhere in the lens it seems to me that the reflection causing it could only come from a reflection that occurs between the aperture stop and the sensor (i.e. reflections in front of the aperture stop can't turn purple as it isn't possible for them to be off axis).

Not sure I follow you entirely here. You may well be right but could you elaborate a bit on your idea that a reflection prior to the aperture stop could not pass the stop and subsequent lens elements to finally arrive at the sensor at an angle of incidence greater than that possible for anything the lens actually images?

Well I should really add an important qualification to that now that I think about it.  I was considering the case more of something like the 7-14 or for the 45/1.8 it being stopped down significantly.  In those cases the exit pupil is rather small compared to the rear element of the lens.

All image forming light, from the perspective of the sensor, originates from the exit pupil - i.e. the image of the aperture stop.  I assume by design light originating from the exit pupil won't cause "purple effects", the lenses are semi-telecentric for this reason.  All reflections prior to the aperture stop of course can only be seen through the exit pupil and thus wouldn't cause "purple effects" either.  (Please also see an import caveat in the answer to your second question).

Reflections past the aperture stop, however, could appear at any angle from the sensor within the confines of the rear element.  That is to say the sensor would see the exit pupil and then well off axis from the exit pupil a reflection from elements on the sensor side of the aperture stop.

It is hard to say for sure, but I'm noticing in the "purple orbs" examples I think most of those blobs show aperture blades which means they are reflections past the aperture stop and jive with my half-baked theory...

Finally, do you have any further ideas about purple flare in the diagonal versus horizontal/vertical direction? An idea that has occurred to me is that the horizontal and vertical but not diagonal pattern shown by my second series of test shots might be due to the pollution being possible only between pixels that are in the same row or column but blocked diagonally. On the other hand, we have plenty of examples of purple blobs that do not take on this pattern but I cannot at present rule out the possibility that they could look the way they look even if diagonal pollution wouldn't be possible. Optical pathways are obviously pretty tricky to sort out in cases like these.

So I wonder if in your two examples (light just outside FoV and light dead center) we are in both cases seeing purple because of color channel pollution and color channel imbalance but that the optical cause of the pollution is different in the two cases.  That's sort of why I was curious about a purple diagonal flare from an outside the FoV source.  If you can generate (and correct) that it would seem to imply the horizontal/vertical pattern you see in the centered light example is from a different optical effect or reflection (and because of the vertical/horizontal effect something more related to the sensor or AA filter as the source of the polluting light).

-- hide signature --

Ken W
See profile for equipment list

 kenw's gear list:kenw's gear list
Sony RX100 Panasonic Lumix DMC-G1 Olympus OM-D E-M5 Panasonic Lumix DMC-GM1 Panasonic Lumix G Vario 14-45mm F3.5-5.6 ASPH OIS +26 more
Reply   Reply with quote   Complain
Keyboard shortcuts:
FForum MMy threads