Purple flare: Causes and remedies (continued)

Anders W

Forum Pro
Messages
22,144
Solutions
20
Reaction score
10,192
Location
Uppsala, SE
(Continued from the original post)

My issue with the orbs is they dont show up at all on the GF1, not even gray. This is what leads me to think the sensor is sensitive to something else in the light spectrum (not purple specifically, i mean whatever is causing the light bleeding), that is not handled properly by internal lens coatings - as in the end there should be no such reflections at all. I totally agree with your conclusions, and i think this will be a tricky issue to solve for manufacturers, if they at least admit the problem exists at all (because so far, they have been in total denial).

In the mean time, we'll have to find a solution esp. for the 7-14, since other lenses might accept a polarizer which mitigates the problem, but the 7-14 doesnt. How about a rear-mounted polarizer ? One would have to cut a cheap filter to experiment with it..
 
Last edited:
D#mn forum software, I just lost a whole post, because it didn't save it when it told me that the limit of the thread was reached.

Two things I noticed when looking over my images:

1) Underexposure seems to affect purple horizontal+vertical streaks of in-frame lightsources more than outside/edge lightsources.





2) Aperture does not affect the form/boundaries of in-frame lightsources, but seems to affect their "sharpness/focus". In the following image the big white center flare seems to be a reflection off the sensor to the glass and back again on the sensor, its form follows the form of the aperture blades. The purple streaks only get more "in focus" and follow the form of the sunstar a bit more.

It should also be noted that the purple streaks are not exactly horizontal + vertical, but spreading in a slight arc towards diagonal (15° maybe?). The more the lightsource gets in frame/on the edge the more the streaks seem to form beams instead of a rather blurry arc. Aperture again plays a role in the "blurriness" of these streaks, but not in the overall form and spread.













 
Last edited:
Surefoot wrote:

(Continued from the original post)

My issue with the orbs is they dont show up at all on the GF1, not even gray.
But they do show up with the GF1 too. Your own series of examples here

http://forums.dpreview.com/forums/post/50959063

shows that. With the E-M5 and the GF1 alike, you see a series of more or less circular reflections. Some of those reflections are intensely purple on the E-M5 but have some other, less deviant but nevertheless deviant, color with the GF1. That's all.
This is what leads me to think the sensor is sensitive to something else in the light spectrum (not purple specifically, i mean whatever is causing the light bleeding), that is not handled properly by internal lens coatings - as in the end there should be no such reflections at all.
Ideally, lenses shouldn't reflect. But there are no ideal lenses so reflections is something we have to live with. However, sensors shouldn't show color-channel pollution, and some sensors don't. So let's hope that the problem in this regard with the latest MFT sensor generation is solved in the next.
I totally agree with your conclusions, and i think this will be a tricky issue to solve for manufacturers, if they at least admit the problem exists at all (because so far, they have been in total denial).
I'd say they have remained pretty silent, except Apple, whose comments (as I pointed out in my OP in the previous thread) merely says what we already knew but does nothing to explain why the reflections suddently go purple.
In the mean time, we'll have to find a solution esp. for the 7-14, since other lenses might accept a polarizer which mitigates the problem, but the 7-14 doesnt. How about a rear-mounted polarizer ? One would have to cut a cheap filter to experiment with it..
Other difficulties aside, it would be pretty tricky to adjust a rear-mounted polarizer for proper effect and I think it likely that it would cancel only some rather than all purpleness in many scenes (as exemplified by my second set of test shots).

Although I am certainly concerned about the purple-flare problem, and would certainly prefer to be without, I am not alarmed by it. I happily shot with the 7-14 on the E-M5 during summer vacations and didn't notice anything unusual at all. When the first threads about the purple-flare problem appeared on the forum, I went through all the images I had shot with the 7-14 but could find only a single scene (of a quite large number shot) where there was a trace of purple flare. Testing sessions aside, I have now run into the problem spontaneously on three occasions (three scenes) after more than a half year's shooting. In none of these cases was the problem very serious and/or the shot ruined and by now I have a fairly good ideas about how to fix the issue in PP without a whole lot of trouble.
 
Timur Born wrote:

D#mn forum software, I just lost a whole post, because it didn't save it when it told me that the limit of the thread was reached.

Two things I noticed when looking over my images:

1) Underexposure seems to affect purple horizontal+vertical streaks of in-frame lightsources more than outside/edge lightsources.
Affect how? And does the difference remain when you push the underexposed shot in PP so that it reaches the same brightness as a correctly exposed shot?
2) Aperture does not affect the form/boundaries of in-frame lightsources, but seems to affect their "sharpness/focus". In the following image the big white center flare seems to be a reflection off the sensor to the glass and back again on the sensor, its form follows the form of the aperture blades.
Reflections often follow the shape of the aperture blades. What makes you think that this has anything to do with reflections from the sensor to the rear of the lens and then back to the sensor again? It was hardly unusual to see flare of this kind/shape back in the film days, long before there was any shiny sensor for the light to reflect on.
The purple streaks only get more "in focus" and follow the form of the sunstar a bit more.

It should also be noted that the purple streaks are not exactly horizontal + vertical, but spreading in a slight arc towards diagonal (15° maybe?). The more the lightsource gets in frame/on the edge the more the streaks seem to form beams instead of a rather blurry arc. Aperture again plays a role in the "blurriness" of these streaks, but not in the overall form and spread.
 
... was the problem did not exist at all (some of us tried to contact them, with sample photos). They just suggested we use Zuiko lenses instead of Panasonic, hinting either they did not understand the problem at all, or they ignored it deliberately. Another answer was to contact Panasonic and return our 7-14mm.

As for the frequency of this problem, it's on the rare occurence side, but having the sun in the frame at 7mm can happen rather frequently, and as you know the best photos are made when the sun is low, which further the chances of producing the dreaded purple orbs (which cannot be handled in PP, as they destroy image information).

I'll be very curious to see what happens on the next gen of Olympus bodies (in 2014 i guess).




I'll also try and make new tests photos with a tripod and the GF1 as reference, i really doubt there are such intense orbs (even gray colored)...
 
Last edited:
Anders W wrote:

The original thread on this subject

http://forums.dpreview.com/forums/thread/3391811


has now expired. Since there appears to be things left to discuss, I started this new thread as a continuation of the first.
I don't have a 4/3rds, but I do get flaring on lenses when pointing them into the sun or strong light or even sometimes just outside of the frame. First I makes sure the lens hood is attached in your case, your hood is like the one on my Nikon 14-24 2.8 permanent. Then looking through the view finder if I see flare I will change my angle or shield the lens. If my hand appears in the picture I edit it out. It is the fastest, easiest, least expensive way of solving the problem. If there is another lens of a similar focal length that does not exhibit the problem or much less, then buy it or do your home work first, most reviews report how well the lens handles flare.

No lens is perfect, they seem to balance defects against the goal of the lens, one may have more CA, vignette, flare, barrel distortion or other types of distortion.

To me, your flare problem is the easiest to rid yourself of compared to some of the other lens issues. Shield your lens from light source with your hand or paper. If you are shooting the sun in the picture and you have issues. Look for a lens that does this well and buy it.

Good luck to you, I notice this problem on rare occasion with some lenses but I just hood with my had, even hand held. So I don't worry about it, if it is on a tripod, it is even easier.
 
Anders W wrote:

Not sure how you reason when you say that this demonstrates a vertical and horizontal constraints of hundreds of pixels. Could you please elaborate a bit.

Just like you, I think of the pollutive streams as local (adjacent pixels). But as far as I can see, this wouldn't prevent the streaks from extending vertically over a pretty long distance but being blocked diagonally if pollution in that direction is impossible.
(quoted from this post: http://forums.dpreview.com/forums/post/50963888 )

My thinking (possibly flawed) is that everything in the optical system up to the sensor is radially symmetric (the only small exception being the AA filter). Every optical flare I've seen is also radially symmetric. This is a key point, a flare covering a large fraction of the image should be radially symmetric if it originated before the surface of the sensor (the only non-radially symmetric element in the optical system). Most flares do have a significant extent (hundreds of pixels or larger) across the image, they aren't point sources.

Now at least your initial example flare and your subsequent diagonal test seem to show a non-radially symmetric flare of significant extent - i.e. extending hundreds of pixels in a non-radially symmetric direction. To me that implies that the extension of hundreds of pixels happened after striking the sensor surface. I can't immediately think of an obvious and plausible mechanism for that. Traveling under the CFA for hundreds of pixels seems unlikely, any multi-reflection path would seem far too attenuated to pull that off. Perhaps the micro-lens surface or sensor metalization can act like a diffraction grating and send light back up to the filter stack to be reflected back down (at an angle such that we get color channel pollution)?

Put another way, I expect any reflected ray to remain coplanar with all its reflected paths when passing through the lens and any flat surface. I also expect it to be coplanar with the optical axis of the lens. The horizontal/vertical flare propagation implies at some point the light ray jumps out of that plane, makes a sudden "turn" so to speak. The only place it seems that could happen would be a non-smooth surface like the sensor itself.


Again, there is always the underlying caveat that the AA filter is strictly not radially symmetric and its also a candidate for being the source of any non-radially symmetric reflection. That said, I've never seen an example of such while I have seen many examples of sensor reflections (e.g. red-dot-disease).

Am I making some sense?
 
Surefoot wrote:

... was the problem did not exist at all (some of us tried to contact them, with sample photos). They just suggested we use Zuiko lenses instead of Panasonic, hinting either they did not understand the problem at all, or they ignored it deliberately. Another answer was to contact Panasonic and return our 7-14mm.

As for the frequency of this problem, it's on the rare occurence side, but having the sun in the frame at 7mm can happen rather frequently, and as you know the best photos are made when the sun is low, which further the chances of producing the dreaded purple orbs (which cannot be handled in PP, as they destroy image information).

I'll be very curious to see what happens on the next gen of Olympus bodies (in 2014 i guess).

I'll also try and make new tests photos with a tripod and the GF1 as reference, i really doubt there are such intense orbs (even gray colored)...
Your major problem is with orbs. Mine is with shapes closer to rectangles. I mainly see the issue in indoor shots with small lights that are very bright relative to the scene or with sunlight through a usually small window or other small opening. See the sample photo I posted.
 
Surefoot wrote:

... was the problem did not exist at all (some of us tried to contact them, with sample photos). They just suggested we use Zuiko lenses instead of Panasonic, hinting either they did not understand the problem at all, or they ignored it deliberately. Another answer was to contact Panasonic and return our 7-14mm.
OK. Then I understand what you mean by denial. And I am not, unfortunately, particularly surprised at the response you got. When will manufacturers start to learn how to handle these things the way they should?
As for the frequency of this problem, it's on the rare occurence side, but having the sun in the frame at 7mm can happen rather frequently, and as you know the best photos are made when the sun is low, which further the chances of producing the dreaded purple orbs (which cannot be handled in PP, as they destroy image information).
What PP techniques have you tried? Shots like those you showed would require quite a bit of work in PP even if there was no purple because the non-purple orbs are disturbing too, and there are quite a few of them. Personally, I try to avoid arrays of orbs like those you got by shielding the lens with my hand when I see the reflections appear in the EVF (I have learnt to look out for them in certain cases). A gray-card or something might serve a suitable hand-extension for shielding purposes.
I'll be very curious to see what happens on the next gen of Olympus bodies (in 2014 i guess).
Well, at least we can hope that the camera/sensor designers are now thoroughly aware of the problem (or they are less smart than I hope/think they are).
I'll also try and make new tests photos with a tripod and the GF1 as reference, i really doubt there are such intense orbs (even gray colored)...
The orbs will surely appear to be as "intense" on the GF1. But the reason you see the purple ones as "intense" is simply that the color is so far off. There is nothing remarkable about them aside from that. Try to compare with the GF1 with both images converted black and white and you'll see.
 
Surefoot wrote:

As for the frequency of this problem, it's on the rare occurence side, but having the sun in the frame at 7mm can happen rather frequently, and as you know the best photos are made when the sun is low, which further the chances of producing the dreaded purple orbs (which cannot be handled in PP, as they destroy image information).
Far from an ideal solution, but a workable one for static subjects, is to use a technique I occasionally use for "normal" flare with UWA and the sun. Works best tripod mounted but can be done hand-held with just a little care as well.

Compose normally and shoot (with whatever resulting objectionable flare). Take two more shots, one significantly panned to the right, one to the left but still overlapping with at least half the original image. In PP create a panoramic alignment but keep the images on layers (don't blend or flatten yet). Using layer masks you can now paint out any objectionable flare from the first image with the overlapping portions of the last two images. Those last two images may also have flare but they will be on a different axis than the first - i.e. the flares won't overlap.

There are of course practical limitations to this. Easy to remember to do when a shot is going to include flare for sure. Not practical at all for "surprise" flares, most often from interior lighting or night shots. But it can work well with the sun. I've used it where an otherwise "pleasant" looking flare pattern is disrupted by a particularly ugly bit of flare I want to get rid of.


And obviously a sensor that doesn't exacerbate the problem would be preferable!
 
Anders W wrote:

Affect how? And does the difference remain when you push the underexposed shot in PP so that it reaches the same brightness as a correctly exposed shot?
I don't see any purple in the in-frame shot, but see very distinctive purple in the edge shot. The PP question isn't easily answered, because the shadow noise is pinkish, the light source is bluish, so the gradation between light/flare and noise become purple on their own. Still the differences between the in-frame and edge images are considerable even after heavy PP that very much pronounces anything purple.

b5ee4ada69644c57b0b306a64dddcbcc.jpg



9dda004e4d9e4d69bdfae683ce22bc28.jpg

Reflections often follow the shape of the aperture blades. What makes you think that this has anything to do with reflections from the sensor to the rear of the lens and then back to the sensor again? It was hardly unusual to see flare of this kind/shape back in the film days, long before there was any shiny sensor for the light to reflect on.
Just my expectation that the lens wouldn't mirror back such a huge blob of light right in the center, so I thought the very last "hard wall" would be responsible. Didn't give it too much of a thought, because the relevant information was how the aperture does only affect the purple streaks' "sharpness/blurriness", but neither form nor spread.
 
Timur Born wrote:
Anders W wrote:

Affect how? And does the difference remain when you push the underexposed shot in PP so that it reaches the same brightness as a correctly exposed shot?
I don't see any purple in the in-frame shot, but see very distinctive purple in the edge shot. The PP question isn't easily answered, because the shadow noise is pinkish, the light source is bluish, so the gradation between light/flare and noise become purple on their own. Still the differences between the in-frame and edge images are considerable even after heavy PP that very much pronounces anything purple.
I'd rather say that the purple is generally more intense in the out-of-frame than in the in-frame shot, no matter which exposure you use. The same tendency is visible in the two series of test shots I showed in the previous thread.


b5ee4ada69644c57b0b306a64dddcbcc.jpg



9dda004e4d9e4d69bdfae683ce22bc28.jpg
Reflections often follow the shape of the aperture blades. What makes you think that this has anything to do with reflections from the sensor to the rear of the lens and then back to the sensor again? It was hardly unusual to see flare of this kind/shape back in the film days, long before there was any shiny sensor for the light to reflect on.
Just my expectation that the lens wouldn't mirror back such a huge blob of light right in the center, so I thought the very last "hard wall" would be responsible. Didn't give it too much of a thought, because the relevant information was how the aperture does only affect the purple streaks' "sharpness/blurriness", but neither form nor spread.
 
kenw wrote:
Surefoot wrote:

As for the frequency of this problem, it's on the rare occurence side, but having the sun in the frame at 7mm can happen rather frequently, and as you know the best photos are made when the sun is low, which further the chances of producing the dreaded purple orbs (which cannot be handled in PP, as they destroy image information).
Far from an ideal solution, but a workable one for static subjects, is to use a technique I occasionally use for "normal" flare with UWA and the sun. Works best tripod mounted but can be done hand-held with just a little care as well.

Compose normally and shoot (with whatever resulting objectionable flare). Take two more shots, one significantly panned to the right, one to the left but still overlapping with at least half the original image. In PP create a panoramic alignment but keep the images on layers (don't blend or flatten yet). Using layer masks you can now paint out any objectionable flare from the first image with the overlapping portions of the last two images. Those last two images may also have flare but they will be on a different axis than the first - i.e. the flares won't overlap.

There are of course practical limitations to this. Easy to remember to do when a shot is going to include flare for sure. Not practical at all for "surprise" flares, most often from interior lighting or night shots. But it can work well with the sun. I've used it where an otherwise "pleasant" looking flare pattern is disrupted by a particularly ugly bit of flare I want to get rid of.

And obviously a sensor that doesn't exacerbate the problem would be preferable!
 
(from the previous thread)

> Yes, a hood surely would help again outside light sources. What remains are those light-sources that are within the frame or on the edge of the frame, either partly inside, or right on the edge. The latter seems to be the worst if you look at these three images.

> Interestingly it seems that the more widely spread purple tint of the "outside the frame" lightsource gets more focused to horizontal+vertical streaks when the lightsource enters the frame.

You're seeing multiple flares combined, "ordinary" ones and "patterned" ones. A bright background image also diminishes the flare effect. Best observed on totally black background with a single small spot light.

Try a similar sample with a smaller light-source. Then you're more likely to isolate the diffraction (interference) grid pattern. Otherwise a large light-source may totally cover the most charasteristic part of the diffraction pattern.
 
Found a publicly available 2005 analysis of cross-talk (see p.4)

http://oatao.univ-toulouse.fr/304/1/estribeau_304.pdf

It appears that in the tested CMOS pixel array (monochrome, 13µm pitch) the response is far lower for diagonal pixels (but not only these ones !?!?) and increases somewhat significantly with the wavelength.
 
kenw wrote:
Anders W wrote:

Not sure how you reason when you say that this demonstrates a vertical and horizontal constraints of hundreds of pixels. Could you please elaborate a bit.

Just like you, I think of the pollutive streams as local (adjacent pixels). But as far as I can see, this wouldn't prevent the streaks from extending vertically over a pretty long distance but being blocked diagonally if pollution in that direction is impossible.
(quoted from this post: http://forums.dpreview.com/forums/post/50963888 )

My thinking (possibly flawed) is that everything in the optical system up to the sensor is radially symmetric (the only small exception being the AA filter). Every optical flare I've seen is also radially symmetric. This is a key point, a flare covering a large fraction of the image should be radially symmetric if it originated before the surface of the sensor (the only non-radially symmetric element in the optical system). Most flares do have a significant extent (hundreds of pixels or larger) across the image, they aren't point sources.

Now at least your initial example flare and your subsequent diagonal test seem to show a non-radially symmetric flare of significant extent - i.e. extending hundreds of pixels in a non-radially symmetric direction. To me that implies that the extension of hundreds of pixels happened after striking the sensor surface. I can't immediately think of an obvious and plausible mechanism for that. Traveling under the CFA for hundreds of pixels seems unlikely, any multi-reflection path would seem far too attenuated to pull that off. Perhaps the micro-lens surface or sensor metalization can act like a diffraction grating and send light back up to the filter stack to be reflected back down (at an angle such that we get color channel pollution)?

Put another way, I expect any reflected ray to remain coplanar with all its reflected paths when passing through the lens and any flat surface. I also expect it to be coplanar with the optical axis of the lens. The horizontal/vertical flare propagation implies at some point the light ray jumps out of that plane, makes a sudden "turn" so to speak. The only place it seems that could happen would be a non-smooth surface like the sensor itself.

Again, there is always the underlying caveat that the AA filter is strictly not radially symmetric and its also a candidate for being the source of any non-radially symmetric reflection. That said, I've never seen an example of such while I have seen many examples of sensor reflections (e.g. red-dot-disease).

Am I making some sense?
The way I am thinking about it as follows: Let's assume you are right that the flare as such is radially symmetric. However, only some of the rays making up that radially symmetric flare strikes the sensor in such a way that pollution can occur, i.e., the rays come in at a sufficiently large angle of incidence and also happen to strike the sensor more or less vertically or horizontally rather than diagonally (if we provisionally accept that diagonal pollution is not possible). Consequently what we see as purple flare (although we know that it is a sensor artifact) may be radially asymmetric although the flare as such isn't. Does that make any sense?

What I still cannot fully explain (I merely have some lose ideas) is why we can sometimes see distinct non-diagonal patterns, like the test images I showed in the previous threads, but also nicely round purble blobs like those exemplified by Surefoot here:

http://forums.dpreview.com/forums/post/50958291
 
Last edited:
Cani wrote:

Found a publicly available 2005 analysis of cross-talk (see p.4)

http://oatao.univ-toulouse.fr/304/1/estribeau_304.pdf

It appears that in the tested CMOS pixel array (monochrome, 13µm pitch) the response is far lower for diagonal pixels (but not only these ones !?!?) and increases somewhat significantly with the wavelength.
Thanks! That might prove helpful.
 
A little more info. I just played with my 7-14 and some ceiling fixtures. One particular type of purple flare, the kind that occurs right at the image edge when the light source is just on the border of the frame (like Anders demonstrated in his very first post) appears to be the result of a reflection in the camera body - not the lens.

First thing to note about this type of flare - it is extremely angle sensitive. Only appears over a very narrow range of angles and has on obvious peak in effect at a particular angle.


Here is the test, easy to perform in live-view. Turn on the graduated cross grid to make it easy to see the center, axis and measure distances. Have a light source and some other reference point placed about the right distance from the source to create the flare. From the same shooting position generate the flare both with the camera in the horizontal and vertical orientation. You will notice:

- The flare occurs at exactly the same point just outside the frame despite the fact the aspect ratio is 4:3 and not square. That is the angle at which it occurs changes in the horizontal and vertical orientations.


- This is made more obvious by having a reference point near the center of the image, you can clearly see that in the horizontal and vertical cases the angle from the optical axis that causes the flare changes.

- Look at the lens from the rear, it is perfectly radially symmetric and couldn't do this.

- Look inside the camera, there are baffles with sharp edges and the sensor frame itself that are about 4:3 aspect ratio.

- Light reflected from these points would be steeply off axis and could turn purple by the mechanism Anders has described and demonstrated.


From those observations I conclude:

- This particular purple flare, in the 7-14 at least but I also suspect in Anders 45/1.8 shots, occurs when the projected image of the light source strikes the edge of a baffle or perhaps some other frame edge in the sensor and filter stack.

- This reflection/purple flare is a characteristic of the body and sensor, not the lens strictly. Probably different lenses have a better chance of getting light into this region of the camera body at an angle that causes reflection.


- This particular purple flare (baffle/edge reflection) is only one type of reflection that could cause color channel pollution. Others are demonstrating other effects I think (e.g. light near center of frame, purple orbs, other purple reflections that preserve image shape).

Sorry I can't contribute much more right now, my time for testing things is extremely limited these days...
 
I really apologize about suggesting a test to try without doing it myself, just not possible right now.

But after Ken's (the other Ken) nice demonstration of vertical/horizontal constraint of color channel pollution this occurs to me as a test to hopefully clearly demonstrate that.

Get nearly spectrally pure red and blue sources (e.g. LEDs). Illuminate the bare sensor at a steep angle with each. Do so both in the horizontal and vertical orientations. Examine the RAW histograms keeping G1 and G2 as separate histograms.

What we would expect to see is:

- In horizontal the red source pollutes G1 and the blue source pollutes G2.

- In vertical the red source pollutes G2 and the blue source pollutes G1.

The key here is to have sources that have far less green light in them to begin with than will pollute the greed channels so that we can clearly see the G1 and G2 pollution effects. Hence the suggestion of an LED source rather than filtered light (although a really good filter would work too).

And of course the numbering of G1 and G2 is arbitrary above, the point is red and blue will affect them differently in a given orientation and will "swap" what they affect in the opposite orientation. This would illustrated that polluting light is clearly constrained to rows and columns and it would also very clearly show color pollution.
 

Keyboard shortcuts

Back
Top