Purple flare: Causes and remedies (continued)

Anders W wrote:

The way I am thinking about it as follows: Let's assume you are right that the flare as such is radially symmetric. However, only some of the rays making up that radially symmetric flare strikes the sensor in such a way that pollution can occur, i.e., the rays come in at a sufficiently large angle of incidence and also happen to strike the sensor more or less vertically or horizontally rather than diagonally (if we provisionally accept that diagonal pollution is not possible). Consequently what we see as purple flare (although we know that it is a sensor artifact) may be radially asymmetric although the flare as such isn't. Does that make any sense?
Perfect sense, that was what I finally concluded once I saw Ken's nice demonstration. Hadn't thought of it that way before his obvious demonstration of "flare" on the diagonal not turning purple.
What I still cannot fully explain (I merely have some lose ideas) is why we can sometimes see distinct non-diagonal patterns, like the test images I showed in the previous threads, but also nicely round purble blobs like those exemplified by Surefoot here:

http://forums.dpreview.com/forums/post/50958291
Me neither. And I also am wary about concluding that just because something is purple that it is color channel pollution. Given your demonstrations so far clearly it needs to be high on the list of possible causes, but it won't always be the cause. That said, given what Surefoot has shown with GF1/E-M5 comparisons makes me think it is color channel pollution even if it isn't apparent to me how.
 
kenw wrote:

A little more info. I just played with my 7-14 and some ceiling fixtures. One particular type of purple flare, the kind that occurs right at the image edge when the light source is just on the border of the frame (like Anders demonstrated in his very first post) appears to be the result of a reflection in the camera body - not the lens.

First thing to note about this type of flare - it is extremely angle sensitive. Only appears over a very narrow range of angles and has on obvious peak in effect at a particular angle.
Yes, I have noticed that too, and it is the same with the 45/1.8 that I used so nothing peculiar to the 7-14.
Here is the test, easy to perform in live-view. Turn on the graduated cross grid to make it easy to see the center, axis and measure distances. Have a light source and some other reference point placed about the right distance from the source to create the flare. From the same shooting position generate the flare both with the camera in the horizontal and vertical orientation. You will notice:

- The flare occurs at exactly the same point just outside the frame despite the fact the aspect ratio is 4:3 and not square. That is the angle at which it occurs changes in the horizontal and vertical orientations.

- This is made more obvious by having a reference point near the center of the image, you can clearly see that in the horizontal and vertical cases the angle from the optical axis that causes the flare changes.

- Look at the lens from the rear, it is perfectly radially symmetric and couldn't do this.

- Look inside the camera, there are baffles with sharp edges and the sensor frame itself that are about 4:3 aspect ratio.

- Light reflected from these points would be steeply off axis and could turn purple by the mechanism Anders has described and demonstrated.

From those observations I conclude:

- This particular purple flare, in the 7-14 at least but I also suspect in Anders 45/1.8 shots, occurs when the projected image of the light source strikes the edge of a baffle or perhaps some other frame edge in the sensor and filter stack.

- This reflection/purple flare is a characteristic of the body and sensor, not the lens strictly. Probably different lenses have a better chance of getting light into this region of the camera body at an angle that causes reflection.

- This particular purple flare (baffle/edge reflection) is only one type of reflection that could cause color channel pollution. Others are demonstrating other effects I think (e.g. light near center of frame, purple orbs, other purple reflections that preserve image shape).

Sorry I can't contribute much more right now, my time for testing things is extremely limited these days...
A clever test along with good analysis Ken. Playing with the fact that the image formed by the lens is circular but the sensor and its surroundings rectangular does help telling this from that.
 
kenw wrote:

I really apologize about suggesting a test to try without doing it myself, just not possible right now.

But after Ken's (the other Ken) nice demonstration of vertical/horizontal constraint of color channel pollution this occurs to me as a test to hopefully clearly demonstrate that.

Get nearly spectrally pure red and blue sources (e.g. LEDs). Illuminate the bare sensor at a steep angle with each. Do so both in the horizontal and vertical orientations. Examine the RAW histograms keeping G1 and G2 as separate histograms.

What we would expect to see is:

- In horizontal the red source pollutes G1 and the blue source pollutes G2.

- In vertical the red source pollutes G2 and the blue source pollutes G1.

The key here is to have sources that have far less green light in them to begin with than will pollute the greed channels so that we can clearly see the G1 and G2 pollution effects. Hence the suggestion of an LED source rather than filtered light (although a really good filter would work too).

And of course the numbering of G1 and G2 is arbitrary above, the point is red and blue will affect them differently in a given orientation and will "swap" what they affect in the opposite orientation. This would illustrated that polluting light is clearly constrained to rows and columns and it would also very clearly show color pollution.
Thanks. Good idea. If suitable LEDs are not available, I guess illumination from a computer screen showing a pure blue or pure red image should work too.
 
kenw wrote:
Anders W wrote:

The way I am thinking about it as follows: Let's assume you are right that the flare as such is radially symmetric. However, only some of the rays making up that radially symmetric flare strikes the sensor in such a way that pollution can occur, i.e., the rays come in at a sufficiently large angle of incidence and also happen to strike the sensor more or less vertically or horizontally rather than diagonally (if we provisionally accept that diagonal pollution is not possible). Consequently what we see as purple flare (although we know that it is a sensor artifact) may be radially asymmetric although the flare as such isn't. Does that make any sense?
Perfect sense, that was what I finally concluded once I saw Ken's nice demonstration. Hadn't thought of it that way before his obvious demonstration of "flare" on the diagonal not turning purple.
What I still cannot fully explain (I merely have some lose ideas) is why we can sometimes see distinct non-diagonal patterns, like the test images I showed in the previous threads, but also nicely round purble blobs like those exemplified by Surefoot here:

http://forums.dpreview.com/forums/post/50958291
Me neither. And I also am wary about concluding that just because something is purple that it is color channel pollution. Given your demonstrations so far clearly it needs to be high on the list of possible causes, but it won't always be the cause. That said, given what Surefoot has shown with GF1/E-M5 comparisons makes me think it is color channel pollution even if it isn't apparent to me how.
I'll see what I can do to test that by means of purple filtering. Even if the 7-14 doesn't take filters, it should be possible to hold the filter in front of it so that it covers the circular purple blob I can hopefully create.

One other thing that I thought I should mention just in case it might give you or someone else some useful ideas:

I had a short exchange with Bob (bobn2) about sensor tech via PM recently. One interesting thing that he mentioned is that recent sensors have two microlenses rather than a single one on top of each pixel, which in Bob's view is a major part of the explanation for the transition from quantum efficiency figures of 40+ percent to 50+ percent. Here's what he said (I hope he doesn't mind me quoting it since it didn't seem to be a secret of any kind):

"I think the thing that has done it [the increase to 50+ percent] is the regular use of 2 element microlenses, a bit of retrofocus increases the f-number. Nikon started on the D3 and refined on the S, and the newer Exmors seem to have a 2 element structure as well. I guess the Toshiba too."

"There is the plastic layer on top, then a silicon nitride lens embedded in the silicon structure, under the CFA. I would be pretty sure that the MFT sensor has it, I would think the pixel design is Sony standard."
 
Anders W wrote:

What I still cannot fully explain (I merely have some lose ideas) is why we can sometimes see distinct non-diagonal patterns, like the test images I showed in the previous threads, but also nicely round purble blobs like those exemplified by Surefoot here:
You're seeing several different flare things. There's a multitude of different ones showing up at different angles. Some have aperture shape, others are fuzzier, I guess some come from rear others from front lens elements...

Check this cellphone video: some "normal" flares, some diffraction-related (huge reddish blobs)

 
OK. Ken (W) was right to warn against generalizing too broadly and too quickly. Here is a series of shots with the 7-14, first on the E-M5 without filter, then on the E-M5 with purple filter (CC30M + CC70M + 80A) held over the light source, and finally with the G1 without filter. As you can see, there is a big blue circular reflection in the first shot which if anything becomes more intense with filtering, but, as expected based on previous comparisons I have made, is only weakly colored with the G1.

E-M5, no filter

E-M5, no filter

E-M5, purple filter on light source

E-M5, purple filter on light source



G1, no filter

G1, no filter




The blue blob we can see in the E-M5 shots strongly reminds me of the ones we can see in the very first examples of "purple flare" with the 7-14 on the E-M5 that I remember seeing, i.e., those in the OP of this thread:

http://forums.dpreview.com/forums/post/42396372
 
Last edited:
kenw wrote:
Anders W wrote:

The way I am thinking about it as follows: Let's assume you are right that the flare as such is radially symmetric. However, only some of the rays making up that radially symmetric flare strikes the sensor in such a way that pollution can occur, i.e., the rays come in at a sufficiently large angle of incidence and also happen to strike the sensor more or less vertically or horizontally rather than diagonally (if we provisionally accept that diagonal pollution is not possible). Consequently what we see as purple flare (although we know that it is a sensor artifact) may be radially asymmetric although the flare as such isn't. Does that make any sense?
Perfect sense, that was what I finally concluded once I saw Ken's nice demonstration. Hadn't thought of it that way before his obvious demonstration of "flare" on the diagonal not turning purple.
What I still cannot fully explain (I merely have some lose ideas) is why we can sometimes see distinct non-diagonal patterns, like the test images I showed in the previous threads, but also nicely round purble blobs like those exemplified by Surefoot here:

http://forums.dpreview.com/forums/post/50958291
Me neither. And I also am wary about concluding that just because something is purple that it is color channel pollution. Given your demonstrations so far clearly it needs to be high on the list of possible causes, but it won't always be the cause. That said, given what Surefoot has shown with GF1/E-M5 comparisons makes me think it is color channel pollution even if it isn't apparent to me how.
Have a look here:

http://forums.dpreview.com/forums/post/50966147
 
Anders W wrote:
kenw wrote:

Now at least your initial example flare and your subsequent diagonal test seem to show a non-radially symmetric flare of significant extent - i.e. extending hundreds of pixels in a non-radially symmetric direction. To me that implies that the extension of hundreds of pixels happened after striking the sensor surface. I can't immediately think of an obvious and plausible mechanism for that. Traveling under the CFA for hundreds of pixels seems unlikely, any multi-reflection path would seem far too attenuated to pull that off. Perhaps the micro-lens surface or sensor metalization can act like a diffraction grating and send light back up to the filter stack to be reflected back down (at an angle such that we get color channel pollution)?

Put another way, I expect any reflected ray to remain coplanar with all its reflected paths when passing through the lens and any flat surface. I also expect it to be coplanar with the optical axis of the lens. The horizontal/vertical flare propagation implies at some point the light ray jumps out of that plane, makes a sudden "turn" so to speak. The only place it seems that could happen would be a non-smooth surface like the sensor itself.

Again, there is always the underlying caveat that the AA filter is strictly not radially symmetric and its also a candidate for being the source of any non-radially symmetric reflection. That said, I've never seen an example of such while I have seen many examples of sensor reflections (e.g. red-dot-disease).

Am I making some sense?
The way I am thinking about it as follows: Let's assume you are right that the flare as such is radially symmetric. However, only some of the rays making up that radially symmetric flare strikes the sensor in such a way that pollution can occur, i.e., the rays come in at a sufficiently large angle of incidence and also happen to strike the sensor more or less vertically or horizontally rather than diagonally (if we provisionally accept that diagonal pollution is not possible). Consequently what we see as purple flare (although we know that it is a sensor artifact) may be radially asymmetric although the flare as such isn't.
This would imply that there are two mechanisms:
  1. Flare coming in at a flat angle is reaching the photodiode if it is striking the sensor either horizontally or vertically and is reflected to somewhere else when striking diagonally. This is needed to explain why in the CC-filtered case the flare is still horizontal and vertical.
  2. If there is a channel imbalance (which it is with white light), this such detected flare turns purple.
 
Last edited:
Richard wrote:
Anders W wrote:

The original thread on this subject

http://forums.dpreview.com/forums/thread/3391811

has now expired. Since there appears to be things left to discuss, I started this new thread as a continuation of the first.
I don't have a 4/3rds, but I do get flaring on lenses when pointing them into the sun or strong light or even sometimes just outside of the frame. First I makes sure the lens hood is attached in your case, your hood is like the one on my Nikon 14-24 2.8 permanent. Then looking through the view finder if I see flare I will change my angle or shield the lens. If my hand appears in the picture I edit it out. It is the fastest, easiest, least expensive way of solving the problem. If there is another lens of a similar focal length that does not exhibit the problem or much less, then buy it or do your home work first, most reviews report how well the lens handles flare.

No lens is perfect, they seem to balance defects against the goal of the lens, one may have more CA, vignette, flare, barrel distortion or other types of distortion.

To me, your flare problem is the easiest to rid yourself of compared to some of the other lens issues. Shield your lens from light source with your hand or paper. If you are shooting the sun in the picture and you have issues. Look for a lens that does this well and buy it.

Good luck to you, I notice this problem on rare occasion with some lenses but I just hood with my had, even hand held. So I don't worry about it, if it is on a tripod, it is even easier.
What you just said here is well known and widely practised, and therefore uninteresting to the point of boring. What Anders et al have been doing is intelectual work backed by well planned and reasoned experiments exploring the possible reasons of an observed phenomenon previously poorly explained, and thus very interesting. Hence the popularity of these threads.

Hacked and intuitive practical techniques might well work better than what organized research could turn up in the short term, but ultimately almost all elegant and reliable solutions are the results of the latter, esp. as the human intellect further develops. In addition, to some people at least, the process of intellectual work is rewarding in itself. These threads is my exhibit A for this claim, if you need convincing.
 
Notice the purple orb ?

Notice the purple orb ?

This is using the M.Zuiko 12mm f/2. This is just to confirm it's not specific to Panasonic lenses.


It's much more difficult to reproduce, i suppose the huge front lens on the 7-14mm doesnt help it against flare...
 
noirdesir wrote:
Anders W wrote:
kenw wrote:

Now at least your initial example flare and your subsequent diagonal test seem to show a non-radially symmetric flare of significant extent - i.e. extending hundreds of pixels in a non-radially symmetric direction. To me that implies that the extension of hundreds of pixels happened after striking the sensor surface. I can't immediately think of an obvious and plausible mechanism for that. Traveling under the CFA for hundreds of pixels seems unlikely, any multi-reflection path would seem far too attenuated to pull that off. Perhaps the micro-lens surface or sensor metalization can act like a diffraction grating and send light back up to the filter stack to be reflected back down (at an angle such that we get color channel pollution)?

Put another way, I expect any reflected ray to remain coplanar with all its reflected paths when passing through the lens and any flat surface. I also expect it to be coplanar with the optical axis of the lens. The horizontal/vertical flare propagation implies at some point the light ray jumps out of that plane, makes a sudden "turn" so to speak. The only place it seems that could happen would be a non-smooth surface like the sensor itself.

Again, there is always the underlying caveat that the AA filter is strictly not radially symmetric and its also a candidate for being the source of any non-radially symmetric reflection. That said, I've never seen an example of such while I have seen many examples of sensor reflections (e.g. red-dot-disease).

Am I making some sense?
The way I am thinking about it as follows: Let's assume you are right that the flare as such is radially symmetric. However, only some of the rays making up that radially symmetric flare strikes the sensor in such a way that pollution can occur, i.e., the rays come in at a sufficiently large angle of incidence and also happen to strike the sensor more or less vertically or horizontally rather than diagonally (if we provisionally accept that diagonal pollution is not possible). Consequently what we see as purple flare (although we know that it is a sensor artifact) may be radially asymmetric although the flare as such isn't.
This would imply that there are two mechanisms:
  1. Flare coming in at a flat angle is reaching the photodiode if it is striking the sensor either horizontally or vertically and is reflected to somewhere else when striking diagonally. This is needed to explain why in the CC-filtered case the flare is still horizontal and vertical.
  2. If there is a channel imbalance (which it is with white light), this such detected flare turns purple.
Yes, those two are essentially the mechanisms I suggested in the prior thread (color-channel pollution and color-channel imbalance) supplemented by the hypothesis that pollution can only occur vertically or horizontally but not diagonally. In mechanism one, though, the diagonal light need not be reflected somewhere else but simply absorbed before hitting any photodiode.

As you can see in the post to which I link below, however,

http://forums.dpreview.com/forums/post/50966147

the round purple or blueish reflection shown there appear to be due to other mechanisms.
 
Anders W wrote:

Yes, those two are essentially the mechanisms I suggested in the prior thread (color-channel pollution and color-channel imbalance) supplemented by the hypothesis that pollution can only occur vertically or horizontally but not diagonally.
A square diffraction pattern may have diagonal components.

From http://en.wikipedia.org/wiki/Diffraction

From http://en.wikipedia.org/wiki/Diffraction

But if the light-source is relatively big, the image gets smeared and the central blob may cover them up, leaving just a smeared cross. However, try to tell apart the regular on-axis flares (regardless of tint) with the patterned flare which is not on the axis. Pls do test with small light-sources. Notice the diagonal dots are prominent in the already mentioned cellphone video, and due to the camera movement it is easy to tell the patterned flares from the rest (which might not be as clear from one still image).


Mind this looks very different with different lenses.
 
Anders W wrote:
noirdesir wrote:

This would imply that there are two mechanisms:
  1. Flare coming in at a flat angle is reaching the photodiode if it is striking the sensor either horizontally or vertically and is reflected to somewhere else when striking diagonally. This is needed to explain why in the CC-filtered case the flare is still horizontal and vertical.
  2. If there is a channel imbalance (which it is with white light), this such detected flare turns purple.
Yes, those two are essentially the mechanisms I suggested in the prior thread (color-channel pollution and color-channel imbalance) supplemented by the hypothesis that pollution can only occur vertically or horizontally but not diagonally.
I think mechanism (1) goes beyond colour channel pollution in that in that it is not just light jumping from one pixel to another after passing the CFA, but that the AA filter, microlenses, and CFA filters 'block' diagonal light at shallow angles but let aligned light at shallow angles through.
In mechanism one, though, the diagonal light need not be reflected somewhere else but simply absorbed before hitting any photodiode.
Which I meant to be included in the "reflected somewhere else", ie, reflected in a direction that leads it not to hit the photodiode, be it by hitting surfaces where it absorbed or scattered enough to become undetectable.
As you can see in the post to which I link below, however,

http://forums.dpreview.com/forums/post/50966147


the round purple or blueish reflection shown there appear to be due to other mechanisms.
Yes, that looks like yet another mechanism.
 
tt321 wrote:
What you just said here is well known and widely practised, and therefore uninteresting to the point of boring. What Anders et al have been doing is intelectual work backed by well planned and reasoned experiments exploring the possible reasons of an observed phenomenon previously poorly explained, and thus very interesting. Hence the popularity of these threads.

Hacked and intuitive practical techniques might well work better than what organized research could turn up in the short term, but ultimately almost all elegant and reliable solutions are the results of the latter, esp. as the human intellect further develops. In addition, to some people at least, the process of intellectual work is rewarding in itself. These threads is my exhibit A for this claim, if you need convincing.
It is a forum. I am entitled to my opinion and to post it here regardless how simple the solution is. Pseudo intellectuals like yourself who cannot even spell practiced or intellectual are invited to give input too, but in my experience, rules like keep it simple stupid dominate over elegant and reliable solutions that you are suggesting coming from pseudo intellectuals that ar usually not simple, elegant or reliable. It is just a bunch of people arguing or trying to prove they have some technical prowess, when most can see they don't.

I guess after seeing this is a problem in some of the later post, the issue is with the camera and not the lens, I am just glad that I use Nikon and Canon that do not have these issues or if they do, they are so negligible, they are not worth talking about. But you can carry on with your pseudo intellectual discussions without me, good luck.
 
I see the orb. This photo and the previous post (from Anders) just encourage me to continue a practice I learned many decades ago. Be careful when you point your camera at a very bright spot, especially the sun. It can cause flare and damage things.
 
Has interpixel capacitance been considered?

"In [CMOS] arrays, small amounts of stray capacitance can couple pixels to neighboring pixels and influence the voltage read for that pixel. This coupling is interpixel capacitance. [...]

"Interpixel capacitance creates two effects. The first and most obvious is that cross talk is generated—a strong signal in one pixel creates a weak signal in neighboring pixels. This observed cross talk may easily be mistaken for a more common cross talk, diffusion cross talk, which occurs when photocarriers generated within one pixel diffuse to adjacent pixels. A second effect naturally exists as well. The signal appearing in those neighboring pixels is a signal that should have appeared in the central pixel had there been no interpixel capacitance. The signal in the central pixel is therefore attenuated. This attenuation may also be mistaken for attenuation resulting from diffusion. [...]

"Interpixel capacitance is expected to become more significant with modern arrays. As detector array designers continue to strive for the simultaneous qualities of high pixel density requiring small distances between pixel centers, high quantum efficiency, low diffusion cross talk, and low latent images requiring 100% fill factor—small gaps between pixel implants, and high sensitivity low capacitance multiplexer nodes, the stray capacitance to neighboring pixels will be more pronounced. Stray capacitance to a detector node is the result of the presence of conductors adjacent to the detector node. Detector nodes must be conductive to accumulate charge. Thus, the nearest conductors adjacent to the pixels in the lowest capacitance detector arrays will be the neighboring pixels." (Moore et al. 2006 http://ridl.cfd.rit.edu/products/publications/intepixel paper downloaded from SPIE.pdf )
 
Last edited:

Keyboard shortcuts

Back
Top