AA filters

You can't make any generalizations, especially without referring to a particular camera. Whether it's rare or not depends on many factors.
I can, and I have. One only has to look at the photos of cameras without blur filters.
That's an invalid comparison. You're also assuming there is a single AA filter for all cameras, and calling it a blur filter indicates you don't understand what it's actually for and why it's needed.
Then you also disagree with Alphoid? Bc he seems to think AA filtering is one specific thing for every camera, to the point he can claim it is noticeable at certain viewing sizes and not noticeable at others. Either AA filter affects a photo in exactly one way (depending on the camera), or it doesn't.

No it isn't. What's increased are alias artifacts, or false detail that wasn't actually there.

What you're claiming is mathematically impossible and no amount of your arguing is going to change math & physics.
Right, so the manufacturers like Leica are simply fooling photographers and what photograPhers are seeing is a figment of their imagination.
Leica does not have the ability to change math and physics.

The fact is that by omitting an anti-alias filter, there will be more aliasing, which is why it's called an anti- alias filter. It's basic signal theory.
Well, there is obviously a flip side to that that you have deemed as a negative and yet affects the entire image.

We don't see individual pixels in most print sizes and yet they affect image quality.
As a whole they do. Not individually, and you're moving the goalposts. The issue is alias artifacts, which an AA filter reduces and/or eliminates.
Comments like that is further proof you don't understand anything about signal theory or aliasing.
I understand what my eyes can see. Perhaps you should tell Leica, and other camera manufacturer, they don't understand either.
You don't understand what your eyes see or you wouldn't be making absurd comments as you have in this thread and others.
AA filters do not reduce resolution. They bandlimit what can't be resolved, which is a requirement for a properly implemented sampling system. Otherwise you get aliasing artifacts.

Since no perfect AA filter exists, there are tradeoffs to be made. It's just the way it is. We live in an imperfect world.
First you say "AA filters do not reduce resolution" and then you say "no perfect AA filter exists, there are tradeoffs to be made?"
If you understood signal theory, you'd not be asking that.
I think this argument can go round and round, bc it's not really about AA filters and signal theory. It's about standards (as much of what is on DPR is about). Nobody is saying AA filters are not needed, we are saying that the current resolution ceiling imposed by artificial LPF is too low, we would rather deal with occasional moire than deal with less resolution all the time.

I would rather have that aliasing destruction come from a natural barrier, such as diffraction, which would also allow me, in theory, the most rez I can get. What we have here are two groups of people who are willing to accept AA filtering, but one side says I want more resolution before we hit that limit, the other side says current resolution is good enough.

You guys are in the "good enough" camp. You are ok with using artificial filters which limit your rez to today's standards. We want more, we want better. Better is possible, and in time it will arrive. All I can say is it's a good thing the world's engineers are not all in the "good enough" camp, otherwise we might still be using manual focus film cameras, or worse yet, not have cameras at all...
 
Ontario Gone wrote:
That link is a technical description, one I have already read, and one that doesn't relay how it changes the appearance within an image.
If you'd like to see it, replicate it in Photoshop:
  • Take a high resolution image. That's what you're taking a photo of.
  • Make four images, each shifted by one pixel (original, up, left, and up+left). Average them. That's the output of the OLPF, as per Wikipedia's description.
  • Downsample by a factor of 2. That's approximately what's at your sensor.
  • Downsample by a factor of 4. That's what shows up on your monitor at full size when viewing on a 4k monitor and shooting with a 36MP sensor.
Or if you prefer, just look at the D800 and D810 images at any size other than 100% crop.
That sounds like averaging, the same can be done with median NR, or, downsampling.
All blurring is some form of averaging. It's how you weigh points when you average. AA filters have a narrow radius where they average. Gaussian ones average over a very large radius, with the center weighed heavily.

Median is very different from averaging, in the case of image processing.
Well they will have to work around CA and SA as well at those super fast apertures. Not to mention, this does not solve anything for people who don't want to shoot at F/0.5 DOF. You do know what DOF is right? If you do then you will know diffraction will still be an issue for somebody who NEEDS a shot at F16, which means poof!!, there is their AA filter, it's called diffraction.
That's a hard engineering problem. On resolution vs. DoF, you're limited by Heisenberg. That doesn't mean it's unsolvable. Lytra, 70D, and other lightfield cameras, for example, have split pixels, which let you shoot wide aperture but you independently sample different parts of the aperture. This means you can correct for both DoF and many optical aberrations in PP. Unfortunately, it brings back diffraction limits as well.
But lets look at your premise. Lens engineers are going to create super fast aperture lenses,
That's actually not my premise. My premise is I cannot predict the future, and camera engineers are good at solving hard problems. 10 years ago, I wouldn't have predicted a lens like the one in the G7X or RX100III would be out in a decade. It's quite compact, and has the performance of a dSLR kit lens in a much smaller form factor.

I don't know which will happen first. To date, progress on sensors has been faster, so that's more likely overall, but there are good reasons to believe progress on sensor resolution might slow. I can come up with many half-baked ideas -- from Fresnel-like optics, to sensor arrays, to lightfield microlenses. I can tell you most won't work. I can also tell you some will. Which will win? I'm not arrogant enough to predict.
with the sole purpose to resolve more detail due to less diffraction, all part of some elaborate plan so that same minute detail can then be snuffed out by a low pass filter??
It's called oversampling. It is quite important in preserving fidelity. Why do you think we sample audio at 96kHz, when we can't hear above 20kHz?
 
Last edited:
Ontario Gone wrote:
That link is a technical description, one I have already read, and one that doesn't relay how it changes the appearance within an image.
If you'd like to see it, replicate it in Photoshop:
  • Take a high resolution image. That's what you're taking a photo of.
  • Make four images, each shifted by one pixel (original, up, left, and up+left). Average them. That's the output of the OLPF, as per Wikipedia's description.
  • Downsample by a factor of 2. That's approximately what's at your sensor.
  • Downsample by a factor of 4. That's what shows up on your monitor at full size when viewing on a 4k monitor and shooting with a 36MP sensor.
Or if you prefer, just look at the D800 and D810 images at any size other than 100% crop.
That sounds like averaging, the same can be done with median NR, or, downsampling.
All blurring is some form of averaging. It's how you weigh points when you average. AA filters have a narrow radius where they average. Gaussian ones average over a very large radius, with the center weighed heavily.
Well I regularly use 1 pixel radius or smaller when sharpening, sometimes as small as .2 if im processing inside PS with good IQ. Here is an example of a shot with thin DOF, but you can clearly see by the eyelashes, and the part of the flower that's in focus. Between the two, even only viewed at 50% and with only 100 sharpening with zero "detail", the difference is there.

Save the originals to your desktop and toggle back and forth, you don't need to view at 100% to see the difference. This is also with only 100 sharpening and no detail slider, and with only 1 pixel radius.

36ca44dc962741b5a2568902660cbc58.jpg

18bc0f7604cf44c5b20d97f2f9d0974e.jpg

It's important to remember that the extent of sharpening one can push is limited to how much artifacting is acceptable, but if 50% is the expected viewing size, one can push my numbers much further (not to mention these were ISO 500). All this is on top of the fact that removing AA won't increase noise, so as long as there is no moire, there is no downside to removing the LPF.
But lets look at your premise. Lens engineers are going to create super fast aperture lenses,
That's actually not my premise. My premise is I cannot predict the future, and camera engineers are good at solving hard problems. 10 years ago, I wouldn't have predicted a lens like the one in the G7X or RX100III would be out in a decade. It's quite compact, and has the performance of a dSLR kit lens in a much smaller form factor.
There is nothing mystical about those lenses. They, like smaller formats such as MFT, are smaller bc their aperture is smaller. The F numbers deceive, bc the sensors are small. It's still the same physics that ruled ten years ago.
I don't know which will happen first. To date, progress on sensors has been faster, so that's more likely overall, but there are good reasons to believe progress on sensor resolution might slow. I can come up with many half-baked ideas -- from Fresnel-like optics, to sensor arrays, to lightfield microlenses. I can tell you most won't work. I can also tell you some will. Which will win? I'm not arrogant enough to predict.
Actually you can't tell us some will work, bc as you admitted a paragraph ago, you cannot predict the future. Luckily some of us can, and we are telling you that sooner than later, diffraction will be the new AA filter, and we will get as much resolution as it will allow at each aperture, with exception to what aberrations block out at F/1'ish stops. And all this will be achieved without any more aliasing.

--
"Run to the light, Carol Anne. Run as fast as you can!"
 
Last edited:
This would be ideal. The brilliant thing about the K3 approach is that by controlling the pattern of motion, Pentax could design whatever OLPF they want. I doubt they did this -- they're probably just vibrating it -- but with a little bit of thought, they could make as close to an ideal OLPF as they want.
I'm sure Nikon just throws random pieces of glass into a plastic tube and calls it a lens, too.
 
I think this argument can go round and round, bc it's not really about AA filters and signal theory.
Of course it's about signal theory. That defines how digital cameras work.
It's about standards (as much of what is on DPR is about). Nobody is saying AA filters are not needed,
Some people do.
we are saying that the current resolution ceiling imposed by artificial LPF is too low, we would rather deal with occasional moire than deal with less resolution all the time.
It's not the AA filter that's limiting it. It's the sensor. All the AA filter does is band-limit what is beyond the ability of the sensor to properly resolve and would otherwise cause alias artifacts. The problem is that after sampling, you can't remove aliasing because it's indistinguishable from real data.

The solution is a higher resolution sensor, or oversampling. The tradeoff is that smaller pixels have more noise and there is a lot more data to move and write to a memory card. Those problems are solvable, and eventually will be.
 
Last edited:
Hasselblad, Leica, Pentax, Nikon, Sony, Olympus, Fuji, Phase One, all wrong. Some guy on the internet correct?
Actually he is correct and his reasoning was sound. Apart from the medium format cameras all brands above used to use AA-filters and some/most/all still do. AA filter has it's use. But the market forces may demand superior micro contrast for pixel peepers instead of what would be best for vast majority of customers. What cameras have is not about what is best, but what sells.
The market has spoken and the AA filter is as gone as the bustle at least to those for whom detail makes a difference, and good riddance to it. Like getting a new pair of glasses.
It would be more productive if you'd argument with evidence and logic instead of listing camera brands. As it was, your reply to the OP was quite rude.
 
Awesome service to the thread good sir. I think it's pretty clear that while the difference isn't world changing, the "non AA" detail is noticeably clearer, especially on the green dots in the lower right side. Im guessing this won't be enough though, the skeptic squad will surely find fault with either your samples, or the K3 itself.

This would be ideal. The brilliant thing about the K3 approach is that by controlling the pattern of motion, Pentax could design whatever OLPF they want. I doubt they did this -- they're probably just vibrating it -- but with a little bit of thought, they could make as close to an ideal OLPF as they want.
I'm sure Nikon just throws random pieces of glass into a plastic tube and calls it a lens, too.
ROFL haha yea that sounds about right for one of the largest camera companies in the world :-D
 
I think this argument can go round and round, bc it's not really about AA filters and signal theory.
Of course it's about signal theory. That defines how digital cameras work.
I don't like playing semantics.

It's about standards (as much of what is on DPR is about). Nobody is saying AA filters are not needed,
Some people do.
No, if they are educated in the matter they realize there is no getting away from AA filters, eventually. Trust me, everybody in this thread would prefer to NOT have moire and false color, they just dislike fuzzy images even more (not to mention moire is relatively rare in the average photo). Eventually people like me will get our way, we will get images that are as sharp or sharper than what a D7100 or K5IIs offers, without moire, since pixel frequency will continue to increase and eventually surpass that of diffraction at any usable aperture.

So I must disagree, none of us want to go without AA filters, we just happen to dislike softness more than we dislike the occasional false color. I would rather push resolving power to the natural limits ASAP.

we are saying that the current resolution ceiling imposed by artificial LPF is too low, we would rather deal with occasional moire than deal with less resolution all the time.
It's not the AA filter that's limiting it. It's the sensor.
Again semantics. That's like saying it's pixel size that determines pixel level diffraction, instead of aperture. It's both, depending on your POV. They both matter. Yes, as I mentioned many times this thread, once sensors are dense enough, the problem solves itself.

All the AA filter does is band-limit what is beyond the ability of the sensor to properly resolve and would otherwise cause alias artifacts. The problem is that after sampling, you can't remove aliasing because it's indistinguishable from real data.
Yes. Which is why we wish we could have our cake and eat it too. Eventually, we will.

The solution is a higher resolution sensor, or oversampling. The tradeoff is that smaller pixels have more noise and there is a lot more data to move and write to a memory card. Those problems are solvable, and eventually will be.
More pixels do NOT have more noise, this has been discussed to death. Look at any comparison, A77 vs A58, for example. Even though the sensor in the A58 has less pixels and is a newer sensor, the two score nearly identically throughout the entire ISO range (with the A77 slightly ahead at base ISO).

Want another? Try the D810 vs D4s. The D810 (having more than 2x as many pixels) is as good or better through the majority of the ISO range, only losing ground at ISO 12k through 51k.

Another example is the NEX7 vs NEX6. The NEX7 (a camera about a year older with 8mp more) tested dead even with the 6 through ISO 3200, and then actually took the lead with higher ISO.

The problem is people want to judge noise based on pixels, when that's not fair to the denser sensor. Of course 100 pixels from the NEX7 will be noisier than 100 pixels from a NEX6, bc the former is represented by fewer photons since that portion of the frame is from a smaller portion of the sensor. DXO for example, doesn't test like that, they normalize for sensor area, which is why their tests prove pixel counts do not matter.

Is there slightly more read noise with more pixels? Technically yes, but it obviously makes no noticeable difference. Or do you think manufacturers purposely sabotage lower pixel count sensors to give denser sensors a phone advantage?
 
Trust me, everybody in this thread would prefer to NOT have moire and false color, they just dislike fuzzy images even more (not to mention moire is relatively rare in the average photo).
Speak for yourself. Cameras that lack AA filters produce artifacts and nastiness that I dislike far, far, far more than slightly (barely) softer shots, which I can sharpen up satisfactorily anyway.

No AA filter = no sale for me.
 
Trust me, everybody in this thread would prefer to NOT have moire and false color, they just dislike fuzzy images even more (not to mention moire is relatively rare in the average photo).
Speak for yourself. Cameras that lack AA filters produce artifacts and nastiness that I dislike far, far, far more than slightly (barely) softer shots, which I can sharpen up satisfactorily anyway.

No AA filter = no sale for me.
Yes, I was speaking for those who prefer to buy cameras without AA filters. You are obviously not in that camp. I mean ask around, it's not like im making it up. None of US want moire, or false color, but we are willing to risk that rather rare occurrence in order to get sharper images. I owned a K5IIs, shot over 20k with it, I can say from experience that moire and false color were not that common. And this is coming from a pixel peeper.

--
"Run to the light, Carol Anne. Run as fast as you can!"
 
Last edited:
Trust me, everybody in this thread would prefer to NOT have moire and false color, they just dislike fuzzy images even more (not to mention moire is relatively rare in the average photo).
Speak for yourself. Cameras that lack AA filters produce artifacts and nastiness that I dislike far, far, far more than slightly (barely) softer shots, which I can sharpen up satisfactorily anyway.

No AA filter = no sale for me.
Yes, I was speaking for those who prefer to buy cameras without AA filters. You are obviously not in that camp. I mean ask around, it's not like im making it up. None of US want moire, or false color, but we are willing to risk that rather rare occurrence in order to get sharper images.
It's not rare. It's nearly every shot.
I owned a K5IIs, shot over 20k with it, I can say from experience that moire and false color were not that common. And this is coming from a pixel peeper.
You just aren't sensitive to this particular type of artifact. Some aren't. CA doesn't particularly bother me. But aliasing artifacts do.
 
Awesome service to the thread good sir. I think it's pretty clear that while the difference isn't world changing, the "non AA" detail is noticeably clearer, especially on the green dots in the lower right side. Im guessing this won't be enough though, the skeptic squad will surely find fault with either your samples, or the K3 itself.
I think what it shows is that the whole debate is rather a tempest in a teapot. Make a picture of those three files, and I strongly doubt that there will be any visible difference from one to the other.
 
This would be ideal. The brilliant thing about the K3 approach is that by controlling the pattern of motion, Pentax could design whatever OLPF they want. I doubt they did this -- they're probably just vibrating it -- but with a little bit of thought, they could make as close to an ideal OLPF as they want.
I'm sure Nikon just throws random pieces of glass into a plastic tube and calls it a lens, too.
It's not an insult to Pentax. It's just how product development works. The concept of an AA filter from IBIS which you can turn on and off is brilliant. "Just vibrating it" would do very well.

In this case, "a little thought" typically means a 6 month to 2 year delay to market.

I mean, it:
  1. Saves money. Lower cost than making a real filter.
  2. Already better than the competing brands, which either include AA filters or do not.
It's possible they sat on it until they had a perfect product, but I'm assuming they're not dumb. A rational organization would ship it, since it's already better than the competition, and improve from there in the next version.
 
Last edited:
Ontario Gone wrote:
Well I regularly use 1 pixel radius or smaller when sharpening,
sometimes as small as .2 if im processing inside PS with good IQ.
Of course. A Gaussian blur always has infinite radius. Sharpening is typically designed to deconvolve a Gaussian blur, simply because it is the most common type, and it too has a large radius. The 'radius' is where the level of blur falls off below some threshold (probably one std. div. of the Gaussian, but I actually don't know). It has an effect outside of that radius.

Let me repeat this again: An AA filter does not perform a Gaussian blur. A Gaussian blur, and the inverse, effects the whole image. The effect of an AA filter is localized.. It's fundamentally different. Playing around with Gaussians, or the inverse with the sharpen tool, will not help you understand AA filters.

I gave you exact instructions for replicating an AA filter. Why not follow those?
All this is on top of the fact that removing AA won't increase noise
It actually will. If there is any texture beyond Nyquist, which there almost always is, you will have noise, especially chroma noise, aliasing in. It's typically not a big increase, but depending on what you're shooting, it could be. Whether or not it matters is a different story.
so as long as there is no moire, there is no downside to removing the LPF.
Moire is just the most visible type of aliasing. There are many artifacts it can introduce. False texture. False detail. Noise. Etc.
There is nothing mystical about those lenses.
There is something pretty amazing about them, actually. As you increase aperture, optical aberrations go up. As you add zoom, optical abberations go up. The ability to make sharp zooms with apertures like f/1.8 or f/1.8-2.8 is actually pretty amazing, at least relative to where optics was a decade ago. And we've now seen them from Sony, Canon, Panasonic, and Sigma.
 
Hasselblad, Leica, Pentax, Nikon, Sony, Olympus, Fuji, Phase One, all wrong. Some guy on the internet correct?
Yes, about natural or realistic detail.

No, about sales potential.

Omitting an AA filter does not come without a cost until pixel densities get much higher than they are now.
 
Ontario Gone wrote:
Well I regularly use 1 pixel radius or smaller when sharpening,
sometimes as small as .2 if im processing inside PS with good IQ.
Of course. A Gaussian blur always has infinite radius. Sharpening is typically designed to deconvolve a Gaussian blur, simply because it is the most common type, and it too has a large radius. The 'radius' is where the level of blur falls off below some threshold (probably one std. div. of the Gaussian, but I actually don't know). It has an effect outside of that radius.
Exactly, it's the threshold of when the sharpening affect stops working. This is the exact opposite of a LPF, which has a threshold of when it stops "softening" based on wavelength. For the record I use smart sharpen in PS, which has a Gaussian blur correction as well as a lens correction (I always use lens, it works better).

So at least my sharpening that I do in PS, which is typically at a 0.2 pixel radius, is not the same as Gaussian reduction. I don't use Gaussian, im pickier than that, I want more control than that. Leaving out the LPF means that the tiny portion of the sharpest wavelengths are what will be changed, only that won't increase noise the way PP sharpening does.

Let me repeat this again: An AA filter does not perform a Gaussian blur. A Gaussian blur, and the inverse, effects the whole image. The effect of an AA filter is localized.. It's fundamentally different. Playing around with Gaussians, or the inverse with the sharpen tool, will not help you understand AA filters.
Just for fun I will repeat. I don't use Gaussian deblur, I use Lens smart sharpen with a 0.2 pixel radius, which does not affect the entire image either. Yet, even when viewed at full size, I can clearly see a change in the sharpest areas of the shot. If it's the eyes of a portrait, at full screen, I can see a difference when I toggle it back and forth.

so as long as there is no moire, there is no downside to removing the LPF.
Moire is just the most visible type of aliasing. There are many artifacts it can introduce. False texture. False detail. Noise. Etc.
Can is the key word. But lets not forget, im not in support of aliasing artifacts, if I had my way they would all be gone. I just happen to prefer sharpness even if it means occasional moire. In the end, I will be happy bc I will have both, aliasing prevention and sharp images.
 
I think this argument can go round and round, bc it's not really about AA filters and signal theory.
Of course it's about signal theory. That defines how digital cameras work.
I don't like playing semantics.
Neither do I.
It's about standards (as much of what is on DPR is about). Nobody is saying AA filters are not needed,
Some people do.
No, if they are educated in the matter they realize there is no getting away from AA filters, eventually.
Agreed, but not everyone is educated in the matter to understand why it's needed. Just look at Sigma users for a prime example, and not for just sampling theory either.
Trust me, everybody in this thread would prefer to NOT have moire and false color, they just dislike fuzzy images even more (not to mention moire is relatively rare in the average photo). Eventually people like me will get our way, we will get images that are as sharp or sharper than what a D7100 or K5IIs offers, without moire, since pixel frequency will continue to increase and eventually surpass that of diffraction at any usable aperture.
If you're getting fuzzy images, you're doing something wrong.
So I must disagree, none of us want to go without AA filters, we just happen to dislike softness more than we dislike the occasional false color. I would rather push resolving power to the natural limits ASAP.
What you're saying is you'd make different tradeoffs than the camera designers. That's fine, but realize that comes at a price.

The solution is not to skip the AA filter, but to raise the sampling rate. We're starting to get to that point, but not quite there yet.
we are saying that the current resolution ceiling imposed by artificial LPF is too low, we would rather deal with occasional moire than deal with less resolution all the time.
It's not the AA filter that's limiting it. It's the sensor.
Again semantics. That's like saying it's pixel size that determines pixel level diffraction, instead of aperture. It's both, depending on your POV. They both matter. Yes, as I mentioned many times this thread, once sensors are dense enough, the problem solves itself.
It's not semantics at all. The sensor has a limit to what it can resolve and the AA filter bandlimits above it because to not do that would cause aliasing. At some point, that limit will be high enough so that the bandlimiting happens elsewhere.
All the AA filter does is band-limit what is beyond the ability of the sensor to properly resolve and would otherwise cause alias artifacts. The problem is that after sampling, you can't remove aliasing because it's indistinguishable from real data.
Yes. Which is why we wish we could have our cake and eat it too. Eventually, we will.
Hopefully.
The solution is a higher resolution sensor, or oversampling. The tradeoff is that smaller pixels have more noise and there is a lot more data to move and write to a memory card. Those problems are solvable, and eventually will be.
More pixels do NOT have more noise, this has been discussed to death. Look at any comparison, A77 vs A58, for example. Even though the sensor in the A58 has less pixels and is a newer sensor, the two score nearly identically throughout the entire ISO range (with the A77 slightly ahead at base ISO).
More pixels most certainly do have more noise. It's basic physics and there's no getting around that.

What people end up comparing is not just pixel size but different sensor technologies. Smaller pixels in a lower noise sensor may have less noise than bigger pixels in a higher noise sensor because of a different generation of sensor. There is also the rest of the electronics which adds its own noise too (although not typically significant).
 
Hasselblad, Leica, Pentax, Nikon, Sony, Olympus, Fuji, Phase One, all wrong. Some guy on the internet correct?
Yes, about natural or realistic detail.
The vast majority of photographers, professional or otherwise, disagree with you.
No, about sales potential.

Omitting an AA filter does not come without a cost until pixel densities get much higher than they are now.

--
John
http://www.pbase.com/image/55384958.jpg
 

Keyboard shortcuts

Back
Top