Somewhat surprised that aliasing is almost never discussed

Erik Kaffehr

Veteran Member
Messages
8,195
Solutions
7
Reaction score
5,117
Location
Nyköping, SE
Hi,

I was shooting a lot of medium format a few years ago and aliasing was often a problem.

The last few months I am shooting primes with my Sony A7rII and now Sony A7rIV and with primes, aliasing is a daily problem.

My understanding is that sharp lenses, large pixels, undersize pixel apertures and no OLP filtering yield aliased images.



76ca4eb7643f4badab3c03e8a3095a4c.jpg.png

But, I seldom see aliasing discussed. So, it seems that few photographers are affected.

That makes me curious...

Best regards

Erik

--
Erik Kaffehr
Website: http://echophoto.dnsalias.net
Magic uses to disappear in controlled experiments…
Gallery: http://echophoto.smugmug.com
Articles: http://echophoto.dnsalias.net/ekr/index.php/photoarticles
 
Your images clearly show colored aliasing, no question.

Maybe, many photographers solve the issue by ignoring it.

With cameras without AA filters, we may control aliasing by stopping down. Diffraction blur produces an SFR factor with a strict cutoff frequency for the Spatial Frequency Response: f = 1/\lambda N_f . So by choosing the right stopped down aperture, you can totally eliminate aliasing from stuff above the Nyqvist frequency without introducing unnecessary degradation on system MTF.

For the green Bayer channel, horizontal or vertical Nyqvist is at (2*pixel_pitch)^-1 , 125 c/mm for 4mum pixels.

For red and blue it is half as much, but blue requires more stopping down than red: /lambda. for 4 mum pixels and blue 450 nm light aliasing is gone finally at about f/32 . With some luck it may become non-disturbing with much less blur.

For the 45 degree direction, Nyqvist becomes the same for the three bayer channels: (2*sqrt(2)*pixel_pitch)^-1 ~ 90 c/mm for 4mum pixels.

The sensor Nyqvist goes up for other directions.
 
It is a real problem - but not much one can do about it. The E1655 f2.8, which I wanted to use on the a7r3 because it is small, light, and sharp, produces so much aliasing that I stopped using it on that camera. At normal viewing distances and magnifications, it's okay but it is there.
 
Your images clearly show colored aliasing, no question.

Maybe, many photographers solve the issue by ignoring it.

With cameras without AA filters, we may control aliasing by stopping down. Diffraction blur produces an SFR factor with a strict cutoff frequency for the Spatial Frequency Response: f = 1/\lambda N_f . So by choosing the right stopped down aperture, you can totally eliminate aliasing from stuff above the Nyqvist frequency without introducing unnecessary degradation on system MTF.

For the green Bayer channel, horizontal or vertical Nyqvist is at (2*pixel_pitch)^-1 , 125 c/mm for 4mum pixels.

For red and blue it is half as much, but blue requires more stopping down than red: /lambda. for 4 mum pixels and blue 450 nm light aliasing is gone finally at about f/32 . With some luck it may become non-disturbing with much less blur.

For the 45 degree direction, Nyqvist becomes the same for the three bayer channels: (2*sqrt(2)*pixel_pitch)^-1 ~ 90 c/mm for 4mum pixels.

The sensor Nyqvist goes up for other directions.
Actually, the G Nyquist is a square rotated by 45 degrees while the R and the B ones are horizontally/vertically oriented squares which can be inscribed in the G "rhombus"; so this would revise the figures you mention above.

In addition to that, the convolution by a pixel (or whatever the light sensitive area is) introduces attenuation away from small frequencies, which has the same orientation for each of the three channels.

As far as the OP goes - aliasing has been discussed in this forum a lot; and in the other ones - not so much, it is usually replaced by discussions about moire.
 
Last edited:
Your images clearly show colored aliasing, no question.

Maybe, many photographers solve the issue by ignoring it.
I would guess that once you have seen it, it is difficult to un-see it.
With cameras without AA filters, we may control aliasing by stopping down. Diffraction blur produces an SFR factor with a strict cutoff frequency for the Spatial Frequency Response: f = 1/\lambda N_f . So by choosing the right stopped down aperture, you can totally eliminate aliasing from stuff above the Nyqvist frequency without introducing unnecessary degradation on system MTF.

For the green Bayer channel, horizontal or vertical Nyqvist is at (2*pixel_pitch)^-1 , 125 c/mm for 4mum pixels.

For red and blue it is half as much, but blue requires more stopping down than red: /lambda. for 4 mum pixels and blue 450 nm light aliasing is gone finally at about f/32 . With some luck it may become non-disturbing with much less blur.

For the 45 degree direction, Nyqvist becomes the same for the three bayer channels: (2*sqrt(2)*pixel_pitch)^-1 ~ 90 c/mm for 4mum pixels.

The sensor Nyqvist goes up for other directions.
Thanks for the detailed analysis!

Best regards

Erik
 
My understanding is that sharp lenses, large pixels, undersize pixel apertures and no OLP filtering yield aliased images.
Also downsampling images causes aliasing in monochrome detail, but it is mainly the color aliasing that people object to. "Get a monochrome camera" they say, or "get a Foveon camera," since they mistakenly claim that those do not alias. They do, but just not the particularly noxious color aliasing that's due to the Bayer filter.
But, I seldom see aliasing discussed. So, it seems that few photographers are affected.
Sometimes a scientifically-minded photographer brings up aliasing as a problem, and they often end up being criticized by others who want sharpness at any cost, any detail even though it may be false detail. Consider also the use of AI software which actually does generate false detail on purpose, but hopefully plausible-looking detail.

You can maybe guess why: imagine spending a fortune on a camera and lens, and then zooming way in, and then only seeing blurry mush. Sharp detail is far more satisfying, even if it looks a bit rough.
That makes me curious...
I think one explanation is that many photographers really don't have aliasing in any significant amount. Stuff that is significantly out of focus, which will often be the majority of an image, won't alias. Camera shake, misfocus, motion blur, noise, and lousy optics also contribute.

In my experience, I never saw aliasing with my old, small-sensor point-and-shoots, and only saw significant aliasing with my DSLRs, but only then when using good optics and good technique. It was my tripod shots, well-exposed, base ISO, good focus, and deep depth of field that really showed aliasing, and even then only typically with architectural subjects with regular patterns and clear, high-contrast edges.
 
My understanding is that sharp lenses, large pixels, undersize pixel apertures and no OLP filtering yield aliased images.
Also downsampling images causes aliasing in monochrome detail, but it is mainly the color aliasing that people object to. "Get a monochrome camera" they say, or "get a Foveon camera," since they mistakenly claim that those do not alias. They do, but just not the particularly noxious color aliasing that's due to the Bayer filter.
The art of downsizing includes blurring the image before resampling...
But, I seldom see aliasing discussed. So, it seems that few photographers are affected.
Sometimes a scientifically-minded photographer brings up aliasing as a problem, and they often end up being criticized by others who want sharpness at any cost, any detail even though it may be false detail. Consider also the use of AI software which actually does generate false detail on purpose, but hopefully plausible-looking detail.

You can maybe guess why: imagine spending a fortune on a camera and lens, and then zooming way in, and then only seeing blurry mush. Sharp detail is far more satisfying, even if it looks a bit rough.
That makes me curious...
I think one explanation is that many photographers really don't have aliasing in any significant amount. Stuff that is significantly out of focus, which will often be the majority of an image, won't alias. Camera shake, misfocus, motion blur, noise, and lousy optics also contribute.
That thought has occurred to me. I mostly shoot on tripod with IS hopefully disabled and try to focus with magnified LV at shooting aperture. On MFD I always focus using a 3X magnifier, but not stopped down.
In my experience, I never saw aliasing with my old, small-sensor point-and-shoots, and only saw significant aliasing with my DSLRs, but only then when using good optics and good technique. It was my tripod shots, well-exposed, base ISO, good focus, and deep depth of field that really showed aliasing, and even then only typically with architectural subjects with regular patterns and clear, high-contrast edges.
I see aliasing on a lot of MFD shots. With my Canon mount zooms on the A7rII it was rare. Switching to 'good' primes on the A7rII it is much more frequent.

Jury is out on the A7rIV, but I don't think that going from 4.5 microns to 3.8 microns is a lot of relief.

Just to say, thanks for chiming in! I always enjoy your writing!

Best regards

Erik

--
Erik Kaffehr
Website: http://echophoto.dnsalias.net
Magic uses to disappear in controlled experiments…
Gallery: http://echophoto.smugmug.com
Articles: http://echophoto.dnsalias.net/ekr/index.php/photoarticles
 
Last edited:
Your images clearly show colored aliasing, no question.

Maybe, many photographers solve the issue by ignoring it.
I would guess that once you have seen it, it is difficult to un-see it.
Agree. It almost feels like they don't see it because they don't expect it?

I also didn't expect the issue to be as big as it proved to be with my 24mp full frame camera. Coming from Foveon, where aliasing is just a charming sign of proper focus, the color artifacts from aliasing in a Bayer camera without AA filter really stand out, whether it's in grass or hair or man-made structures.

EDIT: To be honest, if people don't complain about the sharpening halos from Apple and Samsung, then I guess it's optimistic to think they'll see aliasing ;-)
 
Last edited:
Hi,

I was shooting a lot of medium format a few years ago and aliasing was often a problem.

The last few months I am shooting primes with my Sony A7rII and now Sony A7rIV and with primes, aliasing is a daily problem.

My understanding is that sharp lenses, large pixels, undersize pixel apertures and no OLP filtering yield aliased images.

76ca4eb7643f4badab3c03e8a3095a4c.jpg.png

But, I seldom see aliasing discussed. So, it seems that few photographers are affected.

That makes me curious...

Best regards

Erik
I agree, I think Bayer pattern colour aliasing is obnoxious.

One of the reasons I use a Fuji Xtrans sensor. Not a perfect solution, but better than a weak or no AA filter, and less prone to moire patterns.

--
"A designer knows he has achieved perfection not when there is nothing left to add, but when there is nothing left to take away." Antoine de Saint-Exupery
 
Your images clearly show colored aliasing, no question.

Maybe, many photographers solve the issue by ignoring it.

With cameras without AA filters, we may control aliasing by stopping down. Diffraction blur produces an SFR factor with a strict cutoff frequency for the Spatial Frequency Response: f = 1/\lambda N_f . So by choosing the right stopped down aperture, you can totally eliminate aliasing from stuff above the Nyqvist frequency without introducing unnecessary degradation on system MTF.

For the green Bayer channel, horizontal or vertical Nyqvist is at (2*pixel_pitch)^-1 , 125 c/mm for 4mum pixels.

For red and blue it is half as much, but blue requires more stopping down than red: /lambda. for 4 mum pixels and blue 450 nm light aliasing is gone finally at about f/32 . With some luck it may become non-disturbing with much less blur.

For the 45 degree direction, Nyqvist becomes the same for the three bayer channels: (2*sqrt(2)*pixel_pitch)^-1 ~ 90 c/mm for 4mum pixels.

The sensor Nyqvist goes up for other directions.
Actually, the G Nyquist is a square rotated by 45 degrees while the R and the B ones are horizontally/vertically oriented squares which can be inscribed in the G "rhombus"; so this would revise the figures you mention above.
NO. No revision is needed. It is the distance between the straight lines, that connect the centers of the pixels, that matter.

for Nyqvist in x-direction yo look for lines in y-direction. For R or B, these are spaced 2x pixel pitch. For G, there are the G2 pixels, displaced 1 upwards, that produce a y- line 1 pixel-pitch apart.

For 45 degree lines it comes to the same sqrt(2) * pixel pitch. Displacemet along line for R or B, no displacement for G.
In addition to that, the convolution by a pixel (or whatever the light sensitive area is) introduces attenuation away from small frequencies, which has the same orientation for each of the three channels.
indeed, the pixel shape produces its own contributing SFR factor. For the square pixel, it is the famous sinc(...)*sinc(...) factor. This does not produce a strict cutoff against aliasing.

Also the dichroic beam splitter AA filters do not produce a strict cutoff for total elimination of moire.
 
Your images clearly show colored aliasing, no question.

Maybe, many photographers solve the issue by ignoring it.

With cameras without AA filters, we may control aliasing by stopping down. Diffraction blur produces an SFR factor with a strict cutoff frequency for the Spatial Frequency Response: f = 1/\lambda N_f . So by choosing the right stopped down aperture, you can totally eliminate aliasing from stuff above the Nyqvist frequency without introducing unnecessary degradation on system MTF.

For the green Bayer channel, horizontal or vertical Nyqvist is at (2*pixel_pitch)^-1 , 125 c/mm for 4mum pixels.

For red and blue it is half as much, but blue requires more stopping down than red: /lambda. for 4 mum pixels and blue 450 nm light aliasing is gone finally at about f/32 . With some luck it may become non-disturbing with much less blur.

For the 45 degree direction, Nyqvist becomes the same for the three bayer channels: (2*sqrt(2)*pixel_pitch)^-1 ~ 90 c/mm for 4mum pixels.

The sensor Nyqvist goes up for other directions.
Actually, the G Nyquist is a square rotated by 45 degrees while the R and the B ones are horizontally/vertically oriented squares which can be inscribed in the G "rhombus"; so this would revise the figures you mention above.
NO. No revision is needed. It is the distance between the straight lines, that connect the centers of the pixels, that matter.
I misread what you said above. The "rhombus" treatment provides the same conclusion - same diagonal resolution, G has twice the H and the V one.

The distance between lines is not a primary object. The theory is looking for a group of shifts that generates the grid; then the optimal sampling configuration is the dual grid, etc. In this case, the G grid is just a tilted square one, so nothing more is really needed.
 
Last edited:
The art of downsizing includes blurring the image before resampling...
Yes, I noticed problems with downsampling soon after I upgraded lenses and technique!

Nicolas Robidoux, who used to be a frequent contributor on dpreview, did a lot of academic research in this area, which proved to be very useful.

For example, here is his discussion of then state-of-the-art image resizing techniques, using ImageMagick:

I see aliasing on a lot of MFD shots. With my Canon mount zooms on the A7rII it was rare. Switching to 'good' primes on the A7rII it is much more frequent.

Jury is out on the A7rIV, but I don't think that going from 4.5 microns to 3.8 microns is a lot of relief.
I use a D750, and just obtained an old 24-85 mm kit lens for it, but it is sharp enough to alias more often than my 'antique' lens collection, so I've been doing a lot more of the blur-then-downsample technique.

There recently was a discussion of a new patent for a pixel design that incorporates something like a prism, so you get three color channels plus infrared for every pixel, and no color aliasing along with higher quantum efficiency, with IR thrown in for free.
Just to say, thanks for chiming in! I always enjoy your writing!
You're welcome, thanks!
 
The art of downsizing includes blurring the image before resampling...
Yes, I noticed problems with downsampling soon after I upgraded lenses and technique!

Nicolas Robidoux, who used to be a frequent contributor on dpreview, did a lot of academic research in this area, which proved to be very useful.

For example, here is his discussion of then state-of-the-art image resizing techniques, using ImageMagick:

https://legacy.imagemagick.org/Usage/filter/nicolas/
I have linked to this site before as well - but one has to be careful. Many of his algorithms are for computer graphics, not for properly sampled images.
 
With cameras without AA filters, we may control aliasing by stopping down. Diffraction blur produces an SFR factor with a strict cutoff frequency for the Spatial Frequency Response: f = 1/\lambda N_f . So by choosing the right stopped down aperture, you can totally eliminate aliasing from stuff above the Nyqvist frequency without introducing unnecessary degradation on system MTF.

For the green Bayer channel, horizontal or vertical Nyqvist is at (2*pixel_pitch)^-1 , 125 c/mm for 4mum pixels.

For red and blue it is half as much, but blue requires more stopping down than red: /lambda. for 4 mum pixels and blue 450 nm light aliasing is gone finally at about f/32 . With some luck it may become non-disturbing with much less blur.

For the 45 degree direction, Nyqvist becomes the same for the three bayer channels: (2*sqrt(2)*pixel_pitch)^-1 ~ 90 c/mm for 4mum pixels.

The sensor Nyqvist goes up for other directions.
Thanks for the detailed analysis!
I actually did not encounter the argument, that moire can be completely eliminated by stopping down, in previous forum discussions or articles.

Of course ~f/32 is not appealing to sharpness enthusiasts.

But one might explore pushing above thoughts one step further as an exercise in photographic science: When oversampling the picture, the alias-kill stop would apply to the oversampling small pixel. When the subject allows image stitching, you can get by without such a highly oversampling camera.

Example with stitching: Goal, get zero alias with f/8 sharpness for 4 mum pixel image. Use a 4x4 stitch of images with 4x original focal length on original sensor using f/32. (Note that physical aperture of long lens is the same as physical aperture of original lens at desired aperture.) The long lens stitched image has no aliasing at f/32 with previous argument. The stitched image can now be downsized to the originally wanted size, using a sharp digital filter to avoid aliasing coming up by downsizing.

Just checked, photoshop CS6 does not do such filtering on downsizing.

 
My understanding is that sharp lenses, large pixels, undersize pixel apertures and no OLP filtering yield aliased images.
Also downsampling images causes aliasing in monochrome detail, but it is mainly the color aliasing that people object to. "Get a monochrome camera" they say, or "get a Foveon camera," since they mistakenly claim that those do not alias. They do, but just not the particularly noxious color aliasing that's due to the Bayer filter.
But, I seldom see aliasing discussed. So, it seems that few photographers are affected.
Sometimes a scientifically-minded photographer brings up aliasing as a problem, and they often end up being criticized by others who want sharpness at any cost, any detail even though it may be false detail. Consider also the use of AI software which actually does generate false detail on purpose, but hopefully plausible-looking detail.
I got interested in the issue when I realized that:
  • OLP filters are quite complex and come with a cost.
  • Large pixels and no OLP tend to alias.
But it seemed to me that photographers like aliased images as long aliasing does not add obvious false color.
You can maybe guess why: imagine spending a fortune on a camera and lens, and then zooming way in, and then only seeing blurry mush. Sharp detail is far more satisfying, even if it looks a bit rough.
Yes, if the sensor resolution matches the resolution of the lens, the result will be soft.
That makes me curious...
I think one explanation is that many photographers really don't have aliasing in any significant amount. Stuff that is significantly out of focus, which will often be the majority of an image, won't alias. Camera shake, misfocus, motion blur, noise, and lousy optics also contribute.
Yes, I think that all those factors are around. Also I would think that we often don't look that close at images.



This was an interesting case, I wanted to check the poster at the center, it had heavy monochrome aliasing.
This was an interesting case, I wanted to check the poster at the center, it had heavy monochrome aliasing.



Closer crop, note that there is also aliasing on the window blinds.
Closer crop, note that there is also aliasing on the window blinds.

I didn't want that poster in my image, I was looking at ways to remove it, no easy task.

So, I checked printing it at A2 size, which is my normal print size. At that size it was very visible.

As I would recall, that image was shot on my canon 24/3.5 TSE LII, not the sharpest lens I ever had. But I also have a similar, but different, aliasing pattern with shots using my Canon 16-35/4L.

I would think that the ideal solution may be smaller pixels, with an OLP filter optimized for that pixel size.

This was a 42 MP image. I would assume that a 100 MP image would have much less issues, but it may look pretty soft.

On the other hand, it may be that increasing MP would allow us to use simpler demosaic algorithms. So I think that going up in resolution may be beneficial.

I would also say that this sample proofs that even mediocre lenses combined with high resolution sensors can yield moiré.
Best regards

Erik

--
Erik Kaffehr
Website: http://echophoto.dnsalias.net
Magic uses to disappear in controlled experiments…
Gallery: http://echophoto.smugmug.com
Articles: http://echophoto.dnsalias.net/ekr/index.php/photoarticles
 
I don't know why you think pictures without aliasing will look soft. Generally, higher resolution involves maintaining the MTF up to a higher spatial frequency.

I don't find aliasing as such very objectionable. The problem is that the three colour channels are out of phase. This would give colour effects even if there was no aliasing at all.

Here is an image which is only 15 Megapixels. If you view it at 200%, you can see aliasing -- for instance on the whitish blind of the third shop from the right. But there's no colour effect, because the channels are in phase.



cd25814d4f594444bbd462c50ee176c1.jpg

On still subjects, pixel shift can be used to bring the channels into phase -- but still subjects are rare. Or in good light, you can use a Foveon sensor as here.

Don Cox
 
When I shoot legislators and lobbyists Glen plaid, sharkskin, and the weave of certain neckties are a huge problem. But since beautiful silk ties and elegant wool suits have gone out of fashion it doesn’t seem to be an issue anymore.

Could the sensor makers be doing something clever to prevent it?
 
Images like this make me wonder why Sony left out the OLPF on the a7c:

https://www.dpreview.com/forums/post/64526621

At first, one doesn't notice. But looking at the image at 100% it is obvious and I can't unsee it. The RX1MKII has an adjustable OLPF. Does that work? If it works: Why wasn't it part of the a7r3, or any of the 24MP FF cameras?
 
When I shoot legislators and lobbyists Glen plaid, sharkskin, and the weave of certain neckties are a huge problem. But since beautiful silk ties and elegant wool suits have gone out of fashion it doesn’t seem to be an issue anymore.

Could the sensor makers be doing something clever to prevent it?
Hi,

It is about reducing fine detail contrast at the pixel level, camera makers can do two things:
  • Reduce pixel size
  • Add an OLP filter
A correctly sampled image will look soft at the pixel level.

Reducing the active area of the pixel will also increase aliasing.

Best regards

Erik
 

Keyboard shortcuts

Back
Top