Why we have aliasing and does it really matter?

Erik Kaffehr

Veteran Member
Messages
8,195
Solutions
7
Reaction score
5,118
Location
Nyköping, SE
Most MFD cameras will alias images, but it seems that aliasing matters little to many photographers. Here I explain how it works.

Let's start with an example:

This was not shot on medium format, it was shot using a Voigtlander 65/2 APO Lanthar on a Sony A7rIV. What we see is that image on the left has a lot of color aliasing while the image on the right has very little. The images should be viewed at actual pixels, see the link below.
This was not shot on medium format, it was shot using a Voigtlander 65/2 APO Lanthar on a Sony A7rIV. What we see is that image on the left has a lot of color aliasing while the image on the right has very little. The images should be viewed at actual pixels, see the link below.

The difference between the two images is the aperture used. The lens used here is sharpest at f/2.8 or f/4. Shooting at f/11 we see significant diffraction that acts as an OLP (Optical Low Pass) filter.

To best understand the difference, we can look at MTF plots for both.

The blue line is the MTF at f/4 and the red line is MTF at f/11.
The blue line is the MTF at f/4 and the red line is MTF at f/11.

Now, note the vertical line at 3168 cy/PH. That line is the Nyquist limit. Any signal above Nyquist is unresolved signal. According to sampling theory, those signals will be folded back to low frequency artifacts, also called aliases.

Something like this, note that there is a significant area under the folded back blue line, not so much under the red line.
Something like this, note that there is a significant area under the folded back blue line, not so much under the red line.

Now, most digital cameras use a Bayer filter array in front of the sensor that detect color information.
  • Most cameras use 'Bayer' pattern
  • Some Fujifilm cameras use a pattern called X-TRANS
  • Foveon cameras don't use color filter array, but use a stacked sensor design and basing color information from light absorbtion in the layers affecting different wavelengths differently.
Bayer sensors have an 'R', 'G', 'B', 'G2' pattern of filters, each 'supercell' having a red, blue and two green pixels.

This shows the G and G2 channels. Note the aliases 1round 130 lp/mm. Note that the G1 and G2 aliases are pretty near invented.
This shows the G and G2 channels. Note the aliases 1round 130 lp/mm. Note that the G1 and G2 aliases are pretty near invented.

It is more visible here.
It is more visible here.

What we can see here is that undersampled detail is turned into aliases and these aliases show up differently for different pixel positions.

If we include red and blue pixels, we get a color pattern, known as moiré.

Now, how does this show in an image?

Image shot on Phase One P45+ (6.8 micron pixel pitch 55% area fill factor)
Image shot on Phase One P45+ (6.8 micron pixel pitch 55% area fill factor)

We have extensive aliasing artifacts on the reeds.
We have extensive aliasing artifacts on the reeds.

7eb9f7c984c548f39f6bd425b6b4d4f9.jpg

The images shot here were shot on:

A) Sony A7rII with a sharp lens 4.5 microns

B) Sony A7rII with a good quality standard zoom 4.5 microns

C) Sony A7II with the sharplens, Sony A7II is 24 MP with an OLP filter 6 microns

D) Sony A7rIV, 61 MP without OLP filter 3.8 microns

E) Phase One P45+, 39 MP, low fill factor, 6.8 micron pixels

All viewers place D first and E last. All viewers preferred B to A

Why do I often see aliasing artifacts in my images?
  • I often use good lenses with medium apertures.
  • I try to focus accurately.
  • And I almost always use a tripod.
  • Many of my subjects have areas with fine detail.
Best regards

Erik

--
Erik Kaffehr
Website: http://echophoto.dnsalias.net
Magic tends to disappear in controlled experiments…
Gallery: http://echophoto.smugmug.com
Articles: http://echophoto.dnsalias.net/ekr/index.php/photoarticles
 
Last edited:
Great examples, thanks.

In the first example, stopping down to f11 looks like an almost perfect low-pass filter. The wider-aperture version barely resolves any more detail before aliasing kicks in. The softer edges look like they'd respond well to sharpening.
 
  • Like
Reactions: rbf
Great examples, thanks.

In the first example, stopping down to f11 looks like an almost perfect low-pass filter. The wider-aperture version barely resolves any more detail before aliasing kicks in. The softer edges look like they'd respond well to sharpening.
Another series of images with a similar conclusion:


However, there are better AA filters than diffraction.
 
However, there are better AA filters than diffraction.
In your opinion, are the standard birefringence filters ideal?
I think you need a real frequency (j*omega) axis zero like you get with those filters, but I think most AA filters are insufficiently aggressive for a Bayer-CFA sensor.
 
Great examples, thanks.

In the first example, stopping down to f11 looks like an almost perfect low-pass filter. The wider-aperture version barely resolves any more detail before aliasing kicks in. The softer edges look like they'd respond well to sharpening.
Another series of images with a similar conclusion:

https://blog.kasson.com/gfx-100/a-visual-look-at-gfx-100-diffraction-blur/

However, there are better AA filters than diffraction.
Yeah I remember that. But for people who are fine with stopping down when they feel they need to DoF for an image, if it works it works :)
 
Great examples, thanks.

In the first example, stopping down to f11 looks like an almost perfect low-pass filter. The wider-aperture version barely resolves any more detail before aliasing kicks in. The softer edges look like they'd respond well to sharpening.
Another series of images with a similar conclusion:

https://blog.kasson.com/gfx-100/a-visual-look-at-gfx-100-diffraction-blur/

However, there are better AA filters than diffraction.
What about the Pentax sensor shaker - how well does that work?
 
Great examples, thanks.

In the first example, stopping down to f11 looks like an almost perfect low-pass filter. The wider-aperture version barely resolves any more detail before aliasing kicks in. The softer edges look like they'd respond well to sharpening.
Another series of images with a similar conclusion:

https://blog.kasson.com/gfx-100/a-visual-look-at-gfx-100-diffraction-blur/

However, there are better AA filters than diffraction.
What about the Pentax sensor shaker - how well does that work?
Never used it. No zeros, though.

--
https://blog.kasson.com
 
Last edited:
Great examples, thanks.

In the first example, stopping down to f11 looks like an almost perfect low-pass filter. The wider-aperture version barely resolves any more detail before aliasing kicks in. The softer edges look like they'd respond well to sharpening.
Another series of images with a similar conclusion:

https://blog.kasson.com/gfx-100/a-visual-look-at-gfx-100-diffraction-blur/

However, there are better AA filters than diffraction.
What about the Pentax sensor shaker - how well does that work?
 
Most MFD cameras will alias images, but it seems that aliasing matters little to many photographers. Here I explain how it works.

Let's start with an example:

This was not shot on medium format, it was shot using a Voigtlander 65/2 APO Lanthar on a Sony A7rIV. What we see is that image on the left has a lot of color aliasing while the image on the right has very little. The images should be viewed at actual pixels, see the link below.
This was not shot on medium format, it was shot using a Voigtlander 65/2 APO Lanthar on a Sony A7rIV. What we see is that image on the left has a lot of color aliasing while the image on the right has very little. The images should be viewed at actual pixels, see the link below.

The difference between the two images is the aperture used. The lens used here is sharpest at f/2.8 or f/4. Shooting at f/11 we see significant diffraction that acts as an OLP (Optical Low Pass) filter.

To best understand the difference, we can look at MTF plots for both.

The blue line is the MTF at f/4 and the red line is MTF at f/11.
The blue line is the MTF at f/4 and the red line is MTF at f/11.

Now, note the vertical line at 3168 cy/PH. That line is the Nyquist limit. Any signal above Nyquist is unresolved signal. According to sampling theory, those signals will be folded back to low frequency artifacts, also called aliases.

Something like this, note that there is a significant area under the folded back blue line, not so much under the red line.
Something like this, note that there is a significant area under the folded back blue line, not so much under the red line.

Now, most digital cameras use a Bayer filter array in front of the sensor that detect color information.
  • Most cameras use 'Bayer' pattern
  • Some Fujifilm cameras use a pattern called X-TRANS
  • Foveon cameras don't use color filter array, but use a stacked sensor design and basing color information from light absorbtion in the layers affecting different wavelengths differently.
Bayer sensors have an 'R', 'G', 'B', 'G2' pattern of filters, each 'supercell' having a red, blue and two green pixels.

This shows the G and G2 channels. Note the aliases 1round 130 lp/mm. Note that the G1 and G2 aliases are pretty near invented.
This shows the G and G2 channels. Note the aliases 1round 130 lp/mm. Note that the G1 and G2 aliases are pretty near invented.

It is more visible here.
It is more visible here.

What we can see here is that undersampled detail is turned into aliases and these aliases show up differently for different pixel positions.

If we include red and blue pixels, we get a color pattern, known as moiré.

Now, how does this show in an image?

Image shot on Phase One P45+ (6.8 micron pixel pitch 55% area fill factor)
Image shot on Phase One P45+ (6.8 micron pixel pitch 55% area fill factor)

We have extensive aliasing artifacts on the reeds.
We have extensive aliasing artifacts on the reeds.

7eb9f7c984c548f39f6bd425b6b4d4f9.jpg

The images shot here were shot on:

A) Sony A7rII with a sharp lens 4.5 microns

B) Sony A7rII with a good quality standard zoom 4.5 microns

C) Sony A7II with the sharplens, Sony A7II is 24 MP with an OLP filter 6 microns

D) Sony A7rIV, 61 MP without OLP filter 3.8 microns

E) Phase One P45+, 39 MP, low fill factor, 6.8 micron pixels

All viewers place D first and E last. All viewers preferred B to A

Why do I often see aliasing artifacts in my images?
  • I often use good lenses with medium apertures.
  • I try to focus accurately.
  • And I almost always use a tripod.
  • Many of my subjects have areas with fine detail.
Best regards

Erik
So... does it ?
 
Most MFD cameras will alias images, but it seems that aliasing matters little to many photographers. Here I explain how it works.

Let's start with an example:

This was not shot on medium format, it was shot using a Voigtlander 65/2 APO Lanthar on a Sony A7rIV. What we see is that image on the left has a lot of color aliasing while the image on the right has very little. The images should be viewed at actual pixels, see the link below.
This was not shot on medium format, it was shot using a Voigtlander 65/2 APO Lanthar on a Sony A7rIV. What we see is that image on the left has a lot of color aliasing while the image on the right has very little. The images should be viewed at actual pixels, see the link below.

The difference between the two images is the aperture used. The lens used here is sharpest at f/2.8 or f/4. Shooting at f/11 we see significant diffraction that acts as an OLP (Optical Low Pass) filter.

To best understand the difference, we can look at MTF plots for both.

The blue line is the MTF at f/4 and the red line is MTF at f/11.
The blue line is the MTF at f/4 and the red line is MTF at f/11.

Now, note the vertical line at 3168 cy/PH. That line is the Nyquist limit. Any signal above Nyquist is unresolved signal. According to sampling theory, those signals will be folded back to low frequency artifacts, also called aliases.

Something like this, note that there is a significant area under the folded back blue line, not so much under the red line.
Something like this, note that there is a significant area under the folded back blue line, not so much under the red line.

Now, most digital cameras use a Bayer filter array in front of the sensor that detect color information.
  • Most cameras use 'Bayer' pattern
  • Some Fujifilm cameras use a pattern called X-TRANS
  • Foveon cameras don't use color filter array, but use a stacked sensor design and basing color information from light absorbtion in the layers affecting different wavelengths differently.
Bayer sensors have an 'R', 'G', 'B', 'G2' pattern of filters, each 'supercell' having a red, blue and two green pixels.

This shows the G and G2 channels. Note the aliases 1round 130 lp/mm. Note that the G1 and G2 aliases are pretty near invented.
This shows the G and G2 channels. Note the aliases 1round 130 lp/mm. Note that the G1 and G2 aliases are pretty near invented.

It is more visible here.
It is more visible here.

What we can see here is that undersampled detail is turned into aliases and these aliases show up differently for different pixel positions.

If we include red and blue pixels, we get a color pattern, known as moiré.

Now, how does this show in an image?

Image shot on Phase One P45+ (6.8 micron pixel pitch 55% area fill factor)
Image shot on Phase One P45+ (6.8 micron pixel pitch 55% area fill factor)

We have extensive aliasing artifacts on the reeds.
We have extensive aliasing artifacts on the reeds.

7eb9f7c984c548f39f6bd425b6b4d4f9.jpg

The images shot here were shot on:

A) Sony A7rII with a sharp lens 4.5 microns

B) Sony A7rII with a good quality standard zoom 4.5 microns

C) Sony A7II with the sharplens, Sony A7II is 24 MP with an OLP filter 6 microns

D) Sony A7rIV, 61 MP without OLP filter 3.8 microns

E) Phase One P45+, 39 MP, low fill factor, 6.8 micron pixels

All viewers place D first and E last. All viewers preferred B to A

Why do I often see aliasing artifacts in my images?
  • I often use good lenses with medium apertures.
  • I try to focus accurately.
  • And I almost always use a tripod.
  • Many of my subjects have areas with fine detail.
Best regards

Erik
So... does it ?
I think Erik is providing you with information so you can make up your own mind.

--
 
Why do I often see aliasing artifacts in my images?
  • I often use good lenses with medium apertures.
  • I try to focus accurately.
  • And I almost always use a tripod.
  • Many of my subjects have areas with fine detail.
Best regards

Erik
 
This was the first shot I took with my Hasselblad V/P45+ combo at an industry museum, it shows significant color aliasing in several areas.
This was the first shot I took with my Hasselblad V/P45+ combo at an industry museum, it shows significant color aliasing in several areas.



The next subject I shot was focused on the engine in front. Here we have a classic moiré on spring coil shown.
The next subject I shot was focused on the engine in front. Here we have a classic moiré on spring coil shown.



The same composition, now focused on the engine at center. The exhaust shows some odd color that I don't think exist in the real world. I think that we have pixel size specular reflections where the demosaic algorithm cannot interpolate correct color.
The same composition, now focused on the engine at center. The exhaust shows some odd color that I don't think exist in the real world. I think that we have pixel size specular reflections where the demosaic algorithm cannot interpolate correct color.

These examples are from the P45+ back, which has the combination of large pixels combined by undersize pixel aperture.

I have also shot with my Sony A7rIV at the same location and it can also have aliasing effects.

Best regards

Erik

--
Erik Kaffehr
Website: http://echophoto.dnsalias.net
Magic tends to disappear in controlled experiments…
Gallery: http://echophoto.smugmug.com
Articles: http://echophoto.dnsalias.net/ekr/index.php/photoarticles
 
So... does it ?
Hi,

I think it matters.

On the other hand, in many cases it is not very obvious and many images don't have detail that is prone to aliasing. So, it is subject dependent.

Striving for optimal sharpness also takes us into aliasing territory.

Best regards

Erik

--
Erik Kaffehr
Website: http://echophoto.dnsalias.net
Magic tends to disappear in controlled experiments…
Gallery: http://echophoto.smugmug.com
Articles: http://echophoto.dnsalias.net/ekr/index.php/photoarticles
 
Last edited:
So... does it ?
Hi,

I think it matters.

On the other hand, in many cases it is not very obvious and many images don't have detail that is prone to aliasing. So, it is subject dependent.

Striving for optimal sharpness also takes us into aliasing territory.

Best regards

Erik
Hi Erik, thank you for these clarifications and your technical explanations. That's what I understood too (first time I understand - more or less - what aliasing is!).

Best,

Rachel
 
So having the best combo and you can still run into problems. So please don't blame GFX50s all the time. At least for moire I think angles play some role on how the reflected light is captured. Could be wrong.

51532764120_ea439118c8_k.jpg
 
Good lesson Erik, and you've done it many times before. Appreciate it.

Do you remember when I first bought the original GFX 100 back the month it came out? I went up on my condo roof and set up a tripod. I had been shooting the 50r since it came out so had both cameras.

Everybody was begging me for some simple comparison shots with the 50 and 100, so I shot the same scene without shifting the tripod and used the same lens on both cameras and fired away at a busy city scape at F5.6 through 16. The scene had lots of tall buildings, streets, cars and people far away - lots of detail and lots of vertical and horizontal lines. I posted tons of comparison shots with full-size high-quality jpegs shot at the exact same settings at the exact same scene.

The GFX 100 shots were more detailed and of course had better image fidelity at full res, but the 50r shots looked really good in comparison and were better than any FF shots could be.

The problem with the 50r shots both you and Jim immediately pointed out. There was aliasing all along the distant window lines and some of the angular edges.

The 100 shots? No visible aliasing.

I haven't seen any since with the 100, 100s or 100II and I pixel peep on great monitors tens of thousands of shots. I just don't see it but may be missing something.

My 50r has been converted to IR and I then convert to B&W soi I don't see it with the 50r anymore.

So, to conclude, I could care less about aliasing, but I'm glad you guys taught me what it is.

Why do your Sony cameras have so much aliasing? It doesn't bother me, and I'm glad Fuji GFX 100 cameras made it where people don't (or at least I don't) care about aliasing anymore.

Thanks.
 
Every camera that is well shot with a lens that out-resolves the sensor will exhibit some form of aliasing if there is detail finer than the sensor can sample. It's not always easy to spot but it will there. The only ways to stop it are to increase the sensor sampling rate or reduce the resolution of the optical image. Obviously the 100MP sensor has more res so will show aliasing less often. The 50r also has the shrunken pixel aperture which encourages aliasing.

Erik is using some pretty amazing lenses - isn't that Voigtlander measured as one of the sharpest ever lenses?

F/11 seems to stop it by reducing sharpness and detail.

--
Photo of the day: https://whisperingcat.co.uk/wp/photo-of-the-day/
Website: http://www.whisperingcat.co.uk/ (2022 - website rebuilt, updated and back in action)
DPReview gallery: https://www.dpreview.com/galleries/0286305481
Flickr: http://www.flickr.com/photos/davidmillier/ (very old!)
 
Last edited:
Good lesson Erik, and you've done it many times before. Appreciate it.

Do you remember when I first bought the original GFX 100 back the month it came out? I went up on my condo roof and set up a tripod. I had been shooting the 50r since it came out so had both cameras.

Everybody was begging me for some simple comparison shots with the 50 and 100, so I shot the same scene without shifting the tripod and used the same lens on both cameras and fired away at a busy city scape at F5.6 through 16. The scene had lots of tall buildings, streets, cars and people far away - lots of detail and lots of vertical and horizontal lines. I posted tons of comparison shots with full-size high-quality jpegs shot at the exact same settings at the exact same scene.

The GFX 100 shots were more detailed and of course had better image fidelity at full res, but the 50r shots looked really good in comparison and were better than any FF shots could be.

The problem with the 50r shots both you and Jim immediately pointed out. There was aliasing all along the distant window lines and some of the angular edges.

The 100 shots? No visible aliasing.

I haven't seen any since with the 100, 100s or 100II and I pixel peep on great monitors tens of thousands of shots. I just don't see it but may be missing something.

My 50r has been converted to IR and I then convert to B&W soi I don't see it with the 50r anymore.

So, to conclude, I could care less about aliasing, but I'm glad you guys taught me what it is.

Why do your Sony cameras have so much aliasing? It doesn't bother me, and I'm glad Fuji GFX 100 cameras made it where people don't (or at least I don't) care about aliasing anymore.

Thanks.
Hi Greg,

My take has always been that GFX begged for 102 MP sensor.

I am pretty sure that Fujifilm had Sony's sensor roadmap, including the 102 MP sensor when embarking on the GFX line. The lens line was designed for 100 MP up, I am pretty sure.

Ming Thein was working for Hasselblad for a while and he wrote some articles on his blog that Hasselblad also designed their system for the 102 MP sensor.

The GFX 50 had undersize micro lenses and that contributed to aliasing on the GFX 50. My guess is that it was not really a GFX 50 feature, more a feature of the sensor. The X1D images showed similar aliasing as the GFX 50S in DPReview's test images, or even worse.

The GFX 100 can also have aliasing, Jim's testing shows that. But it has obviously much less of that issue than the GFX 50 models. Objectively speaking, that is a good thing.

My A7rIV can show aliasing, as demonstrated in my testing. The main reason I used A7rIV for this demo was that it was easy to find a good series of test data in my archives.

Why the DPReview studio image for the GFX 100 doesn't show aliasing may have a couple of explanations.
  • It may be that the sensor outresolves the subject.
  • It could be that DPReview didn't achieve perfect focus.
  • Or it could be camera vibration, non flat plane of focus.
  • Or something else.
I think that DPReview's does us some good service that they share studio test images. But it is not necessarily the case that their technique is always perfect. That said, they put some extra effort in testing the GFX 100.

I may mention that I got 20 raw images from Rishi Sanyal on from that test that I run trough my MTF evaluation and they had little variation. (*)

The best approach in my view is to use a high resolution sensor combined with a well designed OLP filter in front of it. Sony doesn't have it on the A7r models while Canon has improved their design on the EOS R5.

The way I see it, with better sensors we want better lenses and those better lenses need better sensors.

Some guys use adapted lenses from the film era and some guys use some lenses designed for line scanners for getting optimal quality for macro work.

There are also the guys and possibly dolls buying the Rodenstock 23 mm at over 10k$US. The race for performance never ends.

Best regards

Erik
 
Every camera that is well shot with a lens that out-resolves the sensor will exhibit some form of aliasing if there is detail finer than the sensor can sample. It's not always easy to spot but it will there. The only ways to stop it are to increase the sensor sampling rate or reduce the resolution of the optical image. Obviously the 100MP sensor has more res so will show aliasing less often. The 50r also has the shrunken pixel aperture which encourages aliasing.

Erik is using some pretty amazing lenses - isn't that Voigtlander measured as one of the sharpest ever lenses?

F/11 seems to stop it by reducing sharpness and detail.
Disagree for the 100. I know you are shooting a 50, which is great!

Foe me, it ain't there if I don't see it on my Dell 6K monitor. Now I'm talking about me. You guys go ahead and worry about what you feel you gotta worry about and shoot how you gotta shoot. It is good to know what aliasing is.

But seriously My Friends, all you guys who have never in your life fired a shot from a GFX 100 and you are harping on something that doesn't impact anything or anyone with a GFX 100 camera. The GFX 50 has some aliasing that you might notice or care about.

My GFX 100s is off for conversion to IR, so I'm about to be finished with the wonderful GFX 50 line, as Fuji is.

It is an interesting technical discussion that has no impact on how anyone shoots with a GFX 100 series camera.

--
Greg Johnson, San Antonio, Texas
https://www.flickr.com/photos/139148982@N02/albums
 
Last edited by a moderator:

Keyboard shortcuts

Back
Top