Can pixel shift increase resolution?

For me, reduced noise is a more significant benefit than reduced aliasing.
In that case you can also just take multiple captures and combine them later, no need for the hassle of setting up pixel shift mode. There can be other advantages to this approach, see below.
I do not use pixel-shift where it is a hassle (e.g., Fuji, Sony). Combining captures later is a hassle, IMO.

However, it does not need to be a hassle (see smartphones). With Olympus OM-1, I press a button to turn it on, take the picture, and about 5 seconds later the merged high-res shot is saved to the card.
Handheld pixel shift, aka multiframe superresolution, is a whole other animal, since hand vibration can give you subpixel information.

https://arxiv.org/abs/1905.03277
Yes, this is what piccure or photoacute do and it can be worthwhile though computationally expensive. They seem to never have gotten real traction.
Thank you for the link to the interesting article.

Could you elaborate on why handheld pixel shift is something completely different? E.g., the Olympus tripod pixel-shift shifts eight times in one-micron increments (source DPR). Therefore, the pixel-shift movements may not be in whole pixel increments.
The short answer is that you are effectively shrinking pixel pitch, though pixel aperture remains unchanged. Smaller pitch = higher effective Nyquist frequency, and potentially higher resolution and lower aliasing depending on scene.

The old Olympus E-M5II had an 8-capture shift mode that also sampled the image in between pixels, effectively doubling sampling pitch. It did not last long though, so I assume there were diminishing returns:
What did you mean by "it did not last long"? At least M1 Mark III and Fuji GFX 50S II are still using less than a pixel to shift (source Wiki). Sony a7rIV also shifts by half a pixel.
https://www.strollswithmydog.com/olympus-e-m5-ii-high-res-40mp-shot-mode/

The slanted edge method does not pick up the additional resolution because it supersamples the edge itself already - and the filtering action of pixel aperture, which it does pick up, remains unchanged in single shot vs shift. Note how Ken takes me to task on this very issue in the comments, I was just starting to work through the MTF framework at the time so there are some inaccuracies in the article. The slight improvement shown is indeed due to external factors.

Jack
 
Don't take it personally JACS, I am not trying to mislead anyone, just provide a counterexample to Bob's as mentioned, in good fun. The point being that the advantage provided by pixel shift as implemented (4 images) depends on the scene and the direction in which frequencies are stressed. Other than that I agree with Jim's assessment: biggest practical advantage comes from reduced aliasing and related effects.

I'll provide image info tomorrow to give someone else a chance to play.

Jack
 
Last edited:
The old Olympus E-M5II had an 8-capture shift mode that also sampled the image in between pixels, effectively doubling sampling pitch. It did not last long though, so I assume there were diminishing returns:
What did you mean by "it did not last long"? At least M1 Mark III and Fuji GFX 50S II are still using less than a pixel to shift (source Wiki). Sony a7rIV also shifts by half a pixel.
Huh, I thought everybody had fallen back to 4-image shifts like the example I provided. I stand corrected. Implications of half pixel shifts are discussed in the theory section of the earlier E-M5II link.
 
Last edited:
Jack Hogan said:
Don't take it personally JACS, I am not trying to mislead anyone, just provide a counterexample to Bob's as mentioned, in good fun. The point being that the advantage provided by pixel shift as implemented (4 images) depends on the scene and the direction in which frequencies are stressed. Other than that I agree with Jim's assessment: biggest practical advantage comes from reduced aliasing and related effects.

I'll provide image info tomorrow to give someone else a chance to play.

Jack
I do not take it personally.

I played with the A7IV RAW file (no pixel shift). I choose WB so that on this graphics, all RAW channels would have the same mean. Then I just plotted the RAW in B&W, no demosaicing at all. Here is the result:



It is rendered in high contrast on purpose. You can see the wavy aliasing on the top. IMO, it is in the data, and that's it. Patterns on the cloths and on the faces look good, actually, better than the demosaiced versions I have seen. So, this would be the honest representation of a B&W image if we knew it is B&W.

Now, if you pixel peep the top, you can see a lot of pixelation. This could be due to poor WB but it is not that. There is real color info there. CA?
 
Last edited:
As a fun counterexample, which of these two unsharpened images from raw files captured with the same camera, lens and setup shows higher resolution and less aliasing? One is pixel shift, the other not.

469f11616e9b4f3391c8c3e510f92d45.jpg.png

For those of you who figured it out, how hard did you have to look and what gave it away?

Jack
The clearer delineation of vertical stripes in the right image makes me think it's pixel-shifted, though I'm unsure why this image has a different white balance to the left.
 
The reduction of aliasing and color artefacts means you improve SNR this in turns means you can sharpen and manipulate more your image

From a marketing standpoint as the image has 4x the pixel resolution it is sold as high resolution shot this is correct from a pixel standpoint and it would be too hard to explain the SNR story

So if you wanted to have the best image for a landscape image this is a good way to achieve it

Today most cameras do not resolve the Nyquist limit of the sensor anyway due to limitations in the lens
 
Interceptor121 wrote: ...

Today most cameras do not resolve the Nyquist limit of the sensor anyway due to limitations in the lens
Can you elaborate on what you mean by this? Many current sensors do not sport anti-aliasing filters so there is usually still a ton of energy above the monochrome Nyquist frequency.
 
Interceptor121 wrote: ...

Today most cameras do not resolve the Nyquist limit of the sensor anyway due to limitations in the lens
Can you elaborate on what you mean by this? Many current sensors do not sport anti-aliasing filters so there is usually still a ton of energy above the monochrome Nyquist frequency.
The lp/mm that you measure with the lens on is lower than the theoretical limit of the pixel pitch because lenses have aberrations and are not perfect
 
As a fun counterexample, which of these two unsharpened images from raw files captured with the same camera, lens and setup shows higher resolution and less aliasing? One is pixel shift, the other not.

469f11616e9b4f3391c8c3e510f92d45.jpg.png

For those of you who figured it out, how hard did you have to look and what gave it away?

Jack
The clearer delineation of vertical stripes in the right image makes me think it's pixel-shifted, though I'm unsure why this image has a different white balance to the left.
Good call shutterbugnx. My bad about the slight difference in WB.

The images come from a Pentax K-1 and were featured in a related thread a few years ago. The left one is DPR's studio scene captured in single shot mode and processed similarly to JACS' above. The right one is the result of a 4-image pixel-shifted capture with the same setup. You can find all info on them by reading through the short thread (they are the two rightmost images there)

https://www.dpreview.com/forums/post/57788120

Nick does a nice job of tearing the single shot apart.

Jack
 
Last edited:
Interceptor121 wrote: ...

Today most cameras do not resolve the Nyquist limit of the sensor anyway due to limitations in the lens
Can you elaborate on what you mean by this? Many current sensors do not sport anti-aliasing filters so there is usually still a ton of energy above the monochrome Nyquist frequency.
The lp/mm that you measure with the lens on is lower than the theoretical limit of the pixel pitch because lenses have aberrations and are not perfect
What do you make of this then?

9b09bb25fa0241d79b559dcb87c5740a.jpg.png

I guess it depends on the definition of resolution.

Jack
 
Last edited:
I played with the A7IV RAW file (no pixel shift). I choose WB so that on this graphics, all RAW channels would have the same mean. Then I just plotted the RAW in B&W, no demosaicing at all. Here is the result: {...}

It is rendered in high contrast on purpose. You can see the wavy aliasing on the top. IMO, it is in the data, and that's it. Patterns on the cloths and on the faces look good, actually, better than the demosaiced versions I have seen. So, this would be the honest representation of a B&W image if we knew it is B&W.
Right, now if you put this image next to an identical pixel-shifted raw capture rendered the same way you would have the comparison I provided.

Bob's example and my counterexample go to the heart of the question of how much a CFA degrades captured information in its various guises (resolution in this thread) vs a full res sensor.

To state the obvious, all other things being equal (including the number of captures) the answer depends on the scene, including its chromaticity and the direction of the detail. It's not all or nothing, it's a continuum with many peaks and valleys throughout the image, from full degradation (half the sampling pitch) to virtually nothing.

Bob showed an example of high degradation, I (and you) showed one of a minimum. There are minima in color parts of an image as well, as one can glean from the earlier Bayer-CFA-Effect-On-Sharpness link . Jim's practical idea of assuming an average effective Bayer Nyquist of 2/3 of monochrome's is one way to simplify the concept down to a scalar. But it is really a vector or an array.
Now, if you pixel peep the top, you can see a lot of pixelation. This could be due to poor WB but it is not that. There is real color info there. CA?
Could be if there is fast changing detail, though I have seen the same effect in uniform gradients with mixed light sources. I don't know how well controlled DPR's scene lighting is.

Jack
 
Last edited:
Interceptor121 wrote: ...

Today most cameras do not resolve the Nyquist limit of the sensor anyway due to limitations in the lens
Can you elaborate on what you mean by this? Many current sensors do not sport anti-aliasing filters so there is usually still a ton of energy above the monochrome Nyquist frequency.
The lp/mm that you measure with the lens on is lower than the theoretical limit of the pixel pitch because lenses have aberrations and are not perfect
What do you make of this then?

9b09bb25fa0241d79b559dcb87c5740a.jpg.png

I guess it depends on the definition of resolution.

Jack
The theoretical resolution il lp/ph is half the pixel resolution so 2752 lp/ph the modelled 2498 represents a correction factor of 0.9 vs a more traditional 0.7 for front illuminated sensor and the camera is actually working at 92% so not yet resolving 100% of the power because of the limitation of the system which is what I said

So it depends on your 'modeled' and how you made that but yet it does not do 2752 line pairs

With microlenses, stacked construction etc the boundaries are being pushed further there is no doubt

This camera does not have a high resolution mode I believe so we cannot see if there would be an SNR improvement on the already very strong performance



--

instagram http://instagram.com/interceptor121
My flickr sets http://www.flickr.com/photos/interceptor121/
Youtube channel http://www.youtube.com/interceptor121
Underwater Photo and Video Blog http://interceptor121.com
Deer Photography workshops https://interceptor121.com/2021/09/26/2021-22-deer-photography-workshops-in-woburn/
 
I guess it depends on what you mean by 'resolving the Nyquist limit', I don't know if I would use the system MTF50 frequency as a gauge for that.

Even so, a system with MTF50 beyond 0.5 c/p would show horrible aliasing and would not be desirable imho - that's why there used to be AA filters and I wish there still were. Perhaps better ones, like Canon's Hi-Res .

One of my biggest complaints about the Z7 above is aliasing: paraphrasing Jim K, at first you don't notice it and your life is bliss; but once you start seeing it, it becomes hard to ignore and soon you see it everywhere. In a natural scene it may present like an oversharpened image, though with no sharpening applied.

Jack
 
Last edited:
The reduction of aliasing and color artefacts means you improve SNR this in turns means you can sharpen and manipulate more your image

From a marketing standpoint as the image has 4x the pixel resolution it is sold as high resolution shot this is correct from a pixel standpoint and it would be too hard to explain the SNR story

So if you wanted to have the best image for a landscape image this is a good way to achieve it

Today most cameras do not resolve the Nyquist limit of the sensor anyway due to limitations in the lens
Actually, they resolve well over the Nyquist limit, which is why a lot of modern cameras have aliasing issues, even in luminance.

All cameras resolve well over the Nyquist limit of the red and blue channels, which is an even bigger issue in terms of colour aliasing.
 
Even so, a system with MTF50 beyond 0.5 c/p would show horrible aliasing and would not be desirable imho - that's why there used to be AA filters and I wish there still were.
Amen to that.
One of my biggest complaints about the Z7 above is aliasing: paraphrasing Jim K, at first you don't notice it and your life is bliss; but once you start seeing it, it becomes hard to ignore and soon you see it everywhere. In a natural scene it may present like an oversharpened image, though with no sharpening applied.
I believe Canon still routinely use AA filters, for which I give them credit.

Personally, I much prefer a sharpened less-aliased image than a less-sharpened aliased one. I have far more control over sharpening artefacts than aliasing.
 
I guess it depends on what you mean by 'resolving the Nyquist limit', I don't know if I would use the system MTF50 frequency as a gauge for that.

Even so, a system with MTF50 beyond 0.5 c/p would show horrible aliasing and would not be desirable imho - that's why there used to be AA filters and I wish there still were. Perhaps better ones, like Canon's Hi-Res .

One of my biggest complaints about the Z7 above is aliasing: paraphrasing Jim K, at first you don't notice it and your life is bliss; but once you start seeing it, it becomes hard to ignore and soon you see it everywhere. In a natural scene it may present like an oversharpened image, though with no sharpening applied.

Jack
MTF measurement has its own challenges aliasing for me doesnt mean it resolves it

your sample though was under the limit
 
I guess it depends on what you mean by 'resolving the Nyquist limit', I don't know if I would use the system MTF50 frequency as a gauge for that.

Even so, a system with MTF50 beyond 0.5 c/p would show horrible aliasing and would not be desirable imho - that's why there used to be AA filters and I wish there still were. Perhaps better ones, like Canon's Hi-Res .

One of my biggest complaints about the Z7 above is aliasing: paraphrasing Jim K, at first you don't notice it and your life is bliss; but once you start seeing it, it becomes hard to ignore and soon you see it everywhere. In a natural scene it may present like an oversharpened image, though with no sharpening applied.

Jack
MTF measurement has its own challenges aliasing for me doesnt mean it resolves it
Aliasing is a sensor issue, not a lens issue.
your sample though was under the limit
Which limit? Nyquist is 0.5 cy/px and aliasing of colour detail occurs at 0.25 cy/px.

Do you mean MTF50? That's an entirely arbitrary measurement. The normal 'limit' of visible detail is usually regarded as MTF10, but this too is rather ambiguous.
 
Today most cameras do not resolve the Nyquist limit of the sensor anyway due to limitations in the lens
I can't tell precisely what you mean by that sentence.

If you mean that real lenses can't deliver an MTF of nearly unity at the Nyquist frequency of most sensors, then I agree, but don't see the relevance.

If you mean that real lenses don't have sufficient contrast at the Nyquist frequency of most sensors to cause visible aliasing with high-frequency subjects, then I disagree strongly.

To talk about the system I'm most familiar with, all of the Fujifilm GF lenses are capable of aliasing at some f-stops on axis with the GFX 100x.

One way to think of lens and sensor resolution is to consider the balance between the two:

 
Last edited:
I played with the A7IV RAW file (no pixel shift). I choose WB so that on this graphics, all RAW channels would have the same mean. Then I just plotted the RAW in B&W, no demosaicing at all. Here is the result: {...}

It is rendered in high contrast on purpose. You can see the wavy aliasing on the top. IMO, it is in the data, and that's it. Patterns on the cloths and on the faces look good, actually, better than the demosaiced versions I have seen. So, this would be the honest representation of a B&W image if we knew it is B&W.
Right, now if you put this image next to an identical pixel-shifted raw capture rendered the same way you would have the comparison I provided.

Bob's example and my counterexample go to the heart of the question of how much a CFA degrades captured information in its various guises (resolution in this thread) vs a full res sensor.

To state the obvious, all other things being equal (including the number of captures) the answer depends on the scene, including its chromaticity and the direction of the detail. It's not all or nothing, it's a continuum with many peaks and valleys throughout the image, from full degradation (half the sampling pitch) to virtually nothing.
This is true only if you know the chromaticity in advance, and we don't.
Bob showed an example of high degradation, I (and you) showed one of a minimum. There are minima in color parts of an image as well, as one can glean from the earlier Bayer-CFA-Effect-On-Sharpness link . Jim's practical idea of assuming an average effective Bayer Nyquist of 2/3 of monochrome's is one way to simplify the concept down to a scalar. But it is really a vector or an array.
Now, if you pixel peep the top, you can see a lot of pixelation. This could be due to poor WB but it is not that. There is real color info there. CA?
Could be if there is fast changing detail, though I have seen the same effect in uniform gradients with mixed light sources. I don't know how well controlled DPR's scene lighting is.

Jack
 

Keyboard shortcuts

Back
Top