Calling an optical engineer What pixel shifts can and can't do (maybe)

clear glass

Leading Member
Messages
854
Solutions
2
Reaction score
31
Location
NY, US
I asked if pixel shifts in ideal conditions (no movement of camera body or subject) could increase resolution, and range of contrast, and decrease noise.

This is what I think I found in Wikipedia

They have a long posting of some details of high resolution shots (pixel shifting) on different cameras.

As far as I understood this, pixel shifting does improve noise and color resolution.

I don't know if that includes b&w images taken with a sensor that also records color. I suppose so, because aren't black and white lack of and mixed colors?

And sensor shift may improve the resolution of the sensor, especially if it is subpixel shifting. I think that means shifting a distance less than pixel shifting, but I may be wrong.

Not many cameras do subpixel shifting.

What's plain is the increased size of a high-resolution image but the exact relation of increased size to improvement of resolution is not given.

And maybe there are effects on accutance.

I welcome anyone who can clarify or correct what I said. I will not be offended.
 
Last edited:
I asked if pixel shifts in ideal conditions (no movement of camera body or subject) could increase resolution, and range of contrast, and decrease noise.

This is what I think I found in Wikipedia

They have a long posting of some details of high resolution shots (pixel shifting) on different cameras.

As far as I understood this, pixel shifting does improve noise and color resolution.
That is correct. I use an Olympus E-M1 Mk 3 which does pixel shifting even with hand held shots (known as HHHR or hand-held high resolution). I don't use it much myself because it is slower and I don't really need the extra resolution.

However, I have tried it and it really does work. It increases resolution, decreases noise, increases dynamic range and reduces aliasing artefacts such as moire.
I don't know if that includes b&w images taken with a sensor that also records color. I suppose so, because aren't black and white lack of and mixed colors?
Yes, it works equally well for b&w because they are taken as colour images and then processed to convert to b&w (which is simply a weighted average of the three primary colours red, green and blue).
And sensor shift may improve the resolution of the sensor, especially if it is subpixel shifting. I think that means shifting a distance less than pixel shifting, but I may be wrong.

Not many cameras do subpixel shifting.
All cameras that I am aware of do subpixel shifting if they have a high resolution mode.
What's plain is the increased size of a high-resolution image but the exact relation of increased size to improvement of resolution is not given.
There are no guarantees on the magnitude of the improvements obtained because that will depend a great deal on the quality of the lens being used and the aperture.
And maybe there are effects on accutance.

I welcome anyone who can clarify or correct what I said. I will not be offended.
 
Pixel shifting is always subpixel shifting. To shift a complete pixel does not help, obviously. But pixels are made of three sensor elements for color which makes this discussion very technical and complicated.

I wonder if the technique is really better than increasing the resolution by AI.
 
Pixel shifting is always subpixel shifting. To shift a complete pixel does not help, obviously. But pixels are made of three sensor elements for color which makes this discussion very technical and complicated.

I wonder if the technique is really better than increasing the resolution by AI.
It depends what you want. AI is adding fictitious data to make the image look sharper. Pixel shifting is adding real data.

Of course, many people will be quite happy with AI because they are already happy with the sharper images obtained by many modern cameras that leave out the AA filter. In these sharper images, most of the extra sharpness is essentially fictitious because it is produced by aliasing artefacts.
 
Pixel shifting is always subpixel shifting. To shift a complete pixel does not help, obviously.
Pentax and perhaps others use complete pixel shifting to remove bayer interpolation. It doesn't technically increase resolution, but it does remove the detrimental demosaicing process.

But pixels are made of three sensor elements for color which makes this discussion very technical and complicated.

I wonder if the technique is really better than increasing the resolution by AI.
Both are inferior to stitching.
 

Keyboard shortcuts

Back
Top