SamuelChia
Member
- Messages
- 24
- Reaction score
- 16
Hi everyone,
I was hoping the brilliant minds I've come to highly respect over the years I've lurked about on DPR without posting much would help me shed some light on this, and confirm or disprove my conclusions if I am wrong.
I'm familiar with Jim Kasson's (controversial) assertion that pixel shift (16-shots like on Fuji GFX, Sony) does not increase resolution, specially spatial resolution. I don't have a problem with that. The question I had that needed answering was how does pixel shift compare to stitching?
For a given camera, say the Fujifilm GFX 100S that I own:
- Pixel shift 16-shot (4x the native resolution, so 404 megapixels)
vs
- Multi-row 3x3 stitch with 50% overlap, resulting stitch would also end up at 404 megapixels.
Questions:
1) If the subject of interest occupies the same pixel dimensions in both cases, (to achieve this, you would double the magnification to capture the for-stitching frames), all else being equal, which would have higher (spatial) resolution?
2) Subtly different and more complex question: Which would look better perceptually/subjectively, given that the stitch is made of single frames that will still potentially have aliasing, while the pixel shift would be virtually free of it (if done properly), yet both are identical in pixel dimensions/resolution.
3) Does pixel shift affect the diffraction limited aperture? I may not have phrased this question quite as precisely as I want to mean it, bear with me. For a certain set of criteria to determine the diffraction limited aperture for a given sensor, does pixel shift 16-shots (1/2 pixel sampling offsets) mean you need half the aperture value (e.g. f/4 vs f/8)? Or is the diffraction limited aperture still the same as the single frame because the sensor is the same, the pixel aperture/pitch remains the same even when pixel shifting?
I've published an article about this on my site with many example image crop comparisons of real world images (I do not know how to generate artificial idealised simulations), and came to some conclusions, which may be right or wrong. I've not seen these types of comparisons anywhere else before despite searching quite a bit for it, so if you're interested, here it is.
Thanks for reading this and much appreciation to anyone who has some insight to share.

Setup details for this comparison are in the link above.

Setup details for this comparison are in the link above.
I was hoping the brilliant minds I've come to highly respect over the years I've lurked about on DPR without posting much would help me shed some light on this, and confirm or disprove my conclusions if I am wrong.
I'm familiar with Jim Kasson's (controversial) assertion that pixel shift (16-shots like on Fuji GFX, Sony) does not increase resolution, specially spatial resolution. I don't have a problem with that. The question I had that needed answering was how does pixel shift compare to stitching?
For a given camera, say the Fujifilm GFX 100S that I own:
- Pixel shift 16-shot (4x the native resolution, so 404 megapixels)
vs
- Multi-row 3x3 stitch with 50% overlap, resulting stitch would also end up at 404 megapixels.
Questions:
1) If the subject of interest occupies the same pixel dimensions in both cases, (to achieve this, you would double the magnification to capture the for-stitching frames), all else being equal, which would have higher (spatial) resolution?
2) Subtly different and more complex question: Which would look better perceptually/subjectively, given that the stitch is made of single frames that will still potentially have aliasing, while the pixel shift would be virtually free of it (if done properly), yet both are identical in pixel dimensions/resolution.
3) Does pixel shift affect the diffraction limited aperture? I may not have phrased this question quite as precisely as I want to mean it, bear with me. For a certain set of criteria to determine the diffraction limited aperture for a given sensor, does pixel shift 16-shots (1/2 pixel sampling offsets) mean you need half the aperture value (e.g. f/4 vs f/8)? Or is the diffraction limited aperture still the same as the single frame because the sensor is the same, the pixel aperture/pitch remains the same even when pixel shifting?
I've published an article about this on my site with many example image crop comparisons of real world images (I do not know how to generate artificial idealised simulations), and came to some conclusions, which may be right or wrong. I've not seen these types of comparisons anywhere else before despite searching quite a bit for it, so if you're interested, here it is.
Thanks for reading this and much appreciation to anyone who has some insight to share.

Setup details for this comparison are in the link above.

Setup details for this comparison are in the link above.











