DPReview.com is closing April 10th - Find out more

k1 pixel shift - can you see the different?

Started 4 months ago | Discussions thread
JeremieB Senior Member • Posts: 2,041
Re: k1 pixel shift - can you see the different?

James O'Neill wrote:

JeremieB wrote:

James O'Neill wrote:

JeremieB wrote:

Pixel shift of course may have an effect on details, but that depends a lot on overall sharpness of the scene (lens vs sensor etc).

But pixel shift has a definitive effect on reducing noise, as you take the mean of 4 captures instead of just 1.

I have never understood HOW this works. I've done tests to show that it DOES work.

If you have PS off, a pixel is might be Red, Green or Blue. The other colours are made up the average of the neighbours with that colour.

But you have PS on there is a single reading for Red and Blue and two for green each site. There should be less averaging with PS. (Unless each image is demozaiced and the 4 are averaged, but that would reduce resolution)

There's more light captured, so there should be less noise, but I can't get my head round the mechanism, unless it the extra data playing a role in NR.

Take 1 picture of size m x n, resize in half to get a picture of size m/2 x n/2 (remove high frequencies before resizing of course to avoid aliasing).

New picture will have better S/N ratio, it's sort of pixel binning in fact (half pixels with 4 times more "light" per pixel). That's the averaging I talk about. Averaging being a blur - there's less visible noise and better S/N. That's also exactly the same effect as when we capture a huge number of photos in astro, then stack them to reduce noise. Averaged same signal remains same signal, only the noise gets step by step lower / closer to the "true" signal (well, shot noise, noise that is not random is not concerned).

That's all good.

PS works the same except it starts from 4 images (that can be considered the same as 1 image of twice the size in each dimension, That's what others do with super resolution).

Argh. Typing up why that's not right , I might have understood why it is. I need to go away and do the sums on what happens if you project a squares 1,2, 3 & 4 pixel widths of white or (e.g.) red light offset left/right by 1/4 pixel and up/down but 1/4 pixel.

Possibly my head will explode before I get to the answer.

ok you're right it's not exactly the same as same image taken by a sensor with 4 times more pixels, or let's say it depends how much the smaller sensor moves between each shot.

BUT what matters for the effect on noise is that each square group of 4 pixels (if I take same colored pixel from each shot and put them side by side), once averaged together, represent the average of the signal under it - so, less noise

Of course if I move the sensor more than 1 pixel between shots then the 4 resulting pixels (4 R, 4 G x 2, 4 B), will not be the average of the underlying signal anymore.

 JeremieB's gear list:JeremieB's gear list
Pentax K-70 Pentax K-3 Mark III Pentax smc FA 50mm F1.4 Pentax smc DA 18-55mm F3.5-5.6 AL Pentax smc D-FA 100mm F2.8 Macro WR +9 more
Post (hide subjects) Posted by
mxx
mxx
Keyboard shortcuts:
FForum PPrevious NNext WNext unread UUpvote SSubscribe RReply QQuote BBookmark MMy threads
Color scheme? Blue / Yellow