From Photoclubalpha "on-sensor PDAF zone is sufficiently large to lower maximum resolution"

Started Dec 27, 2012 | Discussions thread
Rehabdoc Contributing Member • Posts: 919
Re: From Photoclubalpha "on-sensor PDAF zone is sufficiently large to lower maximum resolution"

Photozopia wrote:

Jefftan wrote:

"I've already observed that the perceived maximum resolution (microcontrast and detail sharpness) of the A99 does seem to be lower than the D600, despite Sony's zonally graded low-pass filter which I can confirm does improve the performance to the edges and corners for wider angle lenses. The on-sensor PDAF zone is sufficiently populated and large to have some effect."

This is a review of A99 which have 102 point on sensor PDAF focus point

I don't care about the A99 but I am afraid NEX-5R and NEX-6 with 99 point PDAF is just the same

I know most who got a shiny new camera don't want to believe that but I believe if proper testing is done, maximum resolution of NEX-5R will also be lower than NEX-5N

It has to be because the sensor area is reduced. The question really is how much

That I can't answer unless proper testing is done by someone

Please don't attack me or photoclubalpha unless you have do proper testing and confirm this is not the case.

The reduce resolution may be insignificant and I sincerely hope so

I'd suggest that the combined AF pixel count is no more than about 15,000 pixels (if recent review data from a couple of sources is correct) - so no more than a pinprick upon a 16.1 million chip.

No SLT mirror on the Nex either ... unlike the A99 etc.

If I was buying an SLT model I'd be worried more about the 30% light gathering hit - that most fixed mirror SLR designs inflict - affecting resolution more than a handful of AF pixels.

Shouldn't the missing pixels be accurately reflected in the RAW file structure, and how it is processed?

We could probably probe precisely where the PDAF pixels are by making dummy RAW files with specifically placed on pixels and see how the RAW is processed. E.g. which areas of the sensor the RAW processor tries to extrapolate extra new pixels into the output and how big the "expanded" areas are, reflect where the PDAF pixels are occupying real estate.  Or of course we can rip a sensor up and look at it under a microscope.

Then predicting the amount of image quality loss should be pretty easy.

Post (hide subjects) Posted by
Keyboard shortcuts:
FForum PPrevious NNext WNext unread UUpvote SSubscribe RReply QQuote BBookmark MMy threads
Color scheme? Blue / Yellow