DXO's Perceptual Megapixels

Started Nov 21, 2013 | Discussions thread
Jack Hogan Veteran Member • Posts: 6,487
Re: See SQF relationship

The_Suede wrote:

Jack Hogan wrote:

The_Suede wrote:

According to DxO both the D600/610 and the D800 reach 19P-Mpix when used with the Nikkor AF-S 85/1.8.

According to me, the D800 resolves quite a bit more than the D610. Test images taken from the DPR test-scene, with the named lens.

Good post, thanks The_Suede, I heartily agree with the need to bring back to the surface and show the underlying MTFs

FWIW MTF Mapper shows an average MTF50 of 47.1 vs 45.4 lp/mm for the D800 and D610 respectively - measured off the slanted edges in the relative ISO100 NEFs from DPR's new Studio scene with all the caveats of the case (proper focus peaking being an important one).

OFF TOPIC: There is a bit of an anomaly I've been meaning to ask you about, though. Assuming MTF Mapper is doing its job properly (seems to me to give consistent results in contolled situations), I've noticed that with recent cameras (say those that came out within the last couple of years or so) there sometimes is quite a large difference between the horizontal and vertical MTF50 readings. For instance while the D800's are 46.2 and 47.9 lp/mm H and V respectively, the D610's are 38.0 and 52.8 lp/mm - that's a huge difference. The worst offenders seem to be late model Sony sensors, with the D610, NEX6, A57, K500 showing differences of 20% or more.

Could this have something to do with the slanted edges having ended up under a portion of the lens that shows some astigmatism? Or could it be due to differences in the alignment/directional strength of the AA filter in the newer cameras? Or what else?


Most cameras I've torn up have the same plate thickness for both AA layers, so what is probably happening (barring that the angular alignment of the cut of the birefringent crystal is equal in both layers...) is that the phase plate placed between the two layers doesn't return polarization to the circular polarized original properly.

The first layer takes an incoming random polarization ray and splits it into two parallel rays at the plate exit. What is important at that point is that ONE of the rays is made from mostly nº pol light and the other is mostly n+90º pol light.

Sending that straight into a new AA plate would only move them relative to each other again without individually splitting them, resulting in a diagonal displacement of two rays. What you want is to take the two rays and split them into four.

To make the second layer work correctly you must make both of the rays circularly polarized again, otherwise the second layer has "nothing to work with" so to say. If the QWP only gives an elliptical polarization, then the AA strength is diminished by the h/v pol strength relationship, i.e if the resulting pol after the QWP is elliptical with 70% f' and 30% s' components - the secondary image is lowered in intensity by the same power degree.

As you say, many cameras show this behavior.

And since a QWP is wavelength dependent as Γ = (2π*Δn*L)/(λ), it can only do what it's supposed to do at (pi*n+1)/2 wavelengths. Most QWP's in common photographic use cases are higher-order plates, meaning that they switch from no effect to full effect several times through the working range, from 400nm to 700nm light. If you separate the images into R, G and B, you'll find that the MTF50 delta between the axises isn't the same for long, medium and short wavelengths... And if you use a monochromatizer to test with several different but closely spaced wavelengths, you'll find that the h/v differences switch between "a lot" and "less" quite often.

Thank you for the explanation, The_Suede. What would you consider to be a 'normal' difference for today's DLSRs?

There are ways to make [fairly] achromatic QWP's over the visible band by laminating two different crystals, most often quartz and magnesium fluoride, but that's probably considered to expensive by the manufacturers.

To prove that it isn't MTF Mapper that doing "something strange", just do a lossless 90º flip of the images before sending it in to the program. You'll get the same results (but in reverse relationship of course).

I did try the rotation and the result is *exactly* the same. I think the reading off the horizontal slanted edge of the D610 NEF is an outlier, with a number of things (some of which you've mentioned above) probably aligning the wrong way.  Such are the limitations of using a single edge from a target not designed for spatial resolution measurements, so take my results with a grain of salt.

An interesting fact is that it appears from this limited sample (DPR's new scene raw files) that while Nikon Exmors tend to consistently show slightly higher 'vertical' readings vs the 'horizontal' ones, the Sonys are consistently the opposite.

Or you could shoot a dark-square target from tripod and then without changing the focus just flip the camera 90º sideway along the optical axis on the tripod mount.

I still have some loose plates from various cameras lying around, but testing them thoroughly would mean sending them off to a third party, which I'm not that interested in paying for.

Definitely not worth it given the amateurish nature of the data that generated the question


Post (hide subjects) Posted by
(unknown member)
(unknown member)
Keyboard shortcuts:
FForum PPrevious NNext WNext unread UUpvote SSubscribe RReply QQuote BBookmark MMy threads
Color scheme? Blue / Yellow