Would you still by the 5D Mrk III?

Started Aug 20, 2013 | Discussions thread
RicksAstro
Veteran MemberPosts: 3,409Gear list
Like?
Re: Parallel
In reply to Muresan Bogdan, Aug 22, 2013

Muresan Bogdan wrote:

RicksAstro wrote:

Muresan Bogdan wrote:

Well you got it all wrong here. As somebody said, all pixels are not equal. So if the pixel density is very high ( crop body with 24mp for example) you have a very big drop in shapness ( it's plain physics - difraction phenomenon mostly). So you might and up with that 300 pixels bird that looks mushy while on a FF body you get a 200 pixel bird that is very sharp and so it might take a resize to 300 pixels and still look better than your APS-C. Add noise into the equation and a crop from a FF sensor upscaled might look better than the APS-C equivalent without upscaling.

Given similar technologies, you will never get a drop in sharpness with a higher pixel density when the subject is viewed at the same magnification. The larger pixel density will always yield the same or more likely more real subject detail, which means it will always have the same or more "reach". This includes diffraction...at the same subject magnification, a denser sensor will always give you the same or more detail even when you're in the diffraction "zone".

Well APS-C and FF are not similar technologies. The pixel pitch does matter in diffraction. So for the 7D for example ( 18mp only) difraction kicks in at f5.6. For the 5d mk iii only at f8. So shooting at f5.6 for example the APS-C will loose some detail : one pixel image will spread through diffraction and contaminate the pixels near. So a denser sensor will only give more information, not more detail. If you are under the diffraction limit that is true. The denser sensor will give more detail. But for a 24mp APS-C the diffraction limit drops at about f4.5. Here is a very nice simulator that might help people understand how diffraction works. What you will find as the "limit" of diffraction ( f16 or f13) is actually the point where diffraction is very visible and you have a significant drop in resolution. But it actually starts much lower than that. But from f5.6 to f13 it affects only pixesl size details and most of the time you don't see it.

http://www.cambridgeincolour.com/tutorials/diffraction-photography.htm

By similar technology, I mean similar state of the art (or generation) of sensor design regardless of sensor size.

Diffraction has absolutely nothing to do with pixel size....it's an optical property.   Smaller pixels will resolve diffraction sooner at 100% since you will be viewing a higher magnification.   But I said "same magnification", and your example involved resizing to the same magnification.

It's been proven over and over that, unless you are at extreme apertures, the denser sensor will show more detail than the less dense one.   As diffraction goes up, that difference is less and less (approaching zero at the extremes).

So you're example, when you resize that FF 200 pixel bird up to 300 (or shrink the 300 down to 200), with a similar generation of sensors, the lower density sensor will never resolve more detail...it can't since the sampling is smaller.

The "diffraction limit" you mention shows the resolution of a great lens dropping at, say f4.5 if you graph the resolution at all apertures.   But if you overlaid the actual resolution resolved by a less dense sensor, the graphs will always be lower for all apertures (lower resolution) and will peak later than f4.5 only because the sensor is "wasting" the resolution at apertures larger than that.

If you've ever tried a D800 compared to about any other camera, it becomes very plain.   Yes, only great lenses take full advantage of the density, but even crappy lenses are improved (when results are viewed at the same magnification).

-- hide signature --
Reply   Reply with quote   Complain
Post (hide subjects)Posted by
Keyboard shortcuts:
FForum PPrevious NNext WNext unread UUpvote SSubscribe RReply QQuote BBookmark post MMy threads
Color scheme? Blue / Yellow