Just pulled the trigger on M6II + 32 f1.4

Started 3 months ago | Discussions thread
nnowak Veteran Member • Posts: 7,768
Re: Just pulled the trigger on M6II + 32 f1.4
1

MAC wrote:

nnowak wrote:

MAC wrote:

MAC wrote:

EDWARD ARTISTE wrote:

Just got a replacement m6mkii, keeping my fingers crossed. I encourage you to test the hell out of it.

ok, I'll keep testing during the return period

getting my bearings:

The camera generates HUGE files -- files close to 16x24 inch prints

The DLA (Diffraction Limited Aperture) for this camera is just F5.2 - but I will use up to F8 when I need dof

I really dislike that term. The word "limit" implies a hard boundary where everything falls apart once crossed. Diffraction Visible Aperture would be far more appropriate.

agree

Diffraction is an optical phenomenon that occurs with every lens at every aperture. The only thing that changes as sensors increase in pixel density is the ability to record that diffraction. Even wide open, your 32mm f/1.4 has diffraction, but your M6 II lacks the resolution to record it.

Also, when Angle of View (AoV) and Depth of Field (DoF) are identical, diffraction levels will be the same across sensor sizes. For example, shooting with your 32mm at f8 on your M6 II will produce the same levels of diffraction as shooting with a 50mm lens at f13 on your RP.

I think I understand the 1.6 crop factor multiple (32 x1.6 = 51)

and 8 x 1.6 = 13

but isn't it also multivariable depending on the sensor; i.e., the m6II has a DLA of 5.2 and the RP has a DLA of 9.3

No. Diffraction is solely a lens function and is independent of the image sensor size and pixel density.

if you take 5.2 x 1.6 = you get 8.3 which would be the FF DLA if it had an analogous same generation sensor as the M6II. But in this case, doesn't the RP have a less stressful DLA of 9.3...

It is not about the sensor generation, but the number of pixels.  If the RP also had a 32.2MP sensor like the M6 II, the DLA would numbers would be a closer match.

So I'm thinking the RP at F13 would not be equivalent but be under less diffraction stress than the m6II at F8. Maybe I got this wrong.

You are wrong on this part.  Once you pass the DLA, diffraction is being fully resolved by the image sensor and the effects follow equivalence.

But in any case, one needs to think about stopping down - analogous to squinting ones eyes, and how one will see less sharpness when you squint too much...

Imagine the hair on a model during a portrait shoot.  The lens used is irrelevant.  If you had a one pixel camera, everything would just be solid gray.  Go up to a 100 pixel camera and you can start to discern facial features.  Go all of the way up to 32 million pixels in the M6 II, and you can count the individual eyelashes.  All of those eyelashes were still in existence with the lower pixel count cameras, but those cameras lacked the ability to resolve them.

Diffraction works the same way.  Diffraction is present at every aperture with every lens.  As you stop down, the width of the diffraction gets larger.  As sensors of a given size get more pixels, the size of those pixels gets smaller.  DLA is simply the point where pixels of a given size can finally resolve the diffraction.  Diffraction is still occurring at brighter aperture than the DLA, but the pixels are too large to see the diffraction.

In simplest terms, you can think of your image sensor as a ruler.  As the pixels on your sensor get smaller, you have a ruler with finer gradations.  The object being measured (analogous to the projected image) does not change, but the ruler with finer gradations can more accurately measure the object.

Post (hide subjects) Posted by
MAC
MAC
MAC
MAC
MAC
MAC
MAC
MAC
MAC
MAC
MAC
MAC
MAC
MAC
MAC
MAC
MAC
MAC
MAC
MAC
MAC
MAC
MAC
MAC
MAC
MAC
MAC
MAC
MAC
MAC
MAC
MAC
MAC
MAC
MAC
MAC
MAC
MAC
MAC
MAC
MAC
MAC
MAC
MAC
MAC
MAC
MAC
MAC
MAC
MAC
MAC
MAC
MAC
MAC
MAC
Keyboard shortcuts:
FForum PPrevious NNext WNext unread UUpvote SSubscribe RReply QQuote BBookmark MMy threads
Color scheme? Blue / Yellow