Do we still need a TC?

Started 9 months ago | Discussions thread
James O'Neill
Veteran MemberPosts: 3,928
Like?
Re: Basic error
In reply to paulkienitz, 9 months ago

paulkienitz wrote:

MightyMike wrote:

I'd suggest that many lenses still out resolve even high pixel count sensors by a good margin.

This is getting harder to believe. Today's crop-format sensors have a photosite spacing between 3.5 and 4 microns. I don't think there are a ton of lenses out there that can resolve finer than that. For one thing, the diffraction blur circle at f4 is already over 5 microns in theory, and f2.8 lenses are not usually sharper than f4 ones. In practice it seems possible to extract smaller detail than this, but it's an uphill fight.

-- hide signature --

"A good photograph is knowing where to stand." -- Ansel

That's not how it works though. 6000 pixels * 4 microns = 24mm ... OK.

But do this thought experiment.

You scan a light beam one pixel with perfect edges which instantly from white to black, up and down a sensor with no bayer filter. The light is just bright enough to give maximum exposure level in the pixels it hits.
Your pattern of black and white stripes one pixel wide is reported by the sensor's pixels as 100%,0 100%, 0 ... Or is it? Suppose the alignment of the pattern was off by half a pixel, now the lines would go 50%,50%,50%,50% The average case is 75% , 25%,75%,25% an MTF of 50%.
The maximum number of line pairs we can record is half the number of pixels (people quote Nyquist here, a cycle is light/dark we need to have two samples per cycle, one to record light one to record dark)

Think of MTF as measuring "How much light strays into the 'dark' pixels". When the answer is "too much" you can't resolve light from dark. If the stripes are 2 pixels wide the average error is still 1/4 pixel but it's now 1/8 of a stripe so the light divides 12.5%/87.5% - MTF is 75%,  if the stripes a 4 pixels wide the error is 1/16th of a stripe so its 6.25/83.65%  MTF is 87.5% and so on.

Now imagine that the pattern that you are laying down isn't Black/white/black/white but black-grey-white-grey-black-grey-white. When you get close to the limit of lens resolution it doesn't go suddenly from perfect black/white to grey mush , instead the light spread at the transitions becomes such the grey edges get closer and closer together until the MTF of the pattern reaches the point where you say "there's still a lighter and darker, but the lines aren't properly resolved". If the lines are only just resolved then when  you add the effect of the sensor it drops below the minable MTF to call the result "resolved".  If one component massively out resolves the other when you use at close to the maximum res of the other it's MTF is nearly 100%.

So to get a end product that you call "resolved" the number of lines has to be inside the maxium resolution of both sensor and lens , so that when you add the edge blur of both together the result must be <= the maximum allowable edge blur to call that number of lines resolved. Since blur is proportional to  1/resolution we get this equation I learnt in the film days of 1/output = 1/lens + 1/medium

OK

Now put the bayer sensor back on. Now, however you line up your one pixel wide beam the sesnor records x% Red & Green1 (100-x)% Blue & Green 2, So all your pixels have the same red and blue values, and native red & blue pixels have the same green values and you can't resolve the lines any more.... so your lines must be at least 2 pixels wide (If you think of your 'white beam' as a red,green and blue beam, you have to record a red line, blue line, green line, and red , and blue pixels rows/columns are only 50% of the total)

Reply   Reply with quote   Complain
Post (hide subjects)Posted by
Keyboard shortcuts:
FForum PPrevious NNext WNext unread UUpvote SSubscribe RReply QQuote BBookmark post MMy threads
Color scheme? Blue / Yellow