How many lenses can match 21MP?

Sorry I was unclear, meaning was, for a given lens, if you use a smaller sensor with higher pixel pitch, you might have an overall better result.

Also MTF being a measure of contrast in b+w is (in digital) influenced by demosaicing, and can give different results depending on the RAW converter used. It would be better to use the pure luminance data for any measurement.

Also the AA filter should be discounted, so MTF is practically meaningless as an absolute measurement being relative to photosite pitch/demosaicing/AA.
So if you have more lines in the center, you might have a better MTF
for those, and you don't need the sides and the corners....
To understand this concept you have to let go of this notion of
'lines' and 'resolution'.

MTF shows normalised contrast at a spatial frequency. It's a powerful
analytical technique that allows the performance of separate and
combined elements of an imaging system to be considered both
separately and in combination. Unfortunately, ideas like MTF50
somewhat distort the nature of what is going on, and present data as
if there are hard limits on performance where there are none.

The photozone graphs are a bit of nightmare as they show SYSTEM MTF50
presented as LENS MTF.

--
mumbo jumbo
--
Users/paolobodoano/Desktop/7616_1427EeUwj4ka7UWH.jpg
 
If you crop the distortion in the corners you will have 10mps clean
like from an E-410/E-510/E-3
Yeah but I did not only mean the corners. That would be a different discussion. Question is if you took two shots of the same scene with a 1Ds Mark II and a 1Ds Mark III, would you see more detail resolved in the latter shot (at least in the centre), and if so, with which lens? I bet there are some lenses which, if mounted on a 1Ds Mark II, will not resolve any more detail than when mounted on a 5D, but the better lenses can keep up with the increased sensor resolution, at least in the center. Of these, how many - and which ones - are likely to keep up with sensor resolution when it is raised to over 21 megapixels in the 1Ds Mark III?
 
There is not a clear step :
this yes
this one no.

All the lenses will perform, (even a cheap 500/8), you have to decide what is yr standard. All lenses will perform better on a 20mp than a 16mp of the same size.
 
You can go up to a gigapixel, if you like, and always there will be an improvment.

You can use the same old Zeiss on an Hasselblad 39mp if you prefer and you will have a better result than on the 1DsIII, that all.
 
Wonderful explanation, but my question still stands.

Because I am asking for actual numbers:

Does anyone have numbers for bodies and lenses that I can compare to each other and say "This lens outresolves this sensor, but this other sensor is too good for this other lens" ?

Getting alot of theory in this thread, but no data!

MTF data is great, but no one is giving corresponding sensor "MTF" data in the same terms.
 
You can use any lens on any sensor.

The lens can be lensbaby, 500/8 mirror, EF or Zeiss 6x6.

At the same size of sensor, more mps will give a better result, 1DsIII 21mp better than 1DsII 16mp, better than 5D 13mp with all lenses.
 
If you crop the distortion in the corners you will have 10mps clean
like from an E-410/E-510/E-3
Yeah but I did not only mean the corners. That would be a different
discussion. Question is if you took two shots of the same scene with
a 1Ds Mark II and a 1Ds Mark III, would you see more detail resolved
in the latter shot (at least in the centre), and if so, with which
lens? I bet there are some lenses which, if mounted on a 1Ds Mark II,
will not resolve any more detail than when mounted on a 5D, but the
better lenses can keep up with the increased sensor resolution, at
least in the center. Of these, how many - and which ones - are likely
to keep up with sensor resolution when it is raised to over 21
megapixels in the 1Ds Mark III?
Instead of concentrating on any new beaut digital sensor (with all it's weird add on problems like a Bayer pattern array, microlenses and an AA filter) imagine you are using a perfect sensor, or even a very slow B&W film, which is kinda close.

Now, compare lenses from a pinhole, through a simple meniscus to a doublet, then a Tessar design and so on up (via something like Alpa lenses and their apochromatic CA correction) to a non refractive imager such as a paraboloidal mirror (on centre only) or some catadioptric system such as a Schmidt Cassegrain.

You will see a steadily increasing amount of both resolution and a lessening of optically induced defects such as chromatic aberration, spherical aberration, coma, ghosting (lens coatings and element numbers come in here) overall contrast, yada yada....until you reach Dawe's limit for the resolving power of a perfect optical system of that aperture (at least on axis) which is set by physics, i.e. the wave nature of light at the frequencies we are using.

What I'm getting at here is that a lens that behaves well using film will (angular problems with micro lenses aside) also perform well on a sensor. The characteristics of any particular lens should be similar using digital, no matter what it's pixel count is (above a certain threshold in practice...maybe 3-5 megapixels).

An old third party prime say will still be visibly lower in contrast and resolution compared to a Summicron or Planar. Same goes for older and/or cheaper zooms.

So one can guess pretty well just what all those optical systems will do with either film, or a 5, 10, 20 or whatever megapixel sensor.

So, just judge a lens's ability with a near perfect sensor (slow B7W film) and you have your answer. If it stinks with film it will stink with digital too. If it performs very well with film, a 20 megapixel full frame 35mm sensor with an AA filter will still be a doddle to any decent lens, resolution wise.
 
You could always try LEARNING. The http://www is full of information on how to interpret MTF graphs, why does someone have to hold your hand?

-------------------------------------------------------------------------------------------------
This is an arrogant response!
I wonder how many MTF grafts you made for yourself.

How many other peoples grafts have you poured over, because you were too lazy or ignorant to do for yourself?
 
This may be an ESL issue (not trying to be funny) but this error would impede any searching you do for answers.

Graft
http://www.wordreference.com/definition/graft

Graph
http://www.wordreference.com/definition/graph
You could always try LEARNING. The http://www is full of information on how
to interpret MTF graphs, why does someone have to hold your hand?

-------------------------------------------------------------------------------------------------
This is an arrogant response!
I wonder how many MTF grafts you made for yourself.
How many other peoples grafts have you poured over, because you were
too lazy or ignorant to do for yourself?
 
You could always try LEARNING. The http://www is full of information on how
to interpret MTF graphs, why does someone have to hold your hand?

-------------------------------------------------------------------------------------------------
This is an arrogant response!
OK! I love exclamation marks!
I wonder how many MTF grafts you made for yourself.
How many other peoples grafts have you poured over, because you were
too lazy or ignorant to do for yourself?
What? Why would I want to make MTF measurements? I have done EXACTLY this at university, but that was part of my degree and I have no use for making such things these days.

--
--
mumbo jumbo
 
Sorry I was unclear, meaning was, for a given lens, if you use a
smaller sensor with higher pixel pitch, you might have an overall
better result.
different/better
Also MTF being a measure of contrast in b+w is (in digital)
influenced by demosaicing, and can give different results depending
on the RAW converter used. It would be better to use the pure
luminance data for any measurement.
...how would you derive this pure luma data? By demosaicing, of course.
Also the AA filter should be discounted, so MTF is practically
meaningless as an absolute measurement being relative to photosite
pitch/demosaicing/AA.
Absolutely not. The AA filter is an integral part of SYSTEM MTF.

The lens has a (range of) characteristic MTF(s), the sesnor has a characterisitic MTF, and the image processing chain - while complex - has one likewise. If you measure them as a system, you cannot disentangle the results.
So if you have more lines in the center, you might have a better MTF
for those, and you don't need the sides and the corners....
To understand this concept you have to let go of this notion of
'lines' and 'resolution'.

MTF shows normalised contrast at a spatial frequency. It's a powerful
analytical technique that allows the performance of separate and
combined elements of an imaging system to be considered both
separately and in combination. Unfortunately, ideas like MTF50
somewhat distort the nature of what is going on, and present data as
if there are hard limits on performance where there are none.

The photozone graphs are a bit of nightmare as they show SYSTEM MTF50
presented as LENS MTF.
--
mumbo jumbo
 
Wonderful explanation, but my question still stands.

Because I am asking for actual numbers:
They don't exist - you need a tighter defintion.
Does anyone have numbers for bodies and lenses that I can compare
to each other and say "This lens outresolves this sensor, but this
other sensor is too good for this other lens" ?
LENSES DON'T WORK LIKE THAT.
Getting alot of theory in this thread, but no data!
This is why it's important to understand what MTF tells you, because there's no such number without extra qualification.
MTF data is great, but no one is giving corresponding sensor "MTF"
data in the same terms.
Good point, but there might be some nonlinear analogue or digital processing on-chip that REALLY screws up such an analysis.

--
mumbo jumbo
 
Sorry I was unclear, meaning was, for a given lens, if you use a
smaller sensor with higher pixel pitch, you might have an overall
better result.
different/better
Also MTF being a measure of contrast in b+w is (in digital)
influenced by demosaicing, and can give different results depending
on the RAW converter used. It would be better to use the pure
luminance data for any measurement.
...how would you derive this pure luma data? By demosaicing, of course.
From some cameras (E-10) it was possible to obtain an uncompressed raw made of 3 images, one each RGB.
Also the AA filter should be discounted, so MTF is practically
meaningless as an absolute measurement being relative to photosite
pitch/demosaicing/AA.
Absolutely not. The AA filter is an integral part of SYSTEM MTF.

The lens has a (range of) characteristic MTF(s), the sesnor has a
characterisitic MTF, and the image processing chain - while complex -
has one likewise. If you measure them as a system, you cannot
disentangle the results. Exactly, each step has its own low pass filter..
So if you have more lines in the center, you might have a better MTF
for those, and you don't need the sides and the corners....
To understand this concept you have to let go of this notion of
'lines' and 'resolution'.

MTF shows normalised contrast at a spatial frequency. It's a powerful
analytical technique that allows the performance of separate and
combined elements of an imaging system to be considered both
separately and in combination. Unfortunately, ideas like MTF50
somewhat distort the nature of what is going on, and present data as
if there are hard limits on performance where there are none.

The photozone graphs are a bit of nightmare as they show SYSTEM MTF50
presented as LENS MTF.
--
mumbo jumbo
--
Users/paolobodoano/Desktop/7616_1427EeUwj4ka7UWH.jpg
 
Also MTF being a measure of contrast in b+w is (in digital)
influenced by demosaicing, and can give different results depending
on the RAW converter used. It would be better to use the pure
luminance data for any measurement.
...how would you derive this pure luma data? By demosaicing, of course.
From some cameras (E-10) it was possible to obtain an uncompressed
raw made of 3 images, one each RGB.
You would still need to combine those data sets to produce your luma response. How you combine is not obvious without knowing the sensitivity characteristics of the sensor.

--
--
mumbo jumbo
 
In any case agree that we are talking about a system response.

Another fact to be considered is that the advantage of having 21mp vs 16mp might be clear if camera is fixed on a tripod, but if handhold, there will be motion blur.

If resized at the same print size, the 21mp still will hold an advantage over the 16mp, but if a reviewer compares both of them at 100 % the 21mp image might show more motion blur because at the same angular velocity there is an earlier shift of the image from photosite to photosite.

So a superficial reviewer, looking at the 100 % image might not realize the improvment.

At the end of the day, the final image size and use is part of the system.
 
I'm glad there's people left here who already clarified the issue. This "the great sensors of today are outresolving lenses" is such a nonsense, incredible. And all the internet gurus like Reichmann chime in.

The sensor or film makes an "analog" (optical) copy of the lens' aerial image. I'm sure most people here still remember analog music cassettes. Is it worth copying a bad tape with a good recorder? Yes, because otherwise it will get much worse yet another time.

So why is everyone these days thinking it's suddenly possible to make an identical copy optically/magnetically/chemically/...?? Did the laws of physics change?

Good lenses have much higher resolution than any sensor and most film. They must, because something copied with its own resolution ends up at roughly HALF that resolution.

This is also why digital can beat film in terms of resolution in practical use in spite of even normal color film having a higher resolution than any sensor used in today's cameras: one step of generation loss (enlargement or scanning) is missing. When you do some calculations with the formula found here:
http://www.stockphotoonline.com/C01_NotesOnPhotography/DigitalPhotography.htm

you will also notice something interesting: even drum scanners (11000 DPI and more) introduce too much resolution loss to surpass current sensors with the end result.

AFAIK, good lenses usually have 200-300 lp/mm at reasonable contrast values in the center, which is still 4x higher than current sensors (=system resolution > 80% of sensor resolution).
 
I know how to interpret MTF charts. And those are all over the place.

No one in this thread provides examples or discussion of actual SENSOR mtf.

THis is what I am asking for.
 
maximum sensor mtf
vertical photosites x horizontal photosites

but :
how strong is the AA filter ?
how demosaicing is made in bayer ?
is b+w image demosaiced or corrected RGB luminance ?
 
Match is a relative term. I belive the MTF of CMOS of CCD senors will be the limiting factor in determining the image quality of an imaging system. I have worked with solidstate imaging sensors in medical imaging costing hundreds of thousands of dollars. Although they have replaced film for many applications, film is still the gold standard and the MTFs of good (a realative term) lenses still exceed the MTFs of film.
 

Keyboard shortcuts

Back
Top