The Myth of Equivalent Aperture and other overly simplistic Camera Formulae

I would swear this is the same thread from the heated front page discussion some months ago, with the same amateur nuclear scientists points and counterpoints.

Or maybe it is the matrix having hiccup.
This same topic repeatedly appears on all forums for the last 2-3 years. It seems there are always people who can't grasp this simple concept. Worse yet, there are some of them always want to start a big debate based on that ignorance, and amazingly they never learn, so the arguments continue in perpetuity.

The equivalence seems to hit some deep psychological issues and that's the reason for ever continuing resistance on the part of some people.
I think you're over analyzing it. Equivalence die hards are just wedded to FF superiority. It's not that complicated.
It's not that one format is superior to another, it's that if the output image is all that matters, then you need equivalence to get it. A 35/1,4 lens on APS-C simply won't give the same output as a 50/1,4 on FF.

1. the DOF is different at widest apertures

2. even when stopped down to match the 35/1,4 on APS-C, the FF lens occludes part of the light with aperture blades, so OOF areas will be affected.

Equivalent lenses on equivalent systems give equivalent output. But there simply aren't certain equivalents in smaller sensors that there are in larger sensors. It's not about one being better than another; it's about understanding the stipulations.

Then, when that is understood, you can, if you want to, argue about noise, and dynamic range, etc.

Finally, after understanding that calling a 35/1,4 on APS-C equal to a FF 50/1,4 isn't correct, you need to realise that there are NO size advantages to APS-C cameras that can fit FF sensors. The reason size advantages for APS-C currently exist is that certain APS-C cameras such as Sony's and Fujifilm's, compete against dSLRs that are horribly bloated in size. They are 2x - 3x the volume of their analogues in the film world from the same companies.

But equivalence exists. When looking at FF from medium format it exists, same with LF. It always has and always will. And it is the only way of properly looking levelly across different systems, unless the aim is to compare systems AND the limitations imposed by the size of their sensors.
 
cesjr wrote;

Dude you're hardly on a roll. You're dug in and your arguments are not convincing a lot of folks. You can blame that on them sure. But maybe it's your arguments? Shudder, can't be I guess
You seem like the kind of "dude" who would benefit from watching this video seeing as you seem to totally ignore Fraulain's excellent explanations (which are much better written and scientifically backed up than my own.)


You could also read this but I suspect it isn't going to "take."

http://www.josephjamesphotography.com/equivalence/

The main thing to take away from that website is this statement, "The same total light will result in the same noise for equally efficient sensors (regardless of pixel count and regardless of the ISO setting."
I read that long winded piece and watched that video. It's just the same old FF has More Light argument. Which Is just plain stupid. Sure there's more light coming in but it's just feeding more pixels and more resolution. The same amount of light is falling on each pixel. I cannot believe people are even sucked into this argument. It makes no sense whatsoever
The light IS the information. The more you get, the more signal you get to analyse vs. the noise and therefore the more accurate the result (the stronger the signal is compared to the noise.)

Questions for you. Please answer each one individually.

1. Do you think people who have posted here who are obviously scientifically more educated than you and me (by their explanations and links) regarding light and Physics know less than you and me with regard to sensor signal to noise ratio (I'm referring to the posts of Fraulain and Usee?)

2. Are you one of those people who feels there is no advantage of APSC over FF, but think that m43 format is hopelessly noisy?

3. Do you think there is no advantage whatsoever of a larger sensor over a smaller one? Do you think a cell phone sensor is as good as an APSC sensor?

4. Since all the FF sensors out there right now outperform all the APSC sensors in low light do you think there is a conspiracy among sensor manufacturers to artificially hold back APSC sensors and make them worse than FF sensors?
 
There is also the problem of the Fuji's APSC sensor pixels having to bin more electrons per area at similar total quantity of light which is another place to get tripped up.
I don't understand the above.
OK. Let me see if I can explain my logic with a couple thoughts regarding sensors and lenses and you can correct my missteps.

I understand if you laid to sensors down on the ground, with no lens, the larger sensor would gather more light and have a better signal to noise ratio. In this case they would both be getting the same amount of light per pixel, but one would just have more read sites. The signal to noise would be better for the larger sensor.

However, we put sensors behind lenses in an attempt to capture a specific field of view.

So, if we take two sensors, A with 4 times the area of light as B and put them behind two lenses that have concentrate the same total amount of light on both, wouldn't each pixel on the smaller sensor get 4 times the amount of light as it would on the larger sensor and therefore wouldn't it have to store 4 times the information? Wouldn't the limit on how much information a pixel can store cause the smaller sensor to saturate faster and therefore have less dynamic range?

I know we've been talking about signal to noise ratio but people also care about the dynamic range of the sensor which is effected by how bright an image the sensor can still measure before saturating and therefore is hugely dependent on sensor size.
 
Jeez. Noise does not increase from cropping. All cropping does is cut down the FOV. The same noise is there. I can't believe anyone is really arguing this. It's absurd.
But that same noise is magnified so you see it more. It is why people shouldn't pixel peep at images that they will never be able to view in a particular size. Look at the D800 images I posted of the books. The FF D800 image downsized the same size as the D700 image has less apparent noise. It has less apparent noise because the random noise has been downsized.
 
I had a question? If we just cover the ff sensor inside the camera to APSC size. The sensor performance will drop to aps-c sensor performance?
Yes. It doesn't matter how you crop. Cropping in the computer after the exposure will also cause the same effect.

It is (almost) all about how much light is collected. If you crop, you throw away part of the light.
so, basically, if i print out a FF picture (say 30x20) and i then phisically CUT 2 inches out of each border, what remains is now worse because i cut threw away part of the light?

are you kidding me?
No. You misunderstood me. Maybe I should have been clearer.

If we consider the signal-to-noise-ratio, then cropping reduces the it. If you crop half of the image, your SNR is reduced by factor of 1.414.

I didn't mean that if I crop part of the image out the somehow the rest is somehow also influenced. That would be a silly idea :)

What I meant is simply this:

You take two images with your camera (the same camera) - one image with 50mm lens, and another with 100mm lens. Now if you crop the image taken with the 50mm lens to match the field of view of the 100mm lens, the output image you get has about a stop lower signal to noise ratio than the image which was taken with the 100mm lens has. This is true regardless of how the crop is performed, wether is the sensor (in this case four-thirds) or done in (post)processing.
Here I done this very experiment shot using a constant light source

014_8827%20as%20Smart%20Object-1.jpg


014_8826%20as%20Smart%20Object-1.jpg


If there was no SNR difference when cropping an image then there would be no way that one could shoot at a F stop that restricted the intensity of light by 2 stop and still contain the same visual amount of noise between the 2 images

--
The Camera is only a tool, photography is deciding how to use it.
The hardest part about capturing wildlife is not the photographing portion; it’s getting them to sign a model release
Is this supposed to show something? I don't see a difference but admittedly these are low res jpegs
And understanding why you don’t see a difference is the propose of this test. One photo is taken with a light intensity 2 stops lower while they both show the same amount of noise. If cropping has no effect on SNR then this test would have shown otherwise.

--
The Camera is only a tool, photography is deciding how to use it.
The hardest part about capturing wildlife is not the photographing portion; it’s getting them to sign a model release
Yea I'd like some proof of the exposures and some better way to compare the noise. Are these even high ISO shots?
All the information is there

One image is shot at iso 800 and the other at a iso 4 times higher iso 3200

--
The Camera is only a tool, photography is deciding how to use it.
The hardest part about capturing wildlife is not the photographing portion; it’s getting them to sign a model release
I don't think its a very scientific noise comparison. Plus what hou need to show is an actual degradation from cropping. That your claim ( which is quite silly really )




when viewed at the same output size the cropped image will show more noise as you are enlarging how you view the noise

these are one image, one is cropped and the other is not and both are viewed a the same output resolution. I have used blue wall out of focus at to eliminate FOV so we can just deal with SNR

Or lets break it down mathematically what is going to show more noise 1 data point (pixel) with the same SNR in 36,000,000 data points or 1 data point (pixel) in 16,000,000 data points <--- this would be using a cropped image with a crop factor of 1.5

--
The Camera is only a tool, photography is deciding how to use it.
The hardest part about capturing wildlife is not the photographing portion; it’s getting them to sign a model release
So is one of these at higher magnification?
 
cesjr wrote;

Dude you're hardly on a roll. You're dug in and your arguments are not convincing a lot of folks. You can blame that on them sure. But maybe it's your arguments? Shudder, can't be I guess
You seem like the kind of "dude" who would benefit from watching this video seeing as you seem to totally ignore Fraulain's excellent explanations (which are much better written and scientifically backed up than my own.)


You could also read this but I suspect it isn't going to "take."

http://www.josephjamesphotography.com/equivalence/

The main thing to take away from that website is this statement, "The same total light will result in the same noise for equally efficient sensors (regardless of pixel count and regardless of the ISO setting."
I read that long winded piece and watched that video. It's just the same old FF has More Light argument. Which Is just plain stupid. Sure there's more light coming in but it's just feeding more pixels and more resolution. The same amount of light is falling on each pixel. I cannot believe people are even sucked into this argument. It makes no sense whatsoever
The light IS the information. The more you get, the more signal you get to analyse vs. the noise and therefore the more accurate the result (the stronger the signal is compared to the noise.)

Questions for you. Please answer each one individually.

1. Do you think people who have posted here who are obviously scientifically more educated than you and me (by their explanations and links) regarding light and Physics know less than you and me with regard to sensor signal to noise ratio (I'm referring to the posts of Fraulain and Usee?)

2. Are you one of those people who feels there is no advantage of APSC over FF, but think that m43 format is hopelessly noisy?

3. Do you think there is no advantage whatsoever of a larger sensor over a smaller one? Do you think a cell phone sensor is as good as an APSC sensor?

4. Since all the FF sensors out there right now outperform all the APSC sensors in low light do you think there is a conspiracy among sensor manufacturers to artificially hold back APSC sensors and make them worse than FF sensors?
1. It's a big carry to argue that cropping increases noise in the uncropped portion. Take a picture full frame using Nikon. Then put the camera in DX mode. The portions of the image that overlap have the same amount of noise. You can't make it noisier but cutting out part of the frame. It's the same pixels capturing the light. And no you can't start blowing up the smaller photo to the same size as the FF. You're increasing magnification which will always degrade image quality. Yes FF has more resolution and FOV I'm not disputing that or the resulting image quality increase over a crop frame blown up to the same physical size.

2. No I think FF cameras today are generally less noisy than APS-c and the same for aps-c vs m43 but it's not inherent in the frame size. It's a function of lower noise pixels, and currently it's much more practical to get those on a bigger sensor if you don't want to drive down the megapixels too much (as necessary to sell the camera). But this could change with improved sensors in the future.

3. See answer to question 2. Unless you want a super low res cell phone camera that nobody will buy, you going to have to pack on more pixels that are smaller and take a hit to higher ISO.

4. It's not a conspiracy - there's a real trade off between resolution and high ISO performance with current sensor tech. But it's also possible that APS-c sensors are held back by cost concerns. I do think canon has held back APS-c. Sony has an incentive to do the same, to sell more FF at a higher price. But it's mainly a technical trade off.
 
As during the Christmas Truce of 1914 during WW I when the British and German soldiers put down their weapons temporarily to play football and sing carols, someone should call a day long truce on this thread and all involved should spend the day taking the best photographs possible with whatever camera they have, no matter the sensor type or size, and just enjoy the act of taking pictures which, I believe, is why we all got into this thing called photography in the first place.

This thread has legs. I'm sure it'll still be around after a day.
 
Last edited:
I just curious with by increasing the number of sensor pixels will improve the SNR (nothing to do in what of the size of sensor). So, just vary number of pixels only and fix all other setting or parameter to see any different. So, the best is use one camera system and one len only. The on vary is number of sensor pixels. We just cover the ff sensor to the size of said 1' first and take a photo. Then increase the the number of sensor pixel by increasing the opening of cover and taken photo. Repeat it until full opening (ff size). Now, crop/cut the image size of Full size image to the size of the image taken by covered sensor. Is it clear!
 
There is also the problem of the Fuji's APSC sensor pixels having to bin more electrons per area at similar total quantity of light which is another place to get tripped up.
I don't understand the above.
OK. Let me see if I can explain my logic with a couple thoughts regarding sensors and lenses and you can correct my missteps.

I understand if you laid to sensors down on the ground, with no lens, the larger sensor would gather more light and have a better signal to noise ratio. In this case they would both be getting the same amount of light per pixel, but one would just have more read sites. The signal to noise would be better for the larger sensor.
Well, I'd rather use a lens, stopped down well and used a defocused evenly lit flat detailless subject to minimize sources of problems.
However, we put sensors behind lenses in an attempt to capture a specific field of view.
Sure - we would use different lenses for different formats to get the same field of view.

But this is quite irrelevant for the topic, including what you wrote below :) Even in the situation you described above what you wrote below would be just as true.
So, if we take two sensors, A with 4 times the area of light as B and put them behind two lenses that have concentrate the same total amount of light on both, wouldn't each pixel on the smaller sensor get 4 times the amount of light as it would on the larger sensor and therefore wouldn't it have to store 4 times the information?
Yes.
Wouldn't the limit on how much information a pixel can store cause the smaller sensor to saturate faster and therefore have less dynamic range?
Yes.
I know we've been talking about signal to noise ratio but people also care about the dynamic range of the sensor which is effected by how bright an image the sensor can still measure before saturating and therefore is hugely dependent on sensor size.
What you said is right.

Both SNR and DR are influenced by the total amount of light the sensor can collect without saturating.

For idealized sensors, if you double the sensor size, you increase the SNR by factor of 1.414 and the DR by a stop.
 
I had a question? If we just cover the ff sensor inside the camera to APSC size. The sensor performance will drop to aps-c sensor performance?
Yes. It doesn't matter how you crop. Cropping in the computer after the exposure will also cause the same effect.

It is (almost) all about how much light is collected. If you crop, you throw away part of the light.
so, basically, if i print out a FF picture (say 30x20) and i then phisically CUT 2 inches out of each border, what remains is now worse because i cut threw away part of the light?

are you kidding me?
 
I had a question? If we just cover the ff sensor inside the camera to APSC size. The sensor performance will drop to aps-c sensor performance?
Yes. It doesn't matter how you crop. Cropping in the computer after the exposure will also cause the same effect.

It is (almost) all about how much light is collected. If you crop, you throw away part of the light.
so, basically, if i print out a FF picture (say 30x20) and i then phisically CUT 2 inches out of each border, what remains is now worse because i cut threw away part of the light?

are you kidding me?
No. You misunderstood me. Maybe I should have been clearer.

If we consider the signal-to-noise-ratio, then cropping reduces the it. If you crop half of the image, your SNR is reduced by factor of 1.414.

I didn't mean that if I crop part of the image out the somehow the rest is somehow also influenced. That would be a silly idea :)

What I meant is simply this:

You take two images with your camera (the same camera) - one image with 50mm lens, and another with 100mm lens. Now if you crop the image taken with the 50mm lens to match the field of view of the 100mm lens, the output image you get has about a stop lower signal to noise ratio than the image which was taken with the 100mm lens has. This is true regardless of how the crop is performed, wether is the sensor (in this case four-thirds) or done in (post)processing.
to achive this in post-processing you are doing TWO things: you are cropping AND resizing the image.

if you just crop there is no way the original pixel will be altered.
 
Both SNR and DR are influenced by the total amount of light the sensor can collect without saturating.

For idealized sensors, if you double the sensor size, you increase the SNR by factor of 1.414 and the DR by a stop.
Finally we get to a pure number I can understand.

This counters one thing people are saying about "equivalence."

In almost every website that talks about equivalence they say you can put a faster lens on a smaller sensor so that you get the same amount of light on the image circle and then they will be equivalent from a noise standpoint.

But, they still won't be equivalent because the smaller sensor will get the same amount of light per time but still capture less light because it's exposure will necessarily be shorter (half as long, in fact.)

So a APSC sized sensor can never hope to perform as well as a FF sensor even when "equivalent" lenses are placed in front of them (everything else being equal.)

Is this logic correct?
 
Last edited:
Both SNR and DR are influenced by the total amount of light the sensor can collect without saturating.

For idealized sensors, if you double the sensor size, you increase the SNR by factor of 1.414 and the DR by a stop.
Finally we get to a pure number I can understand.

This counters one thing people are saying about "equivalence."

In almost every website that talks about equivalence they say you can put a faster lens on a smaller sensor so that you get the same amount of light on the image circle and then they will be equivalent from a noise standpoint.

But, they still won't be equivalent because the smaller sensor will get the same amount of light per time but still capture less light because it's exposure will necessarily be shorter (half as long, in fact.)

So a APSC sized sensor can never hope to perform as well as a FF sensor even when "equivalent" lenses are placed in front of them (everything else being equal.)

Is this logic correct?
Unfortunately, no. It is only correct when comparing the same intensity of light hitting a sensor. That is why equivalence is important. If you have the same 35/1,4 lens on both sensors, the FF sensor will accept more light and receive all the advantages that light brings.

And, when equivalence is not taken into account, that same 35/1,4 will be much larger on FF than APS-C. Of course, it gives both a different FOV and DOF as well as intensity of light than the smaller APS-C setup.

But when used with equivalence in mind, the differences will be minimised by the same ratios. Things never 100% even out, but theoretically they could. A 53/F2,1 lens could be released for FF, which would rival the APS-C 35/1,4. Given the same sensor technology (year, builder, silicon, pressing techniques), the two systems would produce the same output. And that's that.

But such equivalents do not exist. F2,0 and slower for FF normal lenses have long gone away. In their stead are 1,8 normal lenses, which, on larger sensors are equivalently faster, still.

As long as we accept that APS-C has tradeoffs in a number of areas, we can brag that it is smaller. But as soon as we expect the same output/DOF/FOV, we lose on every front. The maths simply don't support it. Neither does the output.
 
It doesn't give more intensity of light per equal size area of the sensor. That's why a 1.4 lens is a 1.4 lens on any system. Now whether it projects a big enough image circle is another matter. More light gets through a FF 1.4 lens than on a lens designed for crop at 1.4, the intensity (or exposure value) remains the same, it's just that extra light in the FF lens is projecting a bigger image circle, so it's wasted information on the smaller sensor.

iso performance and iq is getting so good on smaller sensors that FF will probably become a niche product in the not to distant future. The main benefit of FF for me is really for Bokeh whores ;) and that's probably 95% of us on here. But in realty APSC & even M43 delivers enough separation with the right lenses.

If you can't make an excellent print with today's APSC, then it's really not your sensor that's not working hard enough...
 
Last edited:
It doesn't give more intensity of light per equal size area of the sensor. That's why a 1.4 lens is a 1.4 lens on any system. Now whether it projects a big enough image circle is another matter. More light gets through a FF 1.4 lens than on a lens designed for crop at 1.4, the intensity (or exposure value) remains the same, it's just that extra light in the FF lens is projecting a bigger image circle, so it's wasted information on the smaller sensor.

iso performance and iq is getting so good on smaller sensors that FF will probably become a niche product in the not to distant future. The main benefit of FF for me is really for Bokeh whores ;) and that's probably 95% of us on here. But in realty APSC & even M43 delivers enough separation with the right lenses.

If you can't make an excellent print with today's APSC, then it's really not your sensor that's not working hard enough...
An equivalent lens isn't the same F-stop. It is a full F-stop faster. Therefore it does yield more light per pixel...twice the light.
 
can you explain how a 35mm 1.8 ff lens is 1.8 regardless of whether it's used on FF or APSC?

and can you explain how A 35mm 1.8 APSC designed lens that does actually fit a FF (like the nikon) is still a 1.8 lens on FF & APSC? With a little vignette ;)

I think many of you are being confused by all the talk of d.o.f equivalence which is sensor based and exposure value which is lens based.
 
Last edited:
Both SNR and DR are influenced by the total amount of light the sensor can collect without saturating.

For idealized sensors, if you double the sensor size, you increase the SNR by factor of 1.414 and the DR by a stop.
Finally we get to a pure number I can understand.

This counters one thing people are saying about "equivalence."

In almost every website that talks about equivalence they say you can put a faster lens on a smaller sensor so that you get the same amount of light on the image circle and then they will be equivalent from a noise standpoint.
Also equicalent from DOF point of view - f/2 50mm on APS-C and f/3 75mm on FF have the potential to create the same image, both regarding DOF, FOV as well as noise. The shutter speed will then be the deciding factor, and just as you said, the FF will be able to take advantage of slower shutter speed (unless there is certain external need to use a fast shutter speed).
But, they still won't be equivalent because the smaller sensor will get the same amount of light per time but still capture less light because it's exposure will necessarily be shorter (half as long, in fact.)

So a APSC sized sensor can never hope to perform as well as a FF sensor even when "equivalent" lenses are placed in front of them (everything else being equal.)

Is this logic correct?
Yes. (*)

(*) There is one situation where a sensor will perform equally well, wether than is APS-C or FF, and it's when you need to have both certain fixed DOF and fixed shutter speed (assuming the combination does not saturate the smaller sensor). Of course in such situation the FF doesn't use it's full potential even if the APS-C does.
 
Both SNR and DR are influenced by the total amount of light the sensor can collect without saturating.

For idealized sensors, if you double the sensor size, you increase the SNR by factor of 1.414 and the DR by a stop.
Finally we get to a pure number I can understand.

This counters one thing people are saying about "equivalence."

In almost every website that talks about equivalence they say you can put a faster lens on a smaller sensor so that you get the same amount of light on the image circle and then they will be equivalent from a noise standpoint.
Also equicalent from DOF point of view - f/2 50mm on APS-C and f/3 75mm on FF have the potential to create the same image, both regarding DOF, FOV as well as noise. The shutter speed will then be the deciding factor, and just as you said, the FF will be able to take advantage of slower shutter speed (unless there is certain external need to use a fast shutter speed).
But, they still won't be equivalent because the smaller sensor will get the same amount of light per time but still capture less light because it's exposure will necessarily be shorter (half as long, in fact.)

So a APSC sized sensor can never hope to perform as well as a FF sensor even when "equivalent" lenses are placed in front of them (everything else being equal.)

Is this logic correct?
Yes. (*)

(*) There is one situation where a sensor will perform equally well, wether than is APS-C or FF, and it's when you need to have both certain fixed DOF and fixed shutter speed (assuming the combination does not saturate the smaller sensor). Of course in such situation the FF doesn't use it's full potential even if the APS-C does.
Thank you for this contribution and all your others. I think I'm finally up to speed.

I can interpret your answer to mean if you don't need the dynamic range because the subject doesn't call for it (low dynamic range subject), or if you don't need it because you don't need it in your output (you don't care about the too dark or too light area or will use exposure stacking) an APSC sensor can yield nearly equivalent results with an equivalent lens (a lens that is 1 f-stop faster and half the focal length) as long as you expose the same (overexpose the APSC by one stop and then correct in post if you want to get the same final exposure.)

This is actually a technique I've heard suggested for getting the least noise in dark areas. Expose for the highlights you need to keep but not any less, then selectively bring down the exposure where of the highlights.
 

Keyboard shortcuts

Back
Top