sina_hml: I know it's an old article but i hope someone can explain some of my questions.Here is how i understand it:I have a 5d and a 450d and a 50mm 1.8. Both cameras have identical flange distance, so the lens is producing the exact same image at the sensor plane.450d captures a smaller part of this image. I think everyone agree with me so far.The part i don't understand is why do some insist that the picture that 450d sees is darker than what 5d sees? it is a smaller amount of the total light that is entering the lens but it is also used to illuminate a smaller area. I assume that the amount of light that each pixel (photo cell etc.) receives is the same between the cameras.
Reasonable *for a bridge camera* is probably fairer than 'almost unusable' - I take it back.
Modern small sensors tend to perform better (in proportion to their sensor size) than large sensors, as a *very* rough rule-of-thumb, which is part of why I shouldn't use 'equivalent ISO' as anything other than a *very* rough guide.
At F2.8 and ISO 100 the FZ200 is creating its image from the same amount of light as a full frame camera is at F15.6 and ISO 3086 however, the performance won't necessarily be identical.
hookah: ok,i think i understand a little....because i noticed on the 1/2.3" compacts trying for the equivalent of "sunny-16 rule" which is f8@1/400-500 can have a lot more noise than using f5@1/800-1000...so if compacts were given a lens of f2..and ISO 50,80...and raw...then 1/2.3" compacts would produce even more superb images competing with APSC..and there would be no fuss with the focusing cuz everything is in focus on a compact(for the most part)...and it'll fit in your pocket...
I'm afraid not. If you were getting as much light onto a 1/2.3-type sensor by having a really wide aperture, then you'd have the same shallow depth-of-field as an APS-C sensor.
No format has an inherent depth-of-field advantage over any other. Only when you can't stop a larger format lens down any further or can't open the smaller sensor's lens up any further is there any difference.
larryr: If I understand the article correctly, total light gives an idea of the quality of the images so a camera with a smaller sensor might do as well or perhaps better than one with a larger sensor IF it has a faster lens (AND using a slower shutter speed is not an option, forcing the camera with the larger sensor to use a higher ISO)? Pixel density is not important? This allows one to understand the relative importance of lens speed vs. sensor size (in terms of total light and IQ).
This changes the common wisdom that a small sensor camera performs pretty well in daylight but not so well in low light? But the daylight situation has not changed, and the low light situation has improved only where the camera has a faster lens and a slower shutter speed can not be used (allowing it to use a lower ISO?)
On cameras with smaller sensors does the adjusted aperture refer to the actual size of the aperture? In the cropped example the adjusted aperture does refer to the used part of the aperture.
This is only true if you compare images at the pixel level.
Even with pixels of the same size, [as in this example](http://www.dpreview.com/articles/5365920428/let-me-try-to-address-that), the larger sensor still looks better, when compared at a common output size.
I'm not sure you can generalise that the noise floor (the point at which electronic read noise contributes in addition to the shot noise) is lower for large pixels.
WesternSage: Hey! Could you add the Pentax 645z to the mix?
The 645Z doesn't have the same-sized pixels, so isn't relevant to the point being made here.
This example was simply an attempt to address the people arguing that, if an APS-C crop of a FF image had the same pixel-level noise as a native APS-C image, then the FF image is just as noisy.
The differences between pixel sizes will be addressed in part 2 of the noise article.
You're right, I didn't express all my assumptions there. You do need a comparable lens to exist. In that case, where a full frame lens with an equivalent field of view exists, the difference is due to the total light.
Note that the article makes clear that equivalent ISOs are not absolute, and serve only as a guide to performance.
However, why would 600mm F15.6 ISO 3086 (if you were confident that equivalent ISOs were going to give you a *close-enough* answer) make you conclude that the camera is unusable in low light?
Thinking in terms of crops may be easy for you, but being able to plot equivalent aperture graphs makes it easy to express the capabilities of otherwise comparable cameras with different sensor sizes.
Klorenzo: Hi Richard I think that this "equivalent aperture" idea is doing only harm and confusion.Try this: take three different cameras with different sized sensors and shoot a moving subject with the same aperture number, shutter speed and ISO and equivalent focal lenght.You'll get three "identical" pictures with the same "brightness" and the same amount of motion blur. Do you agree?
Yes, DoF will be different in a precise amount and you can calculate the "equivalent DoF aperture". And yes, the noise will be different but it vastly depends on the actual sensor you are using: try a 5D classic vs a Sony A7S. Do you agree?
So the "equivalent aperture" is NOT equivalent with respect to exposure and motion blur and it is equivalent for DoF and, with a big approximation, for noise.What is more important? Exposure and image content or noise and out of focus areas?
In which why this "equivalent aperture" concept is better then saying that there is a two stops difference in DoF and less noise?
I don't agree the images will be identical, since you then spend time explaining the ways in which they're different.
Please note that the article doesn't at any point suggest you should ever use equivalence to work out exposure - it is only suggested as a way of comparing two systems. This is particularly relevant when considering buying a camera in part of the market where multiple sensor sizes are available (enthusiast compacts, for instance. APS-C and Micro Four Thirds ILCs). Understanding the effect of sensor size on FoV, DoF and total light (and hence, ~noise) is useful.
However, if you did, for whatever reason, choose to shoot two cameras at equivalent focal lengths, apertures and ISOs (and therefore, the same shutter speed), you' d get the same brightness and motion blur **and** the same depth-of-field. Not necessarily the same noise (as this can depend on sensors, as state in the article).
Sucama: Can DPR show/ add, Exposure Latitude & ISO-invariance and Exposure Latitude & ISO-invariance in d7100, to compare with d7200?
We don't currently have a D7100 but I agree, we should try to get one in to expand those tests.
TMW: Interesting. Pixel size does not matter, if i understand correctly. How about the sony a7s?
At *very* high ISOs (where read noise starts to play a role), there's a difference, at most settings there's no difference.
Part 2 of the noise article should spell out why this is.
DuncanDovovan: You mention pixel size is the same for a FZ sensor and the APS-C sensor. But I assume the FZ sensor has more pixels than the APS-C sensor, right?
Otherwise a better S/N ratio would not be possible?
The point is that pixel count doesn't make that much difference.
If viewed at the same size, the larger sensor had the same **number** of pixels, it would perform better. If the larger sensor had more of the **same sized** pixels, it would perform better (and by a *similar* degree).
The better S/N ratio comes from the larger sensor area, not the pixel performance.
@nigelht - there will be some differences in performance because of sensor differences but they're likely to be fairly small.
Pixel size has very little role to play (have a look at [this, where the full frame and APS-C camera have the same size pixels](http://www.dpreview.com/articles/5365920428/let-me-try-to-address-that) but the full frame sensor still performs better, thanks to simply having more sensor.
I'll address the reasons for this in part 2 of the noise articles we're publishing (should be later this week).
Mssimo: Typo, label on sample images should be D7000 not D7100.
You're right, I'd got the first label wrong then painstakingly built that error into the rest of the cells.
There's no mix-up. These are all 100% crops from larger images (click on the crop to download each one). The whole frame D810 is shot with an 85mm lens to give very nearly the same field of view as the APS-C examples that are shot from essentially the same distance with a 50mm lens.
Eric Nepean: I think the conclusions may be correct, but I think the analysis should proceed as as follows: Consider an APSC sensor and a 35mmFF sensor, each with the same number of pixels and aspect ratio, and each having a lens providing the same FOV to each sensor, and having the same Aperture Fratio, focused on an object plane some fixed long distance away (lets say 10m). The same (corresponding) pixel on both sensors is mapped to the same area on the object plane, which radiates identical light intensity to both lenses. However the focal length of the lens on the APSC sensor is shorter than the lens on the FF sensor to maintain the same FoV. With the same Fratio, the entrance pupil (front lens element) of the APSC lens is smaller than the FF lens, and thus intercepts less light radiating from the pixel-mapped area on the object plane. Consequently there is less light reaching the pixel on the APS-C sensor than the pixel on the FF sensor with the same view.
You mean like [these examples](http://www.dpreview.com/articles/2666934640/what-is-equivalence-and-why-should-i-care/4), where we look at four cameras, each with similar pixel counts (16-18MP, I seem to remember). Then compare them at the same f-number, then at the same aperture diameter (same equivalent aperture).
Eric Nepean: The diagrma and the discussion shows the cone of all point light sources which can be "seen" by all the pixels on the sensor. But what the discussion and analysis is about is the cone of light from one or more point light sources which gets focused onto one pixel.
The hope was that the diagram shows the effect on the whole image. The first set of crops shows what happens if you look at the pixel level but the second set shows the effect of considering the whole image.
I'll have a look back through the text and see if I can clarify this.
ftphoto: This is a very interesting article, but I feel is incomplete as it does not address how the sensor / system uses the light available across the Dynamic Range. Let me try and explain:
12 Bit with 4,096 Levels from Black (Left) to White (Right) - think Histogram
The first stop on the white side uses 2,048 bits of information.Leaving 2,048The next stop uses 1,024Leaving 1,024The next stop uses 512Leaving 512The next stop 261Next stop has 130 leftNext stop has 65 leftNext stop has 32.5
As you can see as we move from brighter to darker there are less and less available bits of data to make up the shadows. Thus creating more noise in the shadow areas of an image. That is why ETTR is so important and effective.
I would like to see this addressed more accurately than how I have explained it. It is the same for all sensors though larger sensors and higher pixel counts provide more data to use therefore less noise.
I'll need to check on the precise reasoning, but I'm assured that this *isn't* why ETTRing is effective.
I think it's because, as with the test tube example in this article, a larger signal will have larger noise variation in absolute terms, and so all the additional bits you're devoting to highlights just end up describing this noise in more detail (which you tend not to be able to see, anyway).
While it's true that linear Raw devotes half the Raw file to the first stop of light, this is something of a red herring when it comes to noise, despite the long-standing articles drawing attention to it.
mostlyboringphotog: Mr. Butler,Pour fa vour, if you would run a test shot with Nikon 1 with their adapter for the FX lens, and same FL/f-stop/SS shot with the CX lens, FWIW, you can make me shut up :-)Regards,
On the basis the shot noise isn't coming from the sensor - it's a reflection of how much light was captured - then, strictly speaking, it's not the sensors themselves that make images from small sensors noisier.
And yes, in situations where you can provide a small sensor with more light (without overexposing), then you can get the same noise performance. However, this tends to be difficult, because, if a small sensor can cope with this extra exposure, then you can probably do the same for the large sensor, too...
mpgxsvcd: Shot noise was what I was missing before. I couldn't understand your conclusion in previous articles because I didn't believe shot noise was as significant as it actually is.
Now i see how much shot noise contributes with these samples and what you are saying becomes much clearer or noisier depending on how you look at it.
Thanks again for these wonderful articles. They are really well written and obviously full of Rishi's technical knowledge. They is a truly a great pairing.
Please keep posting articles like this and the review videos. That has really increased the value of reading Dpreview over the last few months.
Thanks. And yes, Rishi's contribution is significant and his expansion of our raw testing has given us a lot to discuss. I want to make sure as many of our readers as possible are able to understand why we're doing these tests and what they show.
mostlyboringphotog: "Then, in addition to this, we've shot the D810 in crop mode using the same lens as was used on the D7000."Does this mean a DX lens was used on D810 for the crop mode?If so, even a larger sensor with lower SNR from a crop lens will be as noisy as the crop sensor.
"Note that the full frame sensor performs better than the APS-C sensor, ..." It seems the FF sensor performs better only when the illuminating lens provides higher SNR output.
It's solely about the ability to *use* more of the lens's output.
A full frame lens was used on both. But it makes no difference. An APS-C sensor (or the APS-C crop of a full frame sensor), doesn't know whether it is seeing an APS-C lens or the central section ofa full frame lens.
Please look at the diagrams at the top of the page again and read the text. An APS-C sensor only sees the middle section if a full frame lens (which is why you get a narrower field of view for any focal length). An APS-C-specific lens cuts out that additional light *that the APS-C sensor couldn't see anyway*.
In neither of these circumstance is more light available to the smaller sensor, otherwise the Nikon FX 35mm F1.8 and the DX versions would behave differently, which they don't. (Both behave equivalently to a 52.5mm F2.7 on full frame)
Karroly: "As a result, when you shoot two different sized sensors with the same shutter speed, f-number and ISO, the camera with the smaller sensor has to produce the same final image brightness.. from less total light."
I am sorry, I do not agree at all.
Please, let me put this another way. If I use an F:2.8 FF lens on a APS-C body, it is still an F:2.8 lens and the picture taken (at same aperture/speed/ISO AND PIXEL SIZE) is just a crop of the FF sensor. The APS-C area of the FF sensor gets the same amount of light (either total or per area unit) than the APS-C sensor and thus the signal-to-noise ratio is the same...
What happens when you compare a full frame image, an APS-C crop from the same camera and an actual APS-C sensor?
[Have a look here](http://www.dpreview.com/articles/5365920428/let-me-try-to-address-that).
lacikuss: I like what I see, however using the excellent Sigma 18-35 f/1.8 as the only lens with the sample pictures instead of the kit zoom as you do with every other camera is just unfair.
We will be shooting with a variety of lenses before we draw any conclusions.