lacikuss: I like what I see, however using the excellent Sigma 18-35 f/1.8 as the only lens with the sample pictures instead of the kit zoom as you do with every other camera is just unfair.
We will be extending the gallery with images shot with other lenses. I wasn't aware Barney had also been shooting with the 18-35mm or I'd have grabbed something different.
mostlyboringphotog: Maybe I'm just misreading it but much of noise discussion becomes if a larger sensor is less noisy. As the shot noise is an attribute of the photons and as the SNR is a function of sqrt of number of photons (regardless of whether the photons are captured or not), the shot noise SNR is then a property of the lens and its size of image circle and not the size of a sensor.For example, of one uses same DX lens on FF and APS-C, the photon shot noise SNR should be same; however, it's counter intuitive to think that conversely, FX lens on FF and APS-C should also have the same SNR.So I'm very curios if the example in the article used same lens for larger sensor and for smaller sensor?
It's neither my theory nor have I stated any rules.
I can't run out and shoot the images right now, but I hope to include some examples in the next part of the article.
How big a difference this makes will depend on what tone you're looking at: if your signal is high enough, then you might well be able to halve it (reduce it by 1EV) and still have a high-enough SNR that the difference is negligible.
However, with a dark tone made up from very little signal, halving it will do your SNR no good at all (or, conversely, doubling the signal could be really beneficial).
DuncanDovovan: Given the examples that the author uses, I assume he is addressing normal photography and not say astrophotography or night exposures.
Does the author really believe that while keeping the ISO, and aperture the same, exposing a photo 1 stop longer (assuming 2 stops will introduce clipping of highlights) and correcting this down in post-processing will actually lead to a *noticeable* (not theoretical) improvement in noise for normal portrait/landscape photography?
I'd like to see proof of that instead of theoretical talk.
Is in theory it is easy to understand he is right, but will the difference be noticeable to the human eye?
With the same aperture diameter (not f-number), you're spreading the same light over a larger area. (As per your projector example).
With the same F-number, you have four times the light with the 85mm lens, and are spreading it over a larger area (such that each unit area gets the same amount of light).
The format the lens was designed for makes no difference at all (so long as it at least covers the format you're working on). A 50mm F2.8 is still a 50mm F2.8.
However, the question is whether the sensor is able to see all of the light (or the sensor-shaped area of it).
[The diagram at the top of this page](http://www.dpreview.com/articles/2666934640/what-is-equivalence-and-why-should-i-care/2) *should* make it clear:
An APS-C/DX sensor doesn't 'see' all the light projected by a full frame/FX lens, so the smaller sensor can't tell whether it's behind an FX or a DX lens.
And, if this is going where I think it is:
The APS-C image would have the same SNR as an APS-C crop from the full frame sensor.
However, if you used all of the full frame sensor (which would mean changing lenses, if you wanted to shoot the same scene), then the full frame image would have higher SNR and so would look cleaner when viewed at the same size.
BJN: And "we're be curious" to so see you test the low-light capabilities of the new D7200.
Is that a reference to a typo in the text somewhere? If so, could you let me know where, so I can correct it?
Musicjohn: In my opinion the writer of this article is missing the real story here. The suggestion that apperture and shutter speeds have influence on the amount of noise is not correct. Making pictures at the suggested shutter speed / apperture combination will show the same amount of noise levels, even if you were to change one of the parameters (so the other changes accordingly). If I change my apperture from f/3.2 to f/8 and the shutter speed changes accordingly (at same ISO setting), I will not have two different images with two different noise levels. However, when using EV compensation and actually over-exposing, I might achieve a cleaner image. However, in a case whereby you would have to raise the ISO setting in order to make exposure to the right possible, your brighter picture may well show a lot more noise than the picture taken with the suggested shutter speed and lower ISO. So, to conclude, it is all about exposure, not about the shutter speed and apperture used.
And what if the 'suggested combination' isn't optimal?
Its a change in the overall exposure (defined by the combination of f-number and shutter speed) that changes the noise.
The section on sensor sizes specifies the lenses and the apertures used and makes clear that they offer the same field of view (and hence the same scene).
A 42.5mm F2.8 lens will condense the same total amount of light onto a sensor 1/4 as large as an 85mm **F5.6** lens would onto a full frame sensor. (Same field of view, same aperture diameter).
By contrast, an 85mm **F2.8** lens will project the same *intensity* of light onto the large sensor, so 4 times the total amount of light.
Pixel size makes very little difference (and those differences will be discussed in part 2). The end result will be much more closely tied to sensor size and the total light available than to pixel size.
birdbrain: I'm confused, using the 'rain' analogy if we have a deluge, low f number, and we hold our test tubes under this deluge the test tubes will collect a certain amount of water. If it's a 'light' rain shower, high f number, and now hold our test tubes under this 'light shower' for a correspondingly longer time we can collect the same number of rain drops - can't we?
Sorry if i'm being a bear of little brain about this :)
Absolutely. The intensity of the rain in this metaphor is the brightness of the scene. There's nothing representing aperture (so it's fair to assume that the light/rain is unrestricted: the aperture is wide-open).
But yes, in low light/light rain you can keep your aperture open and keep the test tubes exposed for longer. In bright weather/heavy rain, you would need to shorten your exposure or introduce something to block some of the rain (an aperture).
The point is simply that light arrives at your sensor just as rain drops arrive in a test tube: as separate packets, randomly over time.
Wiscflank: I like the explanation of shot noise. Very interesting. However, the connection to the sensor size rather than the pixels size is not obvious. Everything holds together, but the conclusion. Smaller sensors are noisier, but if we agree on the fact that the pixels are independent from one another, meaning the light received by one photocell does no impact its neighbors, the logical conclusion is to state that the size of the pixels is important, not the size of the whole sensor.
My understanding is that, precisely because the shot noise is uncorrelated between pixels, the SNR of one large pixel can be recreated by combining the results from any number of smaller pixels that cover the same area (ignoring, for now, read noise, which I'll cover in part 2).
At this point, if you make the (I think) reasonable assumption that people don't choose their viewing or print size based on the sensor format or pixel count it was captured on, then high pixel-count images will be downsampled to the same size as low pixel-count ones, with no difference in shot noise between the two.
At which point it's the total amount of light making up the image that matters most, and this is where the size of the sensor can enter into play.
As I say, there'll be a more rigorous look at this in part 2.
sina_hml: I know it's an old article but i hope someone can explain some of my questions.Here is how i understand it:I have a 5d and a 450d and a 50mm 1.8. Both cameras have identical flange distance, so the lens is producing the exact same image at the sensor plane.450d captures a smaller part of this image. I think everyone agree with me so far.The part i don't understand is why do some insist that the picture that 450d sees is darker than what 5d sees? it is a smaller amount of the total light that is entering the lens but it is also used to illuminate a smaller area. I assume that the amount of light that each pixel (photo cell etc.) receives is the same between the cameras.
esasjl - pixel size makes very little difference in most cases. It's the total area (and hence total light) going into making up the image that accounts for the differences between sensor sizes.
I don't know what you mean by 'the top figure,' so I'm not sure how to respond.
@sina_hml - If the pixels are the same size, then an APS-C crop from the full frame sensor will be identical to an APS-C image, taken with the same settings.
However as soon as you use all of the full frame sensor, its extra light capturing area will put it ahead.
mosc: DPR, can you discuss the ETTR process you use as it relates to in-camera single shot dynamic modes found on modern cameras (not multi-shot HDR gimmics)? It seems like they do many of the same things and the complexity of raw processing and manually determining exposure is far less necessary if the camera's programmers already juggle much of this automatically. Perhaps these modes are not fully developed in your eyes yet and need significant improvement?
Wouldn't it be as simple as giving users a "1 EV pull", "2 EV pull", and "3 EV pull" metering modes? Seems like if the camera can accurately meter for mid-tones then it's a simple ISO change in most cases paired with basic scaling in the JPG creation to give these outputs.
It doesn't seem complexity wise more difficult than multi-shot HDR modes except that internally the camera adjusts brightness of each to match and writes them out independently.
The point is that no camera I can think of gives you a meaningful way of telling when your Raw file is starting to clip, so at best you only have approximate tools for exposing to the right.
The DR modes of the Fujifilm give you three choices about how much of the sensor's DR you want to devote to highlights by increasingly exposing to the left. So you at least have three options about where in your Raw file to put middle grey (it would be four if you could shoot Raw at ISO 100), but you don't have any means of telling which one is optimal for the conditions you're shooting in.
There's no technical reason why you couldn't design a camera meter that obsessed about Raw highlight clipping, rather than JPEG middle grey, but it would mean abandoning the ISO standard if you went the whole hog and separated optimal exposure from the 'correct' output brightness.
mpgxsvcd: If my scene doesn’t have any highlights that are blown at the normal/auto exposure value then is it possible that I would actually have to increase the exposure compensation in order to ETTR? In other words are there ever times when ETTR doesn’t involve decreasing the exposure compensation and is it still recommended to ETTR in those scenarios?
You can always darken the data back to the correct brightness later, but every tone in your image will be made up from more light (signal), so will have a better SNR and will be cleaner.
Fujifilm's DR 100%, 200% and 400% modes sort of do this: they provide different exposure/amplification combinations (and we'll look at this in more detail in part 3).
However, these are still based on JPEG middle-grey metering, just offering three different choices of how much DR is provided above middle grey: they don't provide tools to help you assess how well ETTRed you are. They provide more options but still essentially leave it to guesswork.
yural: Brilliant article, thanks Richard.I have one question regarding sensor size impact. Let assume we stay at exposure impact consideration (rain drop model) only and consider full size and cropped DSLR with same lens. Let take same picture with full frame, but in crop mode, and by cropped camera. (I assume the exposure should be roughly equal). Would we expect same noise than? Similar - full frame picture w/ previous exposure and image by cropped camera - what about noise in cropped area only?Appreciate all your replays!
Assuming the same sized and similarly performing pixels, the cropped area of the full frame sensor and the APS-C sensor would look identical.
However, as soon as you use the full area of the larger sensor, you capture more signal and hence get a cleaner image.
For example, [compare the Nikon D810 and D7100 in our studio scene](http://www.dpreview.com/reviews/image-comparison/fullscreen?attr18=daylight&attr13_0=nikon_d7200&attr13_1=nikon_d810&attr13_2=nikon_d7200&attr13_3=nikon_d810&attr15_0=raw&attr15_1=raw&attr15_2=raw&attr15_3=raw&attr16_0=3200&attr16_1=3200&attr16_2=3200&attr16_3=3200&normalization=full&widget=1&x=0.11482042414920245&y=0.5061312152671813).
The pixel-level noise is essentially the same. However, as soon as you click on the 'Web' button (top right) to view them at the same size, the additional light captured by the D810 comes into play and the image looks cleaner.
Roland Boyer: Sorry for bad English I am French / In my opinion, you are wrong when saying :" a full frame camera shot at 85mm F5.6 and a Four Thirds camera at 42.5mm F2.8 will have the same angle of view and the same aperture size (15.2mm diameter) and hence will be exposed to the same amount of light if exposed for the same amount of time".... Ok for the same angle of view, but i disagree about the "same amont of light" Because on the full frame camera, le light need twice the way to focus at 85 mm and decraese by 4 , compared to focus at 42,5mm! That why we need bigger diameter for the same F-stop on full frame... What is changing is size of pixel (or number of "silver-grain" on film! So you're wright only about le light passing through the lens, and not le light on the sensor or film!
We'll come to the effect of pixel size in part two (since the difference comes down to read noise, not shot noise).
But the most important factor is the light passing through the lens, because most sensors of any given age are (*broadly*) similarly good at making use of that light.
F-number tells you light per unit area (cancelling out the effect of sensor size), which obscures the reason why bigger sensors tend to be better (more light).
eddychan008: 85mm 5.6 lens and 42.5mm 2.8 with same diameter of aperture.... that's right because this is how the aperature value is defined.
A 85mm lens at 2.8 may capture 4 times as much light as 42.5mm at 2.8 but at the same time this light is distributed over 4 times as big an area so to each pixel of each respective sensor still get the same amount of light on it if exposed for same shutter speed, same iso and same "aperture".
It is not the same as saying the smaller sensor has to bring the brightness level up with less total light captured ?
85mm **F5.6** (FF) and 42.5mm F2.8 (1/4 the size):Same *total* amount of light.
85mm **F2.8** (FF) and 42.5mm F2.8 (1/4 the size):Same light per unit area (which is what F-number, rather than aperture diameter gives you), but with four times the area, so 4x more light.
The fact that it was captured over four times the area becomes irrelevant as soon as you compare the images at the same size.
klausious: also, in case of zoom lens having fix maximum aperture, example: the canon 24-70mm f/2.8L, used on a full frame, when keeping the maximum aperture the same (at 2.8) while changing the focal length, will it change the aperture diameter as well and affect the amount of light going through the aperture? I do the maths, and at 24mm f/2.8, the aperture is opened at 8.57mm diameter, while at 70mm f/2.8, the aperture is opened at 25mm. will zoom in and out while keeping the aperture the same will change how much light my sensor receives? I have this question because I shoot contemporary dance concerts and more than often choreographers love moody pieces so the lights are extremely low. I can't compensate shutter speed because i need to capture dancers' movements. Also i hate noisy photo so i try not to boost ISO too much, the one that is left is aperture so I want to know if a zoom lens is appropriate in this situation where the aperture diameter change while I'm zooming in and out.
As the article says, equivalent aperture is only relevant when you're comparing across sensor sizes.
If you're mounting the f/2.8 lens on a single camera then the effect of f/2.8 is the same, regardless of focal length. The f-number ends up dictating light per-unit-area (illuminance), and, since you're not changing the amount of area that you're capturing that light on (you're keeping the sensor size the same), the effect is the same.
It's important to bear in mind that the ISO setting doesn't make that much difference to the noise. [It's usually the shutter speed and aperture you use](http://bit.ly/shotnoise).
klausious: So I read the articles and have a few question:_ so let say I'm using a 50mm at f/2 on a APS-C camera, which means that the aperture is opened at 25mm diameter. then I switch to a full frame camera, using a wider lens 35mm, if I want to achieve the aperture diameter of 25mm, my lens has to be opened at 1.4 (35/25=1.4). My questions are:1. is the depth of field maintain the same between the two settings above? 2. Also, because the full frame will capture more light, does it mean that having the aperture open at 25mm will make brighter image on a full frame compared to the cropped sensor?3. if yes to question 2, let say i want to capture the same amount of light for the setting i set on the APS-C on a full frame, should i go with a smaller aperture than 25mm? and yes then how can i calculate the correct aperture diameter?
Should I assume you've got your 35mm and 50mm back-to-front? (Did you intend to have the same field-of-view with both cameras?). I'm going to make the assumption that the difference between a 35mm F1.4 lens and a mathematically ideal 33mm F1.3 lens is negligible.
1) An APS-C camera with 35mm F1.4 and a full frame camera with 50mm F2 shot from the same position and focused on the same thing will have essentially the same depth-of-field (just as the comparable examples in the article have).
2) In this instance, if you had the same shutter speed, the full frame camera would receive less light per unit area but the same total light (because you're measuring more area)
3) I don't think I understand your question here.
Karroly: "As a result, when you shoot two different sized sensors with the same shutter speed, f-number and ISO, the camera with the smaller sensor has to produce the same final image brightness.. from less total light."
I am sorry, I do not agree at all.
Please, let me put this another way. If I use an F:2.8 FF lens on a APS-C body, it is still an F:2.8 lens and the picture taken (at same aperture/speed/ISO AND PIXEL SIZE) is just a crop of the FF sensor. The APS-C area of the FF sensor gets the same amount of light (either total or per area unit) than the APS-C sensor and thus the signal-to-noise ratio is the same...
Karroly - microlens design tends to be very good in terms of minimising the effect of gaps between pixels. Then, for very small pixels, technologies such as back-side illumination (BSI) help maximise the photosensitive area of each pixel.