Less Dynamic Range When You Shoot FF Camera in Super-35 Mode?

TheOwl360

Forum Enthusiast
Messages
482
Reaction score
345
Location
Seattle, WA, US
Do full-frame cameras in the Sony lineup lose dynamic range when they are shot in super-35 mode?

If so, why?

Thanks in advance for any constructive feedback.
 
Last edited:
Could anyone point me to an actual example of losing dynamic range when in crop mode? I saw similar posts regarding DOF, light gathering, noise etc changing and I feel like it’s all just misconceptions. Actual examples or any sort of realistic measurement obviously would help dispel/prove these points.
It’s a matter of geometry and basic physics. As various people pointed out, OP didn’t explain why we are using crop mode, so we don’t know what two photographic situations are implied by losing dynamic range.

I have an A7R4 with APSC mode attached to a button. Instead of a FF sensor with 61Mpix, that gives an APSC one with around 27Mpix.

You could compare the A7R4 to an A6600 and see roughly the difference in DR. The most important factor is that the sensor’s ability to capture photons as electrons depends on area. It’s called the full well capacity.

As a thought experiment, take a Tamron 28-75/2.8 and shoot it at 42mm f4.2 and ISO 100 in FF mode and then 28mm f2.8 and ISO 100 in APSC mode. The two images will be very similar, including DoF, field of view etc. The FF image will have over twice the light capture of the APSC one. That allows you to raise the shadows more.

Hope that helps.

Andrew
 
Do full-frame cameras in the Sony lineup lose dynamic range when they are shot in super-35 mode?

If so, why?

Thanks in advance for any constructive feedback.
I'm surprised that some many responces indicates, that you will not loose dynamic range in crop (super 35) mode. I'm far from being expert on the topic, but just from the logical point of view:

What is the difference between FF and APSC sensor: area size

What is the difference of dynamic range between FF and APSC sensor: roughly about 1EV

What is happening in crop mode: reducing sensor area to the size of APSC sensor

So my only conclusion is: YES, you will loose dynamic range in super-35 mode. It's the same as you would pull the A6400 out of the drawer and use it. Results from photonstophotos.net confirm that.

9b6205304c5c4663860d035e1af4a85b.jpg
 
Do full-frame cameras in the Sony lineup lose dynamic range when they are shot in super-35 mode?

If so, why?

Thanks in advance for any constructive feedback.
I'm surprised that some many responces indicates, that you will not loose dynamic range in crop (super 35) mode. I'm far from being expert on the topic, but just from the logical point of view:

What is the difference between FF and APSC sensor: area size

What is the difference of dynamic range between FF and APSC sensor: roughly about 1EV

What is happening in crop mode: reducing sensor area to the size of APSC sensor

So my only conclusion is: YES, you will loose dynamic range in super-35 mode. It's the same as you would pull the A6400 out of the drawer and use it. Results from photonstophotos.net confirm that.

9b6205304c5c4663860d035e1af4a85b.jpg
people confuse what dynamic range actually is. in my opinion it has nothing to do with noise, but capturing the extremes of light. but for the same fov image then the FF has less noise but the testers use noise as the way of measuring dynamic range which is wrong imop. i shot a pro shoot with my a6300 last week for kicks and didnt notice any drop in dr at all. in fact the images were spectacular even compared to my a7r2

Ds

--
The confusion starts when the scientists can't agree amongst themselves. Henry F
 
Do full-frame cameras in the Sony lineup lose dynamic range when they are shot in super-35 mode?

If so, why?

Thanks in advance for any constructive feedback.
I'm surprised that some many responces indicates, that you will not loose dynamic range in crop (super 35) mode. I'm far from being expert on the topic, but just from the logical point of view:

What is the difference between FF and APSC sensor: area size

What is the difference of dynamic range between FF and APSC sensor: roughly about 1EV

What is happening in crop mode: reducing sensor area to the size of APSC sensor

So my only conclusion is: YES, you will loose dynamic range in super-35 mode. It's the same as you would pull the A6400 out of the drawer and use it. Results from photonstophotos.net confirm that.

9b6205304c5c4663860d035e1af4a85b.jpg
people confuse what dynamic range actually is. in my opinion it has nothing to do with noise, but capturing the extremes of light.
Agree
but for the same fov image then the FF has less noise but the testers use noise as the way of measuring dynamic range which is wrong imop. i shot a pro shoot with my a6300 last week for kicks and didnt notice any drop in dr at all.
DR range is not easy to "see" in the image or footage without getting in extreme situation and direct comparison (FF/APSC difference is relatively small). Did you make such comparison?
in fact the images were spectacular even compared to my a7r2

Ds
At the end I'm not sure if you agree with me or not :-)
--
The confusion starts when the scientists can't agree amongst themselves. Henry F
 
Do full-frame cameras in the Sony lineup lose dynamic range when they are shot in super-35 mode?

If so, why?

Thanks in advance for any constructive feedback.
I'm surprised that some many responces indicates, that you will not loose dynamic range in crop (super 35) mode. I'm far from being expert on the topic, but just from the logical point of view:

What is the difference between FF and APSC sensor: area size

What is the difference of dynamic range between FF and APSC sensor: roughly about 1EV

What is happening in crop mode: reducing sensor area to the size of APSC sensor

So my only conclusion is: YES, you will loose dynamic range in super-35 mode. It's the same as you would pull the A6400 out of the drawer and use it. Results from photonstophotos.net confirm that.

9b6205304c5c4663860d035e1af4a85b.jpg
people confuse what dynamic range actually is. in my opinion it has nothing to do with noise, but capturing the extremes of light.
Agree
but for the same fov image then the FF has less noise but the testers use noise as the way of measuring dynamic range which is wrong imop. i shot a pro shoot with my a6300 last week for kicks and didnt notice any drop in dr at all.
DR range is not easy to "see" in the image or footage without getting in extreme situation and direct comparison (FF/APSC difference is relatively small). Did you make such comparison?
thats not how i look at dr . i look at colour graduation quality irrelevant of extreme light, veiling glare stops you shooting past 10 stops anyway. and thats only because you can push the image in post, no camera can shoot past 9 stops from my tests out of camera. using a grey card.
in fact the images were spectacular even compared to my a7r2

Ds
At the end I'm not sure if you agree with me or not :-)
Im a pixel size guy. the larger the better :-) and the pixels on both my cameras a roughy the same size. now shooting low light video and pixel binning the FF sensor then the a7r2 beyond iso 6400 is a clear winner.

Ds

--
The confusion starts when the scientists can't agree amongst themselves. Henry F
 
Could anyone point me to an actual example of losing dynamic range when in crop mode? I saw similar posts regarding DOF, light gathering, noise etc changing and I feel like it’s all just misconceptions. Actual examples or any sort of realistic measurement obviously would help dispel/prove these points.
I would love to see evidence of this as well
 
Could anyone point me to an actual example of losing dynamic range when in crop mode? I saw similar posts regarding DOF, light gathering, noise etc changing and I feel like it’s all just misconceptions. Actual examples or any sort of realistic measurement obviously would help dispel/prove these points.
I would love to see evidence of this as well
Did you guys actually read what others post in this forum? Isn't graphs from measuring at photons to photos enough? Or you just decided to ignore that because it did not fit to your opinion about topic? Just asking...
 
Could anyone point me to an actual example of losing dynamic range when in crop mode? I saw similar posts regarding DOF, light gathering, noise etc changing and I feel like it’s all just misconceptions. Actual examples or any sort of realistic measurement obviously would help dispel/prove these points.
I would love to see evidence of this as well
Did you guys actually read what others post in this forum? Isn't graphs from measuring at photons to photos enough? Or you just decided to ignore that because it did not fit to your opinion about topic? Just asking...
The argue meant is that due to methodology (shooting the same size target at a different distance or with a different lens) the method forces that to be the result. The flip side argument is that if you literally take the same shot that you would have gotten with the full sensor and cropped away the edges then each pixel responds with the same response as it otherwise would.



im unclear which is right but more than a graph is needed. Good images with graduated hues in sequence might help shed light on this. (Pun intended).
 
Could anyone point me to an actual example of losing dynamic range when in crop mode? I saw similar posts regarding DOF, light gathering, noise etc changing and I feel like it’s all just misconceptions. Actual examples or any sort of realistic measurement obviously would help dispel/prove these points.
I would love to see evidence of this as well
Did you guys actually read what others post in this forum? Isn't graphs from measuring at photons to photos enough? Or you just decided to ignore that because it did not fit to your opinion about topic? Just asking...
The argue meant is that due to methodology (shooting the same size target at a different distance or with a different lens) the method forces that to be the result. The flip side argument is that if you literally take the same shot that you would have gotten with the full sensor and cropped away the edges then each pixel responds with the same response as it otherwise would.

im unclear which is right but more than a graph is needed. Good images with graduated hues in sequence might help shed light on this. (Pun intended).
I've been through this one. My confusion arose from wondering how a sensor with 14-bit output (per pixel) could have >15-stops of dynamic range. It simply can't! Turns out that DR is calculated by down-sampling the image to 8 megapixels or so, printing it, then measuring DR from the print. Somehow...

Personally, while this may be "traditional", I still think it's a complete crock. If a sensor is specced at 50 megapixels, I expect to get 50 million+ pixels from it. If it is specced as having 15-stops of dynamic range, I expect each pixel to have that range. That is, I should be able to point the camera at a scene that has a 15-stop range of illumination and capture it all in a single shot - but I can't. I'll either block some shadows or burn some highlights, or both.

Anyway, as I understand it now, an 8mpix image downsampled from full frame has more DR than an 8mpix image downsampled from APS-C crop (from the same sensor). I think that's where the assertion that "FF has more DR than crop" comes from...

Personally, I find pixel-level specs more useful (and understandable).
 
Last edited:
Could anyone point me to an actual example of losing dynamic range when in crop mode? I saw similar posts regarding DOF, light gathering, noise etc changing and I feel like it’s all just misconceptions. Actual examples or any sort of realistic measurement obviously would help dispel/prove these points.
I would love to see evidence of this as well
Did you guys actually read what others post in this forum? Isn't graphs from measuring at photons to photos enough? Or you just decided to ignore that because it did not fit to your opinion about topic? Just asking...
The argue meant is that due to methodology (shooting the same size target at a different distance or with a different lens) the method forces that to be the result. The flip side argument is that if you literally take the same shot that you would have gotten with the full sensor and cropped away the edges then each pixel responds with the same response as it otherwise would.

im unclear which is right but more than a graph is needed. Good images with graduated hues in sequence might help shed light on this. (Pun intended).
There is no contradiction between the two ways of looking at this, both are valid views. They answer different questions though.

How much do pixels change if I crop the outside of an image? Answer is not at all. Same amount of light per pixel, same changes from pixel to pixel.

How much impacts using a cropped sensor area my image IQ, when changing to an "equivalent" lens? Very little. Same amount of light, same DOF, roughly the same noise, but less resolution.

How much impacts using a cropped sensor area my image IQ, when using the same lens and stepping back to compensate? A lot. Perspective is changed. DOF is changed, amount of light is changed.

A visualization of crop equivalence (if that is your question): https://www.dpreview.com/articles/2666934640/what-is-equivalence-and-why-should-i-care
 
Last edited:
Do full-frame cameras in the Sony lineup lose dynamic range when they are shot in super-35 mode?

If so, why?

Thanks in advance for any constructive feedback.
Yes - forget the per pixel dynamic range many are talking about in this thread. That's irrelevent.





Instead think about a given output like an 8x10 photo or a 40x30 wall hanging. A general rule of them given similar sensor technology between cameras, is the more photons you get to play with between taking the picture and producing the output the more Dynamic Range and Less Noise you will have.





This should be intuitively obvious to anyone who actually shoots with different format camreas.





Since in this case we are talking about the same sensor in the same camera, just throwing a lot of light and pixels away, of course we are going to have more noise which of course lowers your dynamic range because you've raised your noise floor.




If anyone is still struggling with this, just try it with your own camera. Note as you crop more and more and get down to the 100 percent view, at 100 percent you've got a lot of noise. The more you back away in your raw processor the less noise is visible in the output.





Yes, the bright side of things aren't going to be noisier and shrink the dynamic range as that's a hard cut-off. But the effective noise floor goes up as you crop out how much of a frame you are going to use and blow it up to fill the space you could have used your whole sensor for.





If you don't believe me even now, take a shot, crop out 3/4's of the picture then take the same shot but move the camera closer so the framing is away in your 3/4s crop. Blow them up to the same size output, what do you have with the 1/4 crop of your original, a much noiser picture..... of course.
 
These conversations always descend into a bun fight. I'm pretty sure it's because of one very important assumption about what "use crop mode" means (or "super-35 mode" to use the OP's phrasing) .

For those, like me, who say it makes no difference to DR, we stand in the same place when we take the crop shot as we did when we took the FF shot. Our image is literally a crop from the middle of the original full frame. DR is the same.

For others, who say DR is reduced, I think they assume we back off some distance to include the exact same FoV in the crop view as we had for the FF shot. In that case, there are fewer pixels covering the same scene, so overall DR does drop in that definition.

In summary.

Stand still - crop shot has same DR as the same area cropped from the FF shot

Move back and reframe the same - crop shot has lower overall DR than the FF shot

Is it that? (He asks, expecting a ton of incoming opprobrium for trying to be helpful :-) ).
When cropping, the image is magnified, thus noise is magnified and worse for a given ISO. With worse single to noise ratio, dynamic range is reduced. You're effectively using an APS-C sensor, which has worse noise and DR for effectively the same reasons. If you crop to 2x, you'll see roughly equivalent noise and DR to a Micro Four Thirds camera.

It has nothing to do with where you stand, FOV, or anything like that.
 
Last edited:
Do full-frame cameras in the Sony lineup lose dynamic range when they are shot in super-35 mode?

If so, why?

Thanks in advance for any constructive feedback.
I'm surprised that some many responces indicates, that you will not loose dynamic range in crop (super 35) mode. I'm far from being expert on the topic, but just from the logical point of view:

What is the difference between FF and APSC sensor: area size

What is the difference of dynamic range between FF and APSC sensor: roughly about 1EV

What is happening in crop mode: reducing sensor area to the size of APSC sensor

So my only conclusion is: YES, you will loose dynamic range in super-35 mode. It's the same as you would pull the A6400 out of the drawer and use it. Results from photonstophotos.net confirm that.

9b6205304c5c4663860d035e1af4a85b.jpg
people confuse what dynamic range actually is. in my opinion it has nothing to do with noise, but capturing the extremes of light.
There are well-defined, industry standard ways to measure dynamic range. Signal-to-noise is key - there is a noise floor, a point a which there is too much noise in the shadows to create usable detail, and this is an important aspect of what dynamic range represents, data captured below the noise floor isn't considered usable information.

You can make up your own definition for dynamic range, but it's sort of like inventing your own temperature system or calling blue orange and red green. You're welcome to do it but people might be confused or have a hard time having a conversation with you until you explain yourself.
but for the same fov image then the FF has less noise but the testers use noise as the way of measuring dynamic range which is wrong imop. i shot a pro shoot with my a6300 last week for kicks and didnt notice any drop in dr at all. in fact the images were spectacular even compared to my a7r2

Ds
How much you'll notice the difference in DR from one camera to another depends on the situation. If it's overcast and you're not doing much post processing, most cameras made in the last 20 years can handle it pretty well. If there is high contrast lighting and you're recovering highlights and pushing up shadows, you'll generally notice the extra stop or so of dynamic range.
 
Last edited:
Could anyone point me to an actual example of losing dynamic range when in crop mode? I saw similar posts regarding DOF, light gathering, noise etc changing and I feel like it’s all just misconceptions. Actual examples or any sort of realistic measurement obviously would help dispel/prove these points.
I would love to see evidence of this as well
Did you guys actually read what others post in this forum? Isn't graphs from measuring at photons to photos enough? Or you just decided to ignore that because it did not fit to your opinion about topic? Just asking...
The argue meant is that due to methodology (shooting the same size target at a different distance or with a different lens) the method forces that to be the result. The flip side argument is that if you literally take the same shot that you would have gotten with the full sensor and cropped away the edges then each pixel responds with the same response as it otherwise would.

im unclear which is right but more than a graph is needed. Good images with graduated hues in sequence might help shed light on this. (Pun intended).
I've been through this one. My confusion arose from wondering how a sensor with 14-bit output (per pixel) could have >15-stops of dynamic range. It simply can't! Turns out that DR is calculated by down-sampling the image to 8 megapixels or so, printing it, then measuring DR from the print. Somehow...
You can represent 16 stops of dynamic range in an 8-bit image, and you can represent 8 stops of dynamic range in a 16-bit image. There is no direct relationship between 1 stop of dynamic range and 1 bit of precision for a digital image format, though the more stops the camera can record, generally the more beneficial it is to use a high bit-depth format, otherwise the gradation or granularity of the captured data is reduced.

The sensor is capable of recording a certain range of exposure values (overall dynamic range). This data is converted into a digital image format after capture via an ADC, or analog to digital converter. The ADC samples the sensor data at a given bit depth, and this is what is stored in the raw file.

a3d96d9a5d3a4e7c80d348619e7fe885.jpg.png

Think of it like the sample rate of a digital audio file. More samples gives you a more accurate waveform, but a higher sample rate doesn't necessarily increase the dynamic range of the sound. The same is true with photography. High bit-depth image formats can record the data with more precision, which is helpful when you're capturing and manipulating a larger data set.

There are various situations which can reduce the effective dynamic range of a given camera. Increasing the ISO increases noise and results in less dynamic range (worse signal to noise ratio). Cropping the image has a similar effect. Shooting in modes with reduced bit-depth can reduce DR as well, such as burst modes, lossless compression, video in various flavors, etc.
Personally, while this may be "traditional", I still think it's a complete crock. If a sensor is specced at 50 megapixels, I expect to get 50 million+ pixels from it. If it is specced as having 15-stops of dynamic range, I expect each pixel to have that range. That is, I should be able to point the camera at a scene that has a 15-stop range of illumination and capture it all in a single shot - but I can't. I'll either block some shadows or burn some highlights, or both.
Single-pixel performance is a somewhat dubious concept. For instance, If we want to know how noisy the sensor output from a given camera is, we need to look at a group of pixels at the bare minimum. If we want to know how much noise a photograph has, especially in relation to the same scene captured by another camera, we need to look at the entire image. If we want to know how much noise an image has perceptively, we also need to take into consideration the size of the image and how far the viewer is from it, as noise will be more visible when viewing large images close up than when viewing small images from far away - this is especially relevant when comparing cameras with different pixel pitches, as viewing the image zoomed to the pixel level will mean different magnifications for cameras with different numbers of megapixels.
Anyway, as I understand it now, an 8mpix image downsampled from full frame has more DR than an 8mpix image downsampled from APS-C crop (from the same sensor). I think that's where the assertion that "FF has more DR than crop" comes from...

Personally, I find pixel-level specs more useful (and understandable).
 
Last edited:
One last thing, when discussing dynamic range it's important to keep in mind that highlight capture is binary, and shadow capture is more of a transitional effect.

When it comes to highlights, you should get a clean capture as long as the exposure value doesn't exceed the well capacity of the individual sensel. You can think of this like filling up a bucket of water. As long as your bucket (sensel well) is big enough, you're fine, but if the volume of water (exposure value) is too big, it won't fit in your bucket, and some water will spill over - or in our case the brightest highlights will clip.

Shadow data is more complex. There is a hard floor or clipping point to shadow data, however, there is a range of information that is recorded near the shadow floor. The quality of this data is determined by how much signal to noise there is. Generally, larger and/or more efficient sensors are capable of capturing more signal in the shadows, so they're capable of producing images with more dynamic range.

When it comes to cropping an image, we magnify the noise, which increases the noise levels in the shadows, thus reducing the signal to noise ratio and overall dynamic range.

That said, cropping does not decrease sensel well capacity, so it won't cause your highlights to clip sooner. But the effect on the overall noise of the image reduces dynamic range from the low end.

Back to the bucket example, think of cropping like filling the lower section of your bucket up with sand. The bucket is still capable of holding the same volume of material, but the bottom is no longer useful.
 
Last edited:
Do full-frame cameras in the Sony lineup lose dynamic range when they are shot in super-35 mode?

If so, why?

Thanks in advance for any constructive feedback.
I'm surprised that some many responces indicates, that you will not loose dynamic range in crop (super 35) mode. I'm far from being expert on the topic, but just from the logical point of view:

What is the difference between FF and APSC sensor: area size

What is the difference of dynamic range between FF and APSC sensor: roughly about 1EV

What is happening in crop mode: reducing sensor area to the size of APSC sensor

So my only conclusion is: YES, you will loose dynamic range in super-35 mode. It's the same as you would pull the A6400 out of the drawer and use it. Results from photonstophotos.net confirm that.

9b6205304c5c4663860d035e1af4a85b.jpg
people confuse what dynamic range actually is. in my opinion it has nothing to do with noise, but capturing the extremes of light.
There are well-defined, industry standard ways to measure dynamic range. Signal-to-noise is key - there is a noise floor, a point a which there is too much noise in the shadows to create usable detail, and this is an important aspect of what dynamic range represents, data captured below the noise floor isn't considered usable information.
there is no industry standard period. Industry standard is take your camera out side and shoot a neutral grey card and see how many stops of recorded detail there is. i bet you wont go past 9 :-)
You can make up your own definition for dynamic range, but it's sort of like inventing your own temperature system or calling blue orange and red green. You're welcome to do it but people might be confused or have a hard time having a conversation with you until you explain yourself.
but for the same fov image then the FF has less noise but the testers use noise as the way of measuring dynamic range which is wrong imop. i shot a pro shoot with my a6300 last week for kicks and didnt notice any drop in dr at all. in fact the images were spectacular even compared to my a7r2

Ds
How much you'll notice the difference in DR from one camera to another depends on the situation. If it's overcast and you're not doing much post processing, most cameras made in the last 20 years can handle it pretty well. If there is high contrast lighting and you're recovering highlights and pushing up shadows, you'll generally notice the extra stop or so of dynamic range.
pixel size is pixel size period. they produce the same voltage , they are not connected in series to the next.

Ds

--
The confusion starts when the scientists can't agree amongst themselves. Henry F
 
Do full-frame cameras in the Sony lineup lose dynamic range when they are shot in super-35 mode?

If so, why?

Thanks in advance for any constructive feedback.
I'm surprised that some many responces indicates, that you will not loose dynamic range in crop (super 35) mode. I'm far from being expert on the topic, but just from the logical point of view:

What is the difference between FF and APSC sensor: area size

What is the difference of dynamic range between FF and APSC sensor: roughly about 1EV

What is happening in crop mode: reducing sensor area to the size of APSC sensor

So my only conclusion is: YES, you will loose dynamic range in super-35 mode. It's the same as you would pull the A6400 out of the drawer and use it. Results from photonstophotos.net confirm that.

9b6205304c5c4663860d035e1af4a85b.jpg
people confuse what dynamic range actually is. in my opinion it has nothing to do with noise, but capturing the extremes of light. but for the same fov image then the FF has less noise but the testers use noise as the way of measuring dynamic range which is wrong imop.
Looks like you don’t understand it after all. Noise is exactly the thing that prevents detail from being detected in dark areas of the scene. Not sure what you think it is, if not noise?
i shot a pro shoot with my a6300 last week for kicks and didnt notice any drop in dr at all. in fact the images were spectacular even compared to my a7r2

Ds
Sure. That camera would have more than enough DR for the average “pro shoot”. But that doesn’t make your previous sentences right. ;-)

--
"A picture is a secret about a secret: the more it tells you, the less you know." —Diane Arbus
 
Do full-frame cameras in the Sony lineup lose dynamic range when they are shot in super-35 mode?

If so, why?

Thanks in advance for any constructive feedback.
I'm surprised that some many responces indicates, that you will not loose dynamic range in crop (super 35) mode. I'm far from being expert on the topic, but just from the logical point of view:

What is the difference between FF and APSC sensor: area size

What is the difference of dynamic range between FF and APSC sensor: roughly about 1EV

What is happening in crop mode: reducing sensor area to the size of APSC sensor

So my only conclusion is: YES, you will loose dynamic range in super-35 mode. It's the same as you would pull the A6400 out of the drawer and use it. Results from photonstophotos.net confirm that.

9b6205304c5c4663860d035e1af4a85b.jpg
people confuse what dynamic range actually is. in my opinion it has nothing to do with noise, but capturing the extremes of light. but for the same fov image then the FF has less noise but the testers use noise as the way of measuring dynamic range which is wrong imop.
Looks like you don’t understand it after all. Noise is exactly the thing that prevents detail from being detected in dark areas of the scene. Not sure what you think it is, if not noise?
i shot a pro shoot with my a6300 last week for kicks and didnt notice any drop in dr at all. in fact the images were spectacular even compared to my a7r2

Ds
Sure. That camera would have more than enough DR for the average “pro shoot”. But that doesn’t make your previous sentences right. ;-)
how do you figure that ? black is black and white is white. or an i missing something ? i shoot black costumes on white backgrounds.

Ds

--
The confusion starts when the scientists can't agree amongst themselves. Henry F
 
Do full-frame cameras in the Sony lineup lose dynamic range when they are shot in super-35 mode?

If so, why?

Thanks in advance for any constructive feedback.
I'm surprised that some many responces indicates, that you will not loose dynamic range in crop (super 35) mode. I'm far from being expert on the topic, but just from the logical point of view:

What is the difference between FF and APSC sensor: area size

What is the difference of dynamic range between FF and APSC sensor: roughly about 1EV

What is happening in crop mode: reducing sensor area to the size of APSC sensor

So my only conclusion is: YES, you will loose dynamic range in super-35 mode. It's the same as you would pull the A6400 out of the drawer and use it. Results from photonstophotos.net confirm that.

9b6205304c5c4663860d035e1af4a85b.jpg
people confuse what dynamic range actually is. in my opinion it has nothing to do with noise, but capturing the extremes of light. but for the same fov image then the FF has less noise but the testers use noise as the way of measuring dynamic range which is wrong imop.
Looks like you don’t understand it after all. Noise is exactly the thing that prevents detail from being detected in dark areas of the scene. Not sure what you think it is, if not noise?
i shot a pro shoot with my a6300 last week for kicks and didnt notice any drop in dr at all. in fact the images were spectacular even compared to my a7r2

Ds
Sure. That camera would have more than enough DR for the average “pro shoot”. But that doesn’t make your previous sentences right. ;-)
how do you figure that ? black is black and white is white. or an i missing something ? i shoot black costumes on white backgrounds.
Even black velvet still has sufficient reflectance (ca 1%) to be easily visible in an otherwise properly exposed shot. Similarly non-overexposed whites retain at least some structure. So there should be no pure whites nor pure blacks.

Of course this may not be visible on your computer display. Typical computer displays (and JPGs in general) can only reproduce 8bit or less. Since your a6300 has more than 10bit (at baes ISO), you may not notice any impact of DR at all.
 
Last edited:
Do full-frame cameras in the Sony lineup lose dynamic range when they are shot in super-35 mode?

If so, why?

Thanks in advance for any constructive feedback.
I'm surprised that some many responces indicates, that you will not loose dynamic range in crop (super 35) mode. I'm far from being expert on the topic, but just from the logical point of view:

What is the difference between FF and APSC sensor: area size

What is the difference of dynamic range between FF and APSC sensor: roughly about 1EV

What is happening in crop mode: reducing sensor area to the size of APSC sensor

So my only conclusion is: YES, you will loose dynamic range in super-35 mode. It's the same as you would pull the A6400 out of the drawer and use it. Results from photonstophotos.net confirm that.

9b6205304c5c4663860d035e1af4a85b.jpg
people confuse what dynamic range actually is. in my opinion it has nothing to do with noise, but capturing the extremes of light. but for the same fov image then the FF has less noise but the testers use noise as the way of measuring dynamic range which is wrong imop.
Looks like you don’t understand it after all. Noise is exactly the thing that prevents detail from being detected in dark areas of the scene. Not sure what you think it is, if not noise?
i shot a pro shoot with my a6300 last week for kicks and didnt notice any drop in dr at all. in fact the images were spectacular even compared to my a7r2

Ds
Sure. That camera would have more than enough DR for the average “pro shoot”. But that doesn’t make your previous sentences right. ;-)
how do you figure that ? black is black and white is white. or an i missing something ? i shoot black costumes on white backgrounds.
Even black velvet still has sufficient reflectance (ca 1%) to be easily visible in an otherwise properly exposed shot. Similarly non-overexposed whites retain at least some structure. So there should be no pure whites nor pure blacks.

Of course this may not be visible on your computer display. Typical computer displays (and JPGs in general) can only reproduce 8bit or less. Since your a6300 has more than 10bit (at baes ISO), you may not notice any impact of DR at all.
my monitor is 10 bit ,but "bits" have nothing to do with DR.

Ds

--
The confusion starts when the scientists can't agree amongst themselves. Henry F
 
Last edited:

Keyboard shortcuts

Back
Top