Colour space settings

DavidMillier

Forum Pro
Messages
27,774
Solutions
1
Reaction score
8,366
Location
London, UK
Mike Johnston over on the TOP blog wrote in a post today:

"So why do we set our cameras to Adobe RGB if we're shooting raw, given that the raw file isn't tagged with a color space? Simply because it affects some of the camera settings, such as the histogram. The histogram and certain other affected settings will more closely represent the raw file if the camera is set to Adobe RGB. "

That is news to me, is it correct?
 
Mike Johnston over on the TOP blog wrote in a post today:

"So why do we set our cameras to Adobe RGB if we're shooting raw, given that the raw file isn't tagged with a color space? Simply because it affects some of the camera settings, such as the histogram. The histogram and certain other affected settings will more closely represent the raw file if the camera is set to Adobe RGB. "

That is news to me, is it correct?
It is a huge oversimplification. For example, ARGB and sRBG have different tone curves, hence different x-axes for the histogram. Neither is the tone curve of the raw file, which is linear. White points of the two spaces are D65, so that's not a difference. In a one-channel histogram, what is plotted on the y-axis is luminance, which you can get from either space.
 
Mike Johnston over on the TOP blog wrote in a post today:

"So why do we set our cameras to Adobe RGB if we're shooting raw, given that the raw file isn't tagged with a color space? Simply because it affects some of the camera settings, such as the histogram. The histogram and certain other affected settings will more closely represent the raw file if the camera is set to Adobe RGB. "

That is news to me, is it correct?
I have heard that claim several times.

IMO, the histogram with Adobe RGB is still so much off from the raw histogram that any potential benefit is inconsequential.
 
Mike Johnston over on the TOP blog wrote in a post today:

"So why do we set our cameras to Adobe RGB if we're shooting raw, given that the raw file isn't tagged with a color space? Simply because it affects some of the camera settings, such as the histogram. The histogram and certain other affected settings will more closely represent the raw file if the camera is set to Adobe RGB. "

That is news to me, is it correct?
I have heard that claim several times.

IMO, the histogram with Adobe RGB is still so much off from the raw histogram that any potential benefit is inconsequential.
IF you want the in-camera histo to get close to the raw histo, UniWB is the way to go. If you've done that, it doesn't make much difference whether you set the color space to ARGB or not. ARGB is a bit better because it doesn't clip as many colors.
 
What is the difference between the standard jpeg histogram and raw clipping?Is it practical to just choose a fixed amount of overexposure according to the histogram or is it more variable?

I'm normally so nervous about clipping highlights I underexpose the jpeg histogram by 2 stops (I exaggerate for drama, but there is a degree of truth). My A7r2 has an adjustable zebras indicator that I set to the max 105% but really it could safely go higher still.
 
What is the difference between the standard jpeg histogram and raw clipping?Is it practical to just choose a fixed amount of overexposure according to the histogram
You could do that if your subjects and your lighting were always the same.
or is it more variable?

I'm normally so nervous about clipping highlights I underexpose the jpeg histogram by 2 stops (I exaggerate for drama, but there is a degree of truth). My A7r2 has an adjustable zebras indicator that I set to the max 105% but really it could safely go higher still.
Get RawDigger and find out how much raw file underexposure your method is giving you.
 
I don't thing rawdigger works on linux but darktable has a raw clipping indicator. Just not sure how to use it. It seems to work a bit like normal over/underexposure indicators and shows areas that have clipped the "white point" (is that the sensor saturation point?). Useful for spotting overexposure, not so much underexposure.
 
I don't thing rawdigger works on linux but darktable has a raw clipping indicator. Just not sure how to use it. It seems to work a bit like normal over/underexposure indicators and shows areas that have clipped the "white point" (is that the sensor saturation point?). Useful for spotting overexposure, not so much underexposure.
If you don't have a way to look at raw histograms, you're not going to understand what you need to do with your camera to get ETTR raw files.
 
I've got rawdigger running under Wine but I'm a bit baffled as to how I open a raw file as it can't see my data disk where the files live. Working on it.

Edit: Ok, managed to figure out how to mount and access my photo drive and have loaded a random image which is previewing in the main window.

--
Website: http://www.whisperingcat.co.uk/ (2022 - website rebuilt, updated and back in action)
DPReview gallery: https://www.dpreview.com/galleries/0286305481
Flickr: http://www.flickr.com/photos/davidmillier/ (very old!)
 
Last edited:
41400b0ecb9c4640894dd796850b208d.jpg



--
Website: http://www.whisperingcat.co.uk/ (2022 - website rebuilt, updated and back in action)
DPReview gallery: https://www.dpreview.com/galleries/0286305481
Flickr: http://www.flickr.com/photos/davidmillier/ (very old!)
 
Last edited:
Why does it say EV0 = 2048. Shouldn't it start from 0?

It's all a bit mysterious at first sight. I'm looking for documentation I can understand.

Cool, though :-)
The standard windows installation includes a manual. If you don’t have one, I’m pretty sure you can download one from their website.
 
Are the vertical bars 1 stop intervals?
Yes.
And where is the clipping point on the graph?
I don’t know your camera, but I’m assuming 14 bit precision, so about 16000 counts.
Ah right.

So red channel 2 stops under, green channels 1 stop under and blue channel about 1.5 stops.

Presumably the target is to have all channels as close to 16000 without any exceeding that value?

In terms of learning to expose properly, I guess the approach might be to take some test shots ETTR-ing using the jpg histogram, then compare the jpg histogram to the rawdigger one to work out headroom available and how much it is safe to overexpose according to the jpg histogram?

Assuming for didactic purposes, that this example is typical, I should be going at least a stop hotter.
 
Found the manual, thanks. It says the EV0 point is set as 3 stops below clipping.That clears up that mystery.

I guess that means the perfect exposure yields +EV2.999... in all 4 channels.

I usually try for about 1/3 EV below the hottest channel of the jpeg histogram. I guess that means I have at least a stop of headroom unused with this strategy, possibly more.

And this means more noise than is necessary.

Something to test/check on all my cameras.

--
Website: http://www.whisperingcat.co.uk/ (2022 - website rebuilt, updated and back in action)
DPReview gallery: https://www.dpreview.com/galleries/0286305481
Flickr: http://www.flickr.com/photos/davidmillier/ (very old!)
 
Last edited:
Are the vertical bars 1 stop intervals?
Yes.
And where is the clipping point on the graph?
I don’t know your camera, but I’m assuming 14 bit precision, so about 16000 counts.
Ah right.

So red channel 2 stops under, green channels 1 stop under and blue channel about 1.5 stops.

Presumably the target is to have all channels as close to 16000 without any exceeding that value?
If that’s your goal you’ll have to use filters in front of the lens. Most people just want the hottest channel close to clipping.
In terms of learning to expose properly, I guess the approach might be to take some test shots ETTR-ing using the jpg histogram, then compare the jpg histogram to the rawdigger one to work out headroom available and how much it is safe to overexpose according to the jpg histogram?

Assuming for didactic purposes, that this example is typical, I should be going at least a stop hotter.
 
Are the vertical bars 1 stop intervals?
Yes.
And where is the clipping point on the graph?
I don’t know your camera, but I’m assuming 14 bit precision, so about 16000 counts.
Ah right.

So red channel 2 stops under, green channels 1 stop under and blue channel about 1.5 stops.

Presumably the target is to have all channels as close to 16000 without any exceeding that value?

In terms of learning to expose properly, I guess the approach might be to take some test shots ETTR-ing using the jpg histogram, then compare the jpg histogram to the rawdigger one to work out headroom available and how much it is safe to overexpose according to the jpg histogram?

Assuming for didactic purposes, that this example is typical, I should be going at least a stop hotter.
The target is to avoid clipping in relevant areas of the image. Sometimes it is only one channel that clips.

The histograms show you whether there is clipping. Light clipping may be OK, as post-processing programs can reconstruct some missing data using non-clipped channels (highlight recovery).

There is an "OvExp" check box in the main window. It displays which parts of the images have blown highlights.

The difference between raw and JPEG clipping is not constant and depends on the scene.

It is more important not to clip relevant parts than to have an optimally exposed image.
 
I guess the thing to do is to bracket a bit then to use rawdigger to check the effect. Probably better than always underexposing 2 stops 'just to be safe'.
 

Keyboard shortcuts

Back
Top