Working color space

Started Apr 26, 2012 | Discussions
10sj03 Regular Member • Posts: 388
Working color space

I think I am missing a basic concept on working color space. If I am editing my photos on a computer monitor that can only support sRBG, is there any advantage of selecting ProPhoto or even Adobe RGB as working color space? If I cannot see the difference on the monitor except using larger color space does not show clipping on color channels, what do I gain by using a larger working color space?

Thanks.

Joe

gollywop
gollywop Veteran Member • Posts: 8,279
Re: Working color space

10sj03 wrote:

I think I am missing a basic concept on working color space. If I am editing my photos on a computer monitor that can only support sRBG, is there any advantage of selecting ProPhoto or even Adobe RGB as working color space? If I cannot see the difference on the monitor except using larger color space does not show clipping on color channels, what do I gain by using a larger working color space?

It depends on what you want to do with finished image. If it is going to an Inkjet printer, that likely has a color space larger than sRGB. In which case you gain if you are processing a raw file in 16 bits because it gives you greater latitude with which to make tonal and color adjustments to your image.

If you are shooting and processing a jpeg, however, this is considerably less of a consideration, and you're likely just fine staying in sRGB or Adobe RGB at the most -- but the latter only if you've got your camera's color space set to Adobe RGB as well. If the camera color space is at sRGB, you'll never regain the broader colors present in Adobe RGB.

If you're processing for the web, there is a good case to be made for keeping with sRGB all the way through. It is often quite problematic crunching the broader spaces down into sRGB as must be done as a final step in processing for the web.

-- hide signature --

gollywop

-----------

Simon Garrett Veteran Member • Posts: 5,853
Re: Working color space

gollywop wrote:

10sj03 wrote:

I think I am missing a basic concept on working color space. If I am editing my photos on a computer monitor that can only support sRBG, is there any advantage of selecting ProPhoto or even Adobe RGB as working color space? If I cannot see the difference on the monitor except using larger color space does not show clipping on color channels, what do I gain by using a larger working color space?

It depends on what you want to do with finished image.

Quite, and I agree that if you know your only use for the image is to put it on the web, then you might as well do everything in sRGB, as gollywop says.

Using a larger working space keeps your options open. Lightroom, for example, always uses ProPhoto RGB as its working space for that reason; you don't get a choice.
--
Simon

 Simon Garrett's gear list:Simon Garrett's gear list
Nikon D800
OP 10sj03 Regular Member • Posts: 388
Re: Working color space

gollywop wrote:

10sj03 wrote:

I think I am missing a basic concept on working color space. If I am editing my photos on a computer monitor that can only support sRBG, is there any advantage of selecting ProPhoto or even Adobe RGB as working color space? If I cannot see the difference on the monitor except using larger color space does not show clipping on color channels, what do I gain by using a larger working color space?

It depends on what you want to do with finished image. If it is going to an Inkjet printer, that likely has a color space larger than sRGB. In which case you gain if you are processing a raw file in 16 bits because it gives you greater latitude with which to make tonal and color adjustments to your image.

If you are shooting and processing a jpeg, however, this is considerably less of a consideration, and you're likely just fine staying in sRGB or Adobe RGB at the most -- but the latter only if you've got your camera's color space set to Adobe RGB as well. If the camera color space is at sRGB, you'll never regain the broader colors present in Adobe RGB.

If you're processing for the web, there is a good case to be made for keeping with sRGB all the way through. It is often quite problematic crunching the broader spaces down into sRGB as must be done as a final step in processing for the web.

-- hide signature --

gollywop

-----------

Thanks for your reply. I think I follow your logics and they make sense. When I attended a photography class a while ago, the instructor recommended us to set camera to Adobe RBG and shoot raw. He also said to set Lightroom working color space to ProPhoto. His views are to set to the largest color space available.

My question is if my monitor can only display sRBG, would it be the limiting factor because I won't be able to edit what I can't see on screen due to the limited sRGB space?

As to what I want to do with finish image - both print by lab and for the web. For the web I change it to sRGB as a final step.

Joe

joey_B
joey_B Veteran Member • Posts: 3,080
Re: Working color space

10sj03 wrote:

Thanks for your reply. I think I follow your logics and they make sense. When I attended a photography class a while ago, the instructor recommended us to set camera to Adobe RBG and shoot raw. He also said to set Lightroom working color space to ProPhoto. His views are to set to the largest color space available.

A good instructor would have told you 'why'... First to set right a misconception, sRGB doesn't 'fit' into CMYK (a printers profile), but it overlaps a great deal. Some blues can't be printed but can be displayed, some lighter colors and yellows can be printed but not be viewed. If you want complete control over all of those colors, you would use a working space that encompasses them all. That's ProPhoto, talking RGB-system, or you could work in a totally different color SYSTEM, like L*ab.

My question is if my monitor can only display sRBG, would it be the limiting factor because I won't be able to edit what I can't see on screen due to the limited sRGB space?

It will be limiting if you don't know what kind of clipping occurs when colors are not-viewable. Also you have to be very aware on how colors are 'translated' from one device (camera)with sRGB through ProPhoto to a CMYK device (your printer)If you are editing colors that you can't see, you might be suprised about the printing output. But if you are aware of that, and use the soft-proofing function in PS with the gamut mismatch warning on, you can have a good idea where the colors you see don't match the colors you print. You can use that to your advantage.

For a beginner, it would make more sense to work in sRGB (because that is the defacto 'unmanaged' standard for most people), and try to get a translation to CMYK as good as possible. You are cutting out 2 additional translation steps that way. When you get further into photography, you can delve into color management, and set up the system another way.

As to what I want to do with finish image - both print by lab and for the web. For the web I change it to sRGB as a final step.

That is possible. Make sure to choose the rendering intent that fits you most. Especially when using giant spaces like ProPhoto the translation to a smaller space like sRGB might become an issue. You can read up on this on http://www.cambridgeincolour.com

Chris Noble
Chris Noble Senior Member • Posts: 2,476
Some misconceptions

10sj03 said:

When I attended a photography class a while ago, the instructor recommended us to set camera to Adobe RBG and shoot raw.

The camera gamut setting only affects the in-camera JPEG, not the Raw file.

He also said to set Lightroom working color space to ProPhoto. His views are to set to the largest color space available.

The main reason to use ProPhoto RGB as a working space is that it is 16-bit. Your Raw file is either 12 or 14 bits, and the output file (aRGB or sRGB) will be 8 bits. You want a higher working space resolution so you don't get quantization errors (= noise, color shifts and banding) as you process the image.

 Chris Noble's gear list:Chris Noble's gear list
Panasonic Lumix DMC-GM5 Panasonic Lumix DC-G9 Panasonic Lumix G Vario 7-14mm F4 ASPH Panasonic Lumix G 20mm F1.7 ASPH Panasonic Leica Summilux DG 25mm F1.4 +3 more
Simon Garrett Veteran Member • Posts: 5,853
Re: Some misconceptions

Chris Noble wrote:

The main reason to use ProPhoto RGB as a working space is that it is 16-bit. Your Raw file is either 12 or 14 bits, and the output file (aRGB or sRGB) will be 8 bits. You want a higher working space resolution so you don't get quantization errors (= noise, color shifts and banding) as you process the image.

Chris, I think I know what you mean, but I don't follow the words.

ProPhoto RGB colour space could be 8-bit, 16-bit or any other. However, with a very wide colour space, there's more danger of quantisation errors unless you use 16-bit or more. Hence, as you say, if you are using a wide colour space then you should work in 16-bit.

Jpeg is always 8-bit, and so ProPhoto RGB is not a good choice for jpeg, IMHO.

However:

He also said to set Lightroom working color space to ProPhoto. His views are to set to the largest color space available.

Lightroom doesn't give you the choice. It always uses 16-bit ProPhoto as its working space. You can choose with Photoshop, and I use ProPhoto RGB in Photoshop simply because I use Lightroom, so most times I edit in Photoshop it's done by opening Photoshop from Lightroom, where the image will be in ProPhoto RGB
--
Simon

 Simon Garrett's gear list:Simon Garrett's gear list
Nikon D800
Chris Noble
Chris Noble Senior Member • Posts: 2,476
Some misconceptions on gamut vs. resolution

Simon said:
Chris, I think I know what you mean, but I don't follow the words.

I posted before my first cup of coffee -- sorry! I mixed comments on gamut and comments on resolution , which are two different and independent aspects of image coding.

ProPhoto RGB colour space could be 8-bit, 16-bit or any other.

Quite right. As could sRGB and aRGB.

However, with a very wide colour space, there's more danger of quantisation errors unless you use 16-bit or more. Hence, as you say, if you are using a wide colour space then you should work in 16-bit.

The higher working space resolution is more important than the wider gamut. The working color space can arguably have the same gamut as the output space (although a slightly wider one is probably useful), but it needs to have a much higher resolution (in this case 16 bits vs. 8 bits for the output) or else quantization error accumulates through the imaging pipeline.

So you could theoretically process an image intended for 8-bit sRGB output in 16-bit sRGB, as long as none of the pixels are right on the gamut edge. 16-bit ProPhoto RGB gives you the finer resolution that you always need, and a wider gamut than your output just in case you need that as well.

If you shoot JPEGs for the web that you don't plan to print aRGB, you should arguably shoot sRGB rather than aRGB, because the image is captured at the same resolution as the output. So you get a lower output resolution when you convert aRGB to sRGB than you would if you had shot sRGB to begin with. The aRGB coding squeezes a wider gamut into the same 8 bits as the narrower sRGB, hence sacrificing color resolution.

 Chris Noble's gear list:Chris Noble's gear list
Panasonic Lumix DMC-GM5 Panasonic Lumix DC-G9 Panasonic Lumix G Vario 7-14mm F4 ASPH Panasonic Lumix G 20mm F1.7 ASPH Panasonic Leica Summilux DG 25mm F1.4 +3 more
joey_B
joey_B Veteran Member • Posts: 3,080
Re: Some misconceptions

Chris Noble wrote:

The camera gamut setting only affects the in-camera JPEG, not the Raw file.

true, but then agian, the raw isn't a bitmap file, it has no need for a colorspace.

The main reason to use ProPhoto RGB as a working space is that it is 16-bit.

untrue, in the first place because bit depth is a file-property, not a propertie of a colourspace... any RGB space can handle 8bit or 16bit images. You can convert the raw file from your camera into either one using a raw developer. If you let your camera make the JPG, it will be 8 bit only, because JPG as a file cannot handle 16 bits of information.

Your Raw file is either 12 or 14 bits, and the output file (aRGB or sRGB) will be 8 bits. You want a higher working space resolution so you don't get quantization errors (= noise, color shifts and banding) as you process the image.

'resolution' isn't the word used for that, it's bitdepth. It describes the number of values a piece of information can have. in 8bit images, every pixel has color descriptions for 3 channels (R, G and B, hence the name), that each can have a value between 0 and 255 (2^8 bits). In a 16 bit image, each color description can have a value between 0 and 16,7 milion. (2^16 bits) When you go and edit such a picture it will be less prone to posterisation or banding.

Chris Noble
Chris Noble Senior Member • Posts: 2,476
Re: Some misconceptions

Joey, see my post above that explains color space vs resolution in more detail. "Bit depth" and "resolution" mean the same thing: how many quantization levels are available. "Gamut" and "color space" also mean the same thing: the range of colors that are included in the coding.

 Chris Noble's gear list:Chris Noble's gear list
Panasonic Lumix DMC-GM5 Panasonic Lumix DC-G9 Panasonic Lumix G Vario 7-14mm F4 ASPH Panasonic Lumix G 20mm F1.7 ASPH Panasonic Leica Summilux DG 25mm F1.4 +3 more
gollywop
gollywop Veteran Member • Posts: 8,279
Re: Some misconceptions

Chris Noble wrote:

"Bit depth" and "resolution" mean the same thing: how many quantization levels are available.

There is a (somewhat perverted) sense in which this could be said to be true, since higher bit depth means there are more divisions between any two specific colors. However, the term resolution is not really employed in this context since there is nothing being resolved.

"Gamut" and "color space" also mean the same thing: the range of colors that are included in the coding.

You have correctly identified "gamut" above, but not color space. A color space entails a set of numbers on each color in the gamut and can entail a gamma as well. Two different color spaces could, in principle, have the same gamut, so the two terms do not, and cannot, mean the same thing.

You, of course, are free to use whatever terminology you wish for whatever concepts you wish. But you'll encounter communications and logical problems when you try to identify two terms that really refer to different things. That's precisely what's happening here now.

-- hide signature --

gollywop

-----------

Chris Noble
Chris Noble Senior Member • Posts: 2,476
Re: Some misconceptions

Gollywop said:

There is a (somewhat perverted) sense in which this could be said to be true, since higher bit depth means there are more divisions between any two specific colors. However, the term resolution is not really employed in this context since there is nothing being resolved.

Well, as they say, perversion is in the eye of the beholder ;); what is being resolved here is the quantization steps in the transfer function between the input and output spaces. "Bit depth" and "resolution" are used interchangeably in lots of software processing, not just photography. Another term for exactly the same thing is "dynamic range" which is sometimes expressed in eV, dB and, yes, in "ENOB" (equivalent number of bits). It's all digital signal processing, whether the files are photographs, music, radar signals...

You have correctly identified "gamut" above, but not color space. A color space entails a set of numbers on each color in the gamut and can entail a gamma as well. Two different color spaces could, in principle, have the same gamut, so the two terms do not, and cannot, mean the same thing.

I stand corrected on that one, thank you. The color space includes a range, which is the gamut, but also has other characteristics related to the profile. I was focusing on the difference between gamut and resolution and I should have been more precise.

 Chris Noble's gear list:Chris Noble's gear list
Panasonic Lumix DMC-GM5 Panasonic Lumix DC-G9 Panasonic Lumix G Vario 7-14mm F4 ASPH Panasonic Lumix G 20mm F1.7 ASPH Panasonic Leica Summilux DG 25mm F1.4 +3 more
joey_B
joey_B Veteran Member • Posts: 3,080
Re: Some misconceptions

Chris Noble wrote:

... Another term for exactly the same thing is "dynamic range" which is sometimes expressed in eV, dB and, yes, in "ENOB" (equivalent number of bits). It's all digital signal processing, whether the files are photographs, music, radar signals...

And again you are wrong, in photography the term dynamic range is described to describe the cut-off range of the sensor-pots, the sensitivity of the sensor so to speak.

gollywop
gollywop Veteran Member • Posts: 8,279
Re: Some misconceptions

Chris Noble wrote:

Gollywop said:

There is a (somewhat perverted) sense in which this could be said to be true, since higher bit depth means there are more divisions between any two specific colors. However, the term resolution is not really employed in this context since there is nothing being resolved.

Well, as they say, perversion is in the eye of the beholder ;); what is being resolved here is the quantization steps in the transfer function between the input and output spaces. "Bit depth" and "resolution" are used interchangeably in lots of software processing, not just photography. Another term for exactly the same thing is "dynamic range" which is sometimes expressed in eV, dB and, yes, in "ENOB" (equivalent number of bits). It's all digital signal processing, whether the files are photographs, music, radar signals...

You're getting worser and worser. DR has nothing directly to do with the color bit depth. If I process an image with 16-bit color coding and change to 8-bits, keeping the color space the same, I do not lose DR.

ENOB, by the way, means Effective Number of bits.
--
gollywop

-- hide signature --

Chris Noble
Chris Noble Senior Member • Posts: 2,476
Re: Some misconceptions

Joey said:

In photography the term dynamic range is described to describe the cut-off range of the sensor-pots, the sensitivity of the sensor so to speak.

Joey, no one decided to create photography-specific definitions for any of these terms or concepts. They have existed for a long time before the first digital camera.

Sensitivity and dynamic range are related but are not the same. There are plenty of references on the Web that can explain the differences to you.

 Chris Noble's gear list:Chris Noble's gear list
Panasonic Lumix DMC-GM5 Panasonic Lumix DC-G9 Panasonic Lumix G Vario 7-14mm F4 ASPH Panasonic Lumix G 20mm F1.7 ASPH Panasonic Leica Summilux DG 25mm F1.4 +3 more
Chris Noble
Chris Noble Senior Member • Posts: 2,476
Re: Some misconceptions

See my answer to Joey above.

 Chris Noble's gear list:Chris Noble's gear list
Panasonic Lumix DMC-GM5 Panasonic Lumix DC-G9 Panasonic Lumix G Vario 7-14mm F4 ASPH Panasonic Lumix G 20mm F1.7 ASPH Panasonic Leica Summilux DG 25mm F1.4 +3 more
Chris Noble
Chris Noble Senior Member • Posts: 2,476
Re: Some misconceptions

Gollywop said:

If I process an image with 16-bit color coding and change to 8-bits, keeping the color space the same, I do not lose DR.

You are confusing range and dynamic range . The dynamic range is the ratio of the range and the resolution. You are right that you don't change the range in going from 16 bits to 8, but you change the dynamic range from 65,536 values to 256 values.

 Chris Noble's gear list:Chris Noble's gear list
Panasonic Lumix DMC-GM5 Panasonic Lumix DC-G9 Panasonic Lumix G Vario 7-14mm F4 ASPH Panasonic Lumix G 20mm F1.7 ASPH Panasonic Leica Summilux DG 25mm F1.4 +3 more
gollywop
gollywop Veteran Member • Posts: 8,279
You should have quit while you were behind. (nt)
-- hide signature --

gollywop

-----------

joey_B
joey_B Veteran Member • Posts: 3,080
Re: Some misconceptions

So, considering this forum IS about photography, you are seriously stating that sensel dimension (usually reffered to as 'resolution') of a sensor is the same as the theoretical amount of data each sensel could capture (bit-depth), and the same as the real capacity of that sensel between the cut-off points (dynamic range)...

So I could sell you a camera with a sensor of 2 by 3 pixels, with an almost non-existent dynamic range, as long as it is 12 bit capable on paper...

get real.

Chris Noble
Chris Noble Senior Member • Posts: 2,476
Re: Some misconceptions

Joey said... whatever.

Joey, it appears that you understand less about this subject than your rattling of words initially suggests. You could start by studying the relationship between an analog sensor and the digital sampling of that sensor's output, and how the terms "signal to noise", "sensitivity" and "resolution" are related. Enjoy! And no need to respond.

 Chris Noble's gear list:Chris Noble's gear list
Panasonic Lumix DMC-GM5 Panasonic Lumix DC-G9 Panasonic Lumix G Vario 7-14mm F4 ASPH Panasonic Lumix G 20mm F1.7 ASPH Panasonic Leica Summilux DG 25mm F1.4 +3 more
Keyboard shortcuts:
FForum MMy threads