Slightly O/T but what is your workflow for color management?

Status
Not open for further replies.

Michael Berg

Leading Member
Messages
883
Reaction score
70
Location
Kolind, DK
Hello all,

Is it just me, or is it more than just a little difficult to get a clear and concise description of the differences between color spaces and more specifically how and when to apply what colorspace?

Try to do a Google search on the topic. The first link that pops up is our old friend Ken Rockwell, who positively hates AdobeRGB because he clearly had some issues with it back in the days and now he's just soured on the whole idea. Use sRGB "just to be on the safe side". Uh. Really?

Nearly all the links that you find will tell you something like this:
  • Don't use anything but sRGB "on the web".
  • AdobeRGB is mostly for print
  • Displays can't show anything but sRGB anyway
  • Browsers don't work with anything but sRGB
  • There are many more colors in the AdobeRGB color space than in sRGB
As far as I can tell, all of the above are complete nonsense. For example, there are precisely as many different colors in sRGB as there is in AdobeRGB - 16 million. Both are encoded with 24 bit color, 8 bits per channel. You don't get more colors in AdobeRGB, the color "steps" are just more pronounced and so "reach further" (go brighter/more saturated) (255,0,0 is much redder in AdobeRGB than in sRGB)

And as far as monitors go - well, can you still buy a display that doesn't support P3, AdobeRGB or ProPhoto color spaces, or at least something a lot wider than sRGB? Even mobile devices like iPads are P3, as are phones that have OLED screens - and even your plain old iPhone. So no, it is not true that "sRGB works best on the web". It's true though that it doesn't work for anyone who has a wide gamut monitor.

If you edit your photos in sRGB on your wide gamut monitor, then your colors will look drastically oversaturated and you might end up processing them incorrectly. I wonder how many users out there are unknowingly facing this problem right now.

And browsers? Browsers like Chrome, Safari and Firefox, are in fact color managed - have been for years. They pull out embedded ICC profiles and they render the image according to the gamut of the users display. Some conversions are possible, some are not but the point is if you have a wide gamut monitor and have configured your operating system to use this color space, then browsers will in fact try to render images according to their color space profiles.

Stating that AdobeRGB is mostly for prints - I don't even know where that could have come from. Maybe the first wide gamut devices were printeres but we're back in the 1990's here. Today displays - not just computer displays but tv's as well, are absolutely wider gamut than sRGB. So this statement simply isn't true. Yet it is repeated again and again in articles you find on the subject. Even recent articles - "Don't use AdobeRGB unless you intend to print".

But granted, this is a very difficult subject to understand. There are so many variables involved it's easy to see why people get confused:

- Your monitor may allow displaying images using sRGB or AdobeRGB (my Benq has a physical switch that will drop it down from AdobeRGB to sRGB).
- Your operating system may need to be told what color space your monitor is using. Fail to configure that and your raw processor may not show you the image you expect.
- Your image come with an embedded sRGB or AdobeRGB ICC color space profile
- Your audience may have monitors that only support sRGB

And the chances of getting it wrong are plentyful:

- Show an sRGB image on an sRGB monitor and it will look as you intended.
- Show an sRGB image on a wide gamut monitor and it will look oversaturated
- Show an AdobeRGB image on a a wide gamut monitor and it will look as you intended
- Show an AdobeRGB image on an sRGB monitor and it will look washed out

If your friend has a wide gamut monitor and sees your sRGB photo, it will look over saturated. If your friend has an sRGB monitor and sees your AdobeRGB photo, it will look washed out because the browser can't map between color spaces in that direction automatically (note: I did read that some automatic conversions are possible, from Adobe to sRGB but I'll have to dig that up again).

Right now I'm shooting in RAW + jpeg. I allow my camera to use AdobeRGB for the jpegs since my Benq is wide gamut and will show the images correctly, which helps me through the initial culling quicker. But when I need to upload something to Instagram or in fact share an image with anyone over the web, I find myself in two minds. I can either process the image to sRGB in case the end user has an sRGB display and risk having my image look completely oversaturated if they in fact have a wide gamut display, or I can process the image to AdobeRGB in case the end user has a wide gamut monitor and risk having my iamge look washed out if they only have an sRGB monitor. There is no single, good choice it seems. Either way is half right and half wrong.

I think the only way to really fix this is to have jpeg contain multiple versions of the same image. Or to have it contain some omni-colorspace that every other color space can be remapped from.

What's the point of this rant? Just venting some frustration I guess. And perhaps hoping to hear from anyone who might have some tips to share, some suggestions or just tell a bit about how they process images with respect to color spaces.

DPReview, if you're listening, this would be an excellent subject for a series of in-depth articles .. :-)
 
Start Here: https://www.drycreekphoto.com/Learn/color_management.htm

Then:
  • Calibrate your monitor using a color spyder
  • use a color passport card as a ref when you shoot and build a profile from that
  • If you print, calibrate your printer as well
  • If you use a lab to print, get their calibration profile for your editing workflow.
  • Failure to do the above makes everything else guesswork and mostly a waste of time and effort.
sRGB is a lowest common denominator standard for color space - unless you have a client or special request to do otherwise, I'd stick to doing everything in sRGB for the least number of issues and maximum compatibility across a wide range of output sources.
 
Start Here: https://www.drycreekphoto.com/Learn/color_management.htm

Then:
  • Calibrate your monitor using a color spyder
  • use a color passport card as a ref when you shoot and build a profile from that
  • If you print, calibrate your printer as well
  • If you use a lab to print, get their calibration profile for your editing workflow.
  • Failure to do the above makes everything else guesswork and mostly a waste of time and effort.
sRGB is a lowest common denominator standard for color space - unless you have a client or special request to do otherwise, I'd stick to doing everything in sRGB for the least number of issues and maximum compatibility across a wide range of output sources.
Thanks for your suggestions.

Calibrating my monitor was never a big priority for me, since this is not at the heart of the problem. My monitor is fairly accurately calibrated out of the factory and I'm quite happy with it in that respect. Calibration is not the problem, nor is printing since I don't do that at all.

That last paragraph is precisely why I posted in the first place. This is the type of sweeping statements that you see repeated everywhere on the web, and it comes from basically an outdated perception of what the world actually looks like today.

If I process to sRGB, my image will not look right on a wide gamut monitor. It will look hopelessly oversaturated. Why is this somehow a good option when so many devices today are in fact wide gamut? Ipads, phone displays, TV's - I bet most monitors that are being sold today are wide gamut. My photos will look oversaturated on an increasing number of devices.

There seems to be no right way to process images that will make them look "as intended" on any target device. Or just on most of them. I don't want my photos to look over saturated. I don't want them to look dull and under saturated. But it seems I have to choose one of those options.

--
/Mike
http://familienberg.zenfolio.com
 
Last edited:
Yes, a bit OT. You should post in the appropriate forum - I can’t see the names as I am typing this instead. Chris knows what he is talking about. Just look at how busy the guy is!!

Good luck to you and I think that not calibrating you’re monitor is a mistake for someone who is so invested in getting to the bottom of the ole sRGB vs AdobeRGB battle.

Again, take Chris’s advice.
 
Start Here: https://www.drycreekphoto.com/Learn/color_management.htm

Then:
  • Calibrate your monitor using a color spyder
  • use a color passport card as a ref when you shoot and build a profile from that
  • If you print, calibrate your printer as well
  • If you use a lab to print, get their calibration profile for your editing workflow.
  • Failure to do the above makes everything else guesswork and mostly a waste of time and effort.
sRGB is a lowest common denominator standard for color space - unless you have a client or special request to do otherwise, I'd stick to doing everything in sRGB for the least number of issues and maximum compatibility across a wide range of output sources.
Thanks for your suggestions.

Calibrating my monitor was never a big priority for me, since this is not at the heart of the problem. My monitor is fairly accurately calibrated out of the factory and I'm quite happy with it in that respect. Calibration is not the problem, nor is printing since I don't do that at all.
You still need to calibrate your monitor as a first step. Everything else is a waste of time unless you do this.
That last paragraph is precisely why I posted in the first place. This is the type of sweeping statements that you see repeated everywhere on the web, and it comes from basically an outdated perception of what the world actually looks like today.
No, it comes from 30 years experience, and a PhD in color image processing, and a pro shooting for clients, digital, web, print, publication.
If I process to sRGB, my image will not look right on a wide gamut monitor. It will look hopelessly oversaturated. Why is this somehow a good option when so many devices today are in fact wide gamut? Ipads, phone displays, TV's - I bet most monitors that are being sold today are wide gamut. My photos will look oversaturated on an increasing number of devices.
I work in sRGB and my shots don't look over-saturated on iPads, iPhones, Mac Pro. They also look great in print in magazines. The magazine and printer both request I use sRGB btw.

be3748ddfb09418cb39b8062639fcf6c.jpg


Sounds like you either have incorrect ideas, or are doing something wrong in your colorspace workflow.
There seems to be no right way to process images that will make them look "as intended" on any target device. Or just on most of them. I don't want my photos to look over saturated. I don't want them to look dull and under saturated. But it seems I have to choose one of those options.
You appear to have already decided it's not going to work for you. You have to work through the process methodically, there are no shortcuts.

--
Your time is limited, so don't waste it arguing about camera features - go out and capture memories - Oh, and size does matter - shoot MF
 
Last edited:
Mike,

Interesting post. I don't think it is off-topic. But if you want some input from some true color fanatic printing pros and actual color space scientists, then post that on the MF Board. There are guys on that Board who spend a lifetime worrying over this stuff and are absolute experts at it. Our Mod over there (Chris) is a pro at this.

But they will all tell you to use sRGB, calibrate your monitor, use a good big 4K IPS pro monitor, use color card and gray cards while shooting, and to calibrate your printer too. There are all kinds of lab discussions to consider if you use one. They have profiles you can use in your work flow / post processing. It takes effort and work and knowledge to get it right if you are a pro selling prints.

Everything else you talked about is secondary to that. If you don't print it doesn't matter. If you do print you are wasting your time if you don't do the basics.

But I don't print and I worry about color too, especially nailing the WB.

I think the key for everything these days is to invest in a good, big (32 inch at least) 4K professional IPS high-quality monitor. They are affordable now and were out of reach for most photographers only two years ago.

Then get very good at LR or the program of your choice, and edit the color and tone mapping to your taste. Just don't overdo it.

But if you print? Game over. You gotta do all the stuff that you gotta do.
 
That last paragraph is precisely why I posted in the first place. This is the type of sweeping statements that you see repeated everywhere on the web, and it comes from basically an outdated perception of what the world actually looks like today.
Michael, have you lost your way?

;-)
 
Calibrating my monitor was never a big priority for me, since this is not at the heart of the problem. My monitor is fairly accurately calibrated out of the factory and I'm quite happy with it in that respect. Calibration is not the problem, nor is printing since I don't do that at all.
You still need to calibrate your monitor as a first step. Everything else is a waste of time unless you do this.
Why? It doesn't matter if my monitor is calibrated or not. The point is that the image I see on my monitor is not the image you see on your monitor, because we are using different colors paces.
That last paragraph is precisely why I posted in the first place. This is the type of sweeping statements that you see repeated everywhere on the web, and it comes from basically an outdated perception of what the world actually looks like today.
No, it comes from 30 years experience, and a PhD in color image processing, and a pro shooting for clients, digital, web, print, publication.
With all that knowledge you should know what an ICC profile is, and that different devices have different capabilies. It does sound like you are not aware of those facts. Some monitors display close to 100% of the AdobeRGB color space, while others can go as wide as 99% of P3.

If you own any of those monitors, then trying to show a jpeg image with an embedded sRGB ICC color profile will look wrong on your monitor.

Here, let me give you an example straight from my own screen:



d4e85788635d491d9f28f1277315dc1c.jpg


This is a shot captured with my Galaxy Note 9.

The image on the left is processed to AdobeRGB, and is being shown on my Benq SW2700PT monitor set to show the AdobeRGB color space.

The image on the right is the same image, processed in sRGB. Nothing was changed on those two images, except the output color profile.

The image on the left is how I processed the image and it's what I am happy with. And if I switch my monitor to show sRGB, the image on the right will render like that too.

You see the problem here. How do I know which kind of monitor the viewer has? If he has a wide gamut monitor, the sRGB image will look wrong (the right image). More colors is not better. It's not like louder rock music.
I work in sRGB and my shots don't look over-saturated on iPads, iPhones, Mac Pro. They also look great in print in magazines. The magazine and printer both request I use sRGB btw.
I get that for magazines they have a requirement for a specific ICC profile. They control the output media. You buy the magazine they printed so naturally it will look right. A magazine is not the same as a monitor.
Sounds like you either have incorrect ideas, or are doing something wrong in your colorspace workflow.
Perhaps you can enlighten me then. Explain what my wrong idea is. Are you saying that an image processed to sRGB will render exactly the same on a wide gamut monitor set to show AdobeRGB, as it will on an sRGB monitor? Because then it's you who perhaps have some wrong ideas.
You appear to have already decided it's not going to work for you. You have to work through the process methodically, there are no shortcuts.
Shortcuts? I'm doing all the paths between Adobe and sRGB on the source, intermediate and final rendering media. I'm not taking shortcuts.

--
/Mike
 
Interesting post. I don't think it is off-topic. But if you want some input from some true color fanatic printing pros and actual color space scientists, then post that on the MF Board. There are guys on that Board who spend a lifetime worrying over this stuff and are absolute experts at it. Our Mod over there (Chris) is a pro at this.
Hm, I might try that :-)
But they will all tell you to use sRGB, calibrate your monitor, use a good big 4K IPS pro monitor, use color card and gray cards while shooting, and to calibrate your printer too. There are all kinds of lab discussions to consider if you use one. They have profiles
It's funny how much emphasis is put on high quality "photo" monitors and their ability to show a wider gamut than just sRGB. That is a big reason why you pay a premium for these displays.

If the idea is to switch the monitor into AdobeRGB mode, then edit the photos, then process to sRGB, then switch the monitor back into sRGB mode, then it seems rather convoluted and not at all what the monitor manufacturer intended. If you spend the big bucks on a monitor like that, don't you want to be able to take advantage of the wider gamut when you just browse around in your own images?

Right now I am processing all my images to sRGB and AdobeRGB. I upload to places like Instagram and Eyeem in sRGB, not because it looks better that way but because I take the somewhat unsubstantiated advice of those who say that sRGB monitors are still more prevalent out there. But anyone viewing my sRGB image on their wide gamut monitor will see an image that is oversaturated so when I view my own images, I use the AdobeRGB image set i also made. This means having two copies of each photo.
you can use in your work flow / post processing. It takes effort and work and knowledge to get it right if you are a pro selling prints.
It doesn't really take a lot of work. You work with the images in the exact same way, there are no differences there. Once you are done in Capture One, you just render the final output image to a desired color space. C1 can remap my image to render properly in that color space.

But it only works if the viewer has a monitor set to show that color space.
Everything else you talked about is secondary to that. If you don't print it doesn't matter. If you do print you are wasting your time if you don't do the basics.
I don't print so that's not a concern of mine.
I think the key for everything these days is to invest in a good, big (32 inch at least) 4K professional IPS high-quality monitor. They are affordable now and were out of reach for most photographers only two years ago.
Most of those pride themselves in offering a much wider gamut than sRGB. There just doesn't seem to be any way to take advantage of it.
Then get very good at LR or the program of your choice, and edit the color and tone mapping to your taste. Just don't overdo it.
Yes that's my point. I want to edit my photo the way I like it, and have it render on the viewers monitor the same way, regardless of which monitor he or she has.
 
Good walk through HERE
Yes, I read that already. Here's a quote:

If you notice dull images when you upload them online, you must remember that images have to be converted to the sRGB colorspace first. However, If you don’t, the web converts it for you, which is never ideal as you can see from the example above.

There are so many misconceptions and superficial glossing-overs going on here I don't even know where to start.

Firstly if I upload an AdobeRGB image online, it will not look dull on my screen. It'll look just like I intended because my montor can show that color space perfectly.

Secondly, images "have to be converted to the sRGB colorspace first"? According to which law? What is the argument for doing that, other than to support users who have older monitors that don't display a wider gamut? And how to I support everyone else who do have better monitors, at the same time? Like, say, myself? The article conveniently doesn't go into that.

Thirdly, what in the world do they mean by "the web converts it for you"? That makes no sense at all.

--
/Mike
http://familienberg.zenfolio.com
 
Last edited:
I think what you are saying is something like "color management is a mess and why hasn't this problem been solved yet"? Why do you have to do the conversions at all? Why aren't they automatic?

I've always been curious, too.

It seems to me that in the process of taking what looks good on my monitor and viewing it on a different display, it's all constants, no variables. Why hasn't this problem been solved?

For example, my camera and its settings are a constant--it's all right there in the metadata.

My monitor is a known thing, its characteristics are a "constant." The file type I save the image in is a known thing, well characterized. The display program is, again, a known thing, as is the display device.

There are no unknowns in this chain.

Perhaps there is something here to so with the human psychology of seeing that I am not appreciating--that at some point in the chain someone needs to adjust things "to taste" vs adjusting them mechanically?

Or is something missing in terms of industry cooperation or standards setting?

--Darin
 
That last paragraph is precisely why I posted in the first place. This is the type of sweeping statements that you see repeated everywhere on the web, and it comes from basically an outdated perception of what the world actually looks like today.
I can assure you that this as true today as it possibly can be. In my last workplace we couldn't figure out for the life of us why the pictures looked different once uploaded to our website. Later on we found out that part of the problem was Adobe RGB. Somewhere in the upload process the picture got converted from Adobe RGB into sRGB.

Almost everything concerning Web is sRGB. With Firefox for example, you have to change settings in about:config. If you don't do that, Firefox tries to display the picture at the color gamut of your display. If you display only can display sRGB, this won't be a problem. If your display can display Adobe RGB (or close to it), Firefox will try to convert sRGB images into Adobe RGB. I think you understand why this can be a problem.

Rule of thumb you can remember is:

  • For web use sRGB
  • For printing use Adobe RGB
Quick brower test

If every color looks the same you're fine. If they don't then well ... you haven't understood color management and should just stick to sRGB.


Edit

There is the whole problem with color managed programs vs. not color managed programs. Photoshop is color managed. The Windows 10 photo viewer app is not. So if I open the same picture in Photoshop and photo viewer and look at them side by side the picture can look different. And this is on my own computer with my own monitor.

Then there is the problem with metadata. The programs have to know in what color space the picture is. If they don't know they just assume one and try to convert the picture into that color space. Firefox for example tries to convert images into the gamut of your monitor (you can change that behavior in about:config).

Browser test

At one point you have to realize that even if you do 100% on your part you don't know how the picture is going to look to somebody else.

What color spaces is concerned:
sRGB is the standard because almost every panel nowadays can represent sRGB fairly well. So if you send me a sRGB file and I open it in a color managed (very important part) program it should look roughly the same to me. If you send me an Adobe RGB file or a P3 file it won't look the same on my part because my monitor can only display sRGB.

If you compare color spaces you realize that sRGB lies in the Adobe RGB color space. Adobe RGB can mainly display more cyan/green.

Long story short:
  • If your monitor can only display sRGB then stick to it
  • If your monitor can display Adobe RGB and you don't print stick to sRGB
  • If your monitor can display Adobe RGB and you print, then you can consider working in Adobe RGB
  • If you don't shoot video and aren't working with Apple products forget P3.
 
Last edited:
Good walk through HERE
Yes, I read that already. Here's a quote:

If you notice dull images when you upload them online, you must remember that images have to be converted to the sRGB colorspace first. However, If you don’t, the web converts it for you, which is never ideal as you can see from the example above.

There are so many misconceptions and superficial glossing-overs going on here I don't even know where to start.

Firstly if I upload an AdobeRGB image online, it will not look dull on my screen. It'll look just like I intended because my montor can show that color space perfectly.
You save a version for Web using the sRGB color space using LR/PS, this does a quality conversion for you and avoids the issue
Secondly, images "have to be converted to the sRGB colorspace first"? According to which law? What is the argument for doing that, other than to support users who have older monitors that don't display a wider gamut? And how to I support everyone else who do have better monitors, at the same time? Like, say, myself? The article conveniently doesn't go into that.
sRGB will display on all systems as it's the accepted standard color space. AdobeRGB and other wide gamut monitors all have color spaces which include 100% of sRGB color space, so an image in sRGB space, correctly tagged, will display correctly on a wide gamut display.

Going the other way doesn't work, as your wide gamut space cannot be included within a display that only shows sRGB. In this case, you'd save a version of your image set to sRGB image space from LR/PS, and use that copy on an sRGB display, or web etc.
Thirdly, what in the world do they mean by "the web converts it for you"? That makes no sense at all.
Browsers recognize the color space tags in the image, and attempt to convert the color space for you. They are not always optimized for this. It is far better to save sRGB for browsers/Web use, and this will be shown correctly in all standard browsers, on all screen, iPhones, iPads etc - as long as your original screen where you made your edits was calibrated.
 
Color management should work right out of the box, as long as you have:
  • an embedded document profile
  • a valid monitor profile at system level
  • a color managed application that converts from the first into the second
That's the definition of a color managed process. If you have those three, it has to work. There's nothing the user needs to do, no special settings.

Programs like Photoshop are designed from the ground up to work this way, the whole application revolves around it.

So which one of those three is failing for you?
  • Your monitor profile?
  • The document profile should be embedded.
  • The application's color management - which application were you using?
Again - use proof to Monitor RGB as a reference for no color management. An sRGB file will look oversaturated in that case. Anything that looks like that is wrong.
 
This answer on the page you linked has all of the data in it, you just needed to read it and act on it:

I've had exactly the same issue and it is possible to arrive at a correct and workable solution. There are a lot of misconceptions both in the question and the previous answers (and indeed, around colour management in general), so let me try to clear them up and provide you with an answer.

First, the misconceptions...

  1. Regular (non-wide) monitors do not "live in sRGB", nor do wide gamut displays "live in AdobeRGB". sRGB and AdobeRGB (together with ProPhoto RGB) are known as working profiles: they don't match any real world device, they just provide a standard set of measurements that all devices can be programmed to understand. Every monitor (and every printer) has its own profile, and indeed that profile may change over time as the chemicals in the display age. An individual display's profile may have a large degree of overlap with one of the standard working profiles, but it's incorrect to say it matches it exactly, or even fits completely within it. It's even less correct to say all displays of a certain type have profiles that fit within one of the standard working profiles.
  2. You should never set your display's profile to one of the working profiles (because that isn't its profile!). The correct solution is to use a calibration device to find out your monitor's correct profile, and use that.
  3. Browsers are not the only fruit: you want to make sure your display is profiled in such a way that other imaging apps (Photoshop, Lightroom, whatever) also display colours faithfully.
  4. There is something you can do about unmanaged images in browsers (a few other answers have touched on it). I'll come to it in detail below.
A rough guide to how profiles interact when you view an image on your monitor

In an ideal world, not one but two profiles will come into play when viewing an image. The first is the profile embedded in the image: let's call that the input profile. Remember that digital images are made up of pixels, each containing a combination of red, green and blue. So for a plain red square, every pixel is set to 100% red, 0% green, 0% blue. But what do we mean by 100% red? It's like seeing a sign on the road side saying "You may now drive at maximum speed". What maximum speed? As fast as the car will go? The sign doesn't say, so the actual speed is going to vary from car to car. What the input profile tells us is what that 100% value is relative to: for an image tagged with an embedded profile, your computer now knows that "100% red" means the maximum red value defined by that specific profile. (To complete the analogy, our road sign now says: "Maximum limit 70mph. You may now drive at the maximum.")

So, once an image is tagged with an embedded profile we know exactly what it is we need to display: exactly what shade of red, yellow, or whatever. The next question is: how do we display it? Look at the same image on a few different computers (or just your computer and your phone) and you'll see that no two displays render colour in exactly the same way. This is where we need to calibrate our monitor to produce a display profile - the output profile - that tells us exactly how this specific device renders colours. Now we've got both the pieces of information we need:

  • Input profile: What does this image mean when it says "red"?
  • Output profile: How do I get this hardware to display (as close as possible to) that shade of red?
And what if the image isn't tagged? For all but the most specialist of usage, it's safe to assume that an untagged image is using the sRGB profile.

And now to answer your question

The first step with any monitor - but especially important with a wide gamut display - is to correctly calibrate your monitor. This requires using a calibrator: a piece of hardware that sits over the screen and takes colour readings while displaying a range of test images, to determine what colours your monitor is actually displaying. For a wide gamut display you have to ensure you use a suitable calibrator: I use a Spyder Pro 3 and it works fine.

Once you've calibrated your monitor you should find that any colour-managed application is now displaying colours faithfully. Before calibration, my wide-gamut monitor displayed everything hyper-saturated: skin tones were tomato-red and both Photoshop and Lightroom were unusable. After calibration, they both looked perfect. So, use a colour-managed app to test your calibration.

And now onto the browsers! Firefox is the only browser that works well for me on a calibrated wide-gamut display. By default it uses the embedded colour profile in images to display them correctly, but untagged images still appear over-saturated. But don't worry, all is not lost!

  • Type about:config into your address bar.
  • Scroll down and look for gfx.color_management.mode.
  • Change the value to 1.
This causes Firefox to treat all untagged images as sRGB: exactly what we want to happen. It even works on icons in your bookmarks bar! Unfortunately it still doesn't work on flash video players though.

Both Safari and Chrome are also colour managed but both have their drawbacks. Safari (on Windows at least) doesn't treat untagged images as sRGB; Chrome does, but its colour management is disabled by default and awkward to switch on.

At time of writing, Opera has no colour management at all and IE9 is just downright idiotic: it respects the input profile (the one embedded in the image) but ignores the display output profile! This makes IE9 as good as useless on a wide gamut display.

So: calibrate + use Firefox + set gfx.color_management.mode to 1 = you're good. :)
 
I agree, that the situation is a bit frustrating, with so many capable displays in the field and still most pictures in sRGB. Some thoughts from my side:

In actual photography use, i rarely encounter photos with a gamut larger than sRGB. In my own photos, these are some photos of the sea and a few flowers with intense red colours. Or after some heavy postprocessing. If those colors are correctly mapped into sRGB, it is not like everything gets ugly, you mainly notice it in a side by side comparison, but good pictures still look great.

The biggest point in favour of sRGB in my opinion is, that it is a fallback for many systems. for example, if windows doesn't know the gamut of the monitor, it assumes sRGB. if a picture has no profile embedded, most software assumes sRGB. And so on. I'm not sure about all the mobile devices. if i remember correctly, Android has no color management and displays all images as sRGB, even if the display would allow more. so an adobeRGB image displayed on a P3 capable mobile display would still show muted colors. Only special videoformats take advantage of the wider gamut.

In general, video formats are far more compatible by bundling resolution, color space, refresh rates etc. into designated standards.

In theory, you are right, that there are 50% chance of color space mismatch and therefore false colors. In practice, i would say that with sRGB images, at least 90% of users will see them more or less correct. Many of those who have a wide gamut display probably also use color management (or at least should) so that they also see them right. Those who use a wide gamut display without CM, probably are used to oversaturated colors each and everywhere.

I think everything other than sRGB makes sense only when used with color management.
 
Status
Not open for further replies.

Keyboard shortcuts

Back
Top