Confused About Color Management with Hardware Calibrated Monitors

SCoombs

Senior Member
Messages
1,252
Reaction score
1,096
I am currently using an LG 24gq40w, the limitations of which seem to be causing me some problems for editing. As such, I am looking to replace it with a monitor which will be better for photo editing. This could be anything from an ASUS ProArt to something higher end.

Now essentially any monitor with reasonable specifications for photo work is hardware calibrated. This seems to be universally regarded as superior to calibrating with a software profile. That's great.

What has me extremely confused is the interaction between the software profiles and hardware calibration. Right now, I am using an ICC profile created with DisplayCal using my Calibrite colorimeter on my LG monitor.

What exactly am I supposed to do when I plug in a new, hardware calibrated monitor? Do I no longer worry about the profile and just leave it as it is? If so, why does it not mess up what I see on the monitor?

Or, do I instead activate a different profile? If so, which one?

Some of the hardware calibrated monitors seem to have software which is used to create an ICC profile on the computer designed for that monitor. Some have specific drivers you are supposed to install which may somehow get things working correctly. Some have instructions about setting up profiles in Windows. Yet others do not. You just plug them in and they are a monitor and that's that.

So how does this all work? How do I make sure that when I go ahead and use a decent hardware calibrated monitor that I actually get the correct colors/etc. that I am supposed to be getting with it?
 
Last edited:
Calibration Versus Profiling | Image Science

I own an Asus PA329C. It's the only one I own that offers a full calibration, i. .e, its calibration software generates values that are written into an LUT (look up table) in the monitor.

The software also generates an ICC profile. I'm not sure what Windows does if you have no ICC profile present.

Any monitor can be profiled. Ones that have a programmable LUT tend to be expensive. The PA329C was at the low end such monitors, and I still saved money by buying it as a refurb.

It's possible to buy monitors that are hardware calibrated at the factory. I suppose that a good one won't place excessive demands on the ICC corrections when you profile it.
 
Last edited:
I am currently using an LG 24gq40w, the limitations of which seem to be causing me some problems for editing. As such, I am looking to replace it with a monitor which will be better for photo editing. This could be anything from an ASUS ProArt to something higher end.

Now essentially any monitor with reasonable specifications for photo work is hardware calibrated. This seems to be universally regarded as superior to calibrating with a software profile. That's great.

What has me extremely confused is the interaction between the software profiles and hardware calibration. Right now, I am using an ICC profile created with DisplayCal using my Calibrite colorimeter on my LG monitor.

What exactly am I supposed to do when I plug in a new, hardware calibrated monitor? Do I no longer worry about the profile and just leave it as it is? If so, why does it not mess up what I see on the monitor?
Or, do I instead activate a different profile? If so, which one?

Some of the hardware calibrated monitors seem to have software which is used to create an ICC profile on the computer designed for that monitor. Some have specific drivers you are supposed to install which may somehow get things working correctly. Some have instructions about setting up profiles in Windows. Yet others do not. You just plug them in and they are a monitor and that's that.

So how does this all work? How do I make sure that when I go ahead and use a decent hardware calibrated monitor that I actually get the correct colors/etc. that I am supposed to be getting with it?
As far as I'm aware, the color management system still works in the same way but it uses a different profile for each monitor preset and also one for the native color space of the monitor. So if you are using say an Adobe RGB preset, this means that the monitor's hardware calibration adjusts the monitor interface to emulate an Adobe RGB color space and the icc profile you use is just a standard Adobe RGB icc profile. If you do calibration and profiling on this you will probably end up with a slightly different custom profile. But part of the rationale with such monitor presets is to avoid the need for custom calibration and profiling.

If you set the monitor to it's native color space, you will use either a profile supplied by the manufacturer or make your own custom profile.

How the right profile gets selected probably depends on the system. Some might be automatic, others require manual profile selection. On my macbook pro for example it's automatic.

Dave
 
Now essentially any monitor with reasonable specifications for photo work is hardware calibrated. This seems to be universally regarded as superior to calibrating with a software profile. That's great.
I wonder whether you might be confusing two fundamentally-different levels of hardware calibration.

Windows at least offers a process to 'calibrate' a monitor that relies on the user's eyes accurately seeing how the monitor is displaying colors to tweak them. In the past I've seem similar suggestions that basically involved using one or more test images to adjust the monitor. This is not any type of hardware calibration. Almost nobody thinks it's more than ' better than nothing'.

Then there are two types of hardware calibration, both of which typically use a colorimeter (hardware) but sometimes use other types of devices (like an X-Rite i1Studio spectrophotometer). Software can use the colorimeter to read various colors sent to the monitor to construct an ICC display profile, which you then select e.g. in a Windows control panel. IMO this can work quite well for many people and many purposes, and is what I'm using with my Dell U2415.

However, increasingly in recent years, more serious 'photo editing' monitors can use some of the hardware to interact more directly with the monitor and build tables in it that function rather like ICC display profiles, but are more complex. Obviously the computer also has to know what the monitor can (and cannot) display, and what signal to send to get that color. This approach is likely better / capable of better accuracy, and supported in all or at least most current serious 'photo editing' monitors.
What has me extremely confused is the interaction between the software profiles and hardware calibration. Right now, I am using an ICC profile created with DisplayCal using my Calibrite colorimeter on my LG monitor.

What exactly am I supposed to do when I plug in a new, hardware calibrated monitor? Do I no longer worry about the profile and just leave it as it is? If so, why does it not mess up what I see on the monitor?

Or, do I instead activate a different profile? If so, which one?
AFAIK, all of the monitors that support hardware calibration of the second type have dedicated software to manage it. So if you calibrate an Eizo, you use Eizo ColorNavigator software. If you calibrate a Benq, you use Palette Master software. I'm sure other brands have their own software.
You just plug them in and they are a monitor and that's that.
Not for any really serious use, AFAIK.
So how does this all work? How do I make sure that when I go ahead and use a decent hardware calibrated monitor that I actually get the correct colors/etc. that I am supposed to be getting with it?
I have what is AFAIK supposed to be the current top-of-the-line Calibrite Display Plus HL colorimeter. I just use the Calibrite Profiler software to build a monitor profile for my Dell, and then make sure that in the Windows (11) Color Management settings, Device profile is set to the profile the Calibrite software just built. IMO this works pretty well.

If you get a more advanced monitor and want to load a LUT in the hardware or whatever, I think you just need to use as directed the monitor manufacturer's software.
 
Now essentially any monitor with reasonable specifications for photo work is hardware calibrated. This seems to be universally regarded as superior to calibrating with a software profile. That's great.
I wonder whether you might be confusing two fundamentally-different levels of hardware calibration.

Windows at least offers a process to 'calibrate' a monitor that relies on the user's eyes accurately seeing how the monitor is displaying colors to tweak them. In the past I've seem similar suggestions that basically involved using one or more test images to adjust the monitor. This is not any type of hardware calibration. Almost nobody thinks it's more than ' better than nothing'.

Then there are two types of hardware calibration, both of which typically use a colorimeter (hardware) but sometimes use other types of devices (like an X-Rite i1Studio spectrophotometer). Software can use the colorimeter to read various colors sent to the monitor to construct an ICC display profile, which you then select e.g. in a Windows control panel. IMO this can work quite well for many people and many purposes, and is what I'm using with my Dell U2415.

However, increasingly in recent years, more serious 'photo editing' monitors can use some of the hardware to interact more directly with the monitor and build tables in it that function rather like ICC display profiles, but are more complex. Obviously the computer also has to know what the monitor can (and cannot) display, and what signal to send to get that color. This approach is likely better / capable of better accuracy, and supported in all or at least most current serious 'photo editing' monitors.
What has me extremely confused is the interaction between the software profiles and hardware calibration. Right now, I am using an ICC profile created with DisplayCal using my Calibrite colorimeter on my LG monitor.

What exactly am I supposed to do when I plug in a new, hardware calibrated monitor? Do I no longer worry about the profile and just leave it as it is? If so, why does it not mess up what I see on the monitor?

Or, do I instead activate a different profile? If so, which one?
AFAIK, all of the monitors that support hardware calibration of the second type have dedicated software to manage it. So if you calibrate an Eizo, you use Eizo ColorNavigator software. If you calibrate a Benq, you use Palette Master software. I'm sure other brands have their own software.
You just plug them in and they are a monitor and that's that.
Not for any really serious use, AFAIK.
So how does this all work? How do I make sure that when I go ahead and use a decent hardware calibrated monitor that I actually get the correct colors/etc. that I am supposed to be getting with it?
I have what is AFAIK supposed to be the current top-of-the-line Calibrite Display Plus HL colorimeter. I just use the Calibrite Profiler software to build a monitor profile for my Dell, and then make sure that in the Windows (11) Color Management settings, Device profile is set to the profile the Calibrite software just built. IMO this works pretty well.

If you get a more advanced monitor and want to load a LUT in the hardware or whatever, I think you just need to use as directed the monitor manufacturer's software.
I don't think I am confusing the different types of calibration you mention.

Let me try to ask again in a more specific way.

Most monitors that one would consider for photo editing these days are factory calibrated, Calman certified, etc.

Most people seem to say that they find these factory calibrations excellent. Some argue that they don't trust them and want to do it themselves.

Either way, a monitor being factory calibrated means that there have been specific adjustments made to it which are at least supposed to ensure accurate color representation.

Yet every computer has a color profiles running for each monitor at all times which modifies the way that colors are fed to the monitor.

So if I buy such a monitor which has been factory calibrated and want to see what the factory calibrated, "Calman certified" calibration looks like, what does this mean for Windows' color profile? In other words, when I plug the monitor in, what profile do I need to set in the operating system in order to see the monitor as the factory calibration says it should be?
 
A factory calibration is only valid at the time of manufacture. The calibration will need to be reviewed as the monitor ages.
 
Last edited:
Now essentially any monitor with reasonable specifications for photo work is hardware calibrated. This seems to be universally regarded as superior to calibrating with a software profile. That's great.
I wonder whether you might be confusing two fundamentally-different levels of hardware calibration.

Windows at least offers a process to 'calibrate' a monitor that relies on the user's eyes accurately seeing how the monitor is displaying colors to tweak them. In the past I've seem similar suggestions that basically involved using one or more test images to adjust the monitor. This is not any type of hardware calibration. Almost nobody thinks it's more than ' better than nothing'.

Then there are two types of hardware calibration, both of which typically use a colorimeter (hardware) but sometimes use other types of devices (like an X-Rite i1Studio spectrophotometer). Software can use the colorimeter to read various colors sent to the monitor to construct an ICC display profile, which you then select e.g. in a Windows control panel. IMO this can work quite well for many people and many purposes, and is what I'm using with my Dell U2415.

However, increasingly in recent years, more serious 'photo editing' monitors can use some of the hardware to interact more directly with the monitor and build tables in it that function rather like ICC display profiles, but are more complex. Obviously the computer also has to know what the monitor can (and cannot) display, and what signal to send to get that color. This approach is likely better / capable of better accuracy, and supported in all or at least most current serious 'photo editing' monitors.
What has me extremely confused is the interaction between the software profiles and hardware calibration. Right now, I am using an ICC profile created with DisplayCal using my Calibrite colorimeter on my LG monitor.

What exactly am I supposed to do when I plug in a new, hardware calibrated monitor? Do I no longer worry about the profile and just leave it as it is? If so, why does it not mess up what I see on the monitor?

Or, do I instead activate a different profile? If so, which one?
AFAIK, all of the monitors that support hardware calibration of the second type have dedicated software to manage it. So if you calibrate an Eizo, you use Eizo ColorNavigator software. If you calibrate a Benq, you use Palette Master software. I'm sure other brands have their own software.
You just plug them in and they are a monitor and that's that.
Not for any really serious use, AFAIK.
So how does this all work? How do I make sure that when I go ahead and use a decent hardware calibrated monitor that I actually get the correct colors/etc. that I am supposed to be getting with it?
I have what is AFAIK supposed to be the current top-of-the-line Calibrite Display Plus HL colorimeter. I just use the Calibrite Profiler software to build a monitor profile for my Dell, and then make sure that in the Windows (11) Color Management settings, Device profile is set to the profile the Calibrite software just built. IMO this works pretty well.

If you get a more advanced monitor and want to load a LUT in the hardware or whatever, I think you just need to use as directed the monitor manufacturer's software.
I don't think I am confusing the different types of calibration you mention.

Let me try to ask again in a more specific way.

Most monitors that one would consider for photo editing these days are factory calibrated, Calman certified, etc.

Most people seem to say that they find these factory calibrations excellent. Some argue that they don't trust them and want to do it themselves.

Either way, a monitor being factory calibrated means that there have been specific adjustments made to it which are at least supposed to ensure accurate color representation.

Yet every computer has a color profiles running for each monitor at all times which modifies the way that colors are fed to the monitor.

So if I buy such a monitor which has been factory calibrated and want to see what the factory calibrated, "Calman certified" calibration looks like, what does this mean for Windows' color profile? In other words, when I plug the monitor in, what profile do I need to set in the operating system in order to see the monitor as the factory calibration says it should be?
I think I know what you mean. The default Windows profile is an sRGB one which means that - in my case - on a wide-gamut monitor (set to its native space) colors are way too saturated. Color managed applications use the display profile so it is important that it (the profile) matches the space of choice on the monitor. This is why non-colormanaged apps in Windows are oversaturated on wide-gamut dispays.

This is how it works on my Eizo. Not sure how others work but should be the same I reckon.

The reason for the hardware calibration is to have the monitor handle the calibration in LUT tables inside the monitor rather than having the profile fiddling with the GPU (which is not as consistent and technically accurate). But for many it is more than enough. To be honest I don't really notice a big difference.
 
Last edited:
I am currently using an LG 24gq40w, the limitations of which seem to be causing me some problems for editing. As such, I am looking to replace it with a monitor which will be better for photo editing. This could be anything from an ASUS ProArt to something higher end.

Now essentially any monitor with reasonable specifications for photo work is hardware calibrated. This seems to be universally regarded as superior to calibrating with a software profile. That's great.

What has me extremely confused is the interaction between the software profiles and hardware calibration. Right now, I am using an ICC profile created with DisplayCal using my Calibrite colorimeter on my LG monitor.

What exactly am I supposed to do when I plug in a new, hardware calibrated monitor? Do I no longer worry about the profile and just leave it as it is? If so, why does it not mess up what I see on the monitor?

Or, do I instead activate a different profile? If so, which one?

Some of the hardware calibrated monitors seem to have software which is used to create an ICC profile on the computer designed for that monitor. Some have specific drivers you are supposed to install which may somehow get things working correctly. Some have instructions about setting up profiles in Windows. Yet others do not. You just plug them in and they are a monitor and that's that.

So how does this all work? How do I make sure that when I go ahead and use a decent hardware calibrated monitor that I actually get the correct colors/etc. that I am supposed to be getting with it?
I am not sure if this will help you yet. This link gives a pretty good higher-level explanation of the different options:

https://www.benq.com/en-us/knowledg...are-calibration-and-hardware-calibration.html

I hope that helps yet if not, just keep asking away
 
I am currently using an LG 24gq40w, the limitations of which seem to be causing me some problems for editing. As such, I am looking to replace it with a monitor which will be better for photo editing. This could be anything from an ASUS ProArt to something higher end.

Now essentially any monitor with reasonable specifications for photo work is hardware calibrated. This seems to be universally regarded as superior to calibrating with a software profile. That's great.

What has me extremely confused is the interaction between the software profiles and hardware calibration. Right now, I am using an ICC profile created with DisplayCal using my Calibrite colorimeter on my LG monitor.

What exactly am I supposed to do when I plug in a new, hardware calibrated monitor? Do I no longer worry about the profile and just leave it as it is? If so, why does it not mess up what I see on the monitor?

Or, do I instead activate a different profile? If so, which one?

Some of the hardware calibrated monitors seem to have software which is used to create an ICC profile on the computer designed for that monitor. Some have specific drivers you are supposed to install which may somehow get things working correctly. Some have instructions about setting up profiles in Windows. Yet others do not. You just plug them in and they are a monitor and that's that.

So how does this all work? How do I make sure that when I go ahead and use a decent hardware calibrated monitor that I actually get the correct colors/etc. that I am supposed to be getting with it?
I am not sure if this will help you yet. This link gives a pretty good higher-level explanation of the different options:

https://www.benq.com/en-us/knowledg...are-calibration-and-hardware-calibration.html

I hope that helps yet if not, just keep asking away
I understand the difference between these three types of calibration.

What I don't understand is what I specifically do when plugging in a new monitor which has been factory calibrated OR one which has been hardware calibrated in order to actually get that calibration.

The computer always has software calibration profiles. Even if I don't specifically do a software calibration myself, Windows will still apply some color profile to the monitor. There is no way to just get a "neutral" signal. It will always apply some profile.

So what software profile do I use if the calibration of my monitor is done in hardware? For example, even though it's not something I would ever use, this may help illustrate the point: I have read people talk about one advantage of hardware calibration being that you can move the monitor from one computer to another and it will look the same because the calibration is in the hardware. But if the two different computers are using two different software color profiles, then the monitor will NOT look the same between the two computers. Regardless of what the hardware calibration is, it is operating off of the data being fed to it through the software color profile on each computer.

So what do I set my profile to on the computer so that I actually get the benefit of the hardware calibration?
 
I am currently using an LG 24gq40w, the limitations of which seem to be causing me some problems for editing. As such, I am looking to replace it with a monitor which will be better for photo editing. This could be anything from an ASUS ProArt to something higher end.

Now essentially any monitor with reasonable specifications for photo work is hardware calibrated. This seems to be universally regarded as superior to calibrating with a software profile. That's great.

What has me extremely confused is the interaction between the software profiles and hardware calibration. Right now, I am using an ICC profile created with DisplayCal using my Calibrite colorimeter on my LG monitor.

What exactly am I supposed to do when I plug in a new, hardware calibrated monitor? Do I no longer worry about the profile and just leave it as it is? If so, why does it not mess up what I see on the monitor?

Or, do I instead activate a different profile? If so, which one?

Some of the hardware calibrated monitors seem to have software which is used to create an ICC profile on the computer designed for that monitor. Some have specific drivers you are supposed to install which may somehow get things working correctly. Some have instructions about setting up profiles in Windows. Yet others do not. You just plug them in and they are a monitor and that's that.

So how does this all work? How do I make sure that when I go ahead and use a decent hardware calibrated monitor that I actually get the correct colors/etc. that I am supposed to be getting with it?
I am not sure if this will help you yet. This link gives a pretty good higher-level explanation of the different options:

https://www.benq.com/en-us/knowledg...are-calibration-and-hardware-calibration.html

I hope that helps yet if not, just keep asking away
I understand the difference between these three types of calibration.

What I don't understand is what I specifically do when plugging in a new monitor which has been factory calibrated OR one which has been hardware calibrated in order to actually get that calibration.

The computer always has software calibration profiles. Even if I don't specifically do a software calibration myself, Windows will still apply some color profile to the monitor. There is no way to just get a "neutral" signal. It will always apply some profile.

So what software profile do I use if the calibration of my monitor is done in hardware? For example, even though it's not something I would ever use, this may help illustrate the point: I have read people talk about one advantage of hardware calibration being that you can move the monitor from one computer to another and it will look the same because the calibration is in the hardware. But if the two different computers are using two different software color profiles, then the monitor will NOT look the same between the two computers. Regardless of what the hardware calibration is, it is operating off of the data being fed to it through the software color profile on each computer.

So what do I set my profile to on the computer so that I actually get the benefit of the hardware calibration?
To elaborate further on my earlier post :-

Hardware calibrated monitors come with several presets as well as a native setting. The presets are for standard color spaces such as sRGB, Adobe RGB and P3. With these the factory calibration adjusts the monitor’s LUT values so that these presets make the monitor look like it has the standard color spaces specified. So the icc profile used by the computer just needs to present the data to the monitor in the color space specified. Eg if you have Adobe RGB selected, data will be presented to the monitor in Adobe RGB values. Thus the icc profile used to interface to the monitor is simply a standard Adobe RGB colour space profile. Standard color space profiles can be used in the same way as monitor color space profiles.

Is that any clearer?

Dave
 
I am always find posts about color management a bit 'scary', in the sense that I am afraid doing things wrong. So, I do nothing (I am not a professional user).

I have a factory calibrated BenQ PD3200U 4k 32" monitor and Windows notices the monitor and my Nvidia GeForce RTX 2060. These monitor calibration devices used in the factory are (mostly) a factor 10 more expensive than typical home calibration devices. Only disadvantage is that degradation of the monitor is not corrected. But when it looks good to my eyes, it is good for me.

So I leave it like it is. For processing and editing I use DxO Photolab (PL8) Wide Gamut. PL allows to notice when their colors are out of the monitor color space and that is only rarely the case.
 
I am always find posts about color management a bit 'scary', in the sense that I am afraid doing things wrong. So, I do nothing (I am not a professional user).

I have a factory calibrated BenQ PD3200U 4k 32" monitor and Windows notices the monitor and my Nvidia GeForce RTX 2060. These monitor calibration devices used in the factory are (mostly) a factor 10 more expensive than typical home calibration devices. Only disadvantage is that degradation of the monitor is not corrected. But when it looks good to my eyes, it is good for me.

So I leave it like it is. For processing and editing I use DxO Photolab (PL8) Wide Gamut. PL allows to notice when their colors are out of the monitor color space and that is only rarely the case.
The issue is that "leaving it like it is" doesn't ensure you are getting the calibration they did on the monitor because your computer is always using some profile of its own.
 
I am currently using an LG 24gq40w, the limitations of which seem to be causing me some problems for editing. As such, I am looking to replace it with a monitor which will be better for photo editing. This could be anything from an ASUS ProArt to something higher end.

Now essentially any monitor with reasonable specifications for photo work is hardware calibrated. This seems to be universally regarded as superior to calibrating with a software profile. That's great.

What has me extremely confused is the interaction between the software profiles and hardware calibration. Right now, I am using an ICC profile created with DisplayCal using my Calibrite colorimeter on my LG monitor.

What exactly am I supposed to do when I plug in a new, hardware calibrated monitor? Do I no longer worry about the profile and just leave it as it is? If so, why does it not mess up what I see on the monitor?

Or, do I instead activate a different profile? If so, which one?

Some of the hardware calibrated monitors seem to have software which is used to create an ICC profile on the computer designed for that monitor. Some have specific drivers you are supposed to install which may somehow get things working correctly. Some have instructions about setting up profiles in Windows. Yet others do not. You just plug them in and they are a monitor and that's that.

So how does this all work? How do I make sure that when I go ahead and use a decent hardware calibrated monitor that I actually get the correct colors/etc. that I am supposed to be getting with it?
I am not sure if this will help you yet. This link gives a pretty good higher-level explanation of the different options:

https://www.benq.com/en-us/knowledg...are-calibration-and-hardware-calibration.html

I hope that helps yet if not, just keep asking away
I understand the difference between these three types of calibration.

What I don't understand is what I specifically do when plugging in a new monitor which has been factory calibrated OR one which has been hardware calibrated in order to actually get that calibration.

The computer always has software calibration profiles. Even if I don't specifically do a software calibration myself, Windows will still apply some color profile to the monitor. There is no way to just get a "neutral" signal. It will always apply some profile.

So what software profile do I use if the calibration of my monitor is done in hardware? For example, even though it's not something I would ever use, this may help illustrate the point: I have read people talk about one advantage of hardware calibration being that you can move the monitor from one computer to another and it will look the same because the calibration is in the hardware. But if the two different computers are using two different software color profiles, then the monitor will NOT look the same between the two computers. Regardless of what the hardware calibration is, it is operating off of the data being fed to it through the software color profile on each computer.

So what do I set my profile to on the computer so that I actually get the benefit of the hardware calibration?
To elaborate further on my earlier post :-

Hardware calibrated monitors come with several presets as well as a native setting. The presets are for standard color spaces such as sRGB, Adobe RGB and P3. With these the factory calibration adjusts the monitor’s LUT values so that these presets make the monitor look like it has the standard color spaces specified. So the icc profile used by the computer just needs to present the data to the monitor in the color space specified. Eg if you have Adobe RGB selected, data will be presented to the monitor in Adobe RGB values. Thus the icc profile used to interface to the monitor is simply a standard Adobe RGB colour space profile. Standard color space profiles can be used in the same way as monitor color space profiles.

Is that any clearer?

Dave
Do I understand you to say that I should choose the srgb profile included with Windows if I want to use such a monitor in its srgb calibration?
 
I am always find posts about color management a bit 'scary', in the sense that I am afraid doing things wrong. So, I do nothing (I am not a professional user).

I have a factory calibrated BenQ PD3200U 4k 32" monitor and Windows notices the monitor and my Nvidia GeForce RTX 2060. These monitor calibration devices used in the factory are (mostly) a factor 10 more expensive than typical home calibration devices. Only disadvantage is that degradation of the monitor is not corrected. But when it looks good to my eyes, it is good for me.

So I leave it like it is. For processing and editing I use DxO Photolab (PL8) Wide Gamut. PL allows to notice when their colors are out of the monitor color space and that is only rarely the case.
The issue is that "leaving it like it is" doesn't ensure you are getting the calibration they did on the monitor because your computer is always using some profile of its own.
Monitors usually have "drivers" available for download. (The quotation marks are because they aren't what I'd call drivers - an ICC profile and an .inf file are installed. The latter identifies the monitor to Windows.)

I admit that I have no idea which monitor mode the ICC profile might be intended for. My primary monitor lacks a "custom" mode, so I set it in REC 2020, which is a wider gamut than the monitor is physically capable of.
 
I am currently using an LG 24gq40w, the limitations of which seem to be causing me some problems for editing. As such, I am looking to replace it with a monitor which will be better for photo editing. This could be anything from an ASUS ProArt to something higher end.

Now essentially any monitor with reasonable specifications for photo work is hardware calibrated. This seems to be universally regarded as superior to calibrating with a software profile. That's great.

What has me extremely confused is the interaction between the software profiles and hardware calibration. Right now, I am using an ICC profile created with DisplayCal using my Calibrite colorimeter on my LG monitor.

What exactly am I supposed to do when I plug in a new, hardware calibrated monitor? Do I no longer worry about the profile and just leave it as it is? If so, why does it not mess up what I see on the monitor?

Or, do I instead activate a different profile? If so, which one?

Some of the hardware calibrated monitors seem to have software which is used to create an ICC profile on the computer designed for that monitor. Some have specific drivers you are supposed to install which may somehow get things working correctly. Some have instructions about setting up profiles in Windows. Yet others do not. You just plug them in and they are a monitor and that's that.

So how does this all work? How do I make sure that when I go ahead and use a decent hardware calibrated monitor that I actually get the correct colors/etc. that I am supposed to be getting with it?
I am not sure if this will help you yet. This link gives a pretty good higher-level explanation of the different options:

https://www.benq.com/en-us/knowledg...are-calibration-and-hardware-calibration.html

I hope that helps yet if not, just keep asking away
I understand the difference between these three types of calibration.

What I don't understand is what I specifically do when plugging in a new monitor which has been factory calibrated OR one which has been hardware calibrated in order to actually get that calibration.

The computer always has software calibration profiles. Even if I don't specifically do a software calibration myself, Windows will still apply some color profile to the monitor. There is no way to just get a "neutral" signal. It will always apply some profile.

So what software profile do I use if the calibration of my monitor is done in hardware? For example, even though it's not something I would ever use, this may help illustrate the point: I have read people talk about one advantage of hardware calibration being that you can move the monitor from one computer to another and it will look the same because the calibration is in the hardware. But if the two different computers are using two different software color profiles, then the monitor will NOT look the same between the two computers. Regardless of what the hardware calibration is, it is operating off of the data being fed to it through the software color profile on each computer.

So what do I set my profile to on the computer so that I actually get the benefit of the hardware calibration?
To elaborate further on my earlier post :-

Hardware calibrated monitors come with several presets as well as a native setting. The presets are for standard color spaces such as sRGB, Adobe RGB and P3. With these the factory calibration adjusts the monitor’s LUT values so that these presets make the monitor look like it has the standard color spaces specified. So the icc profile used by the computer just needs to present the data to the monitor in the color space specified. Eg if you have Adobe RGB selected, data will be presented to the monitor in Adobe RGB values. Thus the icc profile used to interface to the monitor is simply a standard Adobe RGB colour space profile. Standard color space profiles can be used in the same way as monitor color space profiles.

Is that any clearer?

Dave
Do I understand you to say that I should choose the srgb profile included with Windows if I want to use such a monitor in its srgb calibration?
Yes
 
I am always find posts about color management a bit 'scary', in the sense that I am afraid doing things wrong. So, I do nothing (I am not a professional user).

I have a factory calibrated BenQ PD3200U 4k 32" monitor and Windows notices the monitor and my Nvidia GeForce RTX 2060. These monitor calibration devices used in the factory are (mostly) a factor 10 more expensive than typical home calibration devices. Only disadvantage is that degradation of the monitor is not corrected. But when it looks good to my eyes, it is good for me.

So I leave it like it is. For processing and editing I use DxO Photolab (PL8) Wide Gamut. PL allows to notice when their colors are out of the monitor color space and that is only rarely the case.
I too have a BenQ monitor, older than yours, and I regular generate a new ICC profile using an i1Display Pro and the calibrite PROFILER application (replaces the old X-Rite i1Profiler application). I might agree with your comment "...when it looks good to my eyes, it is good for me.." however my interest in color management is to have a higher degree of color fidelity from my screen through Photoshop to my Canon printer, such that when I spend time and effort preparing an image for printing I am (hopefully) making edits that produce a better (if not best) print. That is, while "looks good enough" may look ok to my eyes, it may not yield a very good print. At least that's my understand.

I am following this thread's discussion with interest as I may eventually replace my BenQ monitor - which does not support hardware calibration - and so I'd like to better understand how the h/w calibration (profiling?) works compared to the X-Rite/calibrite way.

Peter

PS, my BenQ is the SW2700PT
 
I am always find posts about color management a bit 'scary', in the sense that I am afraid doing things wrong. So, I do nothing (I am not a professional user).

I have a factory calibrated BenQ PD3200U 4k 32" monitor and Windows notices the monitor and my Nvidia GeForce RTX 2060. These monitor calibration devices used in the factory are (mostly) a factor 10 more expensive than typical home calibration devices. Only disadvantage is that degradation of the monitor is not corrected. But when it looks good to my eyes, it is good for me.

So I leave it like it is. For processing and editing I use DxO Photolab (PL8) Wide Gamut. PL allows to notice when their colors are out of the monitor color space and that is only rarely the case.
I too have a BenQ monitor, older than yours, and I regular generate a new ICC profile using an i1Display Pro and the calibrite PROFILER application (replaces the old X-Rite i1Profiler application). I might agree with your comment "...when it looks good to my eyes, it is good for me.." however my interest in color management is to have a higher degree of color fidelity from my screen through Photoshop to my Canon printer, such that when I spend time and effort preparing an image for printing I am (hopefully) making edits that produce a better (if not best) print. That is, while "looks good enough" may look ok to my eyes, it may not yield a very good print. At least that's my understand.

I am following this thread's discussion with interest as I may eventually replace my BenQ monitor - which does not support hardware calibration - and so I'd like to better understand how the h/w calibration (profiling?) works compared to the X-Rite/calibrite way.

Peter

PS, my BenQ is the SW2700PT
My understanding is that the full hardware calibration, which programs an LUT in the monitor, avoids the potential loss of dynamic range that could happen if the corrections applied through an ICC profile are too large. (The bit depth of the LUT is large.)

That may not be a major consideration for a monitor with a good calibration out of the box.

Regardless, I doubt that most users reprogram the LUT often. Creating a new profile to compensate for small drifts in the monitor can be done more quickly, and are probably done more often.
 
I am always find posts about color management a bit 'scary', in the sense that I am afraid doing things wrong. So, I do nothing (I am not a professional user).

I have a factory calibrated BenQ PD3200U 4k 32" monitor and Windows notices the monitor and my Nvidia GeForce RTX 2060. These monitor calibration devices used in the factory are (mostly) a factor 10 more expensive than typical home calibration devices. Only disadvantage is that degradation of the monitor is not corrected. But when it looks good to my eyes, it is good for me.

So I leave it like it is. For processing and editing I use DxO Photolab (PL8) Wide Gamut. PL allows to notice when their colors are out of the monitor color space and that is only rarely the case.
The issue is that "leaving it like it is" doesn't ensure you are getting the calibration they did on the monitor because your computer is always using some profile of its own.
Hi SCoombs

I appreciate the questions and your concerns (they are valid). Here is another video that discusses each of the issues (and more) regarding setting the OS profile for different types of monitors and situations.

Note that this is for a specific type of monitor and the SW it uses in calibration, yet the thoughts apply to other monitors as well. You have not mentioned which type of monitor you are considering purchasing, yet be sure that such a monitor comes with the information/support so you know what to do for that particular monitor.

The video is from several years ago, and since then, BenQ has added ICC sync to update the OS profile depending on what profile you choose on the monitor (nice features).

Watch both videos, and I believe this will empower you to ask questions that the monitor manufacturer should be able to answer - if not, beware of purchasing that monitor.

I hope this helps some more.

Why and when to change OS ICC profile to match monitor

Syncing monitor to OS profiles feature
 
Do I understand you to say that I should choose the srgb profile included with Windows if I want to use such a monitor in its srgb calibration?
Yes
Then I will follow up with another question :-) If one wants to use a wide-gamut display and set it to Adobe RGB for example, which profile should be used then? IF one decides not to calibrate and just run Windows default profiles that is. I couldn't find any regular Adobe RGB profiles in the color management tab.

If I remember correctly, if you run a display in wide gamut and you have not installed any profiles/calibrated and just use the windows system default profile (sRGB) then everything is oversaturated - even in color managed applications such as LR/PS.

So OPs question is very relevant and I have seen this claim as well from different manufacturers...that a "major plus with a LUT based monitor is that you can calibrate it and then connect it to ANY computer" since the calibration is in the monitor. But if that is the case they forget to mention the little detail that you have to install profiles on the new target computer as well. At least if the calibrated monitor is a wide gamut (most - if not all - hardware calibrated are)

Color management and the issues that might (will) occur during the journey is by far the most mysteriuos topic I have ever encountered since I got my first DSLR back in 2004. And to be honest I don't quite understand why it has to be like that in 2024.
 
Last edited:
Do I understand you to say that I should choose the srgb profile included with Windows if I want to use such a monitor in its srgb calibration?
Yes
Then I will follow up with another question :-) If one wants to use a wide-gamut display and set it to Adobe RGB for example, which profile should be used then? IF one decides not to calibrate and just run Windows default profiles that is. I couldn't find any regular Adobe RGB profiles in the color management tab.

If I remember correctly, if you run a display in wide gamut and you have not installed any profiles/calibrated and just use the windows system default profile (sRGB) then everything is oversaturated - even in color managed applications such as LR/PS.

So OPs question is very relevant and I have seen this claim as well from different manufacturers...that a "major plus with a LUT based monitor is that you can calibrate it and then connect it to ANY computer" since the calibration is in the monitor. But if that is the case they forget to mention the little detail that you have to install profiles on the new target computer as well. At least if the calibrated monitor is a wide gamut (most - if not all - hardware calibrated are)

Color management and the issues that might (will) occur during the journey is by far the most mysteriuos topic I have ever encountered since I got my first DSLR back in 2004. And to be honest I don't quite understand why it has to be like that in 2024.
If you are using a standard color preset you need to use the corresponding standard icc profile, whether it be Adobe RGB, P3, sRGB or whatever. There should be a standard Adobe RGB profile somewhere on your OS. You may have to import it into your CM tab. Otherwise it can be downloaded from the Adobe website.

Dave
 
Last edited:

Keyboard shortcuts

Back
Top