Where are gamma correction values stored?

polarbeer

Well-known member
Messages
126
Reaction score
0
Location
FI
Hi,

I have lately tried to understand the meaning of gamma correction in the color management workflow. One thing I haven't found any good information is where are monitor gamma correction values actually stored in Windows 7 (64 bit). I'm running dual monitor system (LG's LCD as primary monitor and Hitachi's CM772 CRT as secondary monitor). My computer has following video card: Palit GeForce GTX260 - 896 MB.

I've read info that some image files already contain gamma corrected RGB-values. Isn't this weird if every monitor should have their own gamma correction values? Doesn't this mean that an image file containing gamma corrected RGB-values is corrected twice for gamma?

You can freely explain this gamma thing like I would be four years old :). Maybe someone else will also gain understanding from this.

If this gamma correction can be found somewhere in Windows' control screens, in my video card's control screens or as a file under Windows, I would really appreciate knowing where.

There seem's to be even gamma in my LCD's control panel (three stage setting -50 --> 0 --> +50) which is used with hard buttons of the display.

Already gamma correction part of color management work flow seems like a mess when similar controls are scattered in different places (hardware, operating system, display driver, photo editing software, files already containing corrected values etc.)

-pb
 
Gamma basically is the curve shape between the two endpoints (black & white). To actually calculate a gamma curve, assume that the luminance covers a range of 0 (black) to 1.0 (white). Each intermediate value (between black & white) will be 0.xxx. The formula for calculating gamma is:

Lout = Lin^G, where:

Lin = Input Luminance signal
Lout = Output Luminance signal
G = Gamma
^ = raised to the power of

As an example, with a Gamma of 2, the Luminance transfer function would be a parabolic function (Y=X^2).

Gamma correction is normally done in a display system (either in the display itself or a combination of the display and graphics card). This is done to set the display system gamma to a standardized value. For example, sRGB is defined with a Gamma of 2.2. Gamma corrected images have been adjusted so that they display correctly on a display system of a known gamma (again, typically 2.2).

Hope this helps.
 
Gamma basically is the curve shape between the two endpoints (black & white). To actually calculate a gamma curve, assume that the luminance covers a range of 0 (black) to 1.0 (white). Each intermediate value (between black & white) will be 0.xxx. The formula for calculating gamma is:

Lout = Lin^G, where:

Lin = Input Luminance signal
Lout = Output Luminance signal
G = Gamma
^ = raised to the power of

As an example, with a Gamma of 2, the Luminance transfer function would be a parabolic function (Y=X^2).

Gamma correction is normally done in a display system (either in the display itself or a combination of the display and graphics card). This is done to set the display system gamma to a standardized value. For example, sRGB is defined with a Gamma of 2.2. Gamma corrected images have been adjusted so that they display correctly on a display system of a known gamma (again, typically 2.2).

Hope this helps.
Thanks for your time Bob. You explain gamma quite well. Best explanation I have found so far is here:

http://www.siggraph.org/education/materials/HyperGraph/gamma_correction/gamma_intro.html

http://www.siggraph.org/education/materials/HyperGraph/gamma_correction/gamma_definitions.html

The data on those pages might be quite old as the writer talks about Photoshop 3.0. But I think the writer explains things quite well and the gamma phenomenon itself hasn't changed over the years.

I would just like to know how are these gamma and gamma corrections handled in today's operating systems, hardware, software, drivers and image files. I think this is vital to be really able to understand color management and get things right.

If I understand this right:

1. Gamma is phenomenon of all monitors (CRT, LCD)

2. Gamma correction is applied in different places of your computer system so that the actual picture seen on your monitor isn't too bright or dark. But what are these places and how can I change these values if desirable?

-pb
 
If I understand this right:

1. Gamma is phenomenon of all monitors (CRT, LCD)
Well it's a result of applying an offset to integer values for a standard across luminance values. Linearity of monitors isn't known by the computer, not even in apple land.
2. Gamma correction is applied in different places of your computer system so that the actual picture seen on your monitor isn't too bright or dark. But what are these places and how can I change these values if desirable?
You can set gamma:
  • Sometimes in monitor hardware settings
  • Usually in graphics properties of the video driver. It can't be set as a simple "set gamma to 2.2 (or 1.8)", as the linearity of the monitor is not known by the graphics controller on the computer. So you need to use either a visual target, or hardware calibrator.
  • some (games) programatically set gamma, and doesn't reset properly if the game is buggy until the machine is rebooted. This is a damned nuisance if you have kids who d/l nasty games.
  • automatically using a hardware calibration tool an software
Most visual target adjustment tools (ie "Adobe Gamma") are based on setting gamma at one luminance level. For my use as a simple "reality check" on (hardware) calibration, I made the target below, at about 73% luminance - about the same as Adobe gamma. This doesn't guarantee that gamma is okay at lower or higher luminance levels. For that you can battle endlessly using all sorts of visual tools "by eye", but better to get a hardware calibrator.

So, if gamma on your monitor is about right, then at a distance or with eyes squinted, the central square of the grey, red, green and blue rectangles should be almost indistinguishable from the squares on either side. They look perfect on my setup.

 
With most display systems, the gamma correction is applied in the video card (by downloading a custom LUT into the card). Some of the higher-end monitors (high end NEC & Eizo monitors) have the ability to apply a correction LUT internally inside the monitor. The advantage of this is that they are generally higher bit depth (typically 10 or 12 bits) than the correction applied via the display card (8-bits).

As freddyNZ mentioned, the correction LUT isn't automatically calculated, as it depends upon the individual characteristics of the display (CRT or LCD) being used. If you're using a hardware display calibrator, it starts off by loading a linear LUT into the video card and then proceeds to measure the characteristics of the display. It then calulates a correction LUT that will bring the display to a known gamma & color temperature and loads that into the video card (or monitor if supported). After calibrating the display to known set of values, it then proceeds to characterize the color characteristics of the display. This data is used to build the ICC profile for the display. Note that if you use a display that doesn't support an internal correction LUT (that's most of us), the correction LUT gets loaded into the video card when the OS boots up. It is usually done with a small LUT loader utility which runs at startup and reads the correction LUT from the default display profile and loads it into the video card.
 
If I understand this right:

1. Gamma is phenomenon of all monitors (CRT, LCD)
Well it's a result of applying an offset to integer values for a standard across luminance values. Linearity of monitors isn't known by the computer, not even in apple land.
2. Gamma correction is applied in different places of your computer system so that the actual picture seen on your monitor isn't too bright or dark. But what are these places and how can I change these values if desirable?
You can set gamma:
  • Sometimes in monitor hardware settings
  • Usually in graphics properties of the video driver. It can't be set as a simple "set gamma to 2.2 (or 1.8)", as the linearity of the monitor is not known by the graphics controller on the computer. So you need to use either a visual target, or hardware calibrator.
  • some (games) programatically set gamma, and doesn't reset properly if the game is buggy until the machine is rebooted. This is a damned nuisance if you have kids who d/l nasty games.
  • automatically using a hardware calibration tool an software
Most visual target adjustment tools (ie "Adobe Gamma") are based on setting gamma at one luminance level. For my use as a simple "reality check" on (hardware) calibration, I made the target below, at about 73% luminance - about the same as Adobe gamma. This doesn't guarantee that gamma is okay at lower or higher luminance levels. For that you can battle endlessly using all sorts of visual tools "by eye", but better to get a hardware calibrator.

So, if gamma on your monitor is about right, then at a distance or with eyes squinted, the central square of the grey, red, green and blue rectangles should be almost indistinguishable from the squares on either side. They look perfect on my setup.

Thanks for your interesting post.

I looked the "calibration pattern" you had made in my Opera-browser and Gimp -photo editor (moving the used application from other monitor to another) and it seems that my old CRT is quite well calibrated but the cheap LCD is showing colors totally wrong. Middle rectangles seem to be much brighter than the surrounding rectangles.

Interesting that for example in the top row the surrounding rectangles are made of equally thin lines of black (0, 0, 0) and white (255, 255, 255). I would think that it is seen as 50 % luminance (= 128, 128, 128) but you mention it is 73 % luminance (186, 186, 186 = color used in the center rectangle). This must have something to do with human's vision and how we sense colors and light?

-pb
 
With most display systems, the gamma correction is applied in the video card (by downloading a custom LUT into the card). Some of the higher-end monitors (high end NEC & Eizo monitors) have the ability to apply a correction LUT internally inside the monitor. The advantage of this is that they are generally higher bit depth (typically 10 or 12 bits) than the correction applied via the display card (8-bits).
Thanks again for sharing your knowledge on this important subject.

As it seems from FreddyNZ's calibration image my LCD monitor is totally "screwed up".
As freddyNZ mentioned, the correction LUT isn't automatically calculated, as it depends upon the individual characteristics of the display (CRT or LCD) being used. If you're using a hardware display calibrator, it starts off by loading a linear LUT into the video card and then proceeds to measure the characteristics of the display. It then calulates a correction LUT that will bring the display to a known gamma & color temperature and loads that into the video card (or monitor if supported). After calibrating the display to known set of values, it then proceeds to characterize the color characteristics of the display. This data is used to build the ICC profile for the display. Note that if you use a display that doesn't support an internal correction LUT (that's most of us), the correction LUT gets loaded into the video card when the OS boots up. It is usually done with a small LUT loader utility which runs at startup and reads the correction LUT from the default display profile and loads it into the video card.
Do you happen to know if a correction LUT made by a hardware calibrator is stored inside the ICC profile you mention? Or are the monitor's ICC profile and the correction LUT two totally different things?

How about some image files already already containing gamma corrected values. In Siggraph -link I posted above I found the following info:

******************************

"When the data is saved, after being gamma corrected to 1.8, that gamma correction stays with the file. However, most file formats (GIF, JPEG) don't have anyway to tell a user the gamma correction that has already been applied to image data. Therefore, the user must guess and gamma correct until he is satisfied with how it looks. The Targa and PNG file formats do encode the exact gamma information, removing some of the guess work. The 3D modeling program, 3D Studio, actually takes advantage of this information!

Gamma correction, then, can be done on file data directly (the individual bits in the file are changed to reflect the correction). This is what is meant by the File Gamma or "gamma of a file." On the other hand gamma correction can be done as post processing on file data. In the latter case, the data in the file is unchanged, but between reading the file and displaying the data on your monitor, the data is gamma corrected for display purposes. Ideally, if one knows the File Gamma and their own System Gamma, they can determine the gamma correction needed (if any) to accurately display the file on their system."

******************************************

So are gamma corrected values stored often in image files by image editing software? And do any programs really take it into consideration when showing this kind of image file by lowering midtones of the image before sending it to video card? So is this "file gamma" really a common problem among photo editing or a very rare exception?
 
Bob Collette wrote:

With most display systems, the gamma correction is applied in the video card (by downloading a custom LUT into the card). Some of the higher-end monitors (high end NEC & Eizo monitors) have the ability to apply a correction LUT internally inside the monitor. The advantage of this is that they are generally higher bit depth (typically 10 or 12 bits) than the correction applied via the display card (8-bits).
Thanks again for sharing your knowledge on this important subject.

As it seems from FreddyNZ's calibration image my LCD monitor is totally "screwed up".
As freddyNZ mentioned, the correction LUT isn't automatically calculated, as it depends upon the individual characteristics of the display (CRT or LCD) being used. If you're using a hardware display calibrator, it starts off by loading a linear LUT into the video card and then proceeds to measure the characteristics of the display. It then calulates a correction LUT that will bring the display to a known gamma & color temperature and loads that into the video card (or monitor if supported). After calibrating the display to known set of values, it then proceeds to characterize the color characteristics of the display. This data is used to build the ICC profile for the display. Note that if you use a display that doesn't support an internal correction LUT (that's most of us), the correction LUT gets loaded into the video card when the OS boots up. It is usually done with a small LUT loader utility which runs at startup and reads the correction LUT from the default display profile and loads it into the video card.
Do you happen to know if a correction LUT made by a hardware calibrator is stored inside the ICC profile you mention? Or are the monitor's ICC profile and the correction LUT two totally different things?

How about some image files already already containing gamma corrected values. In Siggraph -link I posted above I found the following info:

italics When the data is saved, after being gamma corrected to 1.8, that gamma correction stays with the file. However, most file formats (GIF, JPEG) don't have anyway to tell a user the gamma correction that has already been applied to image data. Therefore, the user must guess and gamma correct until he is satisfied with how it looks. The Targa and PNG file formats do encode the exact gamma information, removing some of the guess work. The 3D modeling program, 3D Studio, actually takes advantage of this information!

italics Gamma correction, then, can be done on file data directly (the individual bits in the file are changed to reflect the correction). This is what is meant by the File Gamma or "gamma of a file." On the other hand gamma correction can be done as post processing on file data. In the latter case, the data in the file is unchanged, but between reading the file and displaying the data on your monitor, the data is gamma corrected for display purposes. Ideally, if one knows the File Gamma and their own System Gamma, they can determine the gamma correction needed (if any) to accurately display the file on their system.


So are gamma corrected values stored often in image files by image editing software? And do any programs really take it into consideration when showing this kind of image file by lowering midtones of the image before sending it to video card? So is this "file gamma" really a common problem among photo editing or a very rare exception?

-pb
 
I know that monitor profiles built with the i1 Match profiling software do have the correction LUT stored in the monitor profile. I believe that the same is true with the Spyder profiling software as well. In fact, I have found that Windows 7 is able to read the correction LUT from the monitor profile and pass it to the video card (nVidia GeForce graphics card), so a LUT loader isn't necessary with Win7. However, I recently purchased a new Win7 laptop that uses the Intel i3 CPU with built-in graphics, and the LUT loader is required with that video, so apparently the need for a LUT loader depends upon the graphics engine being used (don't know about the ATI Radeon series).

I'm not aware of gamma values being stored in image files. However, as I mentioned earlier, if the image is in sRGB or Adobe RGB, the display system gamma is assumed to be 2.2, since that is what the color space specification calls for.
 
Interesting that for example in the top row the surrounding rectangles are made of equally thin lines of black (0, 0, 0) and white (255, 255, 255). I would think that it is seen as 50 % luminance (= 128, 128, 128) but you mention it is 73 % luminance (186, 186, 186 = color used in the center rectangle). This must have something to do with human's vision and how we sense colors and light?
No, freddyNZ's understanding is/was a little muddled - the relative luminance of 186 is not 73%, it is actually 50%.

The recommended standard image/system gamma value is '2.2' - so 50% luminance is actually produced by the value of 186 (in an 8 bit, 0-255 system).

186/255^2.2 = 0.5 (i.e. 50%)

...or to illustrate it in reverse...

0.5^(1/2.2) x 255 = 186

It's important to note that the most common image formats have their numeric data values allready encoded to suit a display system having a gamma 2.2 response characteristic.

Most further corrections made in hardware or in profiling are often relatively 'fine adjustments', and may not neccessarily be gamma curves at all - but with the purpose of making the display system conform closely to a gamma characteristic of 2.2.
 
Interesting that for example in the top row the surrounding rectangles are made of equally thin lines of black (0, 0, 0) and white (255, 255, 255). I would think that it is seen as 50 % luminance (= 128, 128, 128) but you mention it is 73 % luminance (186, 186, 186 = color used in the center rectangle). This must have something to do with human's vision and how we sense colors and light?
No, freddyNZ's understanding is/was a little muddled - the relative luminance of 186 is not 73%, it is actually 50%.

The recommended standard image/system gamma value is '2.2' - so 50% luminance is actually produced by the value of 186 (in an 8 bit, 0-255 system).

186/255^2.2 = 0.5 (i.e. 50%)

...or to illustrate it in reverse...

0.5^(1/2.2) x 255 = 186

It's important to note that the most common image formats have their numeric data values allready encoded to suit a display system having a gamma 2.2 response characteristic.

Most further corrections made in hardware or in profiling are often relatively 'fine adjustments', and may not neccessarily be gamma curves at all - but with the purpose of making the display system conform closely to a gamma characteristic of 2.2.
Yes - my mistake the "V" in HSV 73% isn't lightness. Anyway.

With correctly calibrated (gamma 2.2) display, I put my sypder on colour targets, 100% white, and 186/186/186 grey, and luminance (in cd/m2) of "186" grey is almost exactly half that of white (55 vs 110).
 
With most display systems, the gamma correction is applied in the video card (by downloading a custom LUT into the card). Some of the higher-end monitors (high end NEC & Eizo monitors) have the ability to apply a correction LUT internally inside the monitor. The advantage of this is that they are generally higher bit depth (typically 10 or 12 bits) than the correction applied via the display card (8-bits).

As freddyNZ mentioned, the correction LUT isn't automatically calculated, as it depends upon the individual characteristics of the display (CRT or LCD) being used. If you're using a hardware display calibrator, it starts off by loading a linear LUT into the video card and then proceeds to measure the characteristics of the display. It then calulates a correction LUT that will bring the display to a known gamma & color temperature and loads that into the video card (or monitor if supported). After calibrating the display to known set of values, it then proceeds to characterize the color characteristics of the display. This data is used to build the ICC profile for the display. Note that if you use a display that doesn't support an internal correction LUT (that's most of us), the correction LUT gets loaded into the video card when the OS boots up. It is usually done with a small LUT loader utility which runs at startup and reads the correction LUT from the default display profile and loads it into the video card.
one additional point is that most games toss out the loaded LUT and all current HD-DVD and Blu-ray playing software renders in such a way that the LUT does not get used at all, pretty much everything else makes use of it though.

and only fully-color managed software makes use of the ICC profile monitor characterization and can account for the monitor's actual primary color values and saturation responses (firefox with add-on, MS photo viewer (when NOT in full-screen/slideshow mode, for some reason it dumps color management then), fast pic viewer, photoshop, adobe after effects (not NOT premiere pro :( ), photo mechanic, safari to a limited degree are color-managed programs)
 
I know that monitor profiles built with the i1 Match profiling software do have the correction LUT stored in the monitor profile. I believe that the same is true with the Spyder profiling software as well. In fact, I have found that Windows 7 is able to read the correction LUT from the monitor profile and pass it to the video card (nVidia GeForce graphics card), so a LUT loader isn't necessary with Win7. However, I recently purchased a new Win7 laptop that uses the Intel i3 CPU with built-in graphics, and the LUT loader is required with that video, so apparently the need for a LUT loader depends upon the graphics engine being used (don't know about the ATI Radeon series).

I'm not aware of gamma values being stored in image files. However, as I mentioned earlier, if the image is in sRGB or Adobe RGB, the display system gamma is assumed to be 2.2, since that is what the color space specification calls for.
one thing that has me curious is that technically sRGB is supposed to have an sRGB tone response curve which is actually a little bit different than a power function gamma 2.2 and the tricky thing is that some profiler creators allow sRGB TRC as an option and some don't and some wide gamut monitors sRGB modes are locked into sRGB RTC and some are locked into 2.2 and some can be toggled and some stock ICC profiles for sRGB claim gamma 2.2 and some claim sRGB TRC.

and then some programs can read an attached TRC and some won't and then even stuff like photoshop may use the included gamma 2.2 sRGB ICC profile for conversion logic and other stuff or perhaps even photoshop provided with a different sRGB profile may use sRGB TRC instead.

seems like a disaster to me since you never who used sRGB TRC and who used 2.2 on thei rmonitor and whether a file is tagged this way or that or whether one was converted from say prophoto to sRGB using sRGB TRC or using 2.2 and in some cases you are forceably locked into one or another on some displays, etc.

i have yet to find a straight answer

i suspect it's just a total mess at this point

more programs appear to be offering sRGB TRC and more wide gamuts monitors appear ot be using sRGB TRC instead of 2.2 as the native TRC for sRGB mode but OTOH gamma 2.2 has been talked about so much and I think the sRGB profile used by photoshop at default uses 2.2 not sRGB so I suspect that using 2.2 for everything at this point gets you to match the general intent in use more often than using sRGB TRC, but whatever the case it seems like a mess
 
one thing that has me curious is that technically sRGB is supposed to have an sRGB tone response curve which is actually a little bit different than a power function gamma 2.2 and the tricky thing is that some profiler creators allow sRGB TRC as an option and some don't and some wide gamut monitors sRGB modes are locked into sRGB RTC and some are locked into 2.2 and some can be toggled and some stock ICC profiles for sRGB claim gamma 2.2 and some claim sRGB TRC.

and then some programs can read an attached TRC and some won't and then even stuff like photoshop may use the included gamma 2.2 sRGB ICC profile for conversion logic and other stuff or perhaps even photoshop provided with a different sRGB profile may use sRGB TRC instead.

seems like a disaster to me since you never who used sRGB TRC and who used 2.2 on thei rmonitor and whether a file is tagged this way or that or whether one was converted from say prophoto to sRGB using sRGB TRC or using 2.2 and in some cases you are forceably locked into one or another on some displays, etc.

i have yet to find a straight answer

i suspect it's just a total mess at this point

more programs appear to be offering sRGB TRC and more wide gamuts monitors appear ot be using sRGB TRC instead of 2.2 as the native TRC for sRGB mode but OTOH gamma 2.2 has been talked about so much and I think the sRGB profile used by photoshop at default uses 2.2 not sRGB so I suspect that using 2.2 for everything at this point gets you to match the general intent in use more often than using sRGB TRC, but whatever the case it seems like a mess
Just a suggestion! Use shorter sentences when writing. It is much easier to follow your thinking, when you use shorter sentences. Commas, periods and other punctuation marks are there for every body :).

Thanks for your answer anyway!

-pb
 
No, freddyNZ's understanding is/was a little muddled - the relative luminance of 186 is not 73%, it is actually 50%.

The recommended standard image/system gamma value is '2.2' - so 50% luminance is actually produced by the value of 186 (in an 8 bit, 0-255 system).

186/255^2.2 = 0.5 (i.e. 50%)

...or to illustrate it in reverse...

0.5^(1/2.2) x 255 = 186

It's important to note that the most common image formats have their numeric data values allready encoded to suit a display system having a gamma 2.2 response characteristic.

Most further corrections made in hardware or in profiling are often relatively 'fine adjustments', and may not neccessarily be gamma curves at all - but with the purpose of making the display system conform closely to a gamma characteristic of 2.2.
So to make things CRYSTAL CLEAR:

Todays digital cameras are encoding gamma correction of 2.2 directly into their *.jpg files? So if a camera wants to represent some pixel at 50 % luminance, it will record the pixel as RGB value of (186, 186, 186) not (128, 128, 128). Camera is thus assuming that all the pictures are viewed on systems that have TOTAL system gamma of ^2.2?

So hardware calibrators are used to measure monitor gamma curve and add additional gamma so that TOTAL system gamma would equal ^2.2?

For example if a hardware calibrator would measure, that my primary LCD monitor has gamma of ^1.8, hardware calibrator would then insert a look up table (LUT) in Windows to make my total system gamma equal ^2.2. I.e. the LUT would contain additional gamma of ^1.22. (2.2/1.8 = 1.22222...)

((186/255)^1.8)^1.22 = 0.5 = 50 % luminance

And if a hardware calibrator would measure, that my secondary CRT monitor has gamma of ^2.7, hardware calibrator would then insert a look up table (LUT) in Windows to make my total system gamma equal ^2.2. I.e. the LUT would contain additional gamma of ^0.81. (2.2/2.7 = 0.81481...)

((186/255)^2.7)^0.81 = 0.5 = 50 % luminance

Until now I had thought that hardware calibrators etc. are used to add gamma correction to system to overcome gamma of monitor. I.e images would contain un-corrected values ((128, 128, 128) in this case) and then the LUT would contain gamma correction of ^(1/2.2).

((128/255)^2.2)^(1/2.2) = 0.5 = 50 % luminance

But how about image files that don't have gamma corrected RGB-values? Don't they appear too dark on ideally calibrated systems? Seems like a mess to me :P.

Not to mention if gamma correction encoded into image files isn't always ^(1/2.2)...


-pb
 
Without checking all your maths in detail - yes, that is the general idea.

The sRGB standard with it's particular gamma encoding, was created in 1996, by Microsoft and HP (hence it's predominance to this day).

The idea is/was that 'sRGB' encoded image data could be output directly to the most commonly used displays of the time, CRT monitors, without the need for any dedicated PC hardware or software manipulation, and yet appear broadly correct.

It's interesting to note that the 'sRGB' standard isn't actually a pure 2.2 gamma, it's just 'approximately gamma 2.2 overall'. It actually starts with a short linear section near black then the 'gamma' increases quickly from 1.0 towards 2.0 and ends at about 2.3 at the 'white' end. See Wikipedia http://en.wikipedia.org/wiki/SRGB

As for cameras, things are yet a bit more complicated - most cameras will additionally mix in their own proprietary custom 'Tone Curves' to make their images subjectively more appealing. Typically most cameras will add an 'S' shape tone curve to increase mid tone contrast and compress/roll-off shadows and highlights (emulating film response), and of course most cameras allow the user some control over this contrast adjustment.
 
Well thanks for your patience and help!

I'm definitely a slow learner! :D I just re-read all the posts and Bob writes already in his first reply:

"Gamma correction is normally done in a display system (either in the display itself or a combination of the display and graphics card). This is done to set the display system gamma to a standardized value. For example, sRGB is defined with a Gamma of 2.2. Gamma corrected images have been adjusted so that they display correctly on a display system of a known gamma (again, typically 2.2)"

For some reason I had totally wrong understanding how this gamma vs. gamma correction pipeline works. So it took some posts before I realized, that I had understood the whole system wrong in the first place. This bottomline misunderstanding made it very hard to understand your helpful texts.

Hopefully I can now more clearly spot non-logical use of terms "gamma" and "gamma correction" when surfing around. :)

Now I have to try to make my LG-branded LCD monitor conform to gamma 2.2 response curve so that I can see my sRGB jpg-files as they are meant to be. ;)

Thanks once again for everyone's friendly input!

-pb
 
Without checking all your maths in detail - yes, that is the general idea.

The sRGB standard with it's particular gamma encoding, was created in 1996, by Microsoft and HP (hence it's predominance to this day).

The idea is/was that 'sRGB' encoded image data could be output directly to the most commonly used displays of the time, CRT monitors, without the need for any dedicated PC hardware or software manipulation, and yet appear broadly correct.

It's interesting to note that the 'sRGB' standard isn't actually a pure 2.2 gamma, it's just 'approximately gamma 2.2 overall'. It actually starts with a short linear section near black then the 'gamma' increases quickly from 1.0 towards 2.0 and ends at about 2.3 at the 'white' end. See Wikipedia http://en.wikipedia.org/wiki/SRGB

As for cameras, things are yet a bit more complicated - most cameras will additionally mix in their own proprietary custom 'Tone Curves' to make their images subjectively more appealing. Typically most cameras will add an 'S' shape tone curve to increase mid tone contrast and compress/roll-off shadows and highlights (emulating film response), and of course most cameras allow the user some control over this contrast adjustment.
the tricky thing is that only some profiler offer the actual sRGB TRC and only some monitors have it as a preset (while a few actually force it)

and some profiles used by popular software appear to use the regular old 2.2 for conversions

so to me it all seems like a mess you never know who used software converting to sRGB at 2.2 or sRGB TRC and who used a monitor set to 2.2 and who used on set to sRGB TRC.
 
I know that monitor profiles built with the i1 Match profiling software do have the correction LUT stored in the monitor profile. I believe that the same is true with the Spyder profiling software as well. In fact, I have found that Windows 7 is able to read the correction LUT from the monitor profile and pass it to the video card (nVidia GeForce graphics card), so a LUT loader isn't necessary with Win7. However, I recently purchased a new Win7 laptop that uses the Intel i3 CPU with built-in graphics, and the LUT loader is required with that video, so apparently the need for a LUT loader depends upon the graphics engine being used (don't know about the ATI Radeon series).

I'm not aware of gamma values being stored in image files. However, as I mentioned earlier, if the image is in sRGB or Adobe RGB, the display system gamma is assumed to be 2.2, since that is what the color space specification calls for.
hi bob,

i followed yours and others with regards to having windows 7 load your custom made profile via the windows 7 profile tool or "eye-o-meter".

i tried it versus logo calibrator from verion 3.62 xrites latest for windows. and i saw no flash/profile being loaded after logging in that i do when logo calibrator is loading the profile made with i1match.

i followed the instructions you all posted and it doesn't work on miy version of windows 7. in fact it totally goes against what windows 7 help says and i qouted in a post. i don't doubt you got it to work just wondering why mine doesn't. again even the help file says use the third party software that came with your profiling hardware. i also checked device manager as one page said he couldn't get windows to see it properly having the old yellow question marks. mine shows fine in device manager.

it just seems counter to everything windows/ms has written on the subject and what they post in there "communites" ie forums.

--
D700 paired with 24-70 f2.8 and 70-200vr f2.8
 
I recently bought a new laptop that's running Windows 7 Pro. Unlike my desktop computer that has an nVidia graphics card, the laptop uses the Intel built-in graphics (i3 CPU). While I didn't need to use the LUT loader on my desktop computer, I do need to use it with the laptop. Apparently, the graphics card plays a role in whether the LUT loader is required or not.

Bob
 

Keyboard shortcuts

Back
Top