Editing for SDR and HDR output

I bet money Jim is relatively silent while he uses his decades of scientific knowledge to get a grasp on this. Pretty soon, You Tubers will be explaining how to best edit in LR HDR and how to best export it and view it on various types of HDR monitors.

We won't understand the science except in general term (Jim and a few others here will), but we will learn how to make best use of this tech.
 
You can have an sdr 10 bits image that shows more colors than 8 bit jpeg
How are you defining "more colors"? More discrete colors that humans can discriminate from each other? How do you perform that calculation?
https://en.wikipedia.org/wiki/Color_depth
That doesn't answer the question.
Sorry, I wasn't trying to question your qualifications.

I was hoping that instead of leaving rhetorical questions that are hard for mere mortals to answer, you would instead respond by expanding on your points and teaching us.

As long as you can see two more colors on 10 bit rather than 8 bit wouldn't his point technically be true even if you can't enumerate it?

If you can see banding on 8 bit, and can't on 10 bit, wouldn't that imply that 10 bit has more colors you can see? Shouldn't someone have tried to measure this?
 
You can have an sdr 10 bits image that shows more colors than 8 bit jpeg
How are you defining "more colors"? More discrete colors that humans can discriminate from each other? How do you perform that calculation?
https://en.wikipedia.org/wiki/Color_depth
That doesn't answer the question.
Sorry, I wasn't trying to question your qualifications.

I was hoping that instead of leaving rhetorical questions
It wasn't a rhetorical question.
that are hard for mere mortals to answer, you would instead respond by expanding on your points and teaching us.
I might be able to do that if the question were precisely stated.
As long as you can see two more colors on 10 bit rather than 8 bit
I don't understand on what basis you're making that statement.
wouldn't his point technically be true even if you can't enumerate it?

If you can see banding on 8 bit, and can't on 10 bit, wouldn't that imply that 10 bit has more colors you can see?
In the regions where that banding is visible.
Shouldn't someone have tried to measure this?
It will probably come as no surprise to you that there has been a great deal of work on this topic.
 
You can have an sdr 10 bits image that shows more colors than 8 bit jpeg
How are you defining "more colors"? More discrete colors that humans can discriminate from each other? How do you perform that calculation?
I might be able to do that if the question were precisely stated.
So, the claim was 10 bit images can show more colors than 8 bit color. Your questions implies this isn't true. To a lay person, 10 > 8, so must be more colors.

I fail to see how exactly the number of colors matters.
As long as you can see two more colors on 10 bit rather than 8 bit
I don't understand on what basis you're making that statement.
wouldn't his point technically be true even if you can't enumerate it?

If you can see banding on 8 bit, and can't on 10 bit, wouldn't that imply that 10 bit has more colors you can see?
In the regions where that banding is visible.
so there are more colors?
Shouldn't someone have tried to measure this?
It will probably come as no surprise to you that there has been a great deal of work on this topic.
So you think everyone in this forum should knees deep in the technical papers of your specialty? care to share your knowledge on this topic?

So in general, interceptor121 is correct and I can take it a 10 bit image contains more "colors" than an 8 bit image?

Sorry if I started you off on the wrong foot, but color science is clearly not my specialty and it feels bad to have an expert berate me for knowledge I have no reason to have instead of teaching. I had hoped to learn something on this thread and partially commented so I could get notifications, but instead I get ridiculed for seemingly no reason.
 
Another thing.... People get confused on Googling HDR. In Photography HDR (vs watching TV or video) 99% of it is about blending images of various EV, which we have been messing around with for 20 years.

What we are talking about here is way different. Google HDR and you will get 500 videos instructing you how to do a simple HDR blend in LR or PS.

I wish it had different names.
 
Sorry, I wasn't trying to question your qualifications.
Here's what you said to me above, in reference to the Wikipedia link you posted:
oh boy, actual color science! not the fake "color science" term everyone on dpreview likes to misuse.
Right, as in "oh boy I get to discuss color science with an expert" not "the fake color science that's all over this forum.

It wasn't directed at you, sorry if I offended you.

edit: The "fake color science" I am referring to is the term people use all over DPReview (not this specific forum, that's why I said DPReview) to say why they like the output of one camera brand over another.
 
Last edited:
You can have an sdr 10 bits image that shows more colors than 8 bit jpeg
How are you defining "more colors"? More discrete colors that humans can discriminate from each other? How do you perform that calculation?
I might be able to do that if the question were precisely stated.
So, the claim was 10 bit images can show more colors than 8 bit color. Your questions implies this isn't true.
Why does my question imply that isn't true?
To a lay person, 10 > 8, so must be more colors.

I fail to see how exactly the number of colors matters.
Then why did you say "You can have an sdr 10 bits image that shows more colors than 8 bit jpeg"
As long as you can see two more colors on 10 bit rather than 8 bit
I don't understand on what basis you're making that statement.
wouldn't his point technically be true even if you can't enumerate it?

If you can see banding on 8 bit, and can't on 10 bit, wouldn't that imply that 10 bit has more colors you can see?
In the regions where that banding is visible.
so there are more colors?
In some regions. It's not simple. It also depends on how you define colors, as I said above.
Shouldn't someone have tried to measure this?
It will probably come as no surprise to you that there has been a great deal of work on this topic.
So you think everyone in this forum should knees deep in the technical papers of your specialty? care to share your knowledge on this topic?
Formulate a crisp question, and I'll do my best to answer it.
So in general, interceptor121 is correct and I can take it a 10 bit image contains more "colors" than an 8 bit image?
The answer to that depends on the color counting process. And the gamut of the display. And the tone curve of the encoding. And the viewing conditions. And some other stuff.
Sorry if I started you off on the wrong foot, but color science is clearly not my specialty and it feels bad to have an expert berate me
I'm not berating you. I'm trying to get you to ask a clear question that I can answer.
for knowledge I have no reason to have instead of teaching. I had hoped to learn something on this thread and partially commented so I could get notifications, but instead I get ridiculed for seemingly no reason.
I'm not ridiculing you. You said that the folks here on DPR did not understand actual color science, " not the fake "color science" term everyone on dpreview likes to misuse."

I posted some evidence that I actually understand color science. Not sure why that's threatening.
 
You can have an sdr 10 bits image that shows more colors than 8 bit jpeg
How are you defining "more colors"? More discrete colors that humans can discriminate from each other? How do you perform that calculation?
I might be able to do that if the question were precisely stated.
So, the claim was 10 bit images can show more colors than 8 bit color. Your questions implies this isn't true.
Why does my question imply that isn't true?
To a lay person, 10 > 8, so must be more colors.

I fail to see how exactly the number of colors matters.
Then why did you say "You can have an sdr 10 bits image that shows more colors than 8 bit jpeg"
I am not Interceptor121 though? I view in flat mode so I'm not actually sure what the rest of his post said.

So I jumped on the thread because to me, it seems like a 10 bit image should have more colors than an 8 bit, but your questions seem to imply that it doesn't necessarily.

So on a cursory wikipedia review it seemed like it was true, that 10 but should have more colors, at least some of which are discernable.

I was hoping you would expound on your questions on why this may not be true.
I'm not ridiculing you. You said that the folks here on DPR did not understand actual color science, " not the fake "color science" term everyone on dpreview likes to misuse."

I posted some evidence that I actually understand color science. Not sure why that's threatening.
So in the end, my takeaway is that 10 bit images do have more colors that can be discernable than 8 bit images.

In case you view on threaded view and not flat view, my reply to your other post:

Right, as in "oh boy I get to discuss color science with an expert" not "the fake color science that's all over this forum.

It wasn't directed at you, sorry if I offended you.

edit: The "fake color science" I am referring to is the term people use all over DPReview to say why they subjectively prefer the output of one camera brand over another.
 
Sorry, I wasn't trying to question your qualifications.
Here's what you said to me above, in reference to the Wikipedia link you posted:
oh boy, actual color science! not the fake "color science" term everyone on dpreview likes to misuse.
Right, as in "oh boy I get to discuss color science with an expert" not "the fake color science that's all over this forum.

It wasn't directed at you, sorry if I offended you.

edit: The "fake color science" I am referring to is the term people use all over DPReview (not this specific forum, that's why I said DPReview) to say why they like the output of one camera brand over another.
OK, sorry. I thought the antecedent of actual color science was the WIkipedia link you posted.

Let me take my best shot at answering a question that is similar to the one you're asking. It may even be the question you're asking, but I can't tell for sure.

For the purpose of enumerating colors, let us say that two colors are the same if a normal human can't tell the difference between them, and different if a normal human can. Let us further say that the color spaces under discussion here all have nonlinear tone curves. Those tone curves might be the same from one space to the next, or they might be different. Let us further say that we're talking about RGB additive color monitors, with physical primaries.

If we discuss humans observing the real world, humans can discriminate among about 10 million colors. Translating this to RGB monitors is tricky, because no commercial RGB monitor, no matter the precision, can display all the colors that people can see.

As the gamut of RGB monitors increases, more and more visible colors can be seen, but with modern monitors, the range of output spectra producible by the monitor exceeds the number of countable colors. For example, sRGB with 8-bit precision can produce almost 17 million different spectral outputs. However, not all of those spectral outputs are perceivable as different colors. Banding is hardly ever a problem with 8-bit sRGB. That's a big part of the basis for choosing it as a least common denominator.

However, as the primaries of the representation grow further apart, it takes greater precision and/or different tone curves to represent colors with no banding. For example, with PPRGB, 8 bit precision often results in banding, and the standard for precision in photo editing of PPRGB images is 15 bits plus one state. Same with CIEL*a*b*. Both of those have gamuts in excess of any commercial monitor. In fact, PPRGB can encode values not recognizable by humans at all.

The peak brightness of the display and the tone curve of the encoding and of the display also affect the precision necessary to avoid banding. If the black point remains constant, a brighter display will show more banding than a darker one.

So the answer to the 8-bit vs 10-bit precision question and the number of colors displayable is complicated.

With wide-gamut monitors with high brightness and low black points, two things need to change from what we're now calling SDR in order to avoid banding. First, the precision needs to increase. But going from 8-bit to 10-bit precision is not sufficient with gamuts like Rec 2020 at brightnesses on the order of 4000 cd/m^2. The tone curves must also be modified, which is what the HDR standards that I know about do.

Does that help?

Jim
 
Let me take my best shot at answering a question that is similar to the one you're asking. It may even be the question you're asking, but I can't tell for sure.

For the purpose of enumerating colors, let us say that two colors are the same if a normal human can't tell the difference between them, and different if a normal human can. Let us further say that the color spaces under discussion here all have nonlinear tone curves. Those tone curves might be the same from one space to the next, or they might be different. Let us further say that we're talking about RGB additive color monitors, with physical primaries.

If we discuss humans observing the real world, humans can discriminate among about 10 million colors. Translating this to RGB monitors is tricky, because no commercial RGB monitor, no matter the precision, can display all the colors that people can see.

As the gamut of RGB monitors increases, more and more visible colors can be seen, but with modern monitors, the range of output spectra producible by the monitor exceeds the number of countable colors. For example, sRGB with 8-bit precision can produce almost 17 million different spectral outputs. However, not all of those spectral outputs are perceivable as different colors. Banding is hardly ever a problem with 8-bit sRGB. That's a big part of the basis for choosing it as a least common denominator.

However, as the primaries of the representation grow further apart, it takes greater precision and/or different tone curves to represent colors with no banding. For example, with PPRGB, 8 bit precision often results in banding, and the standard for precision in photo editing of PPRGB images is 15 bits plus one state. Same with CIEL*a*b*. Both of those have gamuts in excess of any commercial monitor. In fact, PPRGB can encode values not recognizable by humans at all.

The peak brightness of the display and the tone curve of the encoding and of the display also affect the precision necessary to avoid banding. If the black point remains constant, a brighter display will show more banding than a darker one.

So the answer to the 8-bit vs 10-bit precision question and the number of colors displayable is complicated.

With wide-gamut monitors with high brightness and low black points, two things need to change from what we're now calling SDR in order to avoid banding. First, the precision needs to increase. But going from 8-bit to 10-bit precision is not sufficient with gamuts like Rec 2020 at brightnesses on the order of 4000 cd/m^2. The tone curves must also be modified, which is what the HDR standards that I know about do.
Armed with the above knowledge, you can see why the relationship between the number of observable colors and banding is shaky. Consider the sRGB primaries and the PPRGB ones, with the tone curves forced to be the same. It is more likely that you'll see banding in the PPRGB presentation, but the PPRGB encoding may actually be able to encode more visible colors -- to find out for sure and provide quantitative results would take me four or five hours of work.
 
Sorry, I wasn't trying to question your qualifications.
Here's what you said to me above, in reference to the Wikipedia link you posted:
oh boy, actual color science! not the fake "color science" term everyone on dpreview likes to misuse.
Right, as in "oh boy I get to discuss color science with an expert" not "the fake color science that's all over this forum.

It wasn't directed at you, sorry if I offended you.

edit: The "fake color science" I am referring to is the term people use all over DPReview (not this specific forum, that's why I said DPReview) to say why they like the output of one camera brand over another.
OK, sorry. I thought the antecedent of actual color science was the WIkipedia link you posted.

Let me take my best shot at answering a question that is similar to the one you're asking. It may even be the question you're asking, but I can't tell for sure.

For the purpose of enumerating colors, let us say that two colors are the same if a normal human can't tell the difference between them, and different if a normal human can. Let us further say that the color spaces under discussion here all have nonlinear tone curves. Those tone curves might be the same from one space to the next, or they might be different. Let us further say that we're talking about RGB additive color monitors, with physical primaries.

If we discuss humans observing the real world, humans can discriminate among about 10 million colors. Translating this to RGB monitors is tricky, because no commercial RGB monitor, no matter the precision, can display all the colors that people can see.

As the gamut of RGB monitors increases, more and more visible colors can be seen, but with modern monitors, the range of output spectra producible by the monitor exceeds the number of countable colors. For example, sRGB with 8-bit precision can produce almost 17 million different spectral outputs. However, not all of those spectral outputs are perceivable as different colors. Banding is hardly ever a problem with 8-bit sRGB. That's a big part of the basis for choosing it as a least common denominator.

However, as the primaries of the representation grow further apart, it takes greater precision and/or different tone curves to represent colors with no banding. For example, with PPRGB, 8 bit precision often results in banding, and the standard for precision in photo editing of PPRGB images is 15 bits plus one state. Same with CIEL*a*b*. Both of those have gamuts in excess of any commercial monitor. In fact, PPRGB can encode values not recognizable by humans at all.

The peak brightness of the display and the tone curve of the encoding and of the display also affect the precision necessary to avoid banding. If the black point remains constant, a brighter display will show more banding than a darker one.

So the answer to the 8-bit vs 10-bit precision question and the number of colors displayable is complicated.

With wide-gamut monitors with high brightness and low black points, two things need to change from what we're now calling SDR in order to avoid banding. First, the precision needs to increase. But going from 8-bit to 10-bit precision is not sufficient with gamuts like Rec 2020 at brightnesses on the order of 4000 cd/m^2. The tone curves must also be modified, which is what the HDR standards that I know about do.

Does that help?

Jim
 
This post is about HDR output, as discussed here, not HDR merge, which involves merging several images into one.

Editing and sharing HDR output images is still in its infancy and is barely supported across various sharing tools.

I am sharing two images in two versions: one edited for SDR and one edited for HDR:

Zonerama HDR vs. SDR Album

If you have an HDR monitor (at least 1000 nits) and a browser that supports HDR, you will notice its advantages. HDR output is not about garish images that burn your retina but fine improvements that boost the joy of looking at a recorded scene.

The advantages of HDR
  • Less post-processing is needed (less masking and less fighting the highlights)
  • The washed-out highlights have better color and better detail.
  • The greyish sky of SDR gets its blues back in HDR :).
  • The colors in SDR are muted and require lots of work to make them pop.
The disadvantage of HDR output is the lack of support, but it is being worked on. Ideally, a single image file should contain both SDR and HDR versions, but I have not managed to do that yet.

Once mature, I believe HDR output will supersede SDR as the choice of display output format. I agree with Eric Chan: “A well-made print is a thing of beauty: a physical, tactile representation of a photograph that stands on its own.” However, an HDR output image shown on a proper display will be a serious alternative to print and not just a convenience.
I couldn't agree with you more. Most photographers don't seem to have heard of HDR stills (they assume we are talking about merging three+ photos to squeeze a high dynamic range scene into a regular dynamic range display or print). I wish there was a different name for it!

I've been using HDR in a serious way for the past week--I'm working on a slideshow project and the deadline is approaching and, of course, I thought "why not convert all 350 images to HDR?" Of course! It will be easy! Ha!

For those unfamiliar, an HDR photo is a photo with up to four more stops of highlight "headroom" than a normal photo. Many displays cannot display this extra headroom correctly, but that is quickly changing. Your home TV probably already does and if you are an Apple Mac, iPad or iPhone user you've probably been viewing HDR images for years now (the iPhone default is to shoot HDR). If you have one of the new MacPros then you not only have an HDR screen but a wonderful, amazing HDR screen.

Adobe (in Photoshop and Lightroom) started supporting HDR stills late last year. I believe Chrome already does and Safari (along with Messages) will start supporting HDR in a few weeks.

What about shooting HDR? In all probably you have been shooting HDR for years--a RAW file is all you need. (Note, just changing the settings on your TV to HDR and viewing a jpeg does not work--that's just the TV trying to fake it with a substandard file.) You're whole archive is waiting for you.

Try it out. Go into Lightroom or Photoshop with your favorite RAW file and hit that little "HDR" button and then notice that the histogram changes. Now it has two parts--the SDR part (the regular part) and the extended HDR part. Proceed as normal.

You'll quickly find that you are on unfamiliar ground a bit. Some photos look amazing and have been waiting for HDR all along. Some photos look crazy weird with super bright areas that you won't like. Other photos don't seem to change at all. There will be an adjustment period as you acclimatize to this new thing.

As an example: A few years ago in California, the sky was red from the smoke of the forest fires, so I went out shooting. The photos never really had the look I wanted (and saw with my eyes) of this dome of weird light above a red-darkened town, with the house and street light still on. With HDR it looks perfect. I'm terribly excited.

For a while still these images will be hard to share and there will be lingering issues all through the image-making pipeline, so not everyone is going to jump in. But if you are aiming at a screen-based display as your final output then jumping in is highly recommended and, oh my god the images!
Darin, I somehow missed this great post you wrote in the long string. That helped. I knew most of that, but what a nice primer. My confusion now is how best to export since I think a jpeg does not contain the HRD stuff you described above. Plus, if I re edit my raw files in HDR mode in LR and get the extended histo, does that now mess up the jpeg at export? I mean, if I edit with HDR, do I have to export it as something else (non jpeg?). If I do, that is a problem and would not edit in HDR except just to see it myself on my own monitor.
 
This post is about HDR output, as discussed here, not HDR merge, which involves merging several images into one.

Editing and sharing HDR output images is still in its infancy and is barely supported across various sharing tools.

I am sharing two images in two versions: one edited for SDR and one edited for HDR:

Zonerama HDR vs. SDR Album

If you have an HDR monitor (at least 1000 nits) and a browser that supports HDR, you will notice its advantages. HDR output is not about garish images that burn your retina but fine improvements that boost the joy of looking at a recorded scene.

The advantages of HDR
  • Less post-processing is needed (less masking and less fighting the highlights)
  • The washed-out highlights have better color and better detail.
  • The greyish sky of SDR gets its blues back in HDR :).
  • The colors in SDR are muted and require lots of work to make them pop.
The disadvantage of HDR output is the lack of support, but it is being worked on. Ideally, a single image file should contain both SDR and HDR versions, but I have not managed to do that yet.

Once mature, I believe HDR output will supersede SDR as the choice of display output format. I agree with Eric Chan: “A well-made print is a thing of beauty: a physical, tactile representation of a photograph that stands on its own.” However, an HDR output image shown on a proper display will be a serious alternative to print and not just a convenience.
I couldn't agree with you more. Most photographers don't seem to have heard of HDR stills (they assume we are talking about merging three+ photos to squeeze a high dynamic range scene into a regular dynamic range display or print). I wish there was a different name for it!

I've been using HDR in a serious way for the past week--I'm working on a slideshow project and the deadline is approaching and, of course, I thought "why not convert all 350 images to HDR?" Of course! It will be easy! Ha!

For those unfamiliar, an HDR photo is a photo with up to four more stops of highlight "headroom" than a normal photo. Many displays cannot display this extra headroom correctly, but that is quickly changing. Your home TV probably already does and if you are an Apple Mac, iPad or iPhone user you've probably been viewing HDR images for years now (the iPhone default is to shoot HDR). If you have one of the new MacPros then you not only have an HDR screen but a wonderful, amazing HDR screen.

Adobe (in Photoshop and Lightroom) started supporting HDR stills late last year. I believe Chrome already does and Safari (along with Messages) will start supporting HDR in a few weeks.

What about shooting HDR? In all probably you have been shooting HDR for years--a RAW file is all you need. (Note, just changing the settings on your TV to HDR and viewing a jpeg does not work--that's just the TV trying to fake it with a substandard file.) You're whole archive is waiting for you.

Try it out. Go into Lightroom or Photoshop with your favorite RAW file and hit that little "HDR" button and then notice that the histogram changes. Now it has two parts--the SDR part (the regular part) and the extended HDR part. Proceed as normal.

You'll quickly find that you are on unfamiliar ground a bit. Some photos look amazing and have been waiting for HDR all along. Some photos look crazy weird with super bright areas that you won't like. Other photos don't seem to change at all. There will be an adjustment period as you acclimatize to this new thing.

As an example: A few years ago in California, the sky was red from the smoke of the forest fires, so I went out shooting. The photos never really had the look I wanted (and saw with my eyes) of this dome of weird light above a red-darkened town, with the house and street light still on. With HDR it looks perfect. I'm terribly excited.

For a while still these images will be hard to share and there will be lingering issues all through the image-making pipeline, so not everyone is going to jump in. But if you are aiming at a screen-based display as your final output then jumping in is highly recommended and, oh my god the images!
Darin, I somehow missed this great post you wrote in the long string. That helped. I knew most of that, but what a nice primer. My confusion now is how best to export since I think a jpeg does not contain the HRD stuff you described above. Plus, if I re edit my raw files in HDR mode in LR and get the extended histo, does that now mess up the jpeg at export? I mean, if I edit with HDR, do I have to export it as something else (non jpeg?). If I do, that is a problem and would not edit in HDR except just to see it myself on my own monitor.
There's a check box in the file settings of the LR export window for 'HDR Output,' and a drop down box to select P3 color space.

It can be confusing when you try to open the file after export. Some programs will show it in HDR and some won't.
 
I have had HDR screen since the very beginning
That does not mean that you can view HDR mode images properly.
There is an overwhelming emphasis on brigthness however what matters is contrast
What matters also is detail and color which you lose in SDR mode (highlights).
I much prefer my LG OLED tv with true black to the bright image of my macbook pro or my iphones or even worse some desktop monitors that are just basic LED

Ultimately 10 stops of dynamic range are plenty and many images do not even reach them
I have very rarely seen an image that has not improved by switching to HDR mode.
Sunset sunrise and backlit shots are examples where HDR can help but your examples do not particularly benefit from it

Instagram started to support HDR because phones create HDR images

Phones will drive the adoption, we have had Tv and monitors for years and nothing has happened to date

HDR is interesting because it can avoid editing which many times is done to rebalance dynamic range in a gamma you can display or print, this is the same principles of HLG broadcasting
If it were not for occasional too-bright spots, HDR mode requires less editing. Especially the problematic sky masks can often be avoided.

My advice to everyone is, before making an opinion on HDR mode, to try it out first. The follow up discussions will be more productive after the experimentation.
I have sone hdr video since 6 years and photos since 2
You have much more experience with HDR output than me. Would you mind sharing what tools you used for two years to create HDR photos, what output format you used, and how you shared/viewed HDR photos?
your example images are not ideal to showcase the benefits
I am still struggling with HDR. What I see in Develop mode differs from what I see in the exported AVIF files. Do you have an example that is a better showcase? I consider images with overly bright highlights to be the negative aspect of HDR, the same as the garish look of early HDR merges.
with regards to colors in the highlights well there are not that many the highlights in hdr are super whites
In SDR, the highlights in the overcast sky are muted grey, while in HDR, the original blue comes through. That is visible in my posted images. I consider as highlights any brightness that falls in the HDR part of the tone curve.
dont confuse potential benefits of 10 bit color depth images with hdr dynamic range benefits
I am looking at the practical results. The problem is that the highlights seem to lose information in SDR (colors and details), which are apparent in HDR.
You can have an sdr 10 bits image that shows more colors than 8 bit jpeg
The biggest issue I have had is portability which is mostly why I dont use HDR for images

the easiest way to share images is to put them in a drive in chrome that reads them

Instagram now supports HDR as well but this is mostly optimised for acquisition on phones

for most practical purposes the best combination is to use hdr gamma with p3 gamut this is because you can find a larger number of devices that support P3 especially apple

as PC user though you run into issues as your computer is likely using srgb on hdr gamma for static pages which means the colors go off gamut and you get unexpected results when someone share p3 hdr images

in terms of editing the other problem is that you need to decide what display you are producing for as not all of them supports the full dr of the inage you may output

finally and more importantly if you produce an hdr image the rendering of the related SDR photo for those who dont see hdr is done by your display software again with unexpected results

all the above considered for photos I prefer to just output sdr images…
 
I have had HDR screen since the very beginning
That does not mean that you can view HDR mode images properly.
There is an overwhelming emphasis on brigthness however what matters is contrast
What matters also is detail and color which you lose in SDR mode (highlights).
I much prefer my LG OLED tv with true black to the bright image of my macbook pro or my iphones or even worse some desktop monitors that are just basic LED

Ultimately 10 stops of dynamic range are plenty and many images do not even reach them
I have very rarely seen an image that has not improved by switching to HDR mode.
Sunset sunrise and backlit shots are examples where HDR can help but your examples do not particularly benefit from it

Instagram started to support HDR because phones create HDR images

Phones will drive the adoption, we have had Tv and monitors for years and nothing has happened to date

HDR is interesting because it can avoid editing which many times is done to rebalance dynamic range in a gamma you can display or print, this is the same principles of HLG broadcasting
If it were not for occasional too-bright spots, HDR mode requires less editing. Especially the problematic sky masks can often be avoided.

My advice to everyone is, before making an opinion on HDR mode, to try it out first. The follow up discussions will be more productive after the experimentation.
I have sone hdr video since 6 years and photos since 2
You have much more experience with HDR output than me. Would you mind sharing what tools you used for two years to create HDR photos, what output format you used, and how you shared/viewed HDR photos?
your example images are not ideal to showcase the benefits
I am still struggling with HDR. What I see in Develop mode differs from what I see in the exported AVIF files. Do you have an example that is a better showcase? I consider images with overly bright highlights to be the negative aspect of HDR, the same as the garish look of early HDR merges.
with regards to colors in the highlights well there are not that many the highlights in hdr are super whites
In SDR, the highlights in the overcast sky are muted grey, while in HDR, the original blue comes through. That is visible in my posted images. I consider as highlights any brightness that falls in the HDR part of the tone curve.
dont confuse potential benefits of 10 bit color depth images with hdr dynamic range benefits
I am looking at the practical results. The problem is that the highlights seem to lose information in SDR (colors and details), which are apparent in HDR.
You can have an sdr 10 bits image that shows more colors than 8 bit jpeg
The biggest issue I have had is portability which is mostly why I dont use HDR for images

the easiest way to share images is to put them in a drive in chrome that reads them

Instagram now supports HDR as well but this is mostly optimised for acquisition on phones

for most practical purposes the best combination is to use hdr gamma with p3 gamut this is because you can find a larger number of devices that support P3 especially apple

as PC user though you run into issues as your computer is likely using srgb on hdr gamma for static pages which means the colors go off gamut and you get unexpected results when someone share p3 hdr images

in terms of editing the other problem is that you need to decide what display you are producing for as not all of them supports the full dr of the inage you may output

finally and more importantly if you produce an hdr image the rendering of the related SDR photo for those who dont see hdr is done by your display software again with unexpected results

all the above considered for photos I prefer to just output sdr images…
Thank you for the answers.

Adobe software allows for tuning the gain maps which are included when exporting JPEGs. Early experiments show promising results: HDR images look good on SDR devices. I still feel that one gets better SDR results if one develops for SDR alone.

I am using mainly sRGB at export because I intend to share HDRs via browsers. I also allow only for two stops more in HDR edit, hoping that it would cause less issues with various monitors.

I tried using Instagram for HDR images but it does not work. I guess my account was not included yet in the early rollout stage.
 
I have had HDR screen since the very beginning
That does not mean that you can view HDR mode images properly.
There is an overwhelming emphasis on brigthness however what matters is contrast
What matters also is detail and color which you lose in SDR mode (highlights).
I much prefer my LG OLED tv with true black to the bright image of my macbook pro or my iphones or even worse some desktop monitors that are just basic LED

Ultimately 10 stops of dynamic range are plenty and many images do not even reach them
I have very rarely seen an image that has not improved by switching to HDR mode.
Sunset sunrise and backlit shots are examples where HDR can help but your examples do not particularly benefit from it

Instagram started to support HDR because phones create HDR images

Phones will drive the adoption, we have had Tv and monitors for years and nothing has happened to date

HDR is interesting because it can avoid editing which many times is done to rebalance dynamic range in a gamma you can display or print, this is the same principles of HLG broadcasting
If it were not for occasional too-bright spots, HDR mode requires less editing. Especially the problematic sky masks can often be avoided.

My advice to everyone is, before making an opinion on HDR mode, to try it out first. The follow up discussions will be more productive after the experimentation.
I have sone hdr video since 6 years and photos since 2
You have much more experience with HDR output than me. Would you mind sharing what tools you used for two years to create HDR photos, what output format you used, and how you shared/viewed HDR photos?
your example images are not ideal to showcase the benefits
I am still struggling with HDR. What I see in Develop mode differs from what I see in the exported AVIF files. Do you have an example that is a better showcase? I consider images with overly bright highlights to be the negative aspect of HDR, the same as the garish look of early HDR merges.
with regards to colors in the highlights well there are not that many the highlights in hdr are super whites
In SDR, the highlights in the overcast sky are muted grey, while in HDR, the original blue comes through. That is visible in my posted images. I consider as highlights any brightness that falls in the HDR part of the tone curve.
dont confuse potential benefits of 10 bit color depth images with hdr dynamic range benefits
I am looking at the practical results. The problem is that the highlights seem to lose information in SDR (colors and details), which are apparent in HDR.
You can have an sdr 10 bits image that shows more colors than 8 bit jpeg
The biggest issue I have had is portability which is mostly why I dont use HDR for images

the easiest way to share images is to put them in a drive in chrome that reads them

Instagram now supports HDR as well but this is mostly optimised for acquisition on phones

for most practical purposes the best combination is to use hdr gamma with p3 gamut this is because you can find a larger number of devices that support P3 especially apple

as PC user though you run into issues as your computer is likely using srgb on hdr gamma for static pages which means the colors go off gamut and you get unexpected results when someone share p3 hdr images

in terms of editing the other problem is that you need to decide what display you are producing for as not all of them supports the full dr of the inage you may output

finally and more importantly if you produce an hdr image the rendering of the related SDR photo for those who dont see hdr is done by your display software again with unexpected results

all the above considered for photos I prefer to just output sdr images…
Thank you for the answers.

Adobe software allows for tuning the gain maps which are included when exporting JPEGs. Early experiments show promising results: HDR images look good on SDR devices. I still feel that one gets better SDR results if one develops for SDR alone.

I am using mainly sRGB at export because I intend to share HDRs via browsers. I also allow only for two stops more in HDR edit, hoping that it would cause less issues with various monitors.

I tried using Instagram for HDR images but it does not work. I guess my account was not included yet in the early rollout stage.
And there you have it by using sRGB combined with hdr gamma you have the brightness but keep the color corrected but on the other hand provide limited benefit for viewing

lots of compromises sometimes for best case 1 stop

difference
 
Thank you for the answers.

Adobe software allows for tuning the gain maps which are included when exporting JPEGs. Early experiments show promising results: HDR images look good on SDR devices. I still feel that one gets better SDR results if one develops for SDR alone.

I am using mainly sRGB at export because I intend to share HDRs via browsers. I also allow only for two stops more in HDR edit, hoping that it would cause less issues with various monitors.

I tried using Instagram for HDR images but it does not work. I guess my account was not included yet in the early rollout stage.
And there you have it by using sRGB combined with hdr gamma you have the brightness but keep the color corrected but on the other hand provide limited benefit for viewing

lots of compromises sometimes for best case 1 stop

difference
You guys lost me a little bit there but browsers that currently support HDR are Opera, Chrome, and few others, and Safari with HDR support is in beta and will be out next month. So within five weeks almost 95% of browsers will support HDR. As far as sharing directly via browsers HDR is looking pretty good. Note, if you have to go through Squarespace or DRPreview etc that support is still to come, but direct coding of the image link will be fine--try the AVIF format (with HDR turned on when you save) rather than JPEG (and don't bother with JPEG XL--it appears to be a dead end).

--Darin
 
Thank you for the answers.

Adobe software allows for tuning the gain maps which are included when exporting JPEGs. Early experiments show promising results: HDR images look good on SDR devices. I still feel that one gets better SDR results if one develops for SDR alone.

I am using mainly sRGB at export because I intend to share HDRs via browsers. I also allow only for two stops more in HDR edit, hoping that it would cause less issues with various monitors.

I tried using Instagram for HDR images but it does not work. I guess my account was not included yet in the early rollout stage.
And there you have it by using sRGB combined with hdr gamma you have the brightness but keep the color corrected but on the other hand provide limited benefit for viewing

lots of compromises sometimes for best case 1 stop

difference
You guys lost me a little bit there but browsers that currently support HDR are Opera, Chrome, and few others, and Safari with HDR support is in beta and will be out next month. So within five weeks almost 95% of browsers will support HDR. As far as sharing directly via browsers HDR is looking pretty good. Note, if you have to go through Squarespace or DRPreview etc that support is still to come, but direct coding of the image link will be fine--try the AVIF format (with HDR turned on when you save) rather than JPEG (and don't bother with JPEG XL--it appears to be a dead end).

--Darin
 
Thank you for the answers.

Adobe software allows for tuning the gain maps which are included when exporting JPEGs. Early experiments show promising results: HDR images look good on SDR devices. I still feel that one gets better SDR results if one develops for SDR alone.

I am using mainly sRGB at export because I intend to share HDRs via browsers. I also allow only for two stops more in HDR edit, hoping that it would cause less issues with various monitors.

I tried using Instagram for HDR images but it does not work. I guess my account was not included yet in the early rollout stage.
And there you have it by using sRGB combined with hdr gamma you have the brightness but keep the color corrected but on the other hand provide limited benefit for viewing

lots of compromises sometimes for best case 1 stop

difference
You guys lost me a little bit there but browsers that currently support HDR are Opera, Chrome, and few others, and Safari with HDR support is in beta and will be out next month. So within five weeks almost 95% of browsers will support HDR. As far as sharing directly via browsers HDR is looking pretty good. Note, if you have to go through Squarespace or DRPreview etc that support is still to come, but direct coding of the image link will be fine--try the AVIF format (with HDR turned on when you save) rather than JPEG (and don't bother with JPEG XL--it appears to be a dead end).
Browser support hasnt been an issue provided the websites support the files themselves

The challenge is that while browsing is standardised on srgb and standard gamma hdr is device dependant and not harmonised

Is down to the operator to make it more compatible
I have no trouble opening and viewing AVIF files in P3 on my Mac with either Opera or Chrome. Why would you want to use sRGB?.
 

Keyboard shortcuts

Back
Top