Editing for SDR and HDR output

If you are correct about the loss of highlight colour in SDR, what is the cause? Is it really a lack of dynamic range or some other monitor flaw?

For example, if you display an image that is, say, two stops underexposed and has no true highlight on an SDR monitor does it still lose colour in the brighter parts?
The answers are in the linked Adobe article. We are really lucky that Eric Chan wrote it.

Brighter highlights are an obvious benefit of HDR, but increased highlight color range is also important. When editing in SDR, very bright highlights tend to get washed out due to a limited color palette near white. When editing in HDR, however, the available color gamut expands significantly around the SDR white point (the center vertical line in the histogram). This means that even for low to moderate contrast photos, you may see a significant visual benefit to editing in HDR.

Consider that highlights that live on the edge of the SDR curve suddenly live in the middle of the combined SDR/HDR curve.
 
Chan doesn't provide any explanation as to why this happens, though, just the assertion it does. Is this the fault of the monitor hardware, some effect of the human visual system, a problem with file format, or what? Why don't prints have this issue? That suggests to me the issue is related to transmissive light viewing. Id like to understand why does it happen. Also, he's the LR gurus, youd expect him to gush and champion about new LR features.... Does this this compromise his explanation?

There is a lack of systematic explanation here as to what the technical issues HDR solves and how it does it that leaves me feeling sceptical. The tech world is full of breakthroughs, many of which don't pan out.
 
Greg,

This is your time. You are the DPreview Medium Format Board's prophet and evangelist of viewing images (especially medium format images) on beautiful screens.

HDR is what you have been waiting for.

Come back from Mexico, you are needed. Come back to your 32-inch screen. Hit the HDR button. We need you.
Darin,

I am an HDR novice and my Dell 6K has decent but not top-end HDR capability. My IPS Black 6K monitor (like all IPS monitors) has modest contrast ratio which dulls the impact of HDR. The Mini-LED or OLED monitors are much better at this than my Black IPS. Mini LED is the way to go, but no 6K yet. OLED for TVs.

Everyone knows that HDR is the future. HDR Monitors at high res (4K and up) produce more vivid images and add luminance detail with HDR content. Bright highlights, like a sunset or headlights in a dark alley, are noticeably more detailed than in SDR. HDR is better than SDR, and that is a simple fact. Pretty soon, it won't matter. We will all have it.

I have been told for years here on DPR that a blown highlight is a blown highlight, but that must not be true now because when I review my GFX raw files with HDR turned on in LR, the "blown" highlights suddenly pop back with good detail that is sort of striking, and the file takes on a more luminous looking aspect that I can't describe well.

I read all the monitor reviews on Tom's and those guys are the best at commenting on monitors. They all agree that HDR is the future for any monitor and the content is coming fast. The thing is, we already have the content with our high-res GFX and Hassy raw files.
 
So I get edit the images in HDR in lightroom, but when I bring it into photoshop it comes over as a 32 bit file and none of my workflow works anymore - I have to convert it to 16 bit to make the topaz, etc...work.

When I convert it to 16 bit it does "hdr toning' and the colors get all messed up.

What's the best way to go from lightroom to photoshop?
 
I have been told for years here on DPR that a blown highlight is a blown highlight, but that must not be true now because when I review my GFX raw files with HDR turned on in LR, the "blown" highlights suddenly pop back with good detail that is sort of striking, and the file takes on a more luminous looking aspect that I can't describe well.
If there's good detail, and it's not detail that was invented by the raw developer, the highlight wasn't blown in the first place.
 
You can have an sdr 10 bits image that shows more colors than 8 bit jpeg
How are you defining "more colors"? More discrete colors that humans can discriminate from each other? How do you perform that calculation?
 
Chan doesn't provide any explanation as to why this happens, though, just the assertion it does. Is this the fault of the monitor hardware, some effect of the human visual system, a problem with file format, or what? Why don't prints have this issue?
Prints do have this issue. Absent artificial brightening agents, the highest luminance you can get in a print is to lay no colorant down at all. That produces low chromaticity.
 
I have been told for years here on DPR that a blown highlight is a blown highlight, but that must not be true now because when I review my GFX raw files with HDR turned on in LR, the "blown" highlights suddenly pop back with good detail that is sort of striking, and the file takes on a more luminous looking aspect that I can't describe well.
If there's good detail, and it's not detail that was invented by the raw developer, the highlight wasn't blown in the first place.
Maybe. But we had no way to know and didn't see it until now with HDR in LR. I don't know the tech, but you can see the result.

But I have a simple but probably stupid question. If I show the raw in HDR mode in LR and edit in HDR mode, then export the jpeg, will I see that in the jpeg and on the jpeg on a non HDR monitor? Or when you export jpeg it is in SDR and makes no difference - you lose what you saw in HDR?
 
I have had HDR screen since the very beginning
If your HDR screen is older it is not the same as what we have now.
There is an overwhelming emphasis on brigthness however what matters is contrast

I much prefer my LG OLED tv with true black to the bright image of my macbook pro or my iphones or even worse some desktop monitors that are just basic LED
Of course. We all do. But OLED is terrible fpr productivity on a desktop monitor. Burn-in and bad word and excel performance and trouble in many applications.

I saw an 85-inch mini-LED 4K TV in Costco last week for 3 grand! Amazing.
Ultimately 10 stops of dynamic range are plenty and many images do not even reach them
It's not all about DR with HDR. Even images with not much DR in even light look better.
Sunset sunrise and backlit shots are examples where HDR can help but your examples do not particularly benefit from it
Mine do. But I haven't figured out how to show you.
Instagram started to support HDR because phones create HDR images
Everything will someday. It will be omnipresent and ubiquitous. You won't be able to argue against it. 😁
Phones will drive the adoption, we have had Tv and monitors for years and nothing has happened to date
Because of content. But we have the content right now with our GFX and Hassy Medium Format cameras. Don't fight progress.
HDR is interesting because it can avoid editing which many times is done to rebalance dynamic range in a gamma you can display or print, this is the same principles of HLG broadcasting
Yes.
 
Chan doesn't provide any explanation as to why this happens, though, just the assertion it does. Is this the fault of the monitor hardware, some effect of the human visual system, a problem with file format, or what? Why don't prints have this issue?
Prints do have this issue. Absent artificial brightening agents, the highest luminance you can get in a print is to lay no colorant down at all. That produces low chromaticity.
Monitors will blow prints away in many aspects very soon. Well, they already have.

But I love prints, and we don't have many 6-foot monitors, but I see 6-foot prints in galleries a lot.
 
You can have an sdr 10 bits image that shows more colors than 8 bit jpeg
How are you defining "more colors"? More discrete colors that humans can discriminate from each other? How do you perform that calculation?
I could do that in seconds for you Jim, but I left my slide rule at home and I'm in MX. 😎
 
Greg,

This is your time. You are the DPreview Medium Format Board's prophet and evangelist of viewing images (especially medium format images) on beautiful screens.

HDR is what you have been waiting for.

Come back from Mexico, you are needed. Come back to your 32-inch screen. Hit the HDR button. We need you.
Darin,

I am an HDR novice and my Dell 6K has decent but not top-end HDR capability. My IPS Black 6K monitor (like all IPS monitors) has modest contrast ratio which dulls the impact of HDR. The Mini-LED or OLED monitors are much better at this than my Black IPS. Mini LED is the way to go, but no 6K yet. OLED for TVs.

Everyone knows that HDR is the future. HDR Monitors at high res (4K and up) produce more vivid images and add luminance detail with HDR content. Bright highlights, like a sunset or headlights in a dark alley, are noticeably more detailed than in SDR. HDR is better than SDR, and that is a simple fact. Pretty soon, it won't matter. We will all have it.

I have been told for years here on DPR that a blown highlight is a blown highlight, but that must not be true now because when I review my GFX raw files with HDR turned on in LR, the "blown" highlights suddenly pop back with good detail that is sort of striking, and the file takes on a more luminous looking aspect that I can't describe well.
A blown highlight is still a blown highlight in HDR. However, in HDR, blown highlights look terrible. You can observe that highlights in SDR mode without any detail or color suddenly come alive in HDR. However, those highlights were not clipped, as you can verify with RawDigger.
I read all the monitor reviews on Tom's and those guys are the best at commenting on monitors. They all agree that HDR is the future for any monitor and the content is coming fast. The thing is, we already have the content with our high-res GFX and Hassy raw files.
 
I have been told for years here on DPR that a blown highlight is a blown highlight, but that must not be true now because when I review my GFX raw files with HDR turned on in LR, the "blown" highlights suddenly pop back with good detail that is sort of striking, and the file takes on a more luminous looking aspect that I can't describe well.
If there's good detail, and it's not detail that was invented by the raw developer, the highlight wasn't blown in the first place.
Maybe. But we had no way to know and didn't see it until now with HDR in LR. I don't know the tech, but you can see the result.
No, maybe; clipped highlights look horrible in HDR.
But I have a simple but probably stupid question. If I show the raw in HDR mode in LR and edit in HDR mode, then export the jpeg, will I see that in the jpeg and on the jpeg on a non HDR monitor? Or when you export jpeg it is in SDR and makes no difference - you lose what you saw in HDR?
I read that an HDR mode JPEG can include a gain map which would allow the image to be displayed properly in SDR mode as well.
 
A blown highlight is still a blown highlight in HDR. However, in HDR, blown highlights look terrible. You can observe that highlights in SDR mode without any detail or color suddenly come alive in HDR. However, those highlights were not clipped, as you can verify with RawDigger.
Very interesting. Thx.
 
I have been told for years here on DPR that a blown highlight is a blown highlight, but that must not be true now because when I review my GFX raw files with HDR turned on in LR, the "blown" highlights suddenly pop back with good detail that is sort of striking, and the file takes on a more luminous looking aspect that I can't describe well.
If there's good detail, and it's not detail that was invented by the raw developer, the highlight wasn't blown in the first place.
Maybe. But we had no way to know and didn't see it until now with HDR in LR. I don't know the tech, but you can see the result.
No, maybe; clipped highlights look horrible in HDR.
Thanks for the correction. I'm still learning HDR tech.
But I have a simple but probably stupid question. If I show the raw in HDR mode in LR and edit in HDR mode, then export the jpeg, will I see that in the jpeg and on the jpeg on a non HDR monitor? Or when you export jpeg it is in SDR and makes no difference - you lose what you saw in HDR?
I read that an HDR mode JPEG can include a gain map which would allow the image to be displayed properly in SDR mode as well.
If that is true, then I should be exporting every one of my jpegs in HDR mode.

I am in MX shooting IR on a bright white beach with bright white buildings, I just noticed this morning that my 4-year-old Dell XPS 15 K laptop shows these images way better on the highlights in HDR mode in LR. I didn't know this screen was HDR. I guess I should be in HDR mode when I export the JPEGs to post here today.

--
Greg Johnson, San Antonio, Texas
https://www.flickr.com/photos/139148982@N02/albums
 
Last edited:
Chan doesn't provide any explanation as to why this happens, though, just the assertion it does.
He wrote that the color palette near white is limited in SDR. But in HDR, those highlights are moved to the middle, which allows for a much larger color palette. That sounds like a reasonable explanation to me, and it is confirmed with experiments.
Is this the fault of the monitor hardware, some effect of the human visual system, a problem with file format, or what? Why don't prints have this issue?
Prints have the same issue with highlights.
That suggests to me the issue is related to transmissive light viewing. Id like to understand why does it happen. Also, he's the LR gurus, youd expect him to gush and champion about new LR features.... Does this this compromise his explanation?
I have only skimmed his article before experimenting with HDR. After I made my observations, I went to check for more details about what I was seeing. Eric's reputation is very good. Until I see proof otherwise, I assume that his articles are helpful.
There is a lack of systematic explanation here as to what the technical issues HDR solves and how it does it that leaves me feeling sceptical. The tech world is full of breakthroughs, many of which don't pan out.
As I wrote, the proof is in the pudding.
 
You can have an sdr 10 bits image that shows more colors than 8 bit jpeg
How are you defining "more colors"? More discrete colors that humans can discriminate from each other? How do you perform that calculation?
https://en.wikipedia.org/wiki/Color_depth

oh boy, actual color science! not the fake "color science" term everyone on dpreview likes to misuse.
Most people on DPR like to bypass Jim on all color science questions and come straight to the color pro scientists - me. I always tell them to just edit in 3 colors, yellow, blue and red. LOL

Remember before HDMI when we had those cables with the colored YPbPr splits on them and separate sound cables? Those were the good old days of pre-HDMI TV and video and all of those proprietary chargers. We all had boxes of soon to be obsolete cables.
 
So why do SDR screens lose colour in the highlights? And what is technically different with a HDR screen that solves this problem.?And does the human eye normally see more colour in bright areas than we can currently reproduce with SDR? . Or is the HDR fix adding colour we wouldn't expect to see? If imaging does fail in this respect, why has nobody mentioned this failing to me once in the last 40 years before today?

Actually, thinking about it, I remember one hint from the sigma forum when someone claimed foveon preserves highlight colours that bayer sensors lose.
 
So why do SDR screens lose colour in the highlights? And what is technically different with a HDR screen that solves this problem.?And does the human eye normally see more colour in bright areas than we can currently reproduce with SDR? . Or is the HDR fix adding colour we wouldn't expect to see? If imaging does fail in this respect, why has nobody mentioned this failing to me once in the last 40 years before today?

Actually, thinking about it, I remember one hint from the sigma forum when someone claimed foveon preserves highlight colours that bayer sensors lose.
I don't understand it at all David. But I am studying it. Plus, I just saw it on my IR B&W shots on a 4K laptop that I didn't know was HDR. Saw it just now on those beach IR shots in LR in HDR mode. I don't think I exported it on the jpegs that I just posted. That part I don't understand.
 

Keyboard shortcuts

Back
Top