Editing for SDR and HDR output

SrMi

Veteran Member
Messages
7,875
Solutions
5
Reaction score
6,270
Location
SF Bay Area, CA, US
This post is about HDR output, as discussed here, not HDR merge, which involves merging several images into one.

Editing and sharing HDR output images is still in its infancy and is barely supported across various sharing tools.

I am sharing two images in two versions: one edited for SDR and one edited for HDR:

Zonerama HDR vs. SDR Album

If you have an HDR monitor (at least 1000 nits) and a browser that supports HDR, you will notice its advantages. HDR output is not about garish images that burn your retina but fine improvements that boost the joy of looking at a recorded scene.

The advantages of HDR
  • Less post-processing is needed (less masking and less fighting the highlights)
  • The washed-out highlights have better color and better detail.
  • The greyish sky of SDR gets its blues back in HDR :).
  • The colors in SDR are muted and require lots of work to make them pop.
The disadvantage of HDR output is the lack of support, but it is being worked on. Ideally, a single image file should contain both SDR and HDR versions, but I have not managed to do that yet.

Once mature, I believe HDR output will supersede SDR as the choice of display output format. I agree with Eric Chan: “A well-made print is a thing of beauty: a physical, tactile representation of a photograph that stands on its own.” However, an HDR output image shown on a proper display will be a serious alternative to print and not just a convenience.
 
"So what's this HDR about?", I thought. So I looked at my monitor and noticed it has this badge on it that says "HDR". I picked up the remote control - yes, my monitor has a remote control to operate its functions instead of those ludicrous fiddly buttons hidden underneath that so many have. I scrolled through the onscreen menus and found an option under the colour submenu called HDR. I enabled it. Immediately my screen went 10x brighter and vapourised my retinas. I soldiered on and examined the sample image. My first thought looking at the images was "these are hideously bright". My second thought was the "second image of each pair appears to be grossly oversaturated and psychedelic".

My conclusion was that this new HDR function is nothing about capturing and recording data from high subject brightness range subjects ie extending the dynamic range of the capture device. Instead it is about extending brightness range of the output display device.

I think Ansel Adams proved the point that is not a desirable goal. The absolute dynamic range of the output device is essentially irrelevant - what is important is the tonal relationships present in whatever dynamic range your output device has. If you get those tonal relationships correct, it fools the brain into perceiving a much more brilliant and wide ranging image. Your perception manufacturers a convincing impression of the brilliance of the image. You don't need a 1:10,000 output range to convincingly portray the 1:10,000 range in the original scene. A 1:50 matte print can do it very well as long as the tonal range of the print is carefully chosen to trick the brain into recreating an impression of the original scene.

Now given that what I hate about all monitors is that the brighter half of the display range is far too bright in relation to the darker half resulting in a searing, painful, fatiguing experience (when I worked in an office, I sometimes wore sunglasses to avoid eyestrain and headaches from screen working); when viewing images with extensive highlight regions, the absolutely last thing I want is a mode that makes my backlight 10x brighter. I turned off the HDR function ASAP and went back to viewing a very carefully dimmed-down-as-far-it-can-be screen with images portrayed as gently and inoffensively as is possible by a harsh backlit transmissive device such as a screen. The ideal screen for me would be an e-Ink E-reader lit by reflected light only.

--
2024: Awarded Royal Photographic Society LRPS Distinction
Photo of the day: https://whisperingcat.co.uk/wp/photo-of-the-day/
Website: http://www.whisperingcat.co.uk/
DPReview gallery: https://www.dpreview.com/galleries/0286305481
Flickr: http://www.flickr.com/photos/davidmillier/ (very old!)
 
Last edited:
"So what's this HDR about?", I thought. So I looked at my monitor and noticed it has this badge on it that says "HDR". I picked up the remote control - yes, my monitor has a remote control to operate its functions instead of those ludicrous fiddly buttons hidden underneath that so many have. I scrolled through the onscreen menus and found an option under the colour submenu called HDR. I enabled it. Immediately my screen went 10x brighter and vapourised my retinas. I soldiered on and examined the sample image. My first thought looking at the images was "these are hideously bright". My second thought was the "second image of each pair appears to be grossly oversaturated and psychedelic".

My conclusion was that this new HDR function is nothing about capturing and recording data from high subject brightness range subjects ie extending the dynamic range of the capture device. Instead it is about extending brightness range of the output display device.

I think Ansel Adams proved the point that is not a desirable goal. The absolute dynamic range of the output device is essentially irrelevant - what is important is the tonal relationships present in whatever dynamic range your output device has. If you get those tonal relationships correct, it fools the brain into perceiving a much more brilliant and wide ranging image. Your perception manufacturers a convincing impression of the brilliance of the image. You don't need a 1:10,000 output range to convincingly portray the 1:10,000 range in the original scene. A 1:50 matte print can do it very well as long as the tonal range of the print is carefully chosen to trick the brain into recreating an impression of the original scene.

Now given that what I hate about all monitors is that the brighter half of the display range is far too bright in relation to the darker half resulting in a searing, painful, fatiguing experience (when I worked in an office, I sometimes wore sunglasses to avoid eyestrain and headaches from screen working); when viewing images with extensive highlight regions, the absolutely last thing I want is a mode that makes my backlight 10x brighter. I turned off the HDR function ASAP and went back to viewing a very carefully dimmed-down-as-far-it-can-be screen with images portrayed as gently and inoffensively as is possible by a harsh backlit transmissive device such as a screen. The ideal screen for me would be an e-Ink E-reader lit by reflected light only.
You do not seem to have a proper display for showing HDR photos. You should not be turning HDR on and off on your display, but the display recognizes an HDR photo and is instructed to display extended dynamic range.

The best displays are on MacBook Pros and XDR monitors, and the latest iPhones and Androids are good, too. There are others, but I have only experience with the mentioned ones.

The HDR photos are not much brighter, and the difference is sometimes quite subtle. If you see a huge difference in brightness, then the problem is on your side.

Edit: Could you please share what kind of display you use to view HDR images?
 
Last edited:
I appreciate seeing this post! HDR is likely the most common format in photography when you consider that most modern phones are shooting in HDR by default.

I share photos with family in friends via Google Photos. Anyone with recent phone, or a Mac laptop or Ipad views them in HDR without even knowing it.

I've started editing my mirrorless photos in HDR in Lightroom when it's appropriate to the image, and then share them via Google Photos. Lightroom has made it so easy. I just have to be careful when I export to remember if I'm exporting in HDR or SRGB.

The serious photography community seems very slow to embrace HDR, but for some photos its great since it gives more room for the highlights, as you mentioned.
 
This post is about HDR output, as discussed here, not HDR merge, which involves merging several images into one.

Editing and sharing HDR output images is still in its infancy and is barely supported across various sharing tools.

I am sharing two images in two versions: one edited for SDR and one edited for HDR:

Zonerama HDR vs. SDR Album

If you have an HDR monitor (at least 1000 nits) and a browser that supports HDR, you will notice its advantages. HDR output is not about garish images that burn your retina but fine improvements that boost the joy of looking at a recorded scene.

The advantages of HDR
  • Less post-processing is needed (less masking and less fighting the highlights)
  • The washed-out highlights have better color and better detail.
  • The greyish sky of SDR gets its blues back in HDR :).
  • The colors in SDR are muted and require lots of work to make them pop.
The disadvantage of HDR output is the lack of support, but it is being worked on. Ideally, a single image file should contain both SDR and HDR versions, but I have not managed to do that yet.

Once mature, I believe HDR output will supersede SDR as the choice of display output format. I agree with Eric Chan: “A well-made print is a thing of beauty: a physical, tactile representation of a photograph that stands on its own.” However, an HDR output image shown on a proper display will be a serious alternative to print and not just a convenience.
I couldn't agree with you more. Most photographers don't seem to have heard of HDR stills (they assume we are talking about merging three+ photos to squeeze a high dynamic range scene into a regular dynamic range display or print). I wish there was a different name for it!

I've been using HDR in a serious way for the past week--I'm working on a slideshow project and the deadline is approaching and, of course, I thought "why not convert all 350 images to HDR?" Of course! It will be easy! Ha!

For those unfamiliar, an HDR photo is a photo with up to four more stops of highlight "headroom" than a normal photo. Many displays cannot display this extra headroom correctly, but that is quickly changing. Your home TV probably already does and if you are an Apple Mac, iPad or iPhone user you've probably been viewing HDR images for years now (the iPhone default is to shoot HDR). If you have one of the new MacPros then you not only have an HDR screen but a wonderful, amazing HDR screen.

Adobe (in Photoshop and Lightroom) started supporting HDR stills late last year. I believe Chrome already does and Safari (along with Messages) will start supporting HDR in a few weeks.

What about shooting HDR? In all probably you have been shooting HDR for years--a RAW file is all you need. (Note, just changing the settings on your TV to HDR and viewing a jpeg does not work--that's just the TV trying to fake it with a substandard file.) You're whole archive is waiting for you.

Try it out. Go into Lightroom or Photoshop with your favorite RAW file and hit that little "HDR" button and then notice that the histogram changes. Now it has two parts--the SDR part (the regular part) and the extended HDR part. Proceed as normal.

You'll quickly find that you are on unfamiliar ground a bit. Some photos look amazing and have been waiting for HDR all along. Some photos look crazy weird with super bright areas that you won't like. Other photos don't seem to change at all. There will be an adjustment period as you acclimatize to this new thing.

As an example: A few years ago in California, the sky was red from the smoke of the forest fires, so I went out shooting. The photos never really had the look I wanted (and saw with my eyes) of this dome of weird light above a red-darkened town, with the house and street light still on. With HDR it looks perfect. I'm terribly excited.

For a while still these images will be hard to share and there will be lingering issues all through the image-making pipeline, so not everyone is going to jump in. But if you are aiming at a screen-based display as your final output then jumping in is highly recommended and, oh my god the images!

--Darin
 
Here are two links that will help people get up to speed on HDR still photography.

The first is from Apple, from their recent WWDC event in June, which explains the technology and the ideas behind the technology in super clear terms, then goes into how to implement those ideas into code. The first half of the video will be of interest to everyone, including non-programmers. The second half to programmers only.

https://developer.apple.com/videos/play/wwdc2024/10177/

This second link is to the website of photographer Greg Benz, who has been working with HDR stills and related issues in a big way. Great site to explore.

https://gregbenzphotography.com/

--Darin
 
One additional thought--

In my mind, this technology marks the divergence of paper-based images and screen-based images.

In the past, especially in these parts, screen-based images were considered secondary to print-based images, even though only a small number of photographers made prints of their images.

But you can see the point--the screen image looked more or less like the print image and so one was just a version of the other.

With HDR the screen image is no longer a version of the paper image, it looks and feels like something altogether different...at least for some images. Of course, images with limited tonal range will look identical in SDR and HDR, only if the images go beyond the SDR tonal range will there be any difference. But when there is a difference that difference can be extraordinary.

--Darin
 
I appreciate seeing this post! HDR is likely the most common format in photography when you consider that most modern phones are shooting in HDR by default.

I share photos with family in friends via Google Photos. Anyone with recent phone, or a Mac laptop or Ipad views them in HDR without even knowing it.

I've started editing my mirrorless photos in HDR in Lightroom when it's appropriate to the image, and then share them via Google Photos. Lightroom has made it so easy. I just have to be careful when I export to remember if I'm exporting in HDR or SRGB.

The serious photography community seems very slow to embrace HDR, but for some photos its great since it gives more room for the highlights, as you mentioned.
Is this the same HDR you are talking about?

HDR for me is all about the capture device. With a typical sensor, if the subject brightness range exceeds the dynamic range of the sensor, to avoid clipping the highlights you have to reduce the exposure. This causes the shadow regions to be underexposed and dark. If you later raise the levels of the shadows in post, you expose ugly noise and banding.

One way to solve this problem is some form of exposure blending whereby you shoot at least one image carefully overexposed to ensure the shadows get adequate light, and one image underexposed to ensure the highlights are not clipped. You then blend those images in post so that your final image consists of a mix of the overexposed shadows and the underexposed highlights. This gives no highlight clipping, no shadow noise and the maximum detail across the range and lots of scope for editing the image to taste. I believe something like this is essentially what smartphones do in an automated way, and this is what I think of when I hear the term "HDR".

SrMi appears to be talking about something different, a hardware thing I'm not familiar with that has to do with something built into certain display monitors. From his description, it appears to rely on using a much brighter backlight. I'm not entirely sure what this achieves, other than perhaps making the shadows lighter so you can see detail in the shadows that would normally be lost in darkness.

I'm sure there must be more to it than this, perhaps SrMi could explain the background more thoroughly, including details of the gear you need to make this work, and how you prepare and view your HDR images before we try and understand what benefits it brings, and test out the sample images? At the moment I have no real idea what I'm looking at, except that it is not a pleasant experience.

--
2024: Awarded Royal Photographic Society LRPS Distinction
Photo of the day: https://whisperingcat.co.uk/wp/photo-of-the-day/
Website: http://www.whisperingcat.co.uk/
DPReview gallery: https://www.dpreview.com/galleries/0286305481
Flickr: http://www.flickr.com/photos/davidmillier/ (very old!)
 
Last edited:
I appreciate seeing this post! HDR is likely the most common format in photography when you consider that most modern phones are shooting in HDR by default.

I share photos with family in friends via Google Photos. Anyone with recent phone, or a Mac laptop or Ipad views them in HDR without even knowing it.

I've started editing my mirrorless photos in HDR in Lightroom when it's appropriate to the image, and then share them via Google Photos. Lightroom has made it so easy. I just have to be careful when I export to remember if I'm exporting in HDR or SRGB.

The serious photography community seems very slow to embrace HDR, but for some photos its great since it gives more room for the highlights, as you mentioned.
Is this the same HDR you are talking about?

HDR for me is all about the capture device. With a typical sensor, if the subject brightness range exceeds the dynamic range of the sensor, to avoid clipping the highlights you have to reduce the exposure. This causes the shadow regions to be underexposed and dark. If you later raise the levels of the shadows in post, you expose ugly noise and banding.

One way to solve this problem is some form of exposure blending whereby you shoot at least one image carefully overexposed to ensure the shadows get adequate light, and one image underexposed to ensure the highlights are not clipped. You then blend those images in post so that your final image consists of a mix of the overexposed shadows and the underexposed highlights. This gives no highlight clipping, no shadow noise and the maximum detail across the range and lots of scope for editing the image to taste. I believe something like this is essentially what smartphones do in an automated way, and this is what I think of when I hear the term "HDR".

SrMi appears to be talking about something different, a hardware thing I'm not familiar with that has to do with something built into certain display monitors. From his description, it appears to rely on using a much brighter backlight. I'm not entirely sure what this achieves, other than perhaps making the shadows lighter so you can see detail in the shadows that would normally be lost in darkness.

I'm sure there must be more to it than this, perhaps SrMi could explain the background more thoroughly, including details of the gear you need to make this work, and how you prepare and view your HDR images before we try and understand what benefits it brings, and test out the sample images? At the moment I have no real idea what I'm looking at, except that it is not a pleasant experience.
See the links I posted to the video from Apple and from photographer Greg Benz: https://www.dpreview.com/forums/post/67846616

The HDR you describe is the "old" HDR--where you are trying to shrink a scene's brightness range to that of your screen (or your print). That is not this.

This "new" HDR starts with the idea that the screen you are probably looking at can display a much greater range of brightness than the older screens--a higher dynamic range, in so many words. At the same time, your RAW files have a much greater tonal range than can be displayed on the old screens, and so displaying them on old screens requires some sort of compression of their native tonal range. (Not a technical description!)

If a screen can display a high dynamic range and a RAW file has a high dynamic range--what's missing? The editing software and, in general, agreed upon standards for displaying these high dynamic range images. Enter Adobe, last fall, who updated Photoshop and Lightroom to display HDR stills. Enter Apple who updated Photos some time ago to display HDR stills (quite logically since the iPhone for, like, the past four versions, has been shooting HDR stills). Enter Google which updated Chrome recently to display HDR stills. Enter Apple again which will, after it gets out of Beta, support HDR stills in the new OS, including Safari and Messages.

To be clear, this "new" HDR is not the one you heard about before last year, where you merged multiple photos. Forget about that.

Also, this is not about making every pixel brighter and thus lightening the shadows. This is about taking advantage of the very high dynamic range of modern monitors and home screens for still photography (video has had HDR for forever now--still photography is at last catching up).

You cannot print HDR images without greatly shrinking their dynamic range since paper cannot really be made brighter to capture that extra four stops of brightness available with HDR stills. HDR is only for screen display.

You don't need a new camera since all of your RAW photos, going back to the beginning, will be able to take advantage of this extra brightness (assuming it works aesthetically for the image!) All of your iPhone stills are already HDR--when you look at a photo on your phone and it pauses for a half second and then sort of brightens/glows? That's the HDR kicking in.

The new HDR takes a minute to get your head around but very worth it--check out the Apple video especially (the first half). It lays it all out very clearly.

--Darin
 
One additional thought--

In my mind, this technology marks the divergence of paper-based images and screen-based images.

In the past, especially in these parts, screen-based images were considered secondary to print-based images, even though only a small number of photographers made prints of their images.

But you can see the point--the screen image looked more or less like the print image and so one was just a version of the other.

With HDR the screen image is no longer a version of the paper image, it looks and feels like something altogether different...at least for some images. Of course, images with limited tonal range will look identical in SDR and HDR, only if the images go beyond the SDR tonal range will there be any difference. But when there is a difference that difference can be extraordinary.
The use of language is important here, otherwise confusion sets in.

Subject brightness range = the brightness range in the original scene from dark to light expressed as a ratio. On a dull overcast day this can be very low, about 1:20. On a bright, harshly lit scene with deep shadow it can be very high, as much as 1:20,000.

Sensor/film exposure range (also commonly called sensor dynamic range) = The range between the highlight clipping point and the darkest recorded shadow in the raw file/negative. There are different ways of measuring this depending on how tolerant you are of shadow noise. DXO for example, uses what it calls Engineering dynamic range which includes a lot more noise than is useful for photographic purposes, which is why they can rate modern sensors at around 15 stops of dynamic range (about 1:30,000). Photons to photos uses a less noisy range they call Photographic dynamic range which rates modern sensors at around 11 - 12 stops of PDR (1:2000, if I have added up on my fingers correctly).

Output medium reflectance (print) or transmission contrast range (screen) = the measured difference in brightness of the paper base to the darkest tone the print can produce (dMax). Screen contrast is measured in some way I don't understand. A typical matte finish printing paper or darkroom wet print has a reflectance range of about 1:50. A high gloss print about 1:200.

I presume that these HDR monitors we are discussing have a contrast range that far exceeds the normal contrast range of typical everyday monitors and what is being discussed here is the benefits of using a screen that is capable of much higher contrast than traditional screens?

I'm unsure what the photographic benefits of this are, as Ansel Adams proved 1:50 or less is perfectly capable of producing a fully plausible image of a very high subject brightness range, if skillfully produced. The trick is to present a tonal range that the eye/brain can expand to the perception of a full range. It's all about exactly how the brightnesses of the original scene are compressed in a non linear fashion to create a tonality that tricks the brain into perceiving a full range of tones that aren't actually present in the output medium.You don't actually need an output device with a huge contrast range to do this. if you see an Ansel print, they seem somehow to glow with an imaginary inner light. It's a trick exploiting the human visual system but it works given sufficient skill. Using an output device with a huge contrast range seems to me to not only be unnecessary but akin to smashing yourself in the face with a sledgehammer to achieve the effect of seeing stars...
 
One additional thought--

In my mind, this technology marks the divergence of paper-based images and screen-based images.

In the past, especially in these parts, screen-based images were considered secondary to print-based images, even though only a small number of photographers made prints of their images.

But you can see the point--the screen image looked more or less like the print image and so one was just a version of the other.

With HDR the screen image is no longer a version of the paper image, it looks and feels like something altogether different...at least for some images. Of course, images with limited tonal range will look identical in SDR and HDR, only if the images go beyond the SDR tonal range will there be any difference. But when there is a difference that difference can be extraordinary.
The use of language is important here, otherwise confusion sets in.

Subject brightness range = the brightness range in the original scene from dark to light expressed as a ratio. On a dull overcast day this can be very low, about 1:20. On a bright, harshly lit scene with deep shadow it can be very high, as much as 1:20,000.

Sensor/film exposure range (also commonly called sensor dynamic range) = The range between the highlight clipping point and the darkest recorded shadow in the raw file/negative. There are different ways of measuring this depending on how tolerant you are of shadow noise. DXO for example, uses what it calls Engineering dynamic range which includes a lot more noise than is useful for photographic purposes, which is why they can rate modern sensors at around 15 stops of dynamic range (about 1:30,000). Photons to photos uses a less noisy range they call Photographic dynamic range which rates modern sensors at around 11 - 12 stops of PDR (1:2000, if I have added up on my fingers correctly).

Output medium reflectance (print) or transmission contrast range (screen) = the measured difference in brightness of the paper base to the darkest tone the print can produce (dMax). Screen contrast is measured in some way I don't understand. A typical matte finish printing paper or darkroom wet print has a reflectance range of about 1:50. A high gloss print about 1:200.

I presume that these HDR monitors we are discussing have a contrast range that far exceeds the normal contrast range of typical everyday monitors and what is being discussed here is the benefits of using a screen that is capable of much higher contrast than traditional screens?

I'm unsure what the photographic benefits of this are, as Ansel Adams proved 1:50 or less is perfectly capable of producing a fully plausible image of a very high subject brightness range, if skillfully produced. The trick is to present a tonal range that the eye/brain can expand to the perception of a full range. It's all about exactly how the brightnesses of the original scene are compressed in a non linear fashion to create a tonality that tricks the brain into perceiving a full range of tones that aren't actually present in the output medium.You don't actually need an output device with a huge contrast range to do this. if you see an Ansel print, they seem somehow to glow with an imaginary inner light. It's a trick exploiting the human visual system but it works given sufficient skill. Using an output device with a huge contrast range seems to me to not only be unnecessary but akin to smashing yourself in the face with a sledgehammer to achieve the effect of seeing stars...
 
I appreciate seeing this post! HDR is likely the most common format in photography when you consider that most modern phones are shooting in HDR by default.

I share photos with family in friends via Google Photos. Anyone with recent phone, or a Mac laptop or Ipad views them in HDR without even knowing it.

I've started editing my mirrorless photos in HDR in Lightroom when it's appropriate to the image, and then share them via Google Photos. Lightroom has made it so easy. I just have to be careful when I export to remember if I'm exporting in HDR or SRGB.

The serious photography community seems very slow to embrace HDR, but for some photos its great since it gives more room for the highlights, as you mentioned.
Is this the same HDR you are talking about?

HDR for me is all about the capture device. With a typical sensor, if the subject brightness range exceeds the dynamic range of the sensor, to avoid clipping the highlights you have to reduce the exposure. This causes the shadow regions to be underexposed and dark. If you later raise the levels of the shadows in post, you expose ugly noise and banding.

One way to solve this problem is some form of exposure blending whereby you shoot at least one image carefully overexposed to ensure the shadows get adequate light, and one image underexposed to ensure the highlights are not clipped. You then blend those images in post so that your final image consists of a mix of the overexposed shadows and the underexposed highlights. This gives no highlight clipping, no shadow noise and the maximum detail across the range and lots of scope for editing the image to taste. I believe something like this is essentially what smartphones do in an automated way, and this is what I think of when I hear the term "HDR".

SrMi appears to be talking about something different, a hardware thing I'm not familiar with that has to do with something built into certain display monitors. From his description, it appears to rely on using a much brighter backlight. I'm not entirely sure what this achieves, other than perhaps making the shadows lighter so you can see detail in the shadows that would normally be lost in darkness.

I'm sure there must be more to it than this, perhaps SrMi could explain the background more thoroughly, including details of the gear you need to make this work, and how you prepare and view your HDR images before we try and understand what benefits it brings, and test out the sample images? At the moment I have no real idea what I'm looking at, except that it is not a pleasant experience.
See the links I posted to the video from Apple and from photographer Greg Benz: https://www.dpreview.com/forums/post/67846616
I will look at this.
The HDR you describe is the "old" HDR--where you are trying to shrink a scene's brightness range to that of your screen (or your print). That is not this.
You cannot have any kind of true output HDR unless you have already captured the data from a high subject brightness range scene. Although, you can of course, expand a narrow subject brightness range to a greater output contrast range in post. We sometimes do this when we drag the ends of the histogram left and right to add contrast to a flat capture to add punch.
This "new" HDR starts with the idea that the screen you are probably looking at can display a much greater range of brightness than the older screens--a higher dynamic range, in so many words. At the same time, your RAW files have a much greater tonal range than can be displayed on the old screens, and so displaying them on old screens requires some sort of compression of their native tonal range. (Not a technical description!)
I'd argue (as Ansel Adams did) that the absolute dynamic range of your output device doesn't matter at all. He said that old daguerrotypes he measured with his reflectance densitometer as having a reflectance range as low as 1:10 were more than capable of displaying perfectly plausible images of extremely high subject brightness range scenes. The trick is in the way you organise the non-linear compression of the tones. Do it right and it fools the eye into perceiving a much higher contrast range. The heavy lifting is done by the brain and how it interprets the tonality. Technically it is all about exactly how the negative/raw file tones are mapped to the output medium. Hence his invention of the zone system and sophisticated film development methods.
If a screen can display a high dynamic range and a RAW file has a high dynamic range--what's missing? The editing software and, in general, agreed upon standards for displaying these high dynamic range images. Enter Adobe, last fall, who updated Photoshop and Lightroom to display HDR stills. Enter Apple who updated Photos some time ago to display HDR stills (quite logically since the iPhone for, like, the past four versions, has been shooting HDR stills).
Sadly, I do not have an iPhone, nor do I use LR any more except for printing (ver 6.14 perceptual), nor do I use Chrome, nor do I use Apple OS.
Enter Google which updated Chrome recently to display HDR stills. Enter Apple again which will, after it gets out of Beta, support HDR stills in the new OS, including Safari and Messages.

To be clear, this "new" HDR is not the one you heard about before last year, where you merged multiple photos. Forget about that.

Also, this is not about making every pixel brighter and thus lightening the shadows. This is about taking advantage of the very high dynamic range of modern monitors and home screens for still photography (video has had HDR for forever now--still photography is at last catching up).
My monitor does have an HDR badge on it but my brief experiment of switching it from SDR to HDR mode resulted in a massive increase in the brightness of the backlight, exactly the opposite effect to how I carefully set up my monitor to as dim as possible. If this backlight boost is necessary for HDR, it's definitely not for me.
You cannot print HDR images without greatly shrinking their dynamic range since paper cannot really be made brighter to capture that extra four stops of brightness available with HDR stills. HDR is only for screen display.
This is missing the point, as Ansel explained. You don't need to high dynamic range output device to perceive high dynamic range, you just need appropriate tone mapping to fool the brain into internally creating the HDR perception.
You don't need a new camera since all of your RAW photos, going back to the beginning, will be able to take advantage of this extra brightness (assuming it works aesthetically for the image!) All of your iPhone stills are already HDR--when you look at a photo on your phone and it pauses for a half second and then sort of brightens/glows? That's the HDR kicking in.
It sounds to me to be akin to turning the volume up to 11 in a bid to improve sound fidelity. You certainly get an instant rush, but it doesn't actually improve fidelity.
The new HDR takes a minute to get your head around but very worth it--check out the Apple video especially (the first half). It lays it all out very clearly.
I will, I like to understand new stuff even if I don't end up using it.

--
2024: Awarded Royal Photographic Society LRPS Distinction
Photo of the day: https://whisperingcat.co.uk/wp/photo-of-the-day/
Website: http://www.whisperingcat.co.uk/
DPReview gallery: https://www.dpreview.com/galleries/0286305481
Flickr: http://www.flickr.com/photos/davidmillier/ (very old!)
 
Last edited:
I appreciate seeing this post! HDR is likely the most common format in photography when you consider that most modern phones are shooting in HDR by default.

I share photos with family in friends via Google Photos. Anyone with recent phone, or a Mac laptop or Ipad views them in HDR without even knowing it.

I've started editing my mirrorless photos in HDR in Lightroom when it's appropriate to the image, and then share them via Google Photos. Lightroom has made it so easy. I just have to be careful when I export to remember if I'm exporting in HDR or SRGB.

The serious photography community seems very slow to embrace HDR, but for some photos its great since it gives more room for the highlights, as you mentioned.
Is this the same HDR you are talking about?

HDR for me is all about the capture device. With a typical sensor, if the subject brightness range exceeds the dynamic range of the sensor, to avoid clipping the highlights you have to reduce the exposure. This causes the shadow regions to be underexposed and dark. If you later raise the levels of the shadows in post, you expose ugly noise and banding.

One way to solve this problem is some form of exposure blending whereby you shoot at least one image carefully overexposed to ensure the shadows get adequate light, and one image underexposed to ensure the highlights are not clipped. You then blend those images in post so that your final image consists of a mix of the overexposed shadows and the underexposed highlights. This gives no highlight clipping, no shadow noise and the maximum detail across the range and lots of scope for editing the image to taste. I believe something like this is essentially what smartphones do in an automated way, and this is what I think of when I hear the term "HDR".
As has been written we are not talking about HDR that merges images but about HDR output that can display two to four stop more DR than the SDR output.
SrMi appears to be talking about something different, a hardware thing I'm not familiar with that has to do with something built into certain display monitors. From his description, it appears to rely on using a much brighter backlight. I'm not entirely sure what this achieves, other than perhaps making the shadows lighter so you can see detail in the shadows that would normally be lost in darkness.

I'm sure there must be more to it than this, perhaps SrMi could explain the background more thoroughly, including details of the gear you need to make this work, and how you prepare and view your HDR images before we try and understand what benefits it brings, and test out the sample images? At the moment I have no real idea what I'm looking at, except that it is not a pleasant experience.
Yes, we (except you) are talking about the same HDR output, as described here (highly recommended). This article answers all your questions.

High Dynamic Range Explained by Eric Chan (Adobe)

The sensor's dynamic range exceeds that of the print or the SDR-capable display device. HDR mode allows for 2 (my preference) or 4 more stops of DR to be properly displayed. This means that the post-processor does not need to "crush" and compress the highlights to display them as it does in SDR mode.

HDR output mode makes highlights alive, improving both color and visible detail. HDR mode is most obvious when comparing SDR with HDR images. In high-contrast situations, I typically lose all the blue and detail of the sky when processing in SDR. The original blue color and the sky's detail are resurrected when processing in HDR. HDR output mode is not so much about the brightness of highlights as it is about the quality of highlights. Processing for HDR has its challenges in making the highlights tastefully bright and not feel like knives piercing your retina.

By the way, HDR images are processed only in 32 bits, limiting the tools you can use in PS. As an output format, AVIF with color space HDR P3 works best for me so far.

The HDR display needs to support 1000 nits and displays SDR images like a non-HDR display.

I hope this answers your questions, though I would still refer to Eric Chan's article for the most thorough and complete discussion of HDR mode.
 
I appreciate seeing this post! HDR is likely the most common format in photography when you consider that most modern phones are shooting in HDR by default.

I share photos with family in friends via Google Photos. Anyone with recent phone, or a Mac laptop or Ipad views them in HDR without even knowing it.

I've started editing my mirrorless photos in HDR in Lightroom when it's appropriate to the image, and then share them via Google Photos. Lightroom has made it so easy. I just have to be careful when I export to remember if I'm exporting in HDR or SRGB.

The serious photography community seems very slow to embrace HDR, but for some photos its great since it gives more room for the highlights, as you mentioned.
Is this the same HDR you are talking about?

HDR for me is all about the capture device. With a typical sensor, if the subject brightness range exceeds the dynamic range of the sensor, to avoid clipping the highlights you have to reduce the exposure. This causes the shadow regions to be underexposed and dark. If you later raise the levels of the shadows in post, you expose ugly noise and banding.

One way to solve this problem is some form of exposure blending whereby you shoot at least one image carefully overexposed to ensure the shadows get adequate light, and one image underexposed to ensure the highlights are not clipped. You then blend those images in post so that your final image consists of a mix of the overexposed shadows and the underexposed highlights. This gives no highlight clipping, no shadow noise and the maximum detail across the range and lots of scope for editing the image to taste. I believe something like this is essentially what smartphones do in an automated way, and this is what I think of when I hear the term "HDR".

SrMi appears to be talking about something different, a hardware thing I'm not familiar with that has to do with something built into certain display monitors. From his description, it appears to rely on using a much brighter backlight. I'm not entirely sure what this achieves, other than perhaps making the shadows lighter so you can see detail in the shadows that would normally be lost in darkness.

I'm sure there must be more to it than this, perhaps SrMi could explain the background more thoroughly, including details of the gear you need to make this work, and how you prepare and view your HDR images before we try and understand what benefits it brings, and test out the sample images? At the moment I have no real idea what I'm looking at, except that it is not a pleasant experience.
See the links I posted to the video from Apple and from photographer Greg Benz: https://www.dpreview.com/forums/post/67846616
I will look at this.
The HDR you describe is the "old" HDR--where you are trying to shrink a scene's brightness range to that of your screen (or your print). That is not this.
You cannot have any kind of true output HDR unless you have already captured the data from a high subject brightness range scene. Although, you can of course, expand a narrow subject brightness range to a greater output contrast range in post. We sometimes do this when we drag the ends of the histogram left and right to add contrast to a flat capture to add punch.
This "new" HDR starts with the idea that the screen you are probably looking at can display a much greater range of brightness than the older screens--a higher dynamic range, in so many words. At the same time, your RAW files have a much greater tonal range than can be displayed on the old screens, and so displaying them on old screens requires some sort of compression of their native tonal range. (Not a technical description!)
I'd argue (as Ansel Adams did) that the absolute dynamic range of your output device doesn't matter at all. He said that old daguerrotypes he measured with his reflectance densitometer as having a reflectance range as low as 1:10 were more than capable of displaying perfectly plausible images of extremely high subject brightness range scenes. The trick is in the way you organise the non-linear compression of the tones. Do it right and it fools the eye into perceiving a much higher contrast range. The heavy lifting is done by the brain and how it interprets the tonality. Technically it is all about exactly how the negative/raw file tones are mapped to the output medium. Hence his invention of the zone system and sophisticated film development methods.
If a screen can display a high dynamic range and a RAW file has a high dynamic range--what's missing? The editing software and, in general, agreed upon standards for displaying these high dynamic range images. Enter Adobe, last fall, who updated Photoshop and Lightroom to display HDR stills. Enter Apple who updated Photos some time ago to display HDR stills (quite logically since the iPhone for, like, the past four versions, has been shooting HDR stills).
Sadly, I do not have an iPhone, nor do I use LR any more except for printing (ver 6.14 perceptual), nor do I use Chrome, nor do I use Apple OS.
Enter Google which updated Chrome recently to display HDR stills. Enter Apple again which will, after it gets out of Beta, support HDR stills in the new OS, including Safari and Messages.

To be clear, this "new" HDR is not the one you heard about before last year, where you merged multiple photos. Forget about that.

Also, this is not about making every pixel brighter and thus lightening the shadows. This is about taking advantage of the very high dynamic range of modern monitors and home screens for still photography (video has had HDR for forever now--still photography is at last catching up).
My monitor does have an HDR badge on it but my brief experiment of switching it from SDR to HDR mode resulted in a massive increase in the brightness of the backlight, exactly the opposite effect to how I carefully set up my monitor to as dim as possible. If this backlight boost is necessary for HDR, it's definitely not for me.
You cannot print HDR images without greatly shrinking their dynamic range since paper cannot really be made brighter to capture that extra four stops of brightness available with HDR stills. HDR is only for screen display.
This is missing the point, as Ansel explained. You don't need to high dynamic range output device to perceive high dynamic range, you just need appropriate tone mapping to fool the brain into internally creating the HDR perception.
You don't need a new camera since all of your RAW photos, going back to the beginning, will be able to take advantage of this extra brightness (assuming it works aesthetically for the image!) All of your iPhone stills are already HDR--when you look at a photo on your phone and it pauses for a half second and then sort of brightens/glows? That's the HDR kicking in.
It sounds to me to be akin to turning the volume up to 11 in a bid to improve sound fidelity. You certainly get an instant rush, but it doesn't actually improve fidelity.
SDR mode is more like using an equalizer to reduce much of the high frequency, while HDR allows the higher frequency to pass through unaltered.

If you do not have a monitor that supports 1000nits (regardless of badging) and if you do not use software that can handle HDR images, you will not be able to see the benefits.

In the proper environment, the SDR and HDR images I shared in OP have the same brightness and slight differences: HDR images have better colors and better detail.
The new HDR takes a minute to get your head around but very worth it--check out the Apple video especially (the first half). It lays it all out very clearly.
I will, I like to understand new stuff even if I don't end up using it.
 
One additional thought--

In my mind, this technology marks the divergence of paper-based images and screen-based images.

In the past, especially in these parts, screen-based images were considered secondary to print-based images, even though only a small number of photographers made prints of their images.

But you can see the point--the screen image looked more or less like the print image and so one was just a version of the other.

With HDR the screen image is no longer a version of the paper image, it looks and feels like something altogether different...at least for some images. Of course, images with limited tonal range will look identical in SDR and HDR, only if the images go beyond the SDR tonal range will there be any difference. But when there is a difference that difference can be extraordinary.
The use of language is important here, otherwise confusion sets in.

Subject brightness range = the brightness range in the original scene from dark to light expressed as a ratio. On a dull overcast day this can be very low, about 1:20. On a bright, harshly lit scene with deep shadow it can be very high, as much as 1:20,000.

Sensor/film exposure range (also commonly called sensor dynamic range) = The range between the highlight clipping point and the darkest recorded shadow in the raw file/negative. There are different ways of measuring this depending on how tolerant you are of shadow noise. DXO for example, uses what it calls Engineering dynamic range which includes a lot more noise than is useful for photographic purposes, which is why they can rate modern sensors at around 15 stops of dynamic range (about 1:30,000). Photons to photos uses a less noisy range they call Photographic dynamic range which rates modern sensors at around 11 - 12 stops of PDR (1:2000, if I have added up on my fingers correctly).

Output medium reflectance (print) or transmission contrast range (screen) = the measured difference in brightness of the paper base to the darkest tone the print can produce (dMax). Screen contrast is measured in some way I don't understand. A typical matte finish printing paper or darkroom wet print has a reflectance range of about 1:50. A high gloss print about 1:200.

I presume that these HDR monitors we are discussing have a contrast range that far exceeds the normal contrast range of typical everyday monitors and what is being discussed here is the benefits of using a screen that is capable of much higher contrast than traditional screens?

I'm unsure what the photographic benefits of this are, as Ansel Adams proved 1:50 or less is perfectly capable of producing a fully plausible image of a very high subject brightness range, if skillfully produced. The trick is to present a tonal range that the eye/brain can expand to the perception of a full range. It's all about exactly how the brightnesses of the original scene are compressed in a non linear fashion to create a tonality that tricks the brain into perceiving a full range of tones that aren't actually present in the output medium.You don't actually need an output device with a huge contrast range to do this. if you see an Ansel print, they seem somehow to glow with an imaginary inner light. It's a trick exploiting the human visual system but it works given sufficient skill. Using an output device with a huge contrast range seems to me to not only be unnecessary but akin to smashing yourself in the face with a sledgehammer to achieve the effect of seeing stars...
You mentioned you are a DXO user…here is an article from them on the topic (I think DXO now supports HDR stills?):

https://www.dxomark.com/dxomark-decodes-understanding-hdr-imaging/

—Darin
I didn't, I just mentioned DXOmark benchmark tests. I use darktable.
 
I watched the video.

What I learnt from it is that HDR screens have a much stronger backlight which means that highlight tones can be rendered much brighter, thus rendering bright specular highlights with detail.

As I said before, this sounds like a disastrous move to me as I already find the backlights on standard monitors too bright in the highlights leading to viewing fatigue. And this is with the brightness setting on my monitor reduced to 18 and the contrast to 20. HDR sounds like the opposite of what I want to see in a monitor. Ideally, we would have monitors with dull highlight tones with reduced highlight brightness like a print, but that still possessed clear highlight tonal separation in a compressed form that allows the the brain to subsequently expand this out to give a plausible full range perception.

The following is just a pure guess on my part, because I know little about monitor tech, but I suspect the need for HDR monitors occurs because standard monitors do not have the same capability that prints do to compress the highlight tones non-linearly without losing the perception of tonal separation. And this means that the only way to create realistic looking highlight tonal separation on a monitor is to increase the highlight brightness dramatically.

I see monitors as a poor way to display quality images, all artificial punch and drama with no subtlety. It's not how I want to view images. instead of pushing HDR monitor solutions, let's develop reflected light e-ink to have the detail and tonal range of a print. Then we would have a monitor that would allow us to enjoy the relaxed, gentle experience of the reflected light print with the convenience of being able to easily change the print on display. Backlit monitors are going in the wrong direction IMO.

--
2024: Awarded Royal Photographic Society LRPS Distinction
Photo of the day: https://whisperingcat.co.uk/wp/photo-of-the-day/
Website: http://www.whisperingcat.co.uk/
DPReview gallery: https://www.dpreview.com/galleries/0286305481
Flickr: http://www.flickr.com/photos/davidmillier/ (very old!)
 
Last edited:
I booted windows 10, downloaded and installed the latest chrome.

I went into display settings and turned on the HDR option. The screen immediately went massively brighter. I checked the monitor onscreen menu which confirmed that windows had engaged HDR mode without me having to explicitly set it in the OSD.

I went to your link and inspected the pairs of images. I'm assuming that the image that has the HDR on/off switch in the panel below is the HDR image.

Under these settings the images didn't actually look that much different. The HDR image looked better than it did under Ubuntu. I think we can assume from that that Ubuntu does not currently support HDR properly. The most obvious and noticeable difference was that the HDR versions appeared to have somewhat boosted midtone and shadow regions, akin to applying a mid-tone boost curve. The result of which was that shadows were lifted and the colours brightened. In all cases I felt that the standard image had a more natural and attractive tonal balance and lifting the shadows is not a good look. Shadows need to keep some solidity to avoid that classic tone-mapped look. Although with these images, the effect was minor, it was still quite noticeable with side by side comparison.

Of course it is impossible to verify that the whole chain needed to display HDR images correctly is working in my set up. But the settings look right and it looks better than it did on my Linux box.

However, the HDR display setting sets a much brighter viewing experience than I am used to viewing, more like a TV display than a monitor. I think I am quite bright light sensitive and I don't like my monitors set bright like that. I'm also used to keeping them dim for a better match with prints. Overall, the differences appear to be minor and I presume you could re-edit the HDR version to darken the shadows/midtones a little to obtain a more naturalistic look. Then of course, it wouldn't look any different from SDR, which makes me wonder what the point is of all the palaver! In any case, I like my screen to be set to a low brightness level for comfort and a better match with prints and I'm not seeing any solutions to current monitor issues here, unfortunately.

I just don't see chasing monitor tech with ever more contrast capability is a solution for viewing still images. I see it as like surround sound and VR headsets, a solution for more immersive movie experiences, not for beautiful and realistic still image viewing. I think a e-Ink style reflected light monitor would be the thing for still image viewing if the goal is to reproduce the look of prints. I doubt there is any chance of that ever happening. Generally, though, I find monitor viewing of prints to parallel my experience with reading books on portable devices. A kindle is so superior to an ipad for reading a book even though for the first few seconds the deep contrast and super sharp text looks wonderful. But soon as you start reading, rather than gawping, the e-ink is such a nice screen for reading because it is not backlit and there is no bright background shining straight in your eyes. I've never liked shiny printer paper and I've never like shiny display screens.

--
2024: Awarded Royal Photographic Society LRPS Distinction
Photo of the day: https://whisperingcat.co.uk/wp/photo-of-the-day/
Website: http://www.whisperingcat.co.uk/
DPReview gallery: https://www.dpreview.com/galleries/0286305481
Flickr: http://www.flickr.com/photos/davidmillier/ (very old!)
 
Last edited:

Keyboard shortcuts

Back
Top