HDR standards - DR vs DRR (dynamic range resolution)

quadrox

Senior Member
Messages
1,311
Solutions
1
Reaction score
790
Location
DK
Definitions:
DR - dynamic range - number of stops between darkest and brightes value
DRR - dynamic range resolution - smallest possible difference between values

Topic: It makes sense to say e.g. that a given sensor has 13 stops of dynamic range and we know both the DR and DRR - this works well for raw images (at least theoretically).

I have always assumed that e.g. an sRGB JPEG image also has a well defined DR and DRR, meaning an ideal monitor that is capable of displaying 8bit sRGB images faithfully should always have the same number of stops between pure black and pure white, even if I may increase the overall brightness of the image.

Now we are finally getting some image standards with more bit-depths, e.g. HDR10 and whatever else they are called. These sport 10 bits instead of only 8 bits, but where do these extra bits go? Do they extend the DR, and the DRR stays the same? Do they affect both? Is this even well defined?

I know that even before HDR standards we have had 10bit monitors, but as far as I know these did not feature expanded DR as such, but instead offered more DRR and expanded range for colors. Now, if we want to combine this approach of expanded color spaces with the idea of expanding the DR, don't we need more than 10bits? Why is this not a thing? Indeed, why is the industry (monitors and image standards) lagging so far behind our ability to capture images with high fidelity?
 
Last edited:
Jpeg images are 8 bits sRGB so they can be displayed. Not linear either
"JPEG" can be more or less anything. 8-10-12 bits. 1-3-7 color channels. Color definition embedded as meta-data.

The thing that most people think about as "JPEG" is actually JFIF and restricts JPEG to be 3 or 1 color channel BT601 YCbCr with 8bpp but 0...255 encoding, 4:4:4...4:2:0 chroma subsampling (subject to my memory)

-h
 
Last edited:
If ever Eric Fossum designs a "JOT display" with miniature "pixels" each generating either a photon, or not generating a photon, what will its dynamic range be? The (substantial) ratio of the intensity/energy of a generated photon to no generated photon? (assuming that he also invents a nano-material reflecting no stray light and/or the display is used in an environment where the stray light can be considered approximately zero - such as VR goggles)?

One might take the peak number of photons generated per unit of time (for all pixels being "on") divided by all pixels being "off" (ideally zero) to arrive at an infinite number. In some ways it might be useful to show "perfect black". But it practice, I think that it is usually more useful to look at the range where one can show a useful "perceptually linear" gradation. I.e. if the display is capable of showing 8 Megapixel (to adopt DXO conventions, regardless if the physical panel is 12MP or 100MP) and within each of those "1 in 8 million" pixels can increase brightness by some step (by generating one photon out of N possible ones), then that represents the smallest reproduced step. Assuming that the further progression is linear-light (or perceptually linear), taking this as the noise-floor and defining a fraction vs the maximum number of photons within that same region.

There may be cases where my assumptions above fails. For reproducing a star-sky, perhaps being able to do a regular 100:1 range of brightnesses _and_ in addition having the possibllity of "perfect black" is a perceptually and/or physically sensible capability.

-h
 
Last edited:
Jpeg images are 8 bits sRGB so they can be displayed. Not linear either
"JPEG" can be more or less anything. 8-10-12 bits. 1-3-7 color channels. Color definition embedded as meta-data.

The thing that most people think about as "JPEG" is actually JFIF and restricts JPEG to be 3 or 1 color channel BT601 YCbCr with 8bpp but 0...255 encoding, 4:4:4...4:2:0 chroma subsampling (subject to my memory)

-h
Yes I should have written 'the most common jpeg images such as those produced by camera and the larger majority of editing programs are'

A simplification that nearly killed me

It is uncommon though to use that compression for 10 bits and the compression efficiency is rather low to current standards
 
The issue in the past was the display

sRGB at display level has a contrast ratio of 1000:1 this is related to gamma encoding

an HDR display has over 16 stops depending on the standard so it makes sense to use HEIF images that support 10 bits encoding and can support HDR images

your old joeg images are tone mapped they are still less than 10 stops on the display

heif is a great format it was about time
An old article, and television-centric, but I think it has some relevance. I don't recall the specifics, but I think that sRGB and BT709 are similar in character, if not numerically.

http://poynton.ca/PDFs/GammaFAQ.pdf

"Projected cinema film, or a photographic reflection print, has a contrast ratio of about 80:1. Television assumes a contrast ratio, in your living room, of about 30:1. Typical office viewing conditions restrict the contrast ratio of a CRT display to about 5:1.

At a particular level of adaptation, human vision responds to about a hundred-to-one contrast ratio of intensity from white to black.

If you use nonlinear coding, then the 1.01 “delta” required at the black end of the scale applies as a ratio, not an absolute increment, and progresses like compound interest up to white. This results in about 460 codes, or about nine bits per component. Eight bits, nonlinearly coded according to Rec. 709, is sufficient for broadcast-quality digital television at a contrast ratio of about 50:1."


I see displays marketed as 1000:1 CR, but in order for a user to really appreciate this, I think that one would need:

1) A room that does not wash out blacks with stray light (or a display technology that is highly non-reflective)

2) An image/video format that allows expressing imagery in such a way that 1000:1 CR can be usefully excited by some input signal. I.e. the brightest and darkest values must be mapped to the input, and in-between them a sufficient number of codes must be mapped (nonlineary) to human perception so as to not have objectionable banding.

I have a nice OLED tv that feature very deep blacks on a per-pixel level (due to each subpixel being a separate OLED that can be turned "off"). It does have limits in terms of the maximum brightness, though. Meaning that it is most useful in a semi-dark home cinema. And they had to include "white" (non-filtered) pixels in order to achieve sufficient brightness, meaning that (color) saturated highlights are not possible.

Interestingly, OLEDs seems to use some sort of dither/PWM in order to produce different brightness levels. And while near perfect black is doable (in somewhat unrealistic conditions), the next-lowest-brightness seems to be limited by the duty-cycle/spatial extent of their dithering setup.

-h
Viewing conditions are an essential part of contrast ratios in fact in a normal office environment that needs to be sufficiently lit so you can see your keyboard and take notes you have 300-600 lux this is only going to make your screen look ok for word processing and presentation not to look at deep shadows really

However when you look at grading viewing conditions are much darker. There are now attempts to harmonise HDR displays mostly due to gaming requirements but not only

https://displayhdr.org/ Vesa is at the forefront and lists compatible devices

This is my Vesa HDR400 monitor


It does a decent jopb as long as the room is dark to show good contrast ratio

The dynamic range that you perceive with a standard jpeg 8 bits sRGb is indeed quite good but not as good as an HDR10 video

I also have an Oled Tv like you that can only output 600 nits however as Oled work very well with deep blacks it is still an HDR screen that is fantastic in dark viewing conditions

The viewing conditions issue is an interesting one also for print. You need some light on the print to see it however that also destroys the shadows which is why prints are not going to match a display on both darks nor highlights

Generally mobile devices are easily found with brightness of over 500 nits so having some HDR that works on the highlights still produces more contrast than an SDR image regardless of the viewing conditions damaging the darks on both camps

I think there is a need for more bit depth, more contrast and also for sequences of images that together with compression efficiency will drive new standards but I also believe you could have created a 10 bits jpeg with a different gamma and gamut already before yet this is a rarity and makes quite large files after all
 
If ever Eric Fossum designs a "JOT display" with miniature "pixels" each generating either a photon, or not generating a photon, [...]
Per what unit of time? If it is small enough, one can argue that all displays do that already.
 
HLG is not supported by monitors for editing so it remains a live broadcasting tool
Per their HLG FAQ (and another reference that I can’t seem to find right now), the BBC has found it a viable option to do display-referred editing on, say, a 600 cd/m² monitor, and then apply the appropriate inverse OOTF and OETF to get an HLG signal that displays satisfactorily on various monitors. There is no reason why the editing monitor would have to support HLG natively.
 
HLG is not supported by monitors for editing so it remains a live broadcasting tool
Per their HLG FAQ (and another reference that I can’t seem to find right now), the BBC has found it a viable option to do display-referred editing on, say, a 600 cd/m² monitor, and then apply the appropriate inverse OOTF and OETF to get an HLG signal that displays satisfactorily on various monitors. There is no reason why the editing monitor would have to support HLG natively.
The editing monitor and for that matter no monitor other than some very expensive ones supports HLG at all

When you playback HLG on a standard monitor you still need this monitor to support BT.2020 color space and there are some white point issues

So in practical terms is not a viable options. I shot HLG video since 2016 here an example



The issue was that I could practically do no color correction as it would look totally off. I ended up using my Tv but just to test one done as the computer could not drive an HLG device in fact even that video will show a recreated version from youtube not the original that will only play back on a tv set

Computers really only support HDR10 that is where we are right now and the BBC has a specialised stream for HLG in their iplayer

--
instagram http://instagram.com/interceptor121
My flickr sets http://www.flickr.com/photos/interceptor121/
Youtube channel http://www.youtube.com/interceptor121
Underwater Photo and Video Blog http://interceptor121.com
Deer Photography workshops https://interceptor121.com/2021/09/26/2021-22-deer-photography-workshops-in-woburn/
If you want to get in touch don't send me a PM rather contact me directly at my website/social media
 
Last edited:
HLG is not supported by monitors for editing so it remains a live broadcasting tool
Per their HLG FAQ (and another reference that I can’t seem to find right now), the BBC has found it a viable option to do display-referred editing on, say, a 600 cd/m² monitor, and then apply the appropriate inverse OOTF and OETF to get an HLG signal that displays satisfactorily on various monitors. There is no reason why the editing monitor would have to support HLG natively.
The editing monitor and for that matter no monitor other than some very expensive ones supports HLG at all
My point is that it doesn’t matter. You can grade in PQ if you like, and then convert that to an HLG signal that would produce the same output light if the display supported it. The chain that achieves that is:

(PQ signal) → PQ EOTF → (absolute display light) → scale peak luminance to max signal → (relative display light) → HLG OOTF⁻¹ → (relative scene light) → HLG OETF → (HLG signal)
 
HLG is not supported by monitors for editing so it remains a live broadcasting tool
Per their HLG FAQ (and another reference that I can’t seem to find right now), the BBC has found it a viable option to do display-referred editing on, say, a 600 cd/m² monitor, and then apply the appropriate inverse OOTF and OETF to get an HLG signal that displays satisfactorily on various monitors. There is no reason why the editing monitor would have to support HLG natively.
The editing monitor and for that matter no monitor other than some very expensive ones supports HLG at all
My point is that it doesn’t matter. You can grade in PQ if you like, and then convert that to an HLG signal that would produce the same output light if the display supported it. The chain that achieves that is:

(PQ signal) → PQ EOTF → (absolute display light) → scale peak luminance to max signal → (relative display light) → HLG OOTF⁻¹ → (relative scene light) → HLG OETF → (HLG signal)
Canon has just released HEIF HDR10 still support it is a first

When it comes to video you have serveral options at present of which the most common are

1. Capture log -> drop in a PQ timeline (ideal as PQ is very similar to log) -> grading -> normalise to maximum luminance for target display (normally 1000 nits) -> output HDR10 or Dolby Vision

2. Capture RAW video -> convert to log (as this is what editors manage) -> follow process above

3. Capture HLG -> edit in HLG -> output (problems in the edit good for straight use with limited corrections)

There are other permutations with people capturing HLG to produce SDR but there is a colour space transform which can get complicated and the colors go off quickly

Practically HLG is not used by anyone other than broadcasters or hobbyst that want to shoot directly an HDR format

I am not aware of any device that captures PQ but I have not looked hard enough to be frank I may be wrong

Due to compatibility the log path is the most common as you can use it also to create an SDR version out of the same source material

RAW video is just starting to become main stream I have a recorder that can capture ProRes RAW from two of my cameras but there is no support for lens corrections so only large lenses adopter from DSLR or PL mount are really suitable. All my cameras have log

So at present when it comes to video log is really the base for grading and the process where people have established tools

--

instagram http://instagram.com/interceptor121
My flickr sets http://www.flickr.com/photos/interceptor121/
Youtube channel http://www.youtube.com/interceptor121
Underwater Photo and Video Blog http://interceptor121.com
Deer Photography workshops https://interceptor121.com/2021/09/26/2021-22-deer-photography-workshops-in-woburn/
If you want to get in touch don't send me a PM rather contact me directly at my website/social media
 
Heic is not heif
"What’s the difference between HEIC and HEIF?

HEIF is the name for the standard — High Efficiency Image Format — while HEIC is Apple’s chosen file name extension. The terms are basically interchangeable because the High Efficiency Video Compression (HEVC) standard, also known as H.265, is the base for both."

https://www.adobe.com/creativecloud/file-types/image/raster/heif-file.html

You don't know the industry lingo, probably because you don't know the industry.

--
http://www.libraw.org/
 
Last edited:
and HLG is not supported by monitors for editing so it remains a live broadcasting tool
Yet again you post something that is blatantly wrong and can be easily disproven in 20 seconds of Googling:

https://www.eizo.com/products/coloredge/cg3145/

It took 20 seconds to find an example of a grading reference monitor that supports HLG

Elsewhere you state "other than some expensive ones" - well you WERE talking about reference monitors for grading, weren't you? They're all expensive, whether for PQ or HLG.

If you don't care about being reference-grade, then do what the budget colorists do and use an LG C2 for grading (
for an example of one colorist using it) - it supports HLG and PQ and is very reasonably priced for the image quality it delivers. It appears the C2 will even let you force HLG behavior on input signals that lack appropriate metadata (i'm now even MORE tempted to buy one...)

I have no problem displaying HLG on my Vizio P65-F1, which is now quite old.
 
Last edited:
the next-lowest-brightness seems to be limited by the duty-cycle/spatial extent of their dithering setup.

-h
This is pretty common, not just for OLEDs, but LEDs in general.

Many years ago (before WS2812s became readily available/common), I was experimenting on I2C-controllable RGB LEDs for Christmas lights - since the "native" LED's PWM was linear, I implemented a rudimentary gamma curve ( https://github.com/Entropy512/I2C_RGB/blob/master/firmware/gammacurve_8to12.h ) - and yes, that shows one great example of why the sRGB linear foot/toe exists - I didn't use it and thus wasted some code values by having multiple input values map to 0, and 3 input values map to 1 output.

To keep the core soft-PWM code running on 8-bit values, I wrapped Atmel's software PWM app note implementation with a sigma-delta modulator to achieve higher bit depths. Kind of similar to the FRC implementations of many monitors (temporal dithering, gaining more luminance resolution at the cost of framerate)

The problem was that at very low brightness (1/4096 linear), the required duty cycle was so low that even with the soft-PWM's fastest refresh rate, the LED would visibly flicker.

I could have probably solved it with an analog LPF on the output, but it wasn't long before dedicate silicon (WS2812) with much better capability became available.
 
Heic is not heif
"What’s the difference between HEIC and HEIF?

HEIF is the name for the standard — High Efficiency Image Format — while HEIC is Apple’s chosen file name extension. The terms are basically interchangeable because the High Efficiency Video Compression (HEVC) standard, also known as H.265, is the base for both."

https://www.adobe.com/creativecloud/file-types/image/raster/heif-file.html
The image file format for ISO/IEC 23008-12:2022 also includes AVC, Jpeg, VVC and EVC and in addition there is an open format based on AV1 that is competing in the same space

Apple Heic has fees so it is not necessarily going to be set for global domination

All of those are high efficiency image formats as HEIF is a container name not a codec name so potentially someone calls in tomorrow and adds another codec to the container like it happened with mp4 and other mpeg containers

There are several mime types that broadly follow in this category and I was not necessarily referring to apple heic
 
HLG is not supported by monitors for editing so it remains a live broadcasting tool
Per their HLG FAQ (and another reference that I can’t seem to find right now), the BBC has found it a viable option to do display-referred editing on, say, a 600 cd/m² monitor, and then apply the appropriate inverse OOTF and OETF to get an HLG signal that displays satisfactorily on various monitors. There is no reason why the editing monitor would have to support HLG natively.
The editing monitor and for that matter no monitor other than some very expensive ones supports HLG at all

When you playback HLG on a standard monitor you still need this monitor to support BT.2020 color space and there are some white point issues

So in practical terms is not a viable options. I shot HLG video since 2016 here an example
The issue was that I could practically do no color correction as it would look totally off. I ended up using my Tv but just to test one done as the computer could not drive an HLG device in fact even that video will show a recreated version from youtube not the original that will only play back on a tv set

Computers really only support HDR10 that is where we are right now and the BBC has a specialised stream for HLG in their iplayer
In practice, that example video looks very good full screen on my 9-year old Dell monitor.

Don
 
and HLG is not supported by monitors for editing so it remains a live broadcasting tool
Yet again you post something that is blatantly wrong and can be easily disproven in 20 seconds of Googling:

https://www.eizo.com/products/coloredge/cg3145/

It took 20 seconds to find an example of a grading reference monitor that supports HLG

Elsewhere you state "other than some expensive ones" - well you WERE talking about reference monitors for grading, weren't you? They're all expensive, whether for PQ or HLG.

If you don't care about being reference-grade, then do what the budget colorists do and use an LG C2 for grading (
for an example of one colorist using it) - it supports HLG and PQ and is very reasonably priced for the image quality it delivers. It appears the C2 will even let you force HLG behavior on input signals that lack appropriate metadata (i'm now even MORE tempted to buy one...)

I have no problem displaying HLG on my Vizio P65-F1, which is now quite old.
That is in fact a reference monitor as I wrote with a price point near €5,000

Grading with a Tv requires a card that supports HLG and unfortunately there are none that are consumer level

Displays and computers are set for HDR10 they do not deal well with the HLG construct

I am going to put you in the ignore list as your only purpose is to inflame discussions and go out of scope
 
HLG is not supported by monitors for editing so it remains a live broadcasting tool
Per their HLG FAQ (and another reference that I can’t seem to find right now), the BBC has found it a viable option to do display-referred editing on, say, a 600 cd/m² monitor, and then apply the appropriate inverse OOTF and OETF to get an HLG signal that displays satisfactorily on various monitors. There is no reason why the editing monitor would have to support HLG natively.
The editing monitor and for that matter no monitor other than some very expensive ones supports HLG at all

When you playback HLG on a standard monitor you still need this monitor to support BT.2020 color space and there are some white point issues

So in practical terms is not a viable options. I shot HLG video since 2016 here an example
The issue was that I could practically do no color correction as it would look totally off. I ended up using my Tv but just to test one done as the computer could not drive an HLG device in fact even that video will show a recreated version from youtube not the original that will only play back on a tv set

Computers really only support HDR10 that is where we are right now and the BBC has a specialised stream for HLG in their iplayer
In practice, that example video looks very good full screen on my 9-year old Dell monitor.

Don
Youtube creates a different version files with an adaptation that sometimes works sometimes does not

If you click on stats for nerds you see what it is really running on your computer

--
instagram http://instagram.com/interceptor121
My flickr sets http://www.flickr.com/photos/interceptor121/
Youtube channel http://www.youtube.com/interceptor121
Underwater Photo and Video Blog http://interceptor121.com
Deer Photography workshops https://interceptor121.com/2021/09/26/2021-22-deer-photography-workshops-in-woburn/
If you want to get in touch don't send me a PM rather contact me directly at my website/social media
 
Last edited:
It has happened with adobe introducing experimental support for AVIF
The new HDR mode works as advertised in ACR, P3, 32-bits, Win 11. The way I look at it, it is just a way to ensure that the pipeline delivers the full 10 bits per channel properly to monitors that are so capable. Mine is and has a little less than a 10 stop contrast ratio once calibrated.

Incidentally, playing around cycling the HDR button on and off shows how a weakness of Adobe's regular SD 8-bit squeeze: now I understand the reason for the horrible reddish tinge in some of my blue skies.

Jack
 
It has happened with adobe introducing experimental support for AVIF
The new HDR mode works as advertised in ACR, P3, 32-bits, Win 11. The way I look at it, it is just a way to ensure that the pipeline delivers the full 10 bits per channel properly to monitors that are so capable. Mine is and has a little less than a 10 stop contrast ratio once calibrated.

Incidentally, playing around cycling the HDR button on and off shows how a weakness of Adobe's regular SD 8-bit squeeze: now I understand the reason for the horrible reddish tinge in some of my blue skies.

Jack
Good news that it works well on windows.

One of the things that confuses me and I need to double check is the output

Normally HDR10 has BT.2020 gamut and BT.2100 compliant transfer function with a specific matrix

Most devices around have a pretty low coverage of BT.2020 gamut my screen is around 72% and even most HDR Tv sets are not running close to 90% however many devices are able to cover P3 although there is a difference between DCI-P3 and Display P3. For video you have a set of procedures to work all the way from P3 and end up in a HDR container that is still within the P3 gamut. This ensures that what you edit on your screen is what you will see when you share it with an audience that has the same capabilities

What I am not clear is how am I going to work with the adobe tools that are mostly set for adobergb or prophotorgb to achieve a similar outcome

I understand you can output sRGB P3 and BT.2020 gamut on an HDR transfer function however only BT.2020 / BT.2100 is an expected combination

AVIF is supported by google chrome and some other browsers so it will be interesting to see how the whole thing plays out as AVIF is set to become the jpeg replacement for browsers. Today when I load some test images with various combinations some display a question mark which means the browser was not able to decode them
 

Keyboard shortcuts

Back
Top