For those of you without Lightroom or an HDR monitor, here are a few histograms of one of my images with different screen settings. Image from a GFX 100s using the Acros simulation.
This first screenshot is without the HDR button pressed and is the same for all of my MacBook Pro's screen settings.
This next image is with my screen set to emulate sRGB:
Two things to note. First, there is a new red-colored area on the histogram labeled "HDR." This shows you the image values that are above the SDR range. Unfortunately, since they are in red, they are outside the sRGB range and will display as pure white. But you can see that there real values there, potentially displayable.
The second thing to note--and this is important!--is that the histogram in the SDR range--although squeezed visually, is more or less unchanged. You can see the same contours in both, although I'm not clear why the rise at the end of the non-HDR histogram is not showing.
Now, changing the screen display to P3 (your new-ish TV probably uses P3) greatly enlarges the luminance values available to be displayed. This is exactly the same histogram as in the HDR sRGB example but the red is gone and all but the very last bit of the histogram is displayable. The values displayed from left of the white line (the SDR values) are not changed. Areas of the image that previously were pure white now have detail--about 1.25 stops more hightlight "headroom." (each bar in the right part of the histogram represents one stop.)
If I set my MacBook Pro to its best screen setting (XDR) the highlight area enlarges even more:
In the case of this image, that last little bit of the histogram will now show. And there is still another 1.75 stops of data that could have been displayed had my file contained that data.
Of course, in all of these cases, you can use the normal tools to shift and expand/contract the histogram as you see fit, moving data out of the grayed out area, etc). Your usable area will, as you can see, vary by the monitor (or simulation) being used.
That all seems very easy to understand. Now here's the confusing part. The headroom area varies not only by monitor but by the brightness setting on that monitor. For example, if I lower my MacBook Pro's brightness setting (while Display is set to XDR) to slightly under its halfway point, the HDR area on the histogram grows to four stops rather than the three shown. If I then brighten the image the HDR area begins to shrink, reaching two stops at my computer's brightest setting.
Thus, brightening the image will clip the highlights. I tested this by increasing the exposure of the image in Lightroom until the right areas of the data touched the right edge--a full-range histogram. At the low brightness setting, I could see detail in the highlight areas. As I brightened the screen the highlights became increasingly blown, as you might expect, knowing that the usable highlight headroom is shrinking. I do not observe this occurring when changing the brightness of the screen on the image with the HDR button turned off.
I have not yet exported an HDR file and tried this test again. If this highlight issue is present in the final image or movie file it will obviously present a significant issue to knowing how your image will look when displayed. I'm hoping someone here can help clarify what is going on and what the solution might be.