10-bit video and HDR video?

MoistBurger

New member
Messages
1
Reaction score
0
Can anyone tell me if you record 10-bit video on your camera, could that then be categorized as HDR-content?

Are HDR-content actually 10-bit video?

And could you see the difference between 8-bit and 10-bit video on a regular television?

just curious...
 
Can anyone tell me if you record 10-bit video on your camera, could that then be categorized as HDR-content?

Are HDR-content actually 10-bit video?

And could you see the difference between 8-bit and 10-bit video on a regular television?

just curious...
HDR video is so new thing that there is no standars yet. It is a 10 bit file. Here is a quote from Samsung 10bit HDR-tv review


"...there’s no native HDR content available to consumers right now. But Samsung provided a USB stick containing UHD HDR clips of The Life Of Pi and Ridley Scott’s Exodus, remastered by Fox into HDR."

I wonder if it is possible to show 10 bit UHD video in HDR-tv via HDMI. One can use 10bit graphic card or 10bit 4k video camera like Panasonic GH4.
 
Can anyone tell me if you record 10-bit video on your camera, could that then be categorized as HDR-content?
Nope.
Are HDR-content actually 10-bit video?
Nope.
And could you see the difference between 8-bit and 10-bit video on a regular television?
Yup.

Don't confuse higher bit rate with higher dynamic range, though. A higher bit depth will simply make both colour depth and higher dynamic range look better, more natural.

Actual HDR TVs and video monitors will emerge eventually, hopefully soon, but that's another story.
 
To add a twist to HDR UHD, just after viewing some wonderful Sony & Samsung UHD LCDs, I looked at an OLED and went WOW!

It is the best looking flat panel I have ever seen, you might say a HD OLED is as good as a UHD LCD & will be more compatible for the next few years - BUT it cost as much as a slightly larger UHD LCD!!!!

HDR looks promising, and a shame more HD sets are not 10bit compatible !!!!
 
Actual HDR TVs and video monitors will emerge eventually, hopefully soon, but that's another story.
They are here already. The standard is not yet ready. It would be intresting to see those HDRvideo files. Are those 10bit and what kind of codec specs they have.
 
Samsung and others are certainly developing prototypes of HDR displays, but I suspect that the video they exhibit is juiced up in post to enhance highlights that simply blow out in any actual camera.

The Panasonic VX870 and WX970 have an HDR mode, which supposedly captures frames at two different exposure settings, but the effect appears not much different than the "backlit" mode in other camcorders: some added detail in shadows, a bit less blow-out of highlights, but everything looks a bit soft or foggy.

True to life HDR video would mean a viewing screen that, if portraying the sun, would be blinding bright and deliver enough UV to burn your skin and blind your eyes. You'd need to use those old 3d glasses just to cut down the glare.

Most of the time, such extremes of contrast are unnecessary or become a tedious special effect.

"Hey, Honey. Look at this HDR video of our cat! Now look at this video of a candle. Next scene: our house on fire."

The extra file size needed for such extremes would use up oceans of memory or bandwidth. The processor use would throw off enough head to make you sweat too.

The price? As with yachts, if you need to ask about price you can't afford one. By comparison, Samsung's $130k 120" 4k screen is a pauper's toy. If people sell their house to buy one, at least they can live inside the delivery crate.

Cheaper to simply put an LED atop the screen and have it emit a flash every time a light burst is supposed to occur. Or maybe have yellow and red stars appear on the screen, as in cartoons and comics. Most video editing packags inclue filter or PIP effects that imitate flashes or beams. Most viewers won't notice the difference and, as with true HDR or 3D, soon tire of the videos contrived explicitly to convey the effects.
 
True to life HDR video would mean a viewing screen that, if portraying the sun, would be blinding bright and deliver enough UV to burn your skin and blind your eyes. You'd need to use those old 3d glasses just to cut down the glare.
This is exactly right. When you have more bits available to encode the brightness of each color channel, you can put them to work in two different ways:
  • Increase the number of steps available within the range of brightnesses that existing TVs are capable of showing, or
  • Come up with new display technology that can display brighter whites and darker blacks.
Generally speaking, the new TVs aren't really increasing the amount of available contrast all that much. They're already using local dimming to deepen the black levels, and they really can't increase the white levels because the whole industry is built around "white" not being too overwhelming. Think of all the TV ads and graphics that show images on a white background - if that background became four times brighter then it would start to be very annoying. (When desktop publishing software established the standard of "white" being the color of paper they sealed the fate of truly bright displays, IMHO).

So that leaves the first option - increasing the number of steps available within the existing range of brightnesses. The issue as I see it is that 8 bit video is pretty much capable of displaying enough brightness gradations already. If you create an image with a continuous shade of blue, green or red from black to full-on colour, I can't reliably tell where the line between the colours encoded with brightness levels of 100 vs. 101 is, or where the demarkation between any other two adjacent colours are. Adding more steps between 100 and 101 isn't going to make the picture look any different to me.

So I'm really not seeing what the big deal is over 10-bit video for display purposes. It's valuable in a capture device because it gives you more latitude for adjusting the tone curve in post-production. But once you've encoded it for final output, I'm really not buying how those additional colour bits are going to do me any good.
 
True to life HDR video would mean a viewing screen that, if portraying the sun, would be blinding bright and deliver enough UV to burn your skin and blind your eyes. You'd need to use those old 3d glasses just to cut down the glare.
This is exactly right. When you have more bits available to encode the brightness of each color channel, you can put them to work in two different ways:
  • Increase the number of steps available within the range of brightnesses that existing TVs are capable of showing, or
  • Come up with new display technology that can display brighter whites and darker blacks.
Generally speaking, the new TVs aren't really increasing the amount of available contrast all that much. They're already using local dimming to deepen the black levels, and they really can't increase the white levels because the whole industry is built around "white" not being too overwhelming. Think of all the TV ads and graphics that show images on a white background - if that background became four times brighter then it would start to be very annoying. (When desktop publishing software established the standard of "white" being the color of paper they sealed the fate of truly bright displays, IMHO).

So that leaves the first option - increasing the number of steps available within the existing range of brightnesses. The issue as I see it is that 8 bit video is pretty much capable of displaying enough brightness gradations already. If you create an image with a continuous shade of blue, green or red from black to full-on colour, I can't reliably tell where the line between the colours encoded with brightness levels of 100 vs. 101 is, or where the demarkation between any other two adjacent colours are. Adding more steps between 100 and 101 isn't going to make the picture look any different to me.

So I'm really not seeing what the big deal is over 10-bit video for display purposes. It's valuable in a capture device because it gives you more latitude for adjusting the tone curve in post-production. But once you've encoded it for final output, I'm really not buying how those additional colour bits are going to do me any good.
HDR TV has normal mode with conventional rec709 colors and contrast for normal use.

The HDR mode is for HDR material and has deeper colors and higher brightness for extreme highlight. 10bit HDR file is edited so that "normal colors and contrasts looks the same as before but some glowing parts of image or extreme highlights like neon lights or sunsets are brighter. Gold glows and headlights are bright.The extra bits are for extremes and the normal colors get still enough bits for good gradation.

If the system has more life like color/contrast palette the file can be flatter and there is no need to adjust colors always for monitor black/white extremes to get decent image. I can see it already somehow with my high contrast plasmaTV. I adjust my photos and videos quite flat but they still look good and natural in plasma. If I look my photos with normal low contrast monitors my photos look dull or too flat.

When displays get bigger and crisper it is easier to see banding in gradations or other artefacts. 10bit HDR will help. It has 1024 levels for each channel. 8bit has 256 levels but normal rec709 video has only 220 levels (16-235).

Digital cinema has already enlarged color space (DCI) so movies in theater will have "HDR" anyway. Television must follow that development. I wonder when photo world follows that enlarged color and contrast too. Will there be soon 10bit ecosystem and 10bit JPGs, monitors, graphic cards, etc. for normal users too.
 
The HDR mode is for HDR material and has deeper colors and higher brightness for extreme highlight. 10bit HDR file is edited so that "normal colors and contrasts looks the same as before but some glowing parts of image or extreme highlights like neon lights or sunsets are brighter. Gold glows and headlights are bright.The extra bits are for extremes and the normal colors get still enough bits for good gradation.
I will be interested to see one of these to see how much difference it makes. If it really can display "whiter than white" then it may be a useful feature. The big question will be the availability of content to take advantage of it.
 
Last edited:
I will be interested to see one of these to see how much difference it makes. If it really can display "whiter than white" then it may be a useful feature.
As soon as you've got a high quality HDR monitor or TV that can do over 2000 nits (cd/m2), you could probably see a difference in dynamic range even with 8 bits and HD resolution. Higher bit depths together with higher resolution will just make things look even better, more lifelike, with less visible digital artefacts.
The big question will be the availability of content to take advantage of it.
Not really. Cameras that do up to 14 stops are nothing new any longer, and 10 to 12 stops is not unheard of even among some consumer grade cameras. With 10-bit colour depth. Hollywood and indie filmmakers have been using those cameras for a while now. Film has a decent dynamic range, too. To enjoy the full potential of all those we just need better displays and new standards. They are under development.

Like said, actual high dynamic range TV's and video displays will start appearing eventually, along with new gamma curves to re-grade video for them. There will be an alternative gamma curve optimised for the conventional Rec709 displays. Same content for multiple systems.

There are already working prototypes, and the first commercial products will probably (hopefully) start appearing soon after this first (cash-bait) wave of 4K hype for the mainstream has passed us. Apparently the forthcoming Dolby Vision HDR standard, for example, will support up to 12-bit encoding. Looks like Samsung are already up to something tangible on their own, too.

Suppose that's another reason why I am not in a big hurry to jump for the first 4K TVs, monitors and even cameras. I'd rather not spend my money on just more megapixels for the sake of more megapixels. I'd rather have better bit depth and high dynamic range, too. Apparently the better, high dynamic range displays aren’t too far away any longer.

So meanwhile, I think I’ll just concentrate on making content with cameras that offer high enough dynamic range and good enough quality in general, budget and talent permitting. 4K resolution alone seems less relevant right now. For now. But I digress.

That’s a whole another story for a different thread. :)
 
Last edited:
I will be interested to see one of these to see how much difference it makes. If it really can display "whiter than white" then it may be a useful feature.
As soon as you've got a high quality HDR monitor or TV that can do over 2000 nits (cd/m2), you could probably see a difference in dynamic range even with 8 bits and HD resolution.
As I mentioned above, the issue is that 8-bit screens can't be made too bright because then any graphic that displays information on a white background will have too much glare. The industry has standardized around "white" on 8-bit displays as being a valid background colour, which prevents it from being rendered as bright as real-life objects such as lights.
The big question will be the availability of content to take advantage of it.
Not really. Cameras that do up to 14 stops are nothing new any longer, and 10 to 12 stops is not unheard of even among some consumer grade cameras. With 10-bit colour depth.
Of course there's no technical reason that content can't be delivered, the obstacle is economic and the motivation of the industry. 3D is an example of a format that hasn't lived up to expectations, HDR video is arguably less compelling for most typical consumers. It remains to be seen whether consumers will have ready access to HDR content from major TV and film producers that would make the format commercially viable.

Even as a content producer, I don't think I'd be too eager to generate HDR productions until I was sure the format had staying power. There's a lot of change going on in the industry and I'm skeptical that all of it will stick.
 
From one HDR TV review:


"While I'm still an unabashed fan of 4K UHD's effect on picture quality, after spending a week with Samsung's 65JS9500 I've become convinced that HDR actually has even more of an impact on TV picture quality."
 
I will be interested to see one of these to see how much difference it makes. If it really can display "whiter than white" then it may be a useful feature.
As soon as you've got a high quality HDR monitor or TV that can do over 2000 nits (cd/m2), you could probably see a difference in dynamic range even with 8 bits and HD resolution.
As I mentioned above, the issue is that 8-bit screens can't be made too bright because then any graphic that displays information on a white background will have too much glare. The industry has standardized around "white" on 8-bit displays as being a valid background colour, which prevents it from being rendered as bright as real-life objects such as lights.
You're talking about a slightly different thing it seems, but nevermind, I think I've already depleted my geekery and bickering quota for this holiday weekend. ;)

Just wait and see. Literally. Meanwhile, let's just enjoy the holidays.
3D is an example of a format that hasn't lived up to expectations, HDR video is arguably less compelling for most typical consumers.
3D and the forthcoming HDR video are like a toilet and a Ferrari. They have about as much in common –you enjoy both in a sitting position. You don't have to embrace HDR video but don't be surprised if quite a few people will, eventually.

Other than that, I don't see much point in arguing about things we don't even have yet. Let's just wait and see. I for one am looking forward to the new HDR video standards and devices, hopefully soon-ish. You don't have to, it's quite okay.
It remains to be seen whether consumers will have ready access to HDR content from major TV and film producers that would make the format commercially viable.
Looks like the producers had little difficulties in providing the same content both as a regular DVD, then Blu-Ray and now as a downloadable or streaming HD video. Yet another chance to cash in again with the same content with minimal effort, I think they'll take it. Like they have done ever since the VHS videotapes. Today anyone can make their own Blu-Ray discs. The new HDR content is likely to downgrade gracefully to lower standards, anyway.
Even as a content producer, I don't think I'd be too eager to generate HDR productions until I was sure the format had staying power. There's a lot of change going on in the industry and I'm skeptical that all of it will stick.
You do just that. Meanwhile, like I said above, I won't be 'generating any special HDR productions,' either. I'll just keep on shooting as good stuff as I can, with gear that has as much dynamic range and colour depth as possible natively, within my budget.

I don't see much point in worrying about re-grading the footage for new gamma curves before there is a need for it. I just try to make sure that the footage has enough latitude for a possible re-grade later. Hopefully compelling enough content, too. That doesn't really hurt today, either.

Although the compelling content is the actual tricky part for sure, as always.

I was pretty sceptical about the gimmicky 3D video and TV's at the time, too, but I think this is different, and in this case my glass is half full, rather than half empty. We'll see, eventually.
 
Last edited:
I will be interested to see one of these to see how much difference it makes. If it really can display "whiter than white" then it may be a useful feature.
As soon as you've got a high quality HDR monitor or TV that can do over 2000 nits (cd/m2), you could probably see a difference in dynamic range even with 8 bits and HD resolution.
As I mentioned above, the issue is that 8-bit screens can't be made too bright because then any graphic that displays information on a white background will have too much glare. The industry has standardized around "white" on 8-bit displays as being a valid background colour, which prevents it from being rendered as bright as real-life objects such as lights.
Just wait and see. Literally. Meanwhile, let's just enjoy the holidays.
Well said!
 
You come up to the human biology barrier. Even though 48khz sampling in music is available and used, most humans only hear 15KHz, which is achieved with 15 bits, and is even less than 44.1 kHz. What is good is when digital effects / processing is done in 48Khz then less artifacts appear, so its the post production work which benefits from less noise/artifacts.

Same with video, humans are very insensitive to colour hues, we are far more sensitive to shades of grey. So its not the tv screen that benefits that much from 10bit but the post production does. Looking more natural is not because we see more colour depth (we only see about 8 bits) but because there is more room to customize/process the picture with less artifacts introduced by processing (shadow'highlight tweaks) and more detail (DR) in the shadow. Its the DR that we notice much more than the colour. 4K is best for preproduction less impact on final film compared to 1080p. Most folks arent even watching 1080p on their TV as the signal from provider is rarely more than 720p (even with satellite). Most movies on a screen dont have the sharpness of 1080p, meaning its sharper on their TVs at home. Blu-ray disk is about the only time one even has the chance to see a difference with a 1080p TV, so I am not convinced that 4K TV is needed presently (except for post production work). ~ JM2C

--
"Shoot Long and Prosper"
 
Last edited:
Here's a short article on HDR. It appears the BBC proposal is to use a different log curve
compressing more data into the highlights within 10bit and using slightly more of the codes
within 10bit. Lmax 6.4 which implies 6x the brightness before clipping ?

See page 8

 

Keyboard shortcuts

Back
Top