# 8 Bit or 16 bit , sRGB or Pro Photo

Started Sep 11, 2019 | Discussions thread
Re: Here's where 30 bit monitor path can make a BIG difference

knickerhawk wrote:

technoid wrote:

knickerhawk wrote:

There's no point in asking other questions when you haven't answered the one I asked. Again: why don't you see any difference between the 8-bit encoded version and the 16-bit encoded version using your "high bit display path"? The synthetic gradients in your test image should be the ideal opportunity to see the superiority of the version encoded at a deeper bit depth, yet you don't see any difference even at 800% apparently. Why?

I alluded to the issue of 8 bit monitors that don't have internal LUTs but are dependent on the driver card's LUTs earlier. Here's a bit more detail that may explain what's going on and also explain why pixelgenius and I see no effects with our 30 bit monitors. First, lets do a little math:

What's the deltaE2000 between ProPhoto RGB (30,30,30) and (31,31,31)? Turns out it is only 0.42 with a "perfect" display.

OK, now the problem with card drivers is that they have to skip RGB steps from time to time unless the rare (or impossible w/o internal LUTs in the display) occurrence exists that the LUTs are totally monotonic and never skip values or duplicate them as 8 bit RGB values change from 0 to 255.

So let's look at the case where 31 is skipped. What's the deltaE2000 between (30,30,30) and (32,32,32)? Hey, it's still under 1 coming in at 0.85.

So it should still not be a problem? Right?

But the problem is much worse than that. The controller card's LUTs are calibrated independently. The R, G, and B channels don't follow each other exactly.

So lets examine he case where just the G channel gets bumped up from 30 to 31. One would expect this isn't a problem. But surprise. It is.

The deltaE2000 from (30,30,30) to (30,31,30) is … wait for it …. 2.38!

With in monitor calibration, which usually is 10 bits or more per channel) even an 8 bit video card doesn't have any missing or duplicated steps so the actual error at the monitor is usually close to half a bit. Big improvement right there. Now, make the data path a full 10 bits/channel with no missing codes and reasonably big monitor LUT depth - problem solved.

Conversion of 16 bit to 8 bit neutral RGBs in this kind of system always keeps the R,G, and B values the same on the black and white balls. The largest error from rounding off 16 to 8 bits is deltaE2000 .28.

This is why pixelgenius and I don't see any visual changes.

Thanks so much for the detailed and well written explanation. The key here for me is the surprisingly low deltaE for going from 16 to 8 bits.

To be totally expected. For those of us who have tested this correctly.

Looking at my monitor and seeing the obvious differences in synthetic ramps always led me to blame the problem on the quantization error rather than something post-conversion in the output to the monitor.

Yeah, another assumption. ðŸ˜±

One thing that I'm still trying to wrap my head around is why the hiccup in graphics card LUT would visibly affect the 8 bit encoded version so obviously but not the 16 bit version, even though of course the 16 bit version will also trip over the same bad RGB value(s) in the LUT. Thanks again. This has been helpful.

So more data to back up with both of us have tried to explain to you. The simple colorimetric fact that converting the high bit data doesn't do squat visually, at any zoom ratio to the actual data!

I took the two of the gray ball's constructed by Bill. One is 16-bit Pro Photo. The dupe is converted to 8-bits per color. Dither is OFF.

Screen capture: They appear identical but.....

Extracted all colors from each, converted to Lab. Then applied a deltaE of every pixel!

Here is the report and again, the facts and expectations from those who've done such tests before that the differences in the two are visually identical:

Number of Samples: 263169

Delta-E Formula dE2000

Overall - (263169 colors)

-- hide signature --

Average dE: 0.06

Max dE: 0.25

Min dE: 0.00

StdDev dE: 0.05

95th %ile dE: 0.10

Best 90% - (236851 colors)

--------------------------------------------------

Average dE: 0.04

Max dE: 0.14

Min dE: 0.00

StdDev dE: 0.04

Worst 10% - (26318 colors)

--------------------------------------------------

Average dE: 0.17

Max dE: 0.25

Min dE: 0.14

StdDev dE: 0.02

--------------------------------------------------

263169 pixels (full rez from Bill's balls). Average dE of 0.06!

The damn worst offender is also equally INVISIBLE at 0.25dE.

As both of us tried to explain to you, the reason they appear identical on-screen is because they are visually identical. On a display system that isn't screwed up, that is high bit, that is ideally calibrated and profiled.

Understand?

Complain
Post ()
Keyboard shortcuts: