HDMI vs DVI-D

haiiyaa

Well-known member
Messages
179
Reaction score
155
Hi,

My monitor can either display 8bpc on HDMI, or 32-bit with DVI-D. Maybe my eyes are cheating me but DVI-D seems to show more color depth. I thought HDMI is a newer standard and would show more color depht?
 
Hi,

My monitor can either display 8bpc on HDMI, or 32-bit with DVI-D. Maybe my eyes are cheating me but DVI-D seems to show more color depth. I thought HDMI is a newer standard and would show more color depht?
Should be no difference in video quality. HDMI carries the audio signal in addition to the video, whereas DVI does not.
 
Hi,

My monitor can either display 8bpc on HDMI, or 32-bit with DVI-D. Maybe my eyes are cheating me but DVI-D seems to show more color depth. I thought HDMI is a newer standard and would show more color depht?
Should be no difference in video quality. HDMI carries the audio signal in addition to the video, whereas DVI does not.
I know that. I just didn't think that 8bpc color depth was equal to 32-bit color depth.
 
I know that. I just didn't think that 8bpc color depth was equal to 32-bit color depth.
8bpc = 24bit (3 colors x 8 bits)

32bit color usually means the extra 8 bits are for transparency, at least according to this:


I wouldn't think there would be any difference between HDMI and DVI, but it all depends on the implementation details related to the video card outputs and the monitor inputs.
 
8bpc = 24bit (3 colors x 8 bits)

32bit color usually means the extra 8 bits are for transparency, at least according to this:

http://en.wikipedia.org/wiki/Color_depth

I wouldn't think there would be any difference between HDMI and DVI, but it all depends on the implementation details related to the video card outputs and the monitor inputs.
each generation of HDMI standard allows for more bandwidth (and thus higher resolutions/refresh/bit depth). for 10bit color (30bit), you've typically needed to go to display port. So largely, HDMI and DVI-D is just a matter of choosing the ends, and I've used ones with one connector type on each end. PCs usually don't implement the audio transfer.

To the OP, what Eric writes is correct - 8bit = 24bit/32bit - same thing.
 
What video card (or chipset) do you have?

With some chipsets and monitor combinations, the drivers may treat HDMI differently. Here's an article on how to fix common issues with Nvidia and AMD cards when using an HDMI output:

https://pcmonitors.info/articles/correcting-hdmi-colour-on-nvidia-and-amd-gpus/

For example, until recently, most Nvidia drivers had a "bug" where they would only output a range of 16-235 for each channel (versus a range of 0-255 for each channel) when some monitors connected via HDMI.

Here's a photo you can view to tell if you have the problem that I found linked to in an article on the subject:


If you see 4 shades (2 shades of white in 2 squares on the left, and 2 shades of black in 2 squares on the right), then you do not have an issue.

But, if you see only 2 shades (one column of white on the left, and one column of black on the right), then you probably have the issue, where you're giving the display a "Limited" vs "Full" RGB output via HDMI.

Basically, when in Limited RGB mode, any color below a level of 16 will be sent as 16, and any color with a value above 235 will be sent as 235.

That's known as a "Limited" RGB mode designed for TVs.

It's my understanding that the issue was fixed (or rather configurable so it's fixed) in some of the newer Nvidia Drivers (as they had a fix in one of the beta drivers a while back for it), and you'll now find a dynamic range setting under a display's color settings so you can select Full (0-255) vs Limited (16-235) RGB out when using an HDMI connected display

But, it's possible it's still broken with some monitors.

Again, what video card (or chipset) are you using for outputting HDMI?

If Nvidia, then I'd make sure to grab the latest drivers.

http://www.nvidia.com/Download/index.aspx?lang=en-us

If Windows 7 or 8.x, the latest should be 352.86 for most modern cards.

Here's the 32 bit version of it:

http://www.nvidia.com/download/driverResults.aspx/85050/en-us

Here's the 64 bit version of it:

http://www.nvidia.com/download/driverResults.aspx/85051/en-us

Check under the "Supported Products" tab on the driver download page to make sure your chipset or card is listed.

If a newer driver doesn't solve it, you can also use a DVI to HDMI cable (or conversion adapter). That way, you're just using your video card's DVI output to drive the display. Of course, if you already have a DVI Output from your video card, and a DVI input on your display (which appears to be the case from your description), then there's no need to go to that much trouble (since you can use the DVI connection instead).

--
JimC
------
 
Last edited:

Keyboard shortcuts

Back
Top