NEC 2690WUXi - can it handle regular sRGB?

Started Oct 1, 2007 | Discussions thread
Graystar Veteran Member • Posts: 8,373
Re: Calibration to sRGB in aRGB mode possible?

Will49 wrote:

There are so many incorrect items in this post I'm not really sure
where to begin.

Right back at ya.

First I would recommend you read the following white paper.

Thanks but I already read it.

Graystar wrote:

Yes, you’re missing something. You’re missing the spec, which says
the monitor is limited to a palette of 16.7 million colors at one
time.

"Internal 12-bit Look Up Tables (LUTs) allows the display of 16.7
million colors out of a palette of 69 billion"

“93% Adobe RGB coverage” refers solely to the range of colors that
can be potentially displayed. That has nothing to do with the
implementation of the Adobe RGB standard in any form within the
monitor.

Totally incorrect. The gamut size has nothing to do with the "bit
depth" or LUTs in the monitor.

Did you read what I wrote? Where did I ever equated or related the gamut size to the bit depth? And that’s not even the point of the statements you quoted. My statement points out that the monitor in question (which is NOT the LCD2180WG-LED which you linked to) is limited to a palette of 16.7 million colors at any given time. The number of colors that can be displayed is limited to a 24-bit palette, and THAT is correct.

A 16.7 million color limitation makes the monitor an 8-bit
monitor.

No it doesn't. Using this logic, then there is no such thing as a > 8
bit consumer monitor because currently the maximum bit depth output
from a Mac or PC is 8 bits per color per pixel. It's what you do to
those 8 bits inside the monitor that makes it a > 8 bit monitor - but
again, that has nothing to do with the gamut.

Review of the spec and features show that only sRGB is
built into the monitor, which would be in line with the 8-bit
display.

Again the bit depth has nothing to do with the monitor gamut being
sRGB or ARGB. You could make a 4 or 16 bit monitor with ARGB coverage.

And again, where do you see me saying that the bit depth is limiting, or equating, or relating to the gamut??

The problem here is that you’ve taken my comments out of their intended context. You didn’t follow the previous trail of posts that led to these comments, which was as a response to someone who believe that the workings of this monitor is related to the aRGB standard. My point, which you totally missed so I’ll reword, is that the aRGB standard is a 16-bit standard. If you want to work in that standard then you must shuttle around 16-bit color information. However, there is no 16-bit color processing going on, or even any way to get 16-bit color information from an image that Photoshop is trying to display, to the display itself. The set of standards that comprise the Adobe RGB color set isn’t used in any way, shape, or form within this monitor or any of its processing.

This monitor will only accept 8-bit data per color channel as input on the DVI connection, as per the DVI specification. As far as I can tell there is no DVI-dual link mode to get greater color information to this monitor. The bit depth limits the number of colors in the displayable palette. If you want to display a different set of 16.7 million colors from the available 69 billion, then you have to change the palette. But I never said that the monitor doesn’t really have 69 billion colors and that those colors don’t cover 93% of the aRGB gamut...of course it does.

If what I’ve just said is incorrect then I’ll be more than happy to read and accept your explanation of how you can get an LCD2690WUXi to accept 10 or 12 bit color data from the OS, and have an active palette of greater than 16.7 million colors.

So you’re either in sRGB mode or you’re not (exactly like
high end CRTs.) That makes it an sRGB monitor. Which is really all
it can be, because unless you have a special video card designed to
output 10 bit color, all regular video cards expect to connect to
sRGB devices.

Again totally incorrect. The video card has absolutely nothing to do
with the gamut of the monitor. Video cards don't "expect" to be
connected to any specific gamut of monitor.

And again, and again and again where do you see me saying that the gamut is limited by anything?? I don’t even see the word “gamut” in my quote!

Current operating systems (Mac OS and Windows), applications (i.e.
Photoshop) color management systems (ColorSync etc), graphics cards
and digital video interfaces are currently limited to rendering 8 bit
color to the display.

Well at least we agree on something (I’ll just put aside Matrox’s Parhelia Gigacolor technology and their Photoshop plugin for seeing 10-bit since it’s far from the norm.)

High end CRTs were able to surpass the sRGB color space, and as
analog devices could display billions of colors.

No. Bit depth doesn't increase color gamut.

You have GOT to learn how to read!! You’re just filling in all this stuff from your head! First of all, where do I mention bit depth in that quote?? I didn’t say bit depth increased the gamut. CRTs can divide up the gamut they have into more than 16.7 million choices because they’re analog...they simply process the voltage levels that are given to them. That’s what technologies such as Matrox Gigacolor and ATI’s Avivo do. These graphic cards can output 10 bit data on a DVI dual link (which the LCD2180WG-LED would like) or a regular analog signal with more finely divided voltage levels, thus expanding the palette to a billion colors. But palette does not equal gamut, which is something you seem to think I keep saying.

High end CRTs were not
able to surpass sRGB because they were analog.

The Radius Pressview did and the Sony Artisan did as well as the very expensive Barco Reference monitor.

This is too tiring. Thanks for you expertise.

PS: What's OmniColor?

Post (hide subjects) Posted by
HAE
VG
DRG
(unknown member)
7d7
Keyboard shortcuts:
FForum PPrevious NNext WNext unread UUpvote SSubscribe RReply QQuote BBookmark MMy threads
Color scheme? Blue / Yellow