Anybody has hands-on experience with 30bit LCD + nVidia GeForce on Linux?

Started Aug 18, 2014 | Discussions thread
ForumParentFirstPrevious
Flat view
knizek New Member • Posts: 9
Anybody has hands-on experience with 30bit LCD + nVidia GeForce on Linux?

Hi,

nVidia added support to its linux drivers for GeForce series some time ago. The X.org configuration is the same as for Quadros and fairly simple. The Xorg.0.log confirms, it is up and running (EVGA GTX 750 Ti):

(**) NVIDIA(0): Depth 30, (--) framebuffer bpp 32

My display is Dell U2713H (connected to internal card's DisplayPort), which has 10 bit input, 14 bit internal LUT and 8bit + 2bit FRC panel. Based on reviews, this 8+2 FRC should appear to our eye pretty close as native 10bit panel.

Hardware-wise, the setup should be okay. Now the software support.

While X.org supports 30bit output, there are some minor issues with current graphic toolkits (Qt, GTK), which can mostly be resolved with few patches to source code. However, the usual apps continue to send out 8bit resolution of colour, no matter if they internally process images in 16bit or even higher...

Based on this blog entry, it seems that only Krita (image editor) is actually capable to send out 10bit data to the display (when run in OpenGL mode).

This sounds promising. The only trouble is that in my case it does not work.

The test ramp (1024 steps of grey patches) looks the same as an 8bit test ramp (256 steps of gray patches sized to allign with the 1024 ramp for easy comparison). When I re-start X.org in 24bit depth, both ramps look the same as before.

I want to say: there is a clear banding for each 8bit step (expected). The semi-steps in 1024 ramp are indistinguishable for each successive 4 steps (the pipe tool reads the RGB values change), which indicates by 30bit display setup failed.

P.S. The test ramps are 16bit PNGs generated by ImageMagick. No sw calibration is loaded. Color Management in Krita is turned off (or rather the image profile and display profiles are the same).

Also the report of Argyll CMS calibration reads that graphical card LUT appears to have 8 bit precision ("dispwin -v -yl -R"), although this might be due to my cheap colorimeters (huey, i1Display 2).

So, is there anybody out there on Linux with a working setup with nVidia GeForce?

Or is that all marketing hype and something in the pipe (application - X.org - nVidia driver - display) limits the bith depth anyway?

ForumParentFirstPrevious
Flat view
ForumParentFirstPrevious
Keyboard shortcuts:
FForum PPrevious NNext WNext unread UUpvote SSubscribe RReply QQuote BBookmark MMy threads
Color scheme? Blue / Yellow