Is anyone using an OLED monitor given digital images are often seen on hdr1000 displays?

Started 7 months ago | Discussions thread
hemiola Junior Member • Posts: 28
Re: Is anyone using an OLED monitor given digital images are often seen on hdr1000 displays?

LightSource wrote:

There are a few OLED monitors around. But then LG C1 48" TV is essentially the same as the Gigabyte monitor.

It gives 4k working space, usually SRGB and Adobe calibrated, but most importantly the hdr1000, which given few people view images in person printed, OLED HDR 1000 screens kind of make sense.
Im sure they have their down sides, burn in being one of the old and somewhat still withstanding issues.
Its well known videographers use them. I just wonder how much highs and lows in our DR we lose without such bright monitors? Any HDR photos possibly look better on HDR1000? Even if HDR1000 purpose is for video.
Im considering giving on a crack, otherwise settling with the dell 27" ultrasharp 1440p.

I'm by no means an expert so don't take my opinion as letter of the law.

From what I've learned, HDR10 is an umbrella term encompassing all HDR certifications (or levels of performance). More details for different standards here: https://displayhdr.org/

The 10 in HDR10 stands for 10 bit color depth. In theory at least, a "proper" HDR capable display brings to the table not just added dynamic range, but also 10 bit color depth and wider color gamut (at least DCI-P3, but ideally aiming for >80% coverage of REC2020).

HDR10+ is the "free" equivalent of Dolby Vision. Compared to HDR10 it allows for even higher peak brightness and, chiefly, dynamic metadata so brightness can be adjusted scene by scene (or frame by frame even).

Until recently OLED displays didn't get very bright (most of them still aren't), so it's more common to find the HDR1000 (and above) certifications for IPS/VA panels with FALD or mini LED backlight, whereas OLED panels tend to get the HDR 400 or 500 TrueBlack certifications (that's because they have nearly perfect blacks whereas the peak brightness is just above 500 nits).

Actually more people use OLED displays that it is apparent at the first glance: AMOLED displays that go into smartphones are one other variety of OLED.

HDR images (just as HDR video) should look MUCH BETTER on these displays - whether they are VA, IPS or OLED, as long as they are truly HDR capable - provided they are actually HDR images. As far as I know, exposure bracketed images combined using the likes of Photomatix aren't true HDR images, they are just tone compressed jpg versions (unless you save them with the hdr or exr extensions). These images, just as any other jpeg, might slightly benefit from enhanced contrast, but no new information will be display that isn't already visible on a standard monitor.

Last but not least, you need to consider which mode you work in. In Windows at least, the 10bit and WCG kick in only in HDR mode. When in the default, SDR mode, the display is seen as a 8 bit display (at least the ones I have tried). Not 100% sure about the color space, but again you might be restricted to sRGB (unless maybe you can upload a custom wide gamut profile that works in SDR).

So in order to see a step benfit, you need true HDR content (whether video or stills) and you might need to be in HDR mode.

Keyboard shortcuts:
FForum PPrevious NNext WNext unread UUpvote SSubscribe RReply QQuote BBookmark MMy threads
Color scheme? Blue / Yellow