Hardware & Technical HDR... any point?

Recently discovered that the tv I have hooked up to my pc and xbox1x is hdr10 capable, stupidly you have to manually turn it on for each hdmi input!

Figured out how to enable hdr support in Windows (and on xbox) and while some games do look a little bit different (i wont say better, just brighter) im wondering if its really worth bothering with, especially since games and other content that don't support hdr often look worse when it's enabled.

I mean on pc there's only 2 or 3 games that support it (and elite isn't one of them) and the only game i play on console (rdr2) doeant support hdr. Aside from a few 4k hdr YouTube vids (that do look amazing) there's no video content either, it seems to be a bit of a fad that's good for showing off tv's in a showroom but once you get it home it's rather pointless (anyone remember 3d tv's?).

Is there any point in keeping it enabled, any killer app for HDR that'll make it seem useful, or is it just another gimmick?
 
Right now, there's probably little point; however, especially for games like Elite it would be quite the useful trick to support. Rendering at 10 bit output would at least reduce banding, even if it didn't use the wider gamut or dynamic range that HDR displays are supposed to support. If you just render "normal" content to such a display it will be displayed wrong, with much more vibrant colours and higher contrast than intended, and 8-bit output has a good chance of looking very quantised.

Coming up: half-gigabyte screenshots.

Some more info about colour coverage:

See the chart here: https://www.noteloop.com/kit/display/color-space/ and you can click the names on the left too.

Most consumer devices (both computer monitors and TVs) are at at, or struggling with, sRGB/rec.709 with up to 8 bit colour, the smallest triangle in the middle of the mess; TVs will have less bit-depth since they're throwing away range for analogue broadcast signal levels. Calibration and response vary widely, some devices won't resolve anything in the dark areas, some are struggling with brights, the primaries are usually off, and response curves are usually not great either. When you go to even slightly larger colour spaces like AdobeRGB or variants thereof, you will usually want more bit depth too to avoid visible quantisation, a trick that "workstation" GPUs and professional monitors have supported for a long time.
 
Last edited:
Back
Top Bottom