So, I bought a brand new LG CX OLED to go along with my new Xbox Series X. I was super excited as everyone told me the TV was the best thing ever. I had an old Samsung LED that had some basic HDR, so I never used it because it made everything greyish.
So, I’ve been playing a lot of games and I have no idea what’s the correct look of HDR. Like, should colors pop and stuff like shadows or black colors be pure black (I mean, that’s more on the OLED) I do see that bright highlights pop like crazy. Stuff like lamps look insane. For example, Modern Warfare looks great, specially during the night missions and inside the cave. Forza Horizon 4 looks a bit greyish (might be the current season going on in the game lol) Black Ops Cold War looks washed out imo. AC Valhalla looks weird, interiors look grey but outside it looks really good. I do know that every game looks different, and some are greatly implemented like Gears 5 and others are bad like Nier Automata and some of the Auto HDR titles (some look great tho, like Doom 3)
So, what the fudge is the correct look of HDR, exactly? Is there any showcase for HDR that I should look for?
There are several shop/store demo videos for LGs and Samsungs on YouTube, which if you watch through the TV’s own player will give you an idea. Certainly part of it is specular highlights, where parts of the screen will go nearer 1,000 nits (headlights, the sun) but much of it is richer colour. Some games just do not do it well at all, like Valhalla so far, but Gears is a great example of one that gets it right, particularly the blue lights on the armour (colour) and the underground segments lit by Jack.
I find HDR on Netflix to be really hit and miss. One show will look great, life-like colour, and the next will be a grey grainy mess. I believe this is due to the limited 15mbps bandwidth that Netflix operates with, though. To put that in context, a BluRay runs at 82-128mbps. You would expect that a game would have no issue being nearer BluRay quality than Netflix quality.
Yeah, I’ve seen some of those demos. They look great on my screen, like, insanely great. I’m just not sure how it should look in games. Like, has anyone played State of Decay 2? Should night time be pitch black?
HDR should be brighter highlights and wider colour which means generally more accurate colour. It shouldn’t necessarily pop more or be darker. HDR is often darker because generally people mess with their TV settings in SDR to make it overly bright but HDR generally is taking reference brightness levels for SDR content which is darker than most people choose to view.
As for games and night time very very very few games will be pitch black because they will almost always raise the gamma so you the player can see what you are doing. It would be in most cases rather pointless leaving you being unable to see what is going on.
I personally find HDR to be a mofo to get right more often than I’d like. Some games just have trash HDR, and its best left off. But other games have good HDR, you just need to get it calibrated correctly. But that’s often a big hassle.
AC valhalla is a good example. The elevated black levels are no good in interiors and even outside it still has a more washed out look than SDR. I really wish devs would add a option to enable or disable HDR in the game, so that we can leave the system HDR setting alone.
I found it on a Ubisoft forum, not my own discovery, but worked for me. Had to do something after that awful cutscene with the seer in the first village that everyone complains about because it is so washed out.
I have a Samsung Q90R, which generally has a higher nit/candela level than an OLED; as such, I’ve got exposure on 0.2 and max luminance on 1,200. The TV brochure/stats say that the Q90’s peak candela is 1,300cd/m2, and it is also where the sun stops changing colour/shape on the demo image in the menu, so I’m pretty happy with that as a figure.
If you have an LG Oled, the CX maxes out at just 800 cd/m2, so anything higher than that on the luminance slider shouldn’t make any difference, so I’d recommend putting luminance at 800 on an Oled as that meets the TV’s peak HDR brightness.
Not a big fan of HDR,I shut it off most the time but I’m seeing quite the difference with it on in Vahalla. Leave it off everything looks so crisp and clean. Turn it on though and things look more dreary brightness wise,but then look up at the skies in the game. You’ll see colored rays that just were not there before with HDR turned off. Even look at the ground with the sun setting,you’ll see way more varied colors displaying on rocks and snow. Even the sun is more visible,you can actually see it looking like a colored circular ball of fire,where as before it always looked like a round blurry thing where you couldn’t make out its round shape so well.
Dating back a few years though - I think the last couple of Tomb Raider games always stopped and made me think ok there is something to this HDR thing here. The colors popped so much more with it turned on,and for once I never saw that dreary,dark and washed out look that I’ve always seen accompanied with HDR in allot of games. That thing still happens in Valhalla ofcorse,but man the colored skies when the sun’s beginning to set are unreal,turn HDR off and you won’t even see it. Its crazy.
Yeah the fire on torches stood out to me too. Beautiful color variant. Just walking in caves was always showcase for HDR. Another one would be the sun light cracking rays between the trees,it just gave a way better sense of atmosphere with a better sense of color,a very different mood with HDR turned on.
Someone correct me if I’m wrong with my understanding of how this works…
The difference between HDR and SDR is mainly how it works. SDR works on a relative scale 0 -100 basically telling your TV 0% (no light) to 100% (as bright as you can go). Obviously there isn’t much room to work in there as far as contrast goes, there is only this small range to work with.
HDR is different it works on an absolute scale like 0 - 1000 or 0 -10000. This time referring to actual nits (brightness). The problem here is not many TVs can hit 1000 nits much less 10000 and only OLEDs really get to 0 on the other end. So the programs have to be created with this in mind. If the content calls for levels your TV can’t handle, it won’t look right. So when HDR is implemented they design around this. SDR doesn’t have this problem exactly because the TV is just doing 0 to 100% of what it can do.
Further complicating matters is the color range and stuff.