OLED or QLED For next gen

I see absolutely no reason why you’d want to change the refresh rate of your TV and my CX has never had an issue left at 120Hz at all. Are you using VRR? I tend to leave it on.

Of course I use VRR. Everything is switched on. The only thing the Panasonic JZ980 doesn’t do is Dolby Vision at 120hz.

To back my point re the 120hz mode, take Halo 3’s campaign for example: the game was updated to run at 120 fps on the Series X. But it’s not a locked 120hz. I can feel & see the drop from 120 fps down to 80 fps or so (verifiable via the on screen info option from the tv remote), even with VRR. But when I lock the console to 60hz, it’s buttery smooth locked 60 fps.

Regarding Callisto Protocol, I can only assume the bulk of the development targeted the PS5 where the game was designed around 60hz displays (120hz needs to be activated by the devs on the PS5, i.e. unlike on the Series X, it’s not an option the player can pick at a hardware level). Maybe that’s why it runs better in the 60hz mode.

The point is there isn’t a one size fits all option out there. Some stuff has variables dependent on the display, its tech, the gaming hardware & the choices made by the devs themselves.

Sure I agree. But change the framerate in game rather than on the console output surely?

Unless I’m mistaken, on the Series X there’s no way of changing the framerate settings for the Halo Master Chief Collection in-game. It’s done on the console itself (60hz for 60 fps & 120hz for 120 fps). I’ll need to go through the in-game settings again to verify this but it seems to depend on the refresh rate of the console.

Another thing I’ve noticed as well (which is a bit contrary to what specialist sites & some techy youtubers say) is my tv’s equivalent of HGiG (basically removing dynamic tone mapping of HDR & forcing the console & games to accept the peak brightness of the tv as the maximum, which in my case is 700 nits) isn’t the best option.

I’ve found dynamic tone mapping on the Panasonic along with ambient sensor activated offers a much deeper image. I mean there’s a reason why the tv itself recommends dynamic tone mapping enabled as the ‘default Panasonic setting’, whereas people like Vincent Teoh always say to switch that off & use HGiG (or whatever the equivalent on the tv is).

HGiG is most accurate to the creators intent. That’s why people like Vincent recommend it. It’s showing the content how developers expect people to see it.

Using the TV dynamic tone mapping rather than HGiG basically means the TV will do it using its own algorithm and will make it less accurate to what the game actually wants to show.

In general HGiG is for people who care about playing a game as intended. Dynamic tone mapping using the TV will in most cases produce a brighter less accurate and more blown out image.

However, there is a practicality thing. HGiG pays no regard to the conditions you are gaming in. Whereas dynamic tone mapping can use the TV ambient light sensor to make an image that is more suited to bright viewing conditions.

In essence one of the issues with OLED screens is that most people are conditioned to high brightness and oversaturated colours from years of poor LED screens and also human nature. OLED comes along with much more accurate colour space and contrast but people aren’t used to it and want the over sharpened and saturated, retina burning brightness they are used to.

I use dynamic tone mapping during sunny days as it’s just more practical. At night I usually use HGiG.

1 Like

My previous tv was a Panasonic Plasma.

The Callisto Protocol is an interesting example IMO: with dynamic tone mapping off, the picture is much more washed out (HDR was one of the complaints when it was released). For some reason, the tv’s dynamic tone mapping & ambient sensor seems to fix this (in all viewing conditions).

And the Panasonic JZ980 has one issue which is particular to Panasonic when in 120hz mode: dynamic tone mapping is forced on (it’s a result of the “4K 120hz bypass mode” update to all Panasonic tv’s from 2021 which use a MediaTek chipset).

1 Like

With all this interest in Tv does anyone see a future where Videogame tech is in built for native gaming, and in the process maybe solving the streaming issue.

What do you mean? Like putting gaming hardware in the TV?

In theory it’s doable. You could have something similar to say a steam deck or a rog ally in a tv playing games natively from a store. There isn’t too much to prevent that technically. The issue is cost. The high end tv market probably doesn’t overlap that much with gamers overall. And obviously something like that would add cost, a lot of cost, weight and bulk.

1 Like

That and I feel like all the folks that keep talking about how mobile/ARM processors everywhere is going to make streaming unnecessary are completely ignoring that no matter how good they get, the larger silicon with access to more power is always going to be noticeably ahead.

2 Likes

I mean those devices aren’t arm. Right now I’d say the barrier beyond the cost and significant market issues I outlined before is there isn’t really anyone who’d do it.

Say steam made TVs….theyd do it. But then the whole way these devices work is powering a small screen where running 720p doesn’t matter. It’s a bit different having any handheld AMD APU in a massive screen.

But then….consider that in a few years it’s possible you could get something not too far off the series s in such a form factor. If someone made TVs and had interest in it I wouldn’t say it’s impossible. It would get people into the ecosystem. The most likely is Sony of course…but then such an approach would be counter to their general operating strategy.

Whilst you are correct that obviously fully blown consoles and PCs will always be ahead we are now at a place where a handheld PC costing under £500 can play some AAA games released more than a year after the device came out. Thats quite remarkable and I think the gap will never close it will become tighter.

Yes. Looking at the teardown of many TVs show a lot of space for gaming hardware and the thing today about TV is size is the vogue. Larger screens are what seems to be the pursuit by both customers and the manufacturers.

Agreed I just thought it could be an avenue in the future. I also do wonder where you go with TV tech besides resolution. Most of the Sony TVs like the Bravia are usually priced pretty well over a thousand and the new OLED TVs as well can go well over 4000 dollars.

8K, 16K, 32K (even though there’s no content.) People will keep upgrading, if there’s a bigger number on the box.

The other trend seems to be brightness. Apparently my 2023 midrange-OLED isn’t very bright compared to the latest and greatest - even though I’d have to wear sunglasses at full brightness. But maybe people want to watch TV while they’re tanning in the desert sun? I have no idea. But it’s a bigger number on the box!

Or they just keep adding different letters in front of OLED. :man_shrugging:

2 Likes

Vincent explains why.

1 Like

Brightness is the new Eldorado in tv marketing speak. Now that resolution increases are tapped out (8k isn’t happening, not with the energy requirements), HDR peak brightness has become the new thing to boast about & sell tv’s off the back of.

I mean whatever floats their boat but all SDR content in movies when viewed correctly calibrated (in filmmaker mode for example) has identical brightness across the board irrespective of the tv, whilst HDR is very case dependent, i.e. such as viewing conditions (like dark room versus bright room).

Maybe it’s just me but brightness isn’t something I seek in a tv, not when it can get too bright. I have an Oled with 700 nits peak brightness in HDR & honestly, some scenes are too much (I’ve been borderline blinded a couple of times in some video games when playing in a pitch black room at night).

3 Likes

Yep, I agree with you.

I had an Samsung S95B which got incredibly bright, and at first it’s fantastic but you get used to it very quickly. I’ve also had a Samsung QLED which got even brighter but it’s not what it’s cracked up to be. I have a Sony A95K now, gets plenty bright and it’s come to a point where I hope this set will do me fine for many years to come. There are zero issues with DSE, no banding, no other weird things. It’s kinda near flawless really.

Do you always game in a dark room or do you mix it up? I’m so used to always having the lights on, but that kinda defeats the full HDR effect, doesn’t it?

Sony are just stupid. Nobody needs 4000 nits. It’s completely pointless right now.

And doesn’t compensate for all the issues with LED and local dimming. It’s all smoke and mirrors how many local dimming zones they’d have and it’s impossible to make it look close to OLED.

3 Likes

i support mini-led. yeah, i know it’s not as good as oled for hdr, but i believe all forms of screen tech should be pushed to their limits and sony is at the forefront with mini-led. you can get great results from them imo.

1 Like

I just wish their mini LED wasn’t more expensive than everyone else’s OLED lol.

That said, I understand the benefits of the different technologies and might still go mini LED next time due to being in a bright room and burn in paranoia.

1 Like