Games Analysis |OT| Time To Argue About Pixels And Frames!

Alex did an amazing review of DLSS 3 tech

DLSS can be used without frame reconstruction is another frontier where frames can be gained without affecting the native image quality

Only trade off is latency

1 Like

Yeah, I’ve been listening to reviews and the 4090 is insanely powerful. Just wish it wasn’t so expensive lol

3 Likes

How much latency though(can’t watch at the moment)? Because too much is not a worthwhile trade.

The future of benchmarks!?!?!? (they’re making automated tools)

1 Like

automation is fine, but I think hiring more people to help with the work would go a long way to getting that workload down. Throw in that automation and that should go down even more.

They have an enormous team, but when you have 7 days to test out hardware (or less) it’s ridiculous. These are YouTubers and even “bigger” websites only have 3 or 4 people specialized in this type of stuff. Automation is a massive resource saver for the manual and time consuming thing that is benchmarking hardware.

1 Like

How big is enormous? and how many people on their team do the bench marking?

Edit: Decided to check their forums but I guess the only people listed in staff are people that moderate the website. So I don’t know if they’re part of the testing crew.

LTT is at roughly 90 or so employees, from their past I’d say at least 10-15 of them do hardware benchmarking. When you have one, maybe two items sent to you I don’t care if you have 100 people. They can’t do very much while sitting around and waiting on tests to run.

2 Likes

So, I suppose the main issue is space and tools needed to benchmark cards, which is something that large companies like Nvida avoid by having the facilities to get those tests done, with more employees and automation tools that they themselves likely created.

The main issue is time. It’s a long process and they’re not given much time to do it. They can throw people at it, but if you can automate things that means no need to crunch your team (and proper tools are faster anyways)

1 Like
  • Xbox Series S: 1080p/60fps | 18,60Gb

  • Xbox Series X: 2160p/60fps | 18,60Gb

  • PC: 2160p at Max. Settings (RTX 4090/3080/3070Ti/3060/3050) | 28,78Gb

  • Scorn runs at the highest settings in Steam Deck. However, the lower resolution causes some details to be washed out.

  • Ebb Software confirmed that Scorn would have DLSS from Nvidia, but the game does not have this feature at this time.

  • Xbox Series S has lower LOD distance and texture filtering.

  • Load times are fast and within the usual numbers for each platform.

  • Xbox Series X runs at the equivalent of higher settings on PC.

  • Scorn is one of the most solid and unique renderings I have seen in Unreal Engine 4.

  • The entire RTX30 range is capable of running this game at the highest settings without any issues.

It looks and runs great on both Xbox consoles.

2 Likes

Native 4K/60fps? Now that’s impressive.

1 Like

Honestly, I think the game is really ugly most of the time so 4k/60 isn’t that surprising. Environments tend to be small with the weird aesthetic help carrying what otherwise is a drab and uninteresting looking game.

Sounds like that very first trailer wasn’t indicative of the game at all then, that looked proper current gen back then. Even the gameplay reveal looked good. Sad to hear this.

A few areas look fantastic, but the majority do not

Overwatch 2: PS5 vs Xbox Series X/S Upgrades Tested at 4K/120Hz - A Big Visual Update?

And who says voicing your concerns to corporations doesn’t work? :rofl:

The RTX 4080 12GB is a fantastic graphics card, but it’s not named right. Having two GPUs with the 4080 designation is confusing.

So, we’re pressing the “unlaunch” button on the 4080 12GB.

3 Likes

Was this ever discussed?

Also, I know it’s not a technical game comparison, but it is still a technical comparison of capabilities between the two systems, and the conclusions are near and dead to my heart.

Both Series X and PS5 are energy hogs when it comes to streaming video. Yes, the Series X fared better, but it’s still using a considerable amount more than a dedicated video streaming device (Apple TV, for example). Don’t use your consoles for video streaming, if you can help it - your wallet will thank you (and the planet) and you’re likely not getting the full capabilities anyway (I mentioned in the TV/Movie thread how HBO Max doesn’t support 4K/Dolby Vision on consoles in most regions, for example).

1 Like

I hadn’t seen that report previously. The 4090 is rated at 450 Watts and some initial reviews show it consuming that during benchmark runs. This Titan product stack would be what, like 700-800 Watts depending on RAM capacity?

These power consumptions numbers are outrageous and would require such a finely tuned efficient power supply. Not sure if the sweet spot for PSUs is still at 50-60% rated power draw. Throw in CPU, RAM, Motherboard and you’re looking at PSUs specced high enough to draw more than residential wiring can deliver.

1 Like