A sensible discussion on performance targets for this generation

I feel strongly that it is time for a serious discussion outside of the Digital Foundry thread of tittle tattle that looks realistically at the PS5, Series X and how they stack up to PC - but more significantly what that means for relative performance this generation on console (I really don’t want this thread to deal in SX/PS5 comparisons).

I want to do this because over the internet and even hear I’ve been seeing a LOT of people with wild, wild expectations about resolution, framerate and quality settings. People who clearly haven’t spent much time looking at how PC’s run these games and just assume everything on PC is 4k Native, 60 FPS and ultra settings. This is NOT the case.

The early games for me are very promising in that they are offering sensible options to players letting them choose between 60 FPS and 30 on the whole.

I’ve also been examining some PC benchmarks across a variety of games to see what is possible. A good example is watch dogs here with and without ray tracing.

The bottom line as far as I can tell is this - resolution is just dead now. Very few PC gamers will ever run their games 4K native and with good reason. A 2080 Ti in Watch Dogs will not hit native 4K at 60FPS. So a card well above the console performance envelope is not hitting 60 FPS at native 4K using ultra and that is WITHOUT Ray Tracing turned on. When RT is on that card doesn’t hit 60 even at 1080p with settings at ultra. So people seriously need a raincheck on what they think the consoles can or will achieve.

Games should no longer be judged by the resolution metric, at all. It simply shouldn’t be mentioned because to all intents and purposes games on console are using DRS and what is relevant as a metric there? Upper and lower bounds, average res? Even then the heavy use of TAA and reconstruction techniques renders the resolution fairly irrelevant in assessing the final image quality and composition. Native 4K is a waste. You can look through any PC benchmarks to see why.

FPS - we are going to need understanding that many engines don’t simply scale from FPS multipliers…so going 30 - 60 isn’t always a case of doubling the power - it often requires more and since at 60 you would like to avoid V-sync being turned on (since it reduces the responsiveness and spoils some of the advantages of 60) you can’t target 60 - you need headroom. As we’ve seen with ACV if you don’t have enough headroom you get tearing. On PC most will either target sufficient headroom or use VRR/Freesync etc…but still on console many people don’t have TV’s sufficiently new to use these features. So in reality you might need to target 75FPS to get a reasonably solid 60. Which means that in many cases if you take an Xbox One X version on Series X you will need to find savings in render budget to hit that FPS - so the Series X 60 version may well technically be worse than the One X 30 version. This is all true for GPU bound games, CPU bound games a different story entirely and one where FPS will be very heavily improved. But talking to a few devs there are very very few games that are truly CPU bound in all instances.

So the bottom line here is that whilst 60FPS games are great - we need to accept some sacrifices to run them even on the PS5 or Series X. Those sacrifices are going to be fidelity OR smoothness - and people need to decide what is most important to them. Which is why 4K 30 modes are important because some people want the graphical upgrades and as we examine the benchmarks on PC are unlikely to see them in 60FPS versions of games.

The reality of what these consoles are relative to PC is hopefully now becoming clearer.

1 Like

The gen is starting and little games are taking advantage of the new tech. I think any of this dialogue for now is premature.

2 Likes

Sure but the point is not that games won’t look better as the gen goes on. It’s that the expectations are from some quarters way way out of whack with reality. And utilising RDNA2 won’t magically get you to certain metrics. What we want is games that look and play better and 60 or 4K aren’t measures of that at all. The way we discuss games and the tech needs to evolve now.

1 Like

Digital Foundry have today compared the new Big Navi AMD cards with the 3070 and 3080…

Safe to say the AMD cards do fine on legacy stuff but once you hit next gen features like RT they get absolutely battered. And that’s without taking DLSS into account.

This is not the gen for ray tracing then.

2 Likes

My guess is we see the rise of options in fidelity and performance. I don’t think we’ll ever see PC like options on a wide scale on console.

In terms of performance, I expect we’ll see Native/close to 4K 30fps with Raytracing.

Native/4k 60fps without Raytracing barring demanding titles like Demon’s Souls.

MLSS 4K 60fps Raytracing on many Microsoft first party titles.

1080p-Supersampled 4K 60fps Raytracing on many Sony titles.

The problem with directly comparing with Pc games is that often the console versions are running with settings that you can’t even replicate 100% on Pc with messing with settings files and tuning specifically for it.

See WD Legions for example. At the pc settings even a 3080 has trouble running at native 4k, but apply the console settings and a 2060S (which is basically unplayable at 4k even with RT on low) runs the game at 4k native and higher framerates than the consoles (though since there’s no resolution scaling on Pc you can really lock a 2060S to 4k30).

But this makes comparisons extremely hard for those games.

Though, to comply with the thread, considering architecture and specs SX should perform somewhere in the middle of a 2080S and a 2080ti, at least when it comes to non RT performance.

(RT performance is a bit more complicated. The minecraft demo from March was around that same ballpark, but now we have WD Legions where it’s below a 2060S, and rdna2 gpus are having terrible performance in current RT games as well)

Consoles are compromising the fidelity to achieve higher framerates and/or resolution. Sometimes this makes sense other times it doesn’t. WDL looks good in some aspects awful in others for example.

People forget that every gen starts slow. Once cross gen games aren’t a factor we’ll see more visually impressive games.

3 Likes

Feel like one of the bigger problems that have come out of this gen is that if a game doesnt have AAA cinematic graphics its already being called inferior and last gen stuff.

2 Likes

Go and look at PC benchmarks for games no way is native 4K 60 a realistic proposition unless you heavily compromise on fidelity. Heavily compromise.

Games shouldn’t be targeting native 4K. It’s a complete waste of resources. And Demons Souls is nice looking but hardly going to be in the demanding end of gaming.

I think rather rapidly you are going to see resolutions of 1440p as an upper standard on both machines. Frankly it’s the sensible PC resolution. 4K is just pointless for most titles and given even cross gen games aren’t hitting 4K native and 60 on even more powerful systems than the consoles we shouldn’t expect this.

DRS will play a big part.

2 Likes

Agreed. Makes me so angry. Funny how some reviewers are saying immortals fenyx rising has some absolutely stunning graphics yet people who haven’t played it are just calling it a mobile game…

I think the average resolution will keep lowering as we go on in the generation or at least we won’t get any native 4k60 AAA games.

We haven’t taken into account the larger geometric density, and vastly better LODs that are going to be enabled by the SSDs. More data streaming in every second will need more compute. Look at the density and detail in the unreal 5 demo, that ran at 1440p30fps on ps5. If games start to approach that level of detail, there is no way we are going to see native 4k titles.

Hellblade 2 might be a very good indicator of what sort of performance we can expect, as that was the most next gen looking titled announced till date Imo.

We will have to see what short of performance uplift mesh shaders are going to provide, drs is producing around 15% better performance at the moment. Microsoft’s DLSS equivalent should technically provide the biggest performance boost, and I don’t think Sony will have any answer to that besides regular supersampling/checkerboarding.

I’d personally be happy be 1440p native and MLSS 4k, I think 4k60 was very high bar to set especially natively, they will need MLSS to achieve in true next gen games.

I’ve always been skeptical of the performance of AMD’s ray tracing and machine learning capabilities. All the current data seems to back that up though there’s more research to be done.

But that doesn’t mean that these features can have a big impact in say, ray traced GI and shadows, and in AI upscaling, but rather that it will take a lot more work and ingenuity to have these features make a discernable impact, or a comparable impact like we see on Nvidia cards. It’s like trying to squeeze a full glass of lemonade from a slice of lemon, but I think there’s a vested interest from MS and AMD to get the most that they can from this custom hardware.

One thing we don’t really think about is how performance targets are dictated by consumers. More powerful hardware allows for more processing but how you use it really determines on who you’re making your product for.

I felt like the Unreal Engine 5 PlayStation 5 demo was eye-opening for many reasons. Most importantly for me was public reaction.

First off from a tech perspective, it didn’t use any ray tracing capabilities of the PS5, but had an impressive software based real time GI solution. This lighting using their Lumen technology was a reason for the 30 FPS frame rate, but they’re confident they can get it up to 60 on the same hardware. Because most video cards don’t have ray tracing hardware, Epic has a vested interest in this software solution. But perhaps it can, in the future, leverage ray tracing hardware to assist in its implementation and hopefully mean higher frame rates if it is a frame rate bottleneck.

Secondly, the demo ran at 1440p upscaled to 4K (with a TAA solution, I believe). This, on a PS5 which isn’t a slouch. Although their Nanite technology can throw an impressive amount of fidelity on our screens there are limitations on what can be drawn there is little information whether it’s a polycount target that affects resolution+framerate, and if reducing that count can positively impact overall performance.

And finally, it’s the public reaction to the demo that is interesting. Despite the low frame rate and resolution most people, including myself, were blown away. The level of fidelity and the quality of lighting trumped resolution & frame rate. We haven’t seen anything close to that level of lighting and detail in any game. Many of us would like a higher resolution or a frame rate (or both), but that wouldn’t stop us from wanting to play something that looked like that. If I got a Halo game with that fidelity I’d play it despite the low resolution & frame rate, I’ll sheepishly admit. Multiplayer is a different story, but I wouldn’t mind a single player campaign with that level of fidelity.

So with all that said, most people were blown away by what is essentially tech that doesn’t leverage brand new technologies and hardware capabilities, and doesn’t hit 4K or 60 fps.

And so I imagine that a sub 4K @ 30 fps will be the target for most games. Fidelity and detail matter more to the general audience than resolution or frame rate. Looking at some of Sony’s biggest first party games at launch both default to high detail 30 fps modes. They wanted to hit a level of detail that was not possible at 60fps.

I imagine fighting games, FPS, and racing games hitting a default of 60, with 120 frames performance options (this might be complicated for fighters, so maybe not for them), because that’s what the audience wants.

I expect framerates to be far more stable at native resolutions due to VRS.