I feel strongly that it is time for a serious discussion outside of the Digital Foundry thread of tittle tattle that looks realistically at the PS5, Series X and how they stack up to PC - but more significantly what that means for relative performance this generation on console (I really don’t want this thread to deal in SX/PS5 comparisons).
I want to do this because over the internet and even hear I’ve been seeing a LOT of people with wild, wild expectations about resolution, framerate and quality settings. People who clearly haven’t spent much time looking at how PC’s run these games and just assume everything on PC is 4k Native, 60 FPS and ultra settings. This is NOT the case.
The early games for me are very promising in that they are offering sensible options to players letting them choose between 60 FPS and 30 on the whole.
I’ve also been examining some PC benchmarks across a variety of games to see what is possible. A good example is watch dogs here with and without ray tracing.
The bottom line as far as I can tell is this - resolution is just dead now. Very few PC gamers will ever run their games 4K native and with good reason. A 2080 Ti in Watch Dogs will not hit native 4K at 60FPS. So a card well above the console performance envelope is not hitting 60 FPS at native 4K using ultra and that is WITHOUT Ray Tracing turned on. When RT is on that card doesn’t hit 60 even at 1080p with settings at ultra. So people seriously need a raincheck on what they think the consoles can or will achieve.
Games should no longer be judged by the resolution metric, at all. It simply shouldn’t be mentioned because to all intents and purposes games on console are using DRS and what is relevant as a metric there? Upper and lower bounds, average res? Even then the heavy use of TAA and reconstruction techniques renders the resolution fairly irrelevant in assessing the final image quality and composition. Native 4K is a waste. You can look through any PC benchmarks to see why.
FPS - we are going to need understanding that many engines don’t simply scale from FPS multipliers…so going 30 - 60 isn’t always a case of doubling the power - it often requires more and since at 60 you would like to avoid V-sync being turned on (since it reduces the responsiveness and spoils some of the advantages of 60) you can’t target 60 - you need headroom. As we’ve seen with ACV if you don’t have enough headroom you get tearing. On PC most will either target sufficient headroom or use VRR/Freesync etc…but still on console many people don’t have TV’s sufficiently new to use these features. So in reality you might need to target 75FPS to get a reasonably solid 60. Which means that in many cases if you take an Xbox One X version on Series X you will need to find savings in render budget to hit that FPS - so the Series X 60 version may well technically be worse than the One X 30 version. This is all true for GPU bound games, CPU bound games a different story entirely and one where FPS will be very heavily improved. But talking to a few devs there are very very few games that are truly CPU bound in all instances.
So the bottom line here is that whilst 60FPS games are great - we need to accept some sacrifices to run them even on the PS5 or Series X. Those sacrifices are going to be fidelity OR smoothness - and people need to decide what is most important to them. Which is why 4K 30 modes are important because some people want the graphical upgrades and as we examine the benchmarks on PC are unlikely to see them in 60FPS versions of games.
The reality of what these consoles are relative to PC is hopefully now becoming clearer.