Expectations of the new generation vs reality

So I have been disappointed with what the next Gen has served up compared to my expectations. Not just with the XSX, but also some of the PS5 games.

I have already outlined a few XSX games I have been disappointed in, but also on the PS5 the game Godfall is really disappointing. It runs at an average resolution of just 1350P at 60FPS mode. It also appears they removed Ray Tracing from the game after hyping the game as a RT marvel for the PS5.

So back to my expectations.

First I have some questions.

If you have a game running at 4k30, will you require twice the power to double the frames?

Like say a 10tflop card gives you 30fps, and assume it’s not CPU limited in any way, would you need a 20tflop card to run the game at the exact same graphics settings by at 60fps?

What about resolution? If I am running a game at 1440P/60fps, and I want to run the game at native 4k/60, would I require twice the GPU power to achieve double the resolution?

I see a game running on Xbox One X, with a gimped CPU, getting say 1440P at 30fps. I make the assumption that the series X with twice the GPU power at least (RDNA 2 tflops vs GCN tflops) and four times the CPU power, and I fully expect that the next Gen should be able to run that game at 4k/60 not a problem.

Am I correct, or am I wrong?

Just get a PC if this stuff bothers you.

3 Likes

The answer is that not every game scales the same. Not every engine scales the same. One might think that resolution and framerate scale linearly to power but that is not always the case. You should also keep in mind that the consoles reported TFlop number often makes people line them up to specific PC hardware BUT you are only taking into account the raw compute power and not the overall architecture of the SoC vs the PC dedicated GPU and CPU. Console makes up to an extent for many of the advantages of the PC larger caches and faster larger memory pools but coding specifically for the machine (though we are already seeing perhaps the GDK moves away from this somewhat).

But for example on Assasins Creed you were seeing PC benchmarks where an 80% more powerful GPU (based on Tflops) was only giving you 40% more frames at the same resolution and settings.

The thing to keep in mind is that the GPU is more than just its Tflop number and therefore there are other constraints too like the memory size, memory bandwidth, cache management etc…

You’ve also got to consider that you’re often comparing AMD to NVIDIA architecture. So the early benchmarking of PS5 and Xbox Series X ray tracing puts it equivalent to the 2060 Super…way below the raw Tflops of both machines BUT AMD ray tracing will be less performant than NVIDIAs simply down to the hardware differences on each chip.

As a general rule of thumb I’d say this gen you can expect plenty of games that run dynamic res up to 4K 30 on the One X to run the same or similar settings and res on the Series X but at 60 FPS. That’s probably a reasonable benchmark especially for the next few years. Games that take advantage of the Ray Tracing in a big way will almost certainly run at 30FPS and many likely way below native 4K rendering.

BUT the resolution games run at now is almost irrelevant because as has been pointed out to you, VRS, TAA and a variety of other things mean the render resolution is meaningless to the end image on screen and indeed often hard to accurately nail down full stop. IF for example AMD get their ML DLSS competitor going on consoles I’m willing to bet you’ll see lots of games render 1080p-1440p max and use the reconstruction to get to 4K and still look better than a 1600p native image. That’s the thing about the future resolution doesn’t tell the story.

1 Like

Ps5 being a 2x performance multiplier from Pro has been observed for a while. Even in the first showing was essentially either go 4k native at 30fps or around 1440p60 at similar settings than it.

SX being even below that is a surprise.

But, that doesn’t mean that it’s the full potential for either machine. Launch games rarely exploit the console potential, let alone games that have been made in a working from home environment where even getting the game out on time has been an immense challenge.

It sucks but keep in mind that’s the equivalent of BF4 being 720p ranging from 40-60fps on xbone not the equivalent of say BF 1 and 5 on the same console. So it will improve a lot (and hopefully quite fast too)

1 Like

The thing is that the output tends to improve because tools improve, devs time and knowledge improves but also because smart cutbacks are found on console - like last gen the far away objects only refreshing at half framerates etc - this gen VRS or things like reduced physics in ACV.

The consoles this time are probably not going to significantly punch above their PC equivalents IMHO because the architecture is so common across them and RDNA 2 features are baked in on Xbox and PC and whilst there are some custom things going on - that will offset only the hardware differentials that a console SoC suffers compared to a PC equivalent. That’s not a bad thing anyway because the Series X is pushing some serious numbers but from what I’ve seen and heard this is probably a gen too early for Ray Tracing to be a serious and consistent thing as we push into the gen on console.

Id take a good illumination system over ray tracing right now. RT feels like something more common for next gen and not now when you can put better tech into those gpu’s.

But dont get me wrong that I would love to see RT if it doesnt compromise 60fps frequently so Ill gladly take 1440p 60fps RT if thats what it takes.

1 Like

The thing that makes me laugh is every time someone sees a high quality SSR in say Cyberpunk gameplay or COD (its RT is shadows only) they start jumping up and down about RT…

Its probably as you say not worth it for the hit on framerate and its not like you couldn’t improve lighting and reflections in other ways that are a bit less demanding.

Till now a single game hasn’t come close to having the kind of GI that Red Dead Redemption 2 has and that didn’t have any hardware RT. I’m personally more excited for SDFGI that engines like Godot and Unreal are working on at the moment. The recent UE5 demo showed off a bit of that, Lumen seems to be their version of SDFGI + few of their own additions.

My advice stop watching digital foundry videos and stop going onto twitter and reading others moaning about the same stuff. Just play the games not the fps/res. I’ve seen a lot of quiet with people playing and enjoying their games right until the comparisons came out now everyone is moaning on both sides about X and Y. I genuinely think some people are actually upset after being happy with it before hand, which to me is weird.

It’s bloody launch I wish people would just be happy we have bloody brand new epic consoles this year, the year where many people have died and are locked up in their houses because of you know what. We still managed to have a “normal” hobby that hasn’t been as effected as others.

Games will get better over time, launch games have NEVER been the gold standard of a next gen console. it’s only going to take probably till '22 before we start to see the nuts showcases of the true power of these next gen consoles.

Wish people would stop being so downbeat and negative over this cool ass shit we got, my 2c.

1 Like

It annoys me that people are having a great time till they see a video showing a console they don’t own runs a game 5 frames faster in one instance and then suddenly the game is unplayable and a technical mess and they are spamming developers and Xbox with complaints…like you say, enjoy what you have, if you notice issues, then raise them but don’t go looking out for something to be concerned about.

1 Like

That’s true. And with DLSS and other tech being prominently used on Pc whereas older reconstruction tech used to be ignored even that will not be an advantage for consoles too.

But, they are starting underperforming even compared to gpus below their specs, so there’s a big room to improve on that front as well. And spec wise they are still well positioned versus how ps4 and especially xbone were.

I’m just happy we got 60fps modes for most games. I’d be happy with 1080p-1440p@60fps for the rest of the gen tbh.

The most disappointing stuff is the VRAM capacity.

Interestingly, both consoles have there own methods to tackle this problem. But would that be effective, remains the question for now. Let’s see how these consoles will age in this respect.

Other major blow is RT, The way this technology is moving forward these consoles might not age well. Devs will have hard time once RT becomes a common thing on PC.

I think the biggest problem for these consoles is the use of AMD. AMD is great for cheap computation and graphics but AMD is miles behind Nvidia on RT and DLSS. Problem with that is that Nvidia was brutal for both Sony and MS to work with and costs far more to incorporate into consoles. If either had gone with Nvidia they would have been so much more expensive. Also these AMD cards are more similar to the 20 series cards which is a gen behind the current 30 series cards that absolutely smoke the 20 series cards. As such the RT and DLSS implementations possible by the cards will be rudimentary by comparison.

If you want native 4k/60 you need to get a PC.

I’m enjoying 60fps - resolution doesn’t really matter. JFO is gorgeous in the performance mode

2 Likes

There are a lot of complex elements that muddy the waters on expectations vs reality I think here. If the target is better graphics, we will get those but each new gen brings longer windows of cross gen support. At first that was mainly 3rd parties but now Sony is doing it for 1P too. Couple that with poorly timed dev tool development on Xbox’s side and the timelines for XGS releases being mostly year 2 stuff, and I can see the disappointment.

OTOH, we should not always look to graphics alone. DualSense has some fantastic sounding next gen haptics and QR is incredible even in its infancy. Super fast loading and extremely fleshed out BC is also uniquely impressive for this gen. So there is a lot of QoL stuff being nailed down better this gen compared to previous launch periods.

All that said, the most next gen feature of any console in MANY cycles is machine learning and its potential for gaming applications imho. Everyone is hung up on DLSS (which is legit amazing) but that is the low hanging fruit here. If you wanna get an idea what can be done here, check out this channel: https://www.youtube.com/user/keeroyz

The area holding back photorealism in visuals is not graphics, but rather animation and physics. Demon’s Souls gets praise for having nice destruction physics but the standards we use to judge that are literally 20 yrs old at this point. In actuality, it’s destruction is horribly outdated and looks awful compared to anything resembling realism (ex: swinging a sword at a sturdy table should never shatter it into its component parts, lol). There are major issues with animations too, where character open doors without the characters even animating to open them in most games. It’s small stuff that makes the biggest difference. A positive example is Gears 5, where characters have pupils dilate based on lighting and actors even squint and react to lighting, or characters yell louder to speak over the loud noises in the environment…stuff like that makes games feel way less gamey.

This tech graphics > all else skeptical attitude develops every cycle now, where the new features are pushed aside for wave 1 titles entirely to service cross gen audiences, and everyone responds by finding it lackluster…only to be blown away the subsequent E3 when wave 2 games are finally shown off leveraging this stuff.

I knew at the start it was going to be like upgrading a PC, im happier then I thought I would be because my seriesX is silent and quick resume is really nice.

This gen I got my first “next gen feel” from fallout 4, I think its going to be a while until I feel that on gen 9, cross gen is going to last a while.

I think Nvidia is not the first choice (and may never will) because they can’t provide an SoC solution.

AMD has both CPU and GPU architectures which are incorporated as an APU, Nvidia only has GPU.

In future, things might change if either of the companies decides to make an ARM based SoC. But that’s highly unlikely.

Well, NVIDIA did buy ARM. I’d be curious what kind of performance you could get out of an solution based on what Apple has done.

Except Nintendo switch, ARM processor hasn’t been used as an graphic first SoC yet.

Things can be every intresting. We may see first full fledged ray tracing hardware by Nvidia on ARM itself.

The most interesting thing will be the thermals.