Games Analysis |OT| Time To Argue About Pixels And Frames!

Very good point.

I’m a bit surprised that Arkane didn’t get it higher than on PS5. Parity is not what I expected. Native 4K is what I was expecting for Deathloop. Unless Arkane wasn’t even allowed to work on it for Xbox until recently, but that doesn’t make sense. Not even for Sony contracts/deals, lol. So they had a lot of time to optimizer XSX version.

It’s not full parity - XSX performs better on some modes than PS5, but not by much.


Not sure… but hopefully it doesn’t mean they misspoke lol

Because it literally doesn’t matter. Just because there’s more in the GPU of the XSX doesn’t mean they necessarily have to spend extra time and hours tapping into it to get some more pixels or frames that no one would even really notice ANYWAY. Plus, they had to split their time across two versions as well adding the Series S which they did a great job at as it seems!

While the XSX will outperform the PS5 in most instances, and that’s been reflective of many multi-platform games, there are some scenarios where both systems will be closer to parity. It’s doubtful that Arkane just now had access to XSX dev kits, especially considering the other team at the studio has been working on Redfall. Also to be fair, the XSX version does slightly outperform the PS5 version when frame rates drop below their targets, so the extra power is being utilized.

The gap should be more apparent when future games grow more complex with their rendering pipeline and while artistically interesting, I wouldn’t say Deathloop is a complex title in terms of rendering. So depending on where the bottlenecks are, the results could make sense if we had any access to their profiling tools.


I’m fine with them porting the game and getting going on the next game.

1 Like

Thats true… But it is a surprise to me that 5-10 fps of lead is not a note worthy lead anymore


I agree mate, but I think it’s more down to the team are not doubt busy on a new project and Team XBox are also more worried about getting the likes of Redfall out the door, rather than putting major resources into trying to outdo a PS5 port.

Maybe I’m on my own too, but I don’t like Deathloop at all and I’m kind of worried as I didn’t like Youngblood either. Don’t really have high hopes for Arkane next game myself

It is a little bit of competition being at play here. They know AMD cannot compete at the top end. However, they are not saying something that is not all that different to what MS told DF in the build up to Series X and at hotchips, and that is that smaller nodes are getting increasingly more expensive.


I thought I read that the rumors suggested AMD being competitive at the top end, maybe even surpassing NVIDIA. Could be wrong since I don’t follow extremely closely.

It is true but against Intel’s CPU line up and Supercomputer’s against Nvidia

Top end GPU consumer market is with Nvidia right now

Isn’t Nvidia like 90% marketshare in GPUs or something that ridiculous anyway? I think they are the top dog and know their clientele will pay the big bucks in order to get the latest and greatest. At least EVGA has shown they won’t play Nvidia’s game and are now stopping to sell their cards, hopefully others follow suit so Nvidia changes how they operate but I somehow doubt that will happen

Arguably EVGA (and other vendors) lose more from it than Nvidia.

There’s a chance they’ll be competitive with standard rasterization. The RDNA2 line of cards were very competitive and for many people I’m sure that’s enough. However it’s usually when RT comes into play that AMD is still a ways behind.


This. If you have no interest in cutting edge RT stuff, the AMD line is very competitive in rasterization.


And the DF Article for those without the time to watch the entire video:

Some additional comments from Alex/Dictator on the different nature of artifacts posted at B3D: Nvidia DLSS 3 antialiasing discussion | Page 14 | Beyond3D Forum


Nvidia taking on high end CPU with there GPU now

This is not what I expected from DLSS 3

This is totally mind blowing

Gamers can eventually save money on there CPUs now

I still reserve judgment until more reports on how the game feels. I’m not entirely sold on seeing 60 fps but still feeling like 30 fps because of the Artificial Frame Generations.

Some of the artifacts are cringe too. Though Alex did say some are not perceptible because its only every-other frame and not sustained multiple-frames.

At least its something new and everything has to start some place, so will be interesting to see how it progresses.


This frame generation is 1.0 in its own relam. I am sure it will have some of its downside just like DLSS 1.0 had back then

DLSS 4 will improve upon it just like DLSS 2.0 was for DLSS 1.0

This is truly fascinating. GPU is fixing the CPU bottleneck. And here we thought assets streaming with GPU was a big thing

1 Like