The one we've all been waiting for (Digital Foundry) is up

Yeah that has to be it.

As for those games that have unlocked framerates and still go below 60fps sometimes, this is where VRR can help a lot, right? I remember how tremendously this helped Kingdom Hearts 3.

“There’s been some discussion - prompted by Mark Cerny no less - of how developers may struggle to scale workloads over many compute units, but in many of our tests, we saw point blank that this is exactly what is being delivered.”

I love this sentence from Richard. I would love Richard to point blank say that Mark Cerney is talking absolute rubbish, because everyone knows it is due to the fact paralisam is the name of the game for graphical development.

4 Likes

On one hand, I’m all for it and I’m still over the moon over Bethesda… on the other, damn it’s been a LONG time since a new Elder Scrolls came out. Personally, I love ESO, but I can admit it’s not the same by a country mile.

1 Like

I loved that line from Richard, and that comment from Cerny was from the original presentation earlier this year if I can recall… it was also precisely when I knew Sony was going to go all in on bullshit for their marketing strategy. At the Lab I work, we have multitudes of thousands of compute units that we scale machine-learning algorithms and biomolecular simulations across without too much trouble, so I’m pretty sure game developers who’ve already been doing the same on PC GPUs won’t have a problem.

1 Like

It was obvious bs from miles away when you have just even a little bit of knowledge in tech. It was pure weakness justification.

3 Likes

I mean I have no background in programming etc, but even I know that the whole point of graphics cards and programming for them is that they are engineered to paralise graphical work flow and this has been the case since the dawn of the industry lol. Modern day computer rendering would literally not exist if not for this fact. It’s not like when a new graphics card is released games don’t scale or when Xbox One games, which were designed with for consoles with 12 or 40 CU, don’t automatically take advantage of the increased CU of the Series X.

Honestly, when Mark Cerny tried to peddle that rubbish to actual developers he actually insulted my intelligence, so God knows what actual developers felt lol. However, the worse was that some developers actually tried to agree. I think they need to stop drinking the Sony Koolaid :joy:.

All because he didn’t want to admit that 36 CU was chosen to ensure BC.

4 Likes

Cerny is much more of a PR man than many Sony fans will admit. MS has also out engineered him twice in a row now.

6 Likes

In a time where COVID has delayed many new games on multiple platforms, great BC is even more important for a brand new console.

I hope for Sony users sake their BC is up to par cause playing the paltry PS5 launch lineup is not going to be enough to justify the cost.

By itself it seems the Series X is justified as a major upgrade to existing titles and better performance overall.

No. According to the Sony fans he never spins things as a positive and everything he says is gospel.

The fact of the matter that Nvidia, AMD and Microsoft fundamentally disagree with Cerny’s assertion about shader core count so: Either they are all wrong and Cerny is just smarter than everyone else, or he is wrong and either he is too arrogant or he knows what he is saying is rubbish and he needs to spin it as a positive. I will let other people be the judge as what is the truth.

3 Likes

Spot on!

3 Likes

Very good and detailed video. They should also test Jedi Fallen Order, i would expect a 60fps locked performance there.

They probably don’t recompile the shader code for RDNA2 optimizations like you get on PC.

1 Like

We’ll see if he’s right when the full teardown and DF face off/performance analysis come…Should be fun to watch.

2 Likes

Intel disagrees as well, the MHZ wars ended in the early 2000’s with multi core processors cause pumping up mhz speed started to hit thermal barriers so you needed more cores to improve performance.

Also the OG Xbox one had faster GPU and CPU than the PS4 as far as MHZ goes and we saw how that went.

3 Likes
5 Likes

This makes me even more excited by everything. If IOI can patch Hitman 2 to take advantage of everything, it will run EXTREMELY better.

1 Like

I haven’t seen such positive Xbox reception from the media since 360 days. Incredible turnaround, Xbox has destroyed Sony on all fronts in terms of hardware, studios, BC, value, and messaging.

This should be a Harvard study case.

1 Like

Always wondered how Microsoft would get around the fact RDNA is a different architecture than GCN and we have our answer. I would like to mention that the IPC gains of RDNA is significant to the point where the 5700 can go toe to toe with Vega 64 - a 12.7 TF card.

GCN is not that different from RDNA. The instruction set is mostly the same IIRC, so there is binary compatibility (like on the cpu side). But you would maybe reorganize instructions on RDNA, utilize their new 32 item wavefront and use some of their more advanced features which where not on Xbox One X like more efficient instructions for packed data formats.

Just by sheer power. Amazing. This thing is really a beast!

It is really promising for the future.

1 Like

I just thought what game would be a super ineresting to test and would probably bring the series x to its knees.

Ark, that game runs like poo on everything lol

1 Like