So what's up with the Series X underperforming on launch?

I meant, he mentioned X settings compared to PC in previous AC games. X is close to base console settings, basically only draw distance and lod transitions are improved over them.

Since I don’t think they have drastically changed it for Valhalla I’m assuming it’s still the same, but he does mention that SX and Ps5 are a perfect match for X settings.

Which brings me to this:

If VgTech is right the game on X has the same range as Ps5 and SX (1440p to 2160p), but unlike them do manage to hit native 4k counts and have a higher average resolution either.

So that’s also another point, a 5700 consistently delivers >2x of X performance in games (there is a DF video where they showed among quite a few games, and also can be observed from other benchmarks). And both SX and Ps5 are failing to deliver that under the same settings (because the framerate doubled but the resolution decreased), which I think it’s quite bizarre for SX because it manages to double X performance even on BC (which has no rdna2 IPC improvements)

Yes, indeed if the settings of the SX and PS5 versions are the same as the 1X version, but thats yet to be confirmed, but it doesn’t really matter the point is both consoles are perform in at 5700xt level or worse.

Around the 4 minute mark in the DF video.

They mentioned that unlike BF4 that pushed higher quality settings and new effects on Ps4 and Xbone this time they just doubled the framerate compared to X, but are using the same settings, with the exception of SS which has lower quality shadows and a shorter lod transitions.

Kirby is an Indie dev. He doesn’t even have access to the most recent GDK. His point highlights a theory i had yesterday. This theory was based on DFs original Series X hardware reveal. SFS and all that Velocity stuff is supposed to offload work from the CPU. (PS5 has it’s own method of doing the same thing in terms of freeing up the CPU). The launch games we’re seeing are not applying SFS. Therefore CPU is being bogged down. My guess is it’s doing a lot of decompression it wasn’t intended to do. In games with a large amount of texture data, its probably hitting Series X hard.

A few things to me is its quite obvious the draw distance is better on the new consoles vs One X and the shadow quality I think (from memory).

Which means direct multiplier comparisons are hard to make.

But factor in this…

A 13.45 Tflop RTX 2080 Ti at 1440p ultra gets an average framerate of 67. Yet a 7.45 Tflop RTX 2070 at 1440p ultra gets an average framerate of 50.

So for close to doubling the raw power you are nowhere near doubling the framerate. So the performance isn’t scaling linearly.

Kirby says a lot of things and they don’t pan out. I find it nonsensical that Xbox have built their whole system around SFS that very few multiplat devs will utilise. For example before launch he claimed the SX was far easier to develop for even though we had other reliable sources all saying this was not the case.

Also it makes no sense since the dips we are seeing are clearly GPU bound. I think I’d treat all this stuff with caution. We knew before launch that Xbox tools were later, less final and devs were having a harder time with them. This isn’t a shock DF were tipped off that MP games at launch would be better on PS5 and had been hinting this for months.

Over time the tools will be refined and devs will become more accustomed to them.

From what I could see it’s more pop in and frame drops and loading it from disk, without ANY feedback. So sometimes you get a miss, and that miss is a stutter in frames as it either loads a texture that is too big or is too late to be used proper (pop in). In the first case you get a frame drop in the second you get a pop in or a frame drop as the timing is off.

The PS5 can brute force this a bit more, due to how their solid state drive works (higher peak speeds, maintained speeds is to be seen), but the Xbox Series X (series S doesn’t seem to have this problem as much due to smaller texture probably). is reliant on that Shader Feedback Sampler. But really the differences are so minute that this is making a mouse into a Brontosaurus.

People should relax a bit and realize most people and families won’t care about these kinds of things. They just want to play a game.

I don’t think a " few" will utilize it, it probably will be almost a press of a button for most once it’s implemented into the engines like Unreal Engine or Unity. Finetuning of course is needed for the best result, but the default feedback will already help a lot.

It also is utilized in PC systems, so it will be more utilized than you think.

Sorry if it’s old news. But I saw this on Tweakers, a Dutch site and forum.

You don’t see devs using SFS? Even the DF Series X reveal made it sound significant to the overall system’s balance and efficiency. If 3rd party devs don’t use it, I expect balancing issues with the hardware. Again I’m not a dev or qualified to give a lecture on it but both Microsoft’s own explanation and that from DF made it seem the CPU and memory setup rely on its use.

I dunno, to me it doesn’t seem like SFS will offload the cpu, the way he is describing seems more like the gpu expects SFS to be used at all times, since the games aren’t using it, Ms likely is doing something in the driver that has a cpu impact.

It wouldn’t explain all the performance issues (some of them are clearly gpu bound scenarios) but could explain why in 120fps games with low resolutions SX is having more trouble keeping the fps high.

The dips in Valhalla are gpu bound, but in DMC5 in the 120fps mode SX has a performance advantage, except for some areas where the fps tanks for apparent no reason, which indicates that there’s something very wrong with the Cpu performance too.

hmmm from look like XSX right now is not good with high frame rate modes

I mean right now they won’t. As engines incorporate it they will but that’s a long way off yet and even then we know these features take time to come to fruition. I just think the Xbox dev environment is a bit behind nothing more than that.

Be interesting to see what happens with Cyberpunk for example.

pretty much most of cross gen titles is having some issues on the X.

hmmm is the CPU running without SMT for cross gen titles ?

Looks solid in MP though.

Think they’ve said they have more optimisation work to do on it.

1 Like

yeah but this is normal for MP they prioritize frame rates over everything

I mean 120FPS is pretty much of main benefit to MP. Campaigns are just a benefit and these games have just had their framecap raised like you would on PC and you’d have similar dips on a PC so I guess its a question of them optimising them. Doesn’t really seem much of an issue here though.

1 Like

hmm the frame rates all over the place it reach all the down to 60FPS this is the same as DMC 5 SE right ?

i think if you don’t have TV without VRR then this mode is not good for you

I just tested it yesterday and I have to say VRR works wonders here. Halo never felt so smooth.

1 Like