Games Analysis |OT| Time To Argue About Pixels And Frames!

Not really because once on memory the gpu already knows how to read only the portion of the textures that are needed instead of reading them entirely.

In fact, that’s precisely what SFS does. It reads back from the gpu what was read and what was not to discard the data that was not used

Basically the gpu has been able to read just what needs from memory for the longest time, but that was a black box.

Then Ms added Sampler Feedback on DX12U which is hardware to get poke into was read by the gpu more efficiently.

And on top of that on SX they added Sampler Feedback Streaming which uses this data to stream and unload assets more aggressively since you know exactly what is read and at what quality.

But as you can see, it only affects what’s stored in the memory and has no effect on the gpu bandwidth as the whole process is dicated by what the gpu reads. (And the bandwidth to load these assets in place is largely insignificant and won’t affect anything)

1 Like

I don’t think you have any friggin idea about how a unified memory architecture works. So you are telling me that memory bandwidth doesn’t affect how fast data gets loaded from SSD? I don’t think that’s how it works mate especially in a DMA-like architecture. It’s very difficult to imagine a unified memory with 560 GB/s bandwidth will load data from SSD at same rate as a 160 GB/s RAM in context of DirectStorage.

@Kirby0Louise Maybe you can enlighten us and end this discussion once and for all. Tell us how Azure devs have FAILED to design XSS to perform at par with 1440p gaming on One X.

Excuse me I have no idea about them and their qualifications but if I’m not satisfied with anyone’s explanation it’s in my rights that I engage in a further healthy discussion even if that’s someone of a really high stature. Btw there was absolutely no need for that condescending attack mate. :slightly_smiling_face:

Compare SSD bandwidth to RAM bandwidth. Its around 2 orders of magnitude slower. Bus contention from CPU access should be more serious i imagine.

Where have Microsoft engineers stated that they have designed Series S to be on par with a 1440p One X? They have told us numerous times its designed to play Series X games at significant lower resolutions. Which it does.

1 Like

Your issue is you think anyone is saying MS failed at anything when that’s never been indicated, hinted at, insinuated, or said. You won’t be satisfied because you have this pre-conceived notion to how this works when others are just trying to do their best to explain things for others to understand.

There’s no need to take things to extremes. Some of us just want to talk tech, not try and push a narrative around their favorite brand.

2 Likes

Why will it be an issue? Series S was never advertised for 4K gaming. It’s totally another thing that the crossgen unoptimised titles usually take up more bandwidth than it’s absolutely necessary. One X was advertised as a 4K30 machine whereas Series S was instead designed for achieving next-gen visuals at 1440p 60/120fps. These 1440p visuals on XSS will definitely look at par or better than the same resolution on One X when properly optimised which is what I believe.

Speak for yourself mate as there has not been any feeling of hostility towards anyone from my side.

By default, we are all more hostile than KageMaru, who has the patience of a saint.

I’m talking about making claims like we are saying MS failed at anything while putting “failed” in bold and all caps. That is what I was referring to by taking things to an extreme. I don’t think anyone is being hostile here. If anything, some are just expressing confusion over some of the claims.

My sarcasm meter must be busted :sweat_smile: I can be direct but I never try to be hostile. I know tone can be difficult to read over text at times but I never have bad intentions. There are enough assholes on the internet to fill that role lol

I was being totally sincere. I only know you from this thread and it seems like you always take the time to make long measured posts, and you often have multiple people disagreeing with you. I don’t have enough knowledge of the subject to really follow the conversations, but that’s my takeaway, at least.

1 Like

i don’t know what you are now talking about. According to you CPU bandwidth on the shared bus is a no issue but I/O is and needs SFS. Where is this all coming from?

This is all moot. There are just GPU tasks where Xbox One X is more suited for at higher resolutions because it has more math grunt and raw bandwidth than Series S. I even named some of them for you. No RDNA2 feature is fixing that :woman_shrugging:

The reason it was a mistake isn’t due to not hitting that target all the time so much as it not even making sense as a target in the first place imho. These are 1080pTV owners. Much cleaner message-wise to market it as the next gen X1S successor targeting 1080p TV owners. Marketing it as 1440p makes it seem almost like it should be next gen wrt resolutions in ways it is not actually designed to be.

Wat. Nobody is saying this is the case. Because it isn’t the case. XSS is just an XSX for 1080p TV’s. During the awkward cross gen phase when running game code never built to lever any of its proper next gen feature set it is rendering with one hand tied behind its back. That is not the scenario MS designed it to tackle. It was designed specifically for proper next gen gaming at 1080p. As of yet, there are precisely zero proper next gen games on XSX/XSS, so we have no remotely meaningful benchmarks to evaluate anything yet. Trying to compare it on cross gen stuff when X1X is built for 4k targets is just dumb.

Yes absolutely agreed.

One question I’d like to ask you is once cross gen development is left behind and these systems are better utilized, do you think the scope and demands of “proper” next gen games will be the same as they are now?

My view on XSS is this:

  1. We aren’t seeing the full potential of the machine, as most games aren’t making any use of the DX12 Ultimate features that are hardware-accellerated with the Series consoles. Many are basically using ‘back compat +’ that just treats RDNA2 like it is GCN as well. The SS can definitely do more than it is doing, but so can the SX, so the relative scale is going to persist.

  2. We aren’t going to suddenly see a greater relative performance of SS once these next-gen features are going to be used, as SX will use them too and the SS ends up with the same relative disadvantage. Where SX is full native 4k, SS will be 1440p with a couple of settings turned down. Where SX is running a dynamic res that goes as low as 1080p (as has already happened in some titles), we are going to see SS running sub-HD resolutions. That’s not going anywhere.

  3. Engines are getting more demanding, and SS is going to struggle with URE5. The Coalition’s presentation earlier in the year made that clear. Prepare yourselves for 720p SS games with stripped back graphical settings where that engine is concerned, perhaps as a maximum.

  4. But the talk I see on some forums of SS ‘not lasting the generation’ are nonsense. The relative power difference between SX and SS is not going to change and I would say we have seen the gamut of resolutions SX is going to manage already – the Series X is not going to start running games below the resolutions we have already seen, and developers will continue to respond by shunting the SS down to sub-HD with stripped back settings. The issue is, are Series S owners going to be happy with this as a status quo?

  5. In my view, they should be. Anyone buying the SS, for a price lower than the Switch, should get that they are getting a lower-spec device that plays the games but not as attractively. That’s the deal they took. It’s still a great value proposition – especially if intended to be run with a 1080p display.

what are the chances of Series S And X getting ML ?

Do you mean ML super sampling, something along the lines of DLSS?

Well we know that the architecture supports ML, although I think at nowhere near the capacity of tensor cores. Someone more knowledgeable will have to feed in on what might be possible. Obviously, the actual tech MS has been working on is not here yet.

MS has been talking about Direct ML for years now but we’re yet to see resolution reconstuction materialise in more than some tech demos showing early work with the Forza engine. I mean I would logically infer that MS would have worked with AMD to make sure the ML tech in the Series consoles was designed to deliver this if it was in their roadmap, but who knows.

EDIT:

But I would add, as with my other points above, this is not going to change the realtive difference between SX and SS. If some ML super sampling technique comes along, both consoles will have it. Developers will use it to push the SX further too, and the relative gap will stay the same.

There is also the fact that, with DLSS at least, the benefits reduce significantly the lower the resolution. 1440p to 2160p seems to be the sweet spot. So trying to use it to improve the SS if its chugging sub-HD resolutions out is not likely to have amazing results.

1 Like

thank you .

Genuine question : Ive seen that AMD is roughly 2-3 years behind Nvidia in technology . Why didn’t MS and Sony go to Nvidia for their next gen consoles ? I think Nvidia would have made a much powerful machine with DLSS .

What are the chances of these console manufacturers going to Nvidia for their 10th generation consoles ?

The only difference between XSX and XSS will be only render resolutions for different visual features. That’s what we have been promised by the next-gen hardware/software stack. I don’t think you should treat XSS as something like Nintendo Switch. This is the same console that ran Ori 2 at 4K native even though it’s a simpler game to render. I think the consumer expectations for XSS should be a replica of XSX experience at 1440p/1080p.

It was along the lines of Nvidia not allowing sony to customize the GPU for the PS3 while AMD allows alot of customization as they have dedicated teams set up to work on each consoles GPU. Nvidia might be the technologically more advanced in certain aspects but are bogged down by usually higher pricing as well as no flexibility.

1 Like