Games Analysis |OT| Time To Argue About Pixels And Frames!

If I remember rightly the OG Xbox did have an Nvidia GPU, but they fell out over pricing with NVidia asking for more to keep supplying after the initial agreement. Don’t quote me on that, going off memory, but definitely remember reading there was some bad blood between them after the OG.

1 Like

I honestly think it is just because of both console makers wanting to go with x86 CPUs and AMD is the only company that can easily provide that and a GPU as one package.

2 Likes

Not accurate according to the press around that time, the story they were telling is that for a long long time during development of the HW Sony were planning to use a separate cell processor to handle GPU tasks and continued down this track until late in the schedule. Then at the last minute realized that it wasn’t going to work and went to Nvidea for a GPU for their already late (compared to 360) console. So it’s more likely that Nvidea didn’t have time to customise to any degree.

2 Likes

That’s not what I asked. Leave the hyperbole out of this conversation please, I’m not treating the XSS like the Switch. So again, when new gen games are in full swing and these systems are better utilized, do you think the scope and demands of these next gen games will be the same as they are now?

Costs. Integration. Nvidia doesn’t have the CPU component and their GPU components are astronomically priced.

The only reason Nintendo got Nvidia for The Switch was Nvidia had warehouses full of old SOCs laying around unused from their failed attempts to expand their market.

The other big aspect is backwards compatibility. Far easier to do with more similar hardware than with different hardware.

Yeah, it is this and a lack of high performance APU. I don’t know if this is a problem now since Nintendo is doing okay with them. Could have just been a contract issue on MS side that they took advantage of?

APU problem could be solved in a generation if Arm continues to move high end. Probably would not be ideal to change from x86 to ARM though.

The thing is as the generation goes on we expect the ‘render resolution’ to reduce. That happens every generation as the games become more demanding.

So most games don’t run at 4K on the Series X and over time I think we’ll see some titles dip significantly below that. We are of course in times where the actual render resolution is less relevant to the IQ - given the temporal reconstructions we have.

But the Series S is 3 times less powerful than the Series X and has a lower GPU bandwidth. If you put that together you can expect games to render at 3 times higher resolution on Series X assuming all things are equal and the S bandwidth isn’t a problem. And we can expect games on Series X to render perhaps down to 1080p even at times.

So if your measure is going to be a resolution number by the end of the gen you’ll be disappointed. I think taking marketing details too literally is not really going to help you because of course they are going to look at the best case scenarios which are of course in some or even lots of cases achieveable. But devs have already gone on record to explain that not all engines are equal and there are bottlenecks on the Series S beyond the 3* power multiplier like the GPU bandwidth that even when games are utilising the full RDNA 2 suite will be potentially problematic. Even if we take someone like UE5 what resolution and framerate was the demo on Series X? That isn’t a game of course but the demands on consoles go up usually not down.

Where I agree is that the Series S will be most beneficial for games utilizing the RDNA2 features in full. It will come closest to the 3* power multiplier then. But equally those games will likely see lower resolution on Series X. So you aren’t going to magically change the power disparity between the two systems. Unless you believe render resolution is going up (it isn’t) then the Series S will hit lower numbers rather than higher ones over time. Just the laws of physics in play. However, my bet is the IQ will improve regardless of the numbers.

1 Like

Not necessarily, PS4 still has a bunch of 1080p games (also has been consistent for series that were below that already like AC or Watch Dogs).

Also, as you note, SS has 3x less processing power than SX, but it’s targeting 4x less resolution, so proportionally the console is able to hold it’s target resolution better than SX, and we saw some of that already, like for example Metro Exodus can go sub 1080p on PS5 and 720pish on SS, Diablo 2 Remaster 720p on SS when SX and PS5 are 960p, and many others.

1 Like

I think the point is the marketed target for 1440p isn’t a realistic claim considering how the demands for future games will only go up, not come down. So it’s unlikely we’ll be seeing resolution increases beyond what we’ve been seeing from games over the last year.

Ah for 1440p I totally agree. But I think that was just a market bluff with no real substance, the machine was designed for 1080p and it shows.

The leaked Ms docs even mentioned that they gave SS a slightly tflop/pixel ratio than SX because not all tasks scale with resolution, and so they needed to overshoot in order to allow SS to run the same games as SX with a resolution delta and still keep the same framerates.

If you consider this as a primary goal, you will see that the target must be1080p, as at 1440p the SS has actually lower “density” than SX at 4k

1 Like

Agreed and I think that’s the point some of us have been trying to express the last couple days lol.

1 Like

Don’t mind me then :stuck_out_tongue:

lol I do appreciate your contribution to the conversation and the thread. :grinning_face_with_smiling_eyes:

1 Like

In the era of DLSS why should we care about render resolution?

Maybe because these devices don’t have DLSS…

1 Like

We shouldn’t but you keep quoting resolutions at the thread…

So why we are trying to predict things that are as far out as 10 years from now?

Because that’s where the main difference lies in between these 2 consoles.

Unless I’m mistaken, this conversation started because another poster said that there is no reason why the One X should be running games at higher resolutions than the XSS at 60fps. Where you later responded with their claims. So this entire recent topic has been about render resolutions. Why dispute what I was saying with claims that cross gen games don’t count because they aren’t optimized, if we shouldn’t care about resolutions?

Personally I agree. I think we should focus less on resolutions, at least natively. However the entire start of the XSS vs X1X conversation began because some here took issue with the fact that the 1X has certain advantages over the XSS at these higher resolutions.

1 Like

Render resolution or AI upscaling both work for me honestly. Although I’m not sure about the temporal techniques because they lack a great amount of information to upscale properly.

This doesn’t change the fact that the entire basis of your claims and debates have been regarding how well the XSS and X1X handles higher resolutions at 60fps.

Also it’s the temporal element of DLSS, on top of the ML, that help to add even more detail. One of the biggest faults of a technique like FSR is that it lacks the temporal data to fill in the gaps and provide the missing detail.