Im saving up for a new PC right now. I find this tech super fascinating. Gonna buy a beast rig and ride it for a decade like I usually do. Probably a 7950x or 13900K and a 4090
This tech removes bottleneck of other hardware components
But I think it will only work with very high frame rates. Like 90+ or maybe 120 fps
I have suspicion that using this technique on low frame rates like 30 or 60 fps would make the artifact visible in motion easily
Thats what I want to do too, but the pricing on some of this stuff today is insane. The DDR5 and X670 AM5 motherboards costs really surprised me. I didnt think it to be at the levels they are. Never would have guessed $800 for a motherboard (non-server non-dualcpu). My heart wants 7950x AM5 DDR5 4090RTX but my brain says NOPE.
Shame that new technology costs so much.
My early take away from this tech is that it will save gamers money on there CPU
This will definitely help you to run your rig for a long long time
Tech is moving very fast
And in varying direction as well
On one side, you have such huge power hardware at ridiculous prices but on the other hand you can have the same power at fraction of cost via cloud gaming
Options are up for everyone in this market right now
Agree. I just assumed up front that Iâm spending 3-4k for the tower. The I use it until it essentially become useless or ceases functioning. I will usually do one GPU upgrade about 4-5 years in.
Well I guess people can stop bugging DF to review Stadia games nowâŚ
If it already takes high refresh rates to be useful technology, doesnât that mean you need a quality CPU in the first place?
Producing 90-120 fps these days is quite easy for mid range CPUs. It could be an issue with lower end CPUs but mid range CPUs could easily give great performance for this tech to work
My hunch is that mid range CPUs could give out performance similar to high end GPUs with this tech
I wonder why quite a bit of these indie games still donât run that great, even on XSX. Slime Rancher for example is a game that once you start to have several corrals with slimes in them becomes a slowdown fest.
Outer Wilds that recently received its XSX patch feels great when itâs smooth at 60fps but too often, in the starting area it has framerate issues. According to Reddit without VRR itâs way, way worse. Why is this? Is this a engine matter?
I mean, cool that itâs being advertised as 60fps by the studio, but itâs far from a locked 60fps and honestly with this hardware Iâd expect 120fps actually. Is there anyone with insight into this?
Is Outer Wilds a full on Gen 9 update or is it gen9aware?
If itâs the latter, itâs not taking advantage of the RDNA2 upgrades, including the improved IPC and instead runs like a 12 TF GCN card.
There could be a million reasons why a game drops frames and more often than not weâre just doing guess work depending on whatâs being shown on screen. Context matters a whole lot in many determinations.
I have no idea how to check that. They did recently patch it to XSX, but Iâm guessing itâs just a resolution and framerate job on the Xbox One(X) version.
If youâre ever curious on whether or not a game is a full on Gen 9 game or a Gen9Aware game, highlight any installed game you have, hit the back/pop tart button and select File Info in the bottom left corner of the screen. There, youâll see the file info for the game.
I touched on this in one of my previous videos, but they have changed the location of the File Info option to where I described above. Below is a timestamped version of that video explaining what to look for on whether itâs a Gen 9 or Gen9Aware game.
Good to know. Iâll go check this out when I get home.
You know the mistake I sometimes make, I think probably a lot of people is that they see a game. They see the graphics, animations, draw distance, scope of the game world and they are like âthis doesnât look all that crazy, this game should easily be possible to be 60fps on this hardware.â but itâs clearly a lot of things come into play with things like these.
I have no doubts that the studio of Outer Wilds would love to have the game silky smooth, no hiccups at all, but maybe itâs out of their hands.
This oversimplification is not true.
Iâm sorry, I donât understand. Has there been a report that revealed something different? Iâm not saying games arenât enhanced when updated as gen9aware. The whole point of updating a Xbox One game to gen9aware is to give it enhancements like a higher frame rate. IIRC Microsoft confirmed to DF that games running under gen9aware mode do not take advantage of any of the additional features in the new GPU. That was around launch though, so if something has changed since then, Iâd love to read about it.
The GPU in Series consoles is not a GCN gpu, so it canât behave like one. The GPUs in One and Series are (mostly) binary compatible, but that doesnât mean hardware improvements (caches, IPC improvements) in the GPU are or can be switched of.
Making your game Gen9 or Gen9 aware is a compiler/settings switch. That doesnât mean your shader code uses RDNA features suddenly. You have to program for that.
I know the Series GPUs are not GCN and no one said anything about anything being switched off. There is a difference between something being switched off and just not being properly utilized. MS themself said that when running BC games, the system runs in a back compat mode that takes advantage of the cores and clocks but none of the RDNA2 architectural boosts. Was what I said earlier an oversimplification? Yes but most of whatâs discussed here is. What I said earlier though falls more in line with reality than saying BC games take full advantage of the IPC gains based on benchmark results.
You said it runs like a 12 Tflop GCN card. Which is a oversimplification and not how a gpu works
I thought it was pretty clear that wasnât meant to be taken in a literally because that makes no sense. I was talking about the general performance level we see when running BC games when other factors like memory bandwidth are not the limiting factor. Weâve seen games running in BC mode with dropped frames that would run at a locked 60fps on a 12TF RDNA2 GPU. I was trying to help illustrate why a game might be running into performance issues when it otherwise should be running better.