30fps is fine for this game and many people have been expecting that for Starfield for a long time now. As long as it’s consistent and responsive, I doubt most will care.
I do wonder if they’ll eventually add a 40fps option for those that want a little bit more fluidity. What’s interesting is the PC recommended requirements has a Ryzen 3600x, which is close to the CPU performance on consoles. So are these recommended settings targeting 30fps? That would be surprising since I’d think most recommended settings would target 60fps on PC. I’d love for them to even just unlock the frame rate up to 60fps on consoles just to see how the performance profile looks. I’m betting they won’t though since the performance swings would likely make for a poor gaming experience.
It’s hard to determine what they consider the target for recommended settings. Is that a stable 1440p/60fps with visual settings on par with Series X? It’s hard to say. For example, Nixxes’ recommended settings for Spider-Man: Miles Morales on PC was 1080p/60fps using the Medium preset.
With that said, the Starfield recommend PC specs are not a one-for-one comparison to a Series X. For starters, the Series X has a CPU more comparable to a Ryzen 7 3700X, so it’s actually higher than the PC recommended.
Then there is the memory requirement. The Series X has a memory pool of 16GB that is shared with both the GPU and CPU (I think that is 10GB for GPU and 6 for system/games). In comparison, the recommended RX 6800 XT has 16GB of dedicated GDDR6 (RTX 2080 has 8GB of GDDR6). This is in addition to the 16GB of system ram recommended, although realistically you probably only have 10-12GB free after Windows takes it’s share.
The PC also requires a SSD, so whatever advantages that the Series X would have had over PC are negated somewhat.
So in summary, it seems (at least on paper) that apart from the CPU, the PC has the advantage in every other category.
Pretty sure the real world performance is closer to the 3600 due to the cut down cache in the CPUs. You can see the 4700s (console APUs with defective GPUs) below just above the 3600 and a good bit below the 3700x
The target performance and quality settings those recommended specs aiming for is important though. It’ll be interesting to see how Starfield performs on more mid-range CPUs.
Consistent, indeed. That’s super important and also definitely motion blur. Really hoping they got something good in terms of that, because that goes a long way.
I’m guessing the team is still very busy on smoothening out the framerate as we speak. But as someone else last week said, it’s very likely an older build anyway that they showed. Also, IGN played it on Xbox Series X, not PC, so that’s definitely a good sign. I doubt Todd would let IGN play a not so great performing version.
Yeah the addition of object based motion blur is a real nice touch. I know plenty don’t like it but I’m a big fan if it’s implemented well and with iD helping them, I’m sure it’ll look great.
Oh man I need to take the time out to watch DF. It’s using object based motion blur? Are there any other games that use this? Just to get an idea of things. Rdr2? Did their previous games use it too? Or is this new for them?
Plenty of games use OBMB, this is just a first for a Bethesda game. Seeing as how they’ve regularly been 30fps on consoles, that’s a nice addition to have.
Awesome to hear. So it definitely should feel and look more smooth than Skyrim on Xbox One did or Fallout 4 without any mods on console? Looking forward to it.
You can’t compare PC CPU and GPU performance with that in consoles. On gaming PCs CPUs have their own memory pool and GPUs have their own memory pool. Consoles have unified access to memory and often smaller caches so this is a different situation for performance.