Todd Howard asked on-air why Bethesda didn't optimise Starfield for PC: 'We did [...] you may need to upgrade your PC'

https://sopuli.xyz/post/3218355

Todd Howard asked on-air why Bethesda didn't optimise Starfield for PC: 'We did [...] you may need to upgrade your PC' - Sopuli

lol. has anyone found ways to optimize starfield for their pc, like reducing stuttering, FPS drops, etc?

I’ll believe they actually optimized their PC port when their PC ports start showing some signs of effort at being actual PC ports

No FOV slider after FO76 had one is, to me, a great sign of how little Bethesda actually cares about the platform that keeps them popular (thanks to mods)

They don’t want to put the work in for the biggest benefit of PC gaming.

I don’t think any PC should be able to run a game well at max settings on release. Leave room for it to still look good five years from now.

And have the bare bones setting be able to work on shitty systems that do need upgraded.

Bethesda just wants to make games that run on a very small slice of PCs, and who can blame them when they can’t even release a stable game on a single console? They’re just not good at

I don’t think any PC should be able to run a game well at max settings on release. Leave room for it to still look good five years from now.

This is the mentality they want you to have. And it's a shit one.

If you’ve got a 5 year old pc, sorry you shouldn’t expect to play on max, let alone anything over medium.

People need to temper their expectations about what a PC can actually do over time.

We are talking modern hardware, nobody is expecting a 5 year old PC to be running maxed out games anywhere near as well as the latest hardware should be.
Read the comment chain again, because you missed the persons original point….
I think you are imagining modern hardware to just be like a 4090. Any modern hardware here meaning current generation GPU/CPUs. They should be able to run at max settings yes. The performance match ups of low to mid range hardware of this generation overlaps with mid to high of the last generation (and so on), so just talking years here doesn't really translate. People holding on to their 1080tis obviously shouldn't be expecting max settings, but people with 3080s, 4070s, 6800XTs (even 2080ti/3070s) to some degree, should absolutely be able to play on max settings.

I have an i9 9900k and a 4070ti and can play it butter smooth on 4k 100% rendering. The CPU is definitely starting to show its age, but I haven’t had any complaints about starfields performance.

That said I can’t fucking stand consoles. I get that companies would be stupid to not sell something to as many people as possible, but I’m so sick and tired of seeing good games handicapped because they need to run on a rotten potato Xbox from 10 years ago or whatever…

Like 40-45fps? I've seen a couple people say this now, but every outlet I have seen benchmark performance contradicts it. I don't consider 40fps smooth, but I guess consoles even have to suffer with 30fps in some cases.
I actually never pulled up an FPS meter as it has been so smooth I never felt the need to check. I’ll see when I get home later what it actually is in neon or somewhere “busy.”
Do you have motion blur on?

I’ll never understand why developers add stuff that make the game look so much worse…

Looking at you chromatic aberration, motion blur, film grain, vignette…

The first thing I do with a new game is check graphics settings and nuke that extra garbage lol

Yup like sure add it but at least disable it all by default, but motion blur does make low fps look better, if you can put up with the blur that is (I can't), it's used heavily on consoles for that reason.