Why is it bizarre they clearly put all their effort into making it run on the Xbox and that’s AMD hardware.
Starfield
Welcome to the Starfield community on Lemmy.zip!
- Follow instance rules (no spam, keep it civil and respectful, be constructive, tag NSFW)
Helpful links:
Spoiler policy:
- No spoilers in titles; if you want to share images with spoilers, preferably post the image in the body of the post. If you do make an image post, mark it NSFW.
- Add
[Spoilers]
to your title if there will be untagged spoilers in the post. - Game mechanics and general discoveries (ship parts, weapons, etc) don't need a spoiler tag.
- Details about questlines and other story related content are spoilers. Use your best judgement!
Post & comment spoiler syntax:
<spoiler here>
Don't worry though, Todd Howard himself said that Bethesda definitely did a lot of work on optimizing Starfield. This is all still the fault of the end users, who just need to "upgrade their hardware." Just ignore the decrepit Gamebryo engine that still has all the same old bugs and quirks that it's had for nearly two decades.
Indeed. "this would have the fewest bugs any Bethesda game ever shipped with." as said by MS.
Which is probably the platonic ideal of "damning with faint praise."
I mean, to their credit, the game is relatively bug-free. Still a few oddities here and there and AI that probably needs a tweak or two, but otherwise it's been stable for me and I've not soft locked myself out of major quests...yet...
Don’t worry, there will be plenty bugs left for AMD users as well.
Yep, like the Sun not showing up
https://www.pcgamer.com/in-starfield-the-sun-literally-doesnt-shine-on-amd-gpu-users/
Someone tried to argue that this game is as polished as Tears of the Kingdom lol
Ahahahahahaha!
The worst that game suffers from are duplication glitches
In this thread, people who understand very little about technology and how it works
Sounds like about 80% of Starfield discussion at the moment.
It is? Am I doing something wrong? Because I get a solid 60-70 fps at all times on a 3070ti
From what I recall from one of their Directs, Digital Foundry corroborated another outlet's finding that ultra settings (and I think specifically ultra shadows) are unoptimized. Tons of weird frame time jittering, and like a 15% drop in FPS compared to AMD. So, if you have shadows turned to High or lower, that'll explain it. Otherwise, what they're saying is an AMD equivalent would be getting 70-80 fps in your case.
What CPU are you using? I've read it can be CPU heavy.
Not original dude but
I'm using an i7 11700k and a 1660 super and getting 60fps with occasional drops to 50.
AMD being a “partner” is business speak for “AMD paid us a bunch of money because having their brand on our product is a much larger advertising reach than they can accomplish on their own”.
That performance is better on AMD is in no way “bizarre”… it’s exactly what would be expected.
It's unexpected for nvidia users, who have grown used to games being optimised for them rather than AMD users.
PS5 and Xbox Series both run on AMD hardware. Do you really think AMD has the cash to bribe Microsoft?
Bribery?
Every time you start a game and see an Intel, AMD, Nvidia or other logo outside of the studio or publisher, that’s paid advertising, plain and simple.
I haven’t seen any logos launching Starfield. It’s kinda nice actually
I want to know how the hell I am lucky enough to not have any real performance or graphical issues...
I'm not even using a supported GPU (1660 Super) and it's still very playable with the lowest fps being 27 and the highest being about 70.
Outside is on the low end. Interiors are higher, with empty interiors (IE no NPCs) being the fastest. Just dropping a single NPC into a space I am getting 72 fps in drops the frame rate to 50. NPCs aren't handled by the GPU; they are CPU bound.
My CPU is a Ryzen 5 3600x; the exact AMD chip Bethesda lists as the recommended. In fact, other than my GPU, the rest of my system meets recommended requirements.
Edit: I kinda wonder if it's simply how things are tested in QA. For years, I see users claiming to have high end systems having tons of problems across various games, and I am starting to think if they aren't simply lying about their specs (which seems an odd thing to do if you want real support), is that they are simply too new and the focus was more on hardware more users use. Going by Steam hardware survey stats, most people have pretty old stuff while only a small fraction are on super high end systems.
I've had a similar experience (and similar performance) on my all-AMD rig. There's slowdown in cities but nothing that makes the game unplayable.
Still, it should be optimized.
Still, it should be optimized.
I can't argue there. Considering how well it runs as it is on this card, it feels like their minimum requirements are way off, and they could have supported some older hardware if they optimized certain systems a little more. I don't want to make it sound like no effort was taken at all, because... Damn. I've seen every release from Morrowind to Starfield as it was at launch, and this is by far the most well built right out of the gate.
Yeah, I second that. I run the game to a perfectly playable extent, low-to-medium settings, and I have a barely better GPU, 1660-Ti, with a 10th gen laptop i7
I'm in. 2070S and also don't experience what everybody is talking about.
Kept getting “your gpu is too old” error messages, although I know others have played it online with 980ti. All the google results said to update windows but I wasn’t even on windows. Gave up on it.
In my mind the 980ti is still only a couple years old, so I went to check.
- Eight years. Fuck me, I'm still not used to this starting to feel old thing.
Yeah but they say you can run it on a 1070 and that’s basically the same lvl as 980ti
Anyways I saw 980ti benchmarks out there so i don’t think that’s the problem, it’s some dx12 compatibility nonsense. Which shouldn’t be an issue because it supposedly runs on steam deck.
You can force it to use resizable bar and get more fps. It just needs to be enabled and it's such an easy thing for the Bethesda devs to do, yet people need Nvidia profile inspector to enable it. For no reason.