121
Starfield is a “bizarrely worse experience” on Nvidia and Intel, says Digital Foundry
(www.theverge.com)
Welcome to the Starfield community on Lemmy.zip!
Helpful links:
Spoiler policy:
[Spoilers]
to your title if there will be untagged spoilers in the post.Post & comment spoiler syntax:
<spoiler here>
It's not bizarre, as others have pointed out AMD has clearly had a hand in making sure this performs better on their GPUs. One instance could be a coincidence but when you've got multiple instances of things being 'missing', 'not optimised properly' etc for RTX cards, you have to wonder whether it's a bunch of coincidences or deliberate.
This has taken a lot of shine off AMD for me. They seem to be employing a Russia-esque strategy of "If I can't improve myself then I guess I'd better make things suck for other people so I don't seem as bad"
A graphics card company ensuring software performs better on their GPU?
Time to switch to nvidia. They would never do such a thing.
LOL
Jesus, Nvidia has been doing cut throat shit like this for years.
Feel like this isn't the best take. AMD working with Bethesda to make sure the game works on their card doesn't come close to implying they made sure it didn't work on Nvidia cards. Nvidia should've been working to make sure the game ran well on their cards too.
Nvidia has been pulling the tricks you're talking about for years now, though
Microsoft owns Bethesda. Microsoft owns Xbox.
Xbox uses AMD GPUs and CPUs.
So the game being optimised for AMD makes absolute sense for Microsoft.
AMD paying for access to optimise for thier PC CPUs and GPUs makes sense for AMD.
However not optimising the game for Intel and Nvidia does not make sense for Microsoft. This is more likely to be an oversight/typical poor AAA game launch than deliberate play to benefit AMD. Other games like Cyberpunk 2077 for example had problems on CPUs/GPUs, we have selection biase here where there are fewer problems on AMD systems, and also a generally reasonably solid launch.
Its frustrating but most of the issues are optimisation, not game breaking. The experience on Intel/Nvidia systems is good, just not as good as it could be. One of the examples in the article was a framerate of 95 FPS vs 105 FPS - that may have been avoidable, but it's a minor annoyance at best. Some of this (not all but some) is just obsessing over minutia and things that won't affect the player experience.
So basically storm in a tea cup, and much of this the usual post launch technical difficulties that will be optimised with patches. This is why people shouldn't buy games at launch, although so far at least we haven't seen the game breaking bugs that have dogged other AAA titles at launch.
NVIDIA's entire business model is brand-exclusive proprietary software. Last I checked you can use FSR on NVIDIA but you can't use DLSS on AMD.
DLSS doesn't run on older nVidia hardware either as it's designed to utilize the raytracing and tensor cores of the RTX series. I recall reading somewhere that while it could technically be made to run without them, without the specific cores optimized to do the calculations required it would run terribly. Then again it might just be a blatant lie ¯\_(ツ)_/¯
FSR on the other hand is designed to run on standard GPU hardware and seeing as the tech is open source they can't exactly hide any code that would break compatibility with nVidia.
brand-exclusive proprietary software
But to be fair, nVidia has also been pumping massive amounts of $$$ into R&D in both the Graphics and AI space.
They need return on their R&D investment somehow.
And it's not like they are cutting AMD out of the AI enhanced stuff.
They just aren't going to spend $$$ and effort to help AMD implement their solutions, and AMD doesn't have the hardware to run the AI functions properly.
AMD can implement RTX if they wanted to, nVidia's research papers are out there.
But they can't because they don't have the knowledge of how to implement it.
And it isn't like AMD is sharing with Intel any of their R&D work they do on the CPU side.
That's a pretty bold conspiracy theory. Nvidia outsells AMD by a pretty huge margin. As does Intel in CPUs. What would get Bethesda to deliberately favor AMD tech and hobble Nvidia? That would merely give them a LOT of negative press, as we are seeing now.
The idea of bribery is right out because Bethesda is owned by MS. The idea of laziness is also not great because as above, there's more Intel/Nvidia users so it'd be easier to only prioritize one set of hardware, the most common, if laziness was the goal.
Most likely it's as someone below said: this game was primarily designed around console performance. Both of which, the Xbox and PS5, use AMD hardware. And Bethesda is either too inept or too time-constricted to get it to run well on the primary PC hardware. This is, pretty damn common in the games industry: allowing PC performance to flounder because they are a smaller set of sales.
But MS has no incentive for Nvidia cards to not work well because 99% of PC users are Windows users and most likely run this on Gamepass, an MS product.
Also because you can always lower settings or throw better hardware at the problem on PC so even a badly optimized port should eventually run acceptably. But if you fuck it up on console, you get Cyberpunk on the PS4 and have to spend a ridiculous amount of time and money to make it work.
And this is Bethesda we are talking about, at this point I wouldn't be surprised if the PC versions are designed from the get got under the expectation that the modding scene will come to the rescue and fix everything for them no matter how terrible their work is.
From whom does Microsoft source the CPUs and GPUs for every single XBox?
Yep.
Same people Sony does. It isn't about Nvidia. It's about lazy developers not optimizing for PC.
Xboxes run on AMD GPUS.