this post was submitted on 06 May 2024
27 points (86.5% liked)

PC Gaming

8581 readers
233 users here now

For PC gaming news and discussion. PCGamingWiki

Rules:

  1. Be Respectful.
  2. No Spam or Porn.
  3. No Advertising.
  4. No Memes.
  5. No Tech Support.
  6. No questions about buying/building computers.
  7. No game suggestions, friend requests, surveys, or begging.
  8. No Let's Plays, streams, highlight reels/montages, random videos or shorts.
  9. No off-topic posts/comments, within reason.
  10. Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)

founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 4 points 6 months ago* (last edited 6 months ago) (3 children)

On Windows, Nvidia without thinking twice. On Linux, depends, on rDNA 4 and the next release of Nvidia drivers, but probably still Nvidia.

Unfortunately, despite how much I would rather buy from someone else, AMD's products are just inferior, especially software.

Examples of AMD being worse:

  • AMD's implementation of opengl is a joke, the open source implementation used on Linux is several times faster and made for free by volunteers, without internal knowledge
  • AMD will never run physx, which is every day less relevant, but if AMD from the past had proposed an alternative we would have a standardized physics extension in DirectX by now, like with dlss
  • AMD's ray accelerators are "incomplete" compared to Nvidia RT cores, which is why ray tracing is better on Nvidia, and which is why with rDNA 4 they are changing how they work
  • GCN was terrible and very different from Nvidia's architecture, it was hard to optimize for both. rDNA is more similar, but now AMD has a plethora of old junk to maintain compatible with rDNA
  • Nvidia has been constantly investing in new software technologies (nowadays it's mainly AI), AMD didn't and now it's always playing catch up

AMD also has its wins, for example:

  • They often make their stuff open source, mainly because it's convenient for its underdog position
  • Has a pretty good software stack on Linux (much better than on windows) partly because it's not entirely done by them
  • Nvidia has been a bad faith actor for many years on the Linux space, even if it's in its redemption arc
  • Modern GPU seems to be catching up in compute performance
  • AMD is less greedy with VRAM, mainly because they are less at risk of competing with their own enterprise lineup
  • Current Nvidia's prices are stupid

I would still prefer Nvidia right now, but maybe it's gonna change with the next releases.

P.s. I have used a GTX 1060, an RX 480, and a Vega 56

[–] [email protected] 2 points 6 months ago

but if AMD from the past had proposed an alternative we would have a standardized physics extension in DirectX by now, like with dlss

Why the fuck put this on AMD when it was Nvidia who did their usual proprietary bullshit? "AMD is worse than Nvidia because they didn't provide us with a better alternative!" ???

[–] vikingtons 2 points 6 months ago* (last edited 6 months ago)

For your points against:

The OpenGL UMD was completely re-engineered. This premiered with the 22.7.1 release, so nearly two years ago. AMD now have the most performant, highest quality OpenGL UMD in the industry, which is particularly relevant for workstation use cases (where OpenGL remains the backbone of WS graphics).

PhysX is proprietary, I don't know what can be done about that, but your point is valid here, though given the rise of other physics engines at play, I don't really know if this is a big hit? Do we really want further consolidation in game systems?

AMDs approach to ray acceleleration has always favoured die area efficiency up until now, though I can totally understand your disappointment with the performance in that area. That said, the moment I really care about RTRT in gaming is when it's no longer contingent on the raster model. reflections, shadows and GI are nice and all, but we're still not really there yet.

I dont know how GCN was such a terrible arch since it was the basis of an entire console generation. An argument could be made about how its GPGPU design may have hindered it at gaming on desktops but it had matured extremely well over time with driver upgrades, despite their given price + perf targets at release. Aside from that (and related to point 1), RDNA UMDs are all PAL based. I'm not sure what you're alluding to with this? Could you please elaborate?

Your final remark is untrue (FMF, AL+, gfx feature interop, mic ANS, a plethora of GPUOpen technologies) but I will forgive you not keeping up with a vendor's tech if you don't actively use their products.

[–] Woozythebear 0 points 6 months ago (1 children)
[–] [email protected] 5 points 6 months ago (1 children)

I'm literally using a full AMD PC right now. I don't like Nvidia as much as the next person. I think they use terrible monopolistic practices, and if the competition were on par I would not buy Nvidia. But they aren't.

[–] Woozythebear 4 points 6 months ago (1 children)

The guy asked what's better for gaming and you want on a rant about Nvidia being better because of AI workloads and other software.

Amd are the better cards for gaming, Nvidia may have better ray tracing but most games don't even use ray tracing so you will spend an extra 30% to get the same gaming performance from an AMD card that actually has enough Vram to play the games at ultra settings and higher resolution.

[–] [email protected] 0 points 6 months ago* (last edited 6 months ago)

Well, if you are not gonna use Nvidia's extra stuff, buy an AMD, by all means.

But what you say is disingenuous. "AI and other software" is not entirely unrelated to gaming. Things like hairworks, physx, and most gameworks in general run on CUDA. And for AI (which I don't care about that much) there is DLSS, and they are working on AI enhanced rendering.

Most games don't use those technologies, but some do, and you will miss out on those.