this post was submitted on 19 Feb 2025
128 points (99.2% liked)

PC Gaming

9561 readers
2334 users here now

For PC gaming news and discussion. PCGamingWiki

Rules:

  1. Be Respectful.
  2. No Spam or Porn.
  3. No Advertising.
  4. No Memes.
  5. No Tech Support.
  6. No questions about buying/building computers.
  7. No game suggestions, friend requests, surveys, or begging.
  8. No Let's Plays, streams, highlight reels/montages, random videos or shorts.
  9. No off-topic posts/comments, within reason.
  10. Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)

founded 2 years ago
MODERATORS
top 16 comments
sorted by: hot top controversial new old
[–] [email protected] 4 points 2 days ago

...I guess I really should just play my Steam backlog before upgrading anyways

[–] 9point6 21 points 3 days ago* (last edited 3 days ago) (1 children)

Well that's pretty shitty, I assumed it could only be a CUDA blob at this point anyway rather than any specific hardware, so why drop support?

Edit: ah so it's 32-bit CUDA in general they're killing off, which makes a bit more sense as that probably does result in hardware differences.

Hopefully they open source it at least

[–] [email protected] 2 points 2 days ago

Hahahahaha Nvidia open sourcing anything? They literally fight tooth and nail against any form of open source every single chance they get. Only under a ton of pressure will they give any ground to open source.

Nvidia is one of the more anti-open-source companies.

[–] [email protected] 11 points 3 days ago (1 children)

I thought my 3080 was an irresponsible splurge when I bought it, but every day I love that thing more and more.

[–] [email protected] 5 points 3 days ago

Shit with the current 50 series pricing and availability, the 4090 I got myself for Christmas is looking responsible too. It doesn't even need a 1000 watt PSU

[–] _sideffect 6 points 3 days ago

"Pay us and we'll stream it to you instead!"

[–] MeaanBeaan 5 points 3 days ago

I was playing arkham knight last night on geforce now and I could not for the life of me get the fancy fog and debris to work. Every time I'd turn them on the game would tell me I need to restart. Once I did the settings would just revert. Even though I had the option turned on to have Geforce now save my game configs. At the time I thought it was a bug since afaik the 4080 they're using on their rigs supports these features fine. Now I'm wondering if it was an intentional choice to not allow those features on GeForce Now so as not to make the 50 series cards look bad.

[–] Jumi 2 points 2 days ago

Seems like my trusty 3090 will run for long time

[–] [email protected] 4 points 3 days ago

And it's not the first downgrade. I've noticed a decline over the generations of releases going back to before the 900 series

[–] [email protected] 1 points 3 days ago (1 children)

Is this the return of dedicated PhysX cards?

[–] VindictiveJudge 2 points 2 days ago

You could definitely just drop an old GPU in just for PhysX. The driver still supports that. Wouldn't even need to be a good one. You could also go into driver settings and make the CPU run PhysX if you have enough cores.

[–] Gutek8134 -5 points 3 days ago (2 children)

Ah, the classic:

What're you gonna use your 1000$ GPU for? Local hosting LLM? Video editing? 3d graphics? ...Running new games on highest settings?

Nah, I'm gonna replay this 10+ year old game.

[–] JordanZ 18 points 3 days ago
[–] [email protected] 4 points 3 days ago (1 children)

Old games are better and there's always a way to push them farther. This comment is stupid af.

[–] Gutek8134 4 points 3 days ago (1 children)

Oh, I am in this group. I have a mid range PC (Ryzen 5 3600, GTX 1660S) and still mostly play indies or 5+ year old games, because they're (usually) patched and dirt cheap.

[–] [email protected] 1 points 3 days ago

Oh I think the double sarcasm was missed in your previous, comment. My bad man. It sounded like you were making fun of people who play old games on high-end PCs.