...I guess I really should just play my Steam backlog before upgrading anyways
PC Gaming
For PC gaming news and discussion. PCGamingWiki
Rules:
- Be Respectful.
- No Spam or Porn.
- No Advertising.
- No Memes.
- No Tech Support.
- No questions about buying/building computers.
- No game suggestions, friend requests, surveys, or begging.
- No Let's Plays, streams, highlight reels/montages, random videos or shorts.
- No off-topic posts/comments, within reason.
- Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)
Well that's pretty shitty, I assumed it could only be a CUDA blob at this point anyway rather than any specific hardware, so why drop support?
Edit: ah so it's 32-bit CUDA in general they're killing off, which makes a bit more sense as that probably does result in hardware differences.
Hopefully they open source it at least
Hahahahaha Nvidia open sourcing anything? They literally fight tooth and nail against any form of open source every single chance they get. Only under a ton of pressure will they give any ground to open source.
Nvidia is one of the more anti-open-source companies.
I thought my 3080 was an irresponsible splurge when I bought it, but every day I love that thing more and more.
Shit with the current 50 series pricing and availability, the 4090 I got myself for Christmas is looking responsible too. It doesn't even need a 1000 watt PSU
I was playing arkham knight last night on geforce now and I could not for the life of me get the fancy fog and debris to work. Every time I'd turn them on the game would tell me I need to restart. Once I did the settings would just revert. Even though I had the option turned on to have Geforce now save my game configs. At the time I thought it was a bug since afaik the 4080 they're using on their rigs supports these features fine. Now I'm wondering if it was an intentional choice to not allow those features on GeForce Now so as not to make the 50 series cards look bad.
"Pay us and we'll stream it to you instead!"
Seems like my trusty 3090 will run for long time
And it's not the first downgrade. I've noticed a decline over the generations of releases going back to before the 900 series
Is this the return of dedicated PhysX cards?
You could definitely just drop an old GPU in just for PhysX. The driver still supports that. Wouldn't even need to be a good one. You could also go into driver settings and make the CPU run PhysX if you have enough cores.
Ah, the classic:
What're you gonna use your 1000$ GPU for? Local hosting LLM? Video editing? 3d graphics? ...Running new games on highest settings?
Nah, I'm gonna replay this 10+ year old game.
Old games are better and there's always a way to push them farther. This comment is stupid af.
Oh, I am in this group. I have a mid range PC (Ryzen 5 3600, GTX 1660S) and still mostly play indies or 5+ year old games, because they're (usually) patched and dirt cheap.
Oh I think the double sarcasm was missed in your previous, comment. My bad man. It sounded like you were making fun of people who play old games on high-end PCs.