this post was submitted on 31 Jan 2025
71 points (83.8% liked)

Technology

61315 readers
2691 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
top 24 comments
sorted by: hot top controversial new old
[–] [email protected] 43 points 1 day ago (3 children)

The consumer GPU market is becoming a dystopia at the top end. AMD has publicly retreated from it and Intel is likely a decade away from competing there. I guess I'll stay in the midrange moving forward. Fuck Nvidia.

😞

[–] AtHeartEngineer 20 points 1 day ago (2 children)

If AMD was smart they would release an upper-mid range card with like 40+ gb of vram. Doesn't event have to be their high end card, people wanting to do local/self serve AI stuff would swarm on those.

[–] foggenbooty 2 points 12 hours ago (1 children)

Please just give us self hosting nerds SR-IOV on affordable cards. I really want to have a Linux VM and Windows VM that both have access to a GPU simultaneously.

I was hoping Intel would let some of these enterprise locked features trickle down as a value add, but no dice. Every year AMD just undercuts NVIDIA by a small amount, but it doesn't compete on some of that tech NVIDIA has so it's a wash.

But they're too concerned it would eat into their enterprise cards where they make boatloads, so it's not going to happen. Imagine if consumer CPUs didn't support virtualization, it would be insane and that's where we are with GPUs today.

[–] [email protected] 1 points 12 hours ago

@foggenbooty @AtHeartEngineer spent hours and hours trying to have my gpu passthroigh ony vms and only found agony and dispair

[–] eager_eagle 5 points 1 day ago* (last edited 1 day ago) (1 children)

yeah, I've been wanting a card like that to run local models since 2020 when I got a 3080. Back then I'd have spent a bit more to get one with the same performance but some 20GB of VRAM.

Nowadays, if they released an RX 9070 with at least 24GB at a price between the 16GB model and an RTX 5080 (also 16GB); that would be neat.

[–] AtHeartEngineer 3 points 1 day ago

Same, I've got a modded 2080ti with 22gb of vram running deepseek 32b and it's great... But it's an old card, and with it being modded idk what the life expectancy is.

[–] Diplomjodler3 26 points 1 day ago (6 children)

I don't get why people are so keen on handing over such a huge amount of money just for bragging rights. The midrange is perfectly fine for playing any game these days. Those top end GPUs are getting an absolutely inordinate amount of attention compared to the relevance they have to most people.

[–] atempuser23 1 points 9 hours ago

It's a hobby. It's easy to drop thousands on the expensive ends of hobbies.

As for the amount of attention. I've watched far more sports car reviews than I have mid-size sedan reviews.

[–] [email protected] 4 points 1 day ago (1 children)

Another problem is how big games are made now, they are made (relatively) quickly and are very underperformant. So while GPUs 2, 3 generations ago could be running beautiful games at beautiful framerates, instead they run like ass. Nvidia wants them to rely on their DLSS shit to give people a reason to keep buying their GPUs every cycle. So people feel like they need to upgrade, when they really don't, instead they should stop buying these poorly made games.

[–] atempuser23 2 points 9 hours ago

I think this is on purpose. The kind of people who spend $1000+ on a GPU are more likely to spend $75 on a new release video game.

[–] filister 10 points 1 day ago

The problem is that NVIDIA is consistently gimping the mid range making it a very unattractive proposition.

[–] poleslav 6 points 1 day ago

As someone who does VR in flight sims on one of the least optimized games (DCS) I can see the allure. Aside from that one niche though, I can’t think of many uses for a 90 series card though

[–] tabular 2 points 1 day ago

I want 4k 144Hz 🀠

[–] [email protected] 3 points 1 day ago

Yup, my 6650 XT is perfectly fine, and my SO has a 6700 XT. Both are way more than we need, and we paid $200-300 for them on sale. Why get the top end? Mine is roughly equivalent to current consoles, so I doubt I'm missing out on much except RTX, but I also don't care enough about RTX to 10x my GPU cost.

[–] latenightnoir 3 points 1 day ago* (last edited 1 day ago)

This was my exact thinking the moment I realised I, yet again, needed a GPU upgrade (thanks, Unreal 5...). Which is why I seared my soul and dished for a 4080 Super, with the hopes that I'll be covered for a decade at least. The 40s at least seem to still be built mainly for pretty pictures.

Genuinely not worth paying attention to this nonsense. Maybe - MAYBE - AMD will pull a Comrade and will shift full focus on creating genuinely good and progressively better GPUs, meant for friggin' graphics processing and not this "AI" tumor. But that's a big-ass "maybe."

[–] latenightnoir 38 points 1 day ago

Aaah, they're doing the scarcity thing... Cool. Coolcoolcool.

[–] TheGrandNagus 26 points 1 day ago* (last edited 1 day ago) (1 children)

Sigh. Nvidia and Intel doing paper launches, while AMD has delayed to March.

[–] halcyoncmdr 41 points 1 day ago (2 children)

Nvidia doesn't give a shit about gamers anymore. The incremental improvements are a side effect. This is why they're so focused on software enhancements instead like DLSS now. It gives them the marketing numbers without having to do the hardware improvements for gaming.

Their bread and butter now is AI, and large scale machine learning. Where businesses are buying thousands of cards at a time. It's also why they're so stingy with VRAM on their cards, large amounts on VRAM are not as necessary for most workloads outside gaming now, and it saves them millions of dollars every generation.

[–] TheGrandNagus 6 points 1 day ago* (last edited 1 day ago) (1 children)

You're right, however I'd say that Nvidia has always been stingy with VRAM. The 3060 had 6GB while the RX 480 had 8GB, for example, the 970 had 3.5GB VRAM and the R9 390 had 8GB, and there are similar examples going back a long way.

It has got pretty bad recently. Worse than normal. AI is also very VRAM intensive (even moreso than gaming), so I imagine they've been diverting those chips to their AI/enterprise cards.

[–] [email protected] 1 points 15 hours ago (1 children)

Well, Nvidia seemingly forgot to price gouge on RAM for the 3060 and they had a 12 GB standard version for a while. That should have been the low range standard, with 24 for mid and 32 for high, but they've adjusted.

[–] TheGrandNagus 3 points 13 hours ago* (last edited 11 hours ago) (1 children)

They only did that because they were forced by AMD's VRAM choices and unexpectedly great RDNA2 architecture.

Because of the memory bus that the 3060 had, it essentially had to have either 6GB of VRAM or 12GB, and it'd have looked stupid next to AMD with only 6GB, so they changed it to 12GB fairly late on in development.

It led to the bizarre situation of the 3060 Ti (based on the 3070 die) having less VRAM at 8GB.

So yeah, less that they didn't want to price gouge, more that AMD was giving 12GB for similarly priced cards that were also much faster, and Nvidia knew that 6GB would look like a joke in comparison.

[–] [email protected] 2 points 12 hours ago

Thanks, I didn't know that!

[–] TheDemonBuer 8 points 1 day ago

Nvidia doesn't give a shit about gamers anymore...Their bread and butter now is AI, and large scale machine learning. Where businesses are buying thousands of cards at a time.

I'm just quoting this for emphasis.

[–] [email protected] 14 points 1 day ago

How nice of Nvidia, helping gamers save money in these hard times /s