this post was submitted on 02 Feb 2025
280 points (79.7% liked)

Technology

61932 readers
4192 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
(page 2) 50 comments
sorted by: hot top controversial new old
[–] [email protected] 399 points 1 week ago (9 children)

Edward Snowden doing GPU reviews? This timeline is becoming weirder every day.

[–] [email protected] 1 points 6 days ago

Does he work for Nvidia? Seems out of character for him.

[–] Winged_Hussar 91 points 1 week ago

Legitimately thought this was a hard-drive.net post

[–] eager_eagle 49 points 1 week ago (1 children)

I bet he just wants a card to self host models and not give companies his data, but the amount of vram is indeed ridiculous.

[–] [email protected] 25 points 1 week ago (8 children)

Exactly, I'm in the same situation now and the 8GB in those cheaper cards don't even let you run a 13B model. I'm trying to research if I can run a 13B one on a 3060 with 12 GB.

[–] [email protected] 4 points 1 week ago

I'm running deepseek-r1:14b on a 12GB rx6700. It just about fits in memory and is pretty fast.

load more comments (7 replies)
load more comments (6 replies)
[–] [email protected] 63 points 1 week ago (4 children)

Every one who bought the 7900xtx laughing their arse off running 20GiB models with MUCH better performance than a 4080/4080Super lol

load more comments (4 replies)
[–] [email protected] 39 points 1 week ago

He's not wrong

[–] deleted 28 points 1 week ago (2 children)

I legit tried to understand how a lackluster VRAM capacity could spy on us.

[–] avieshek 1 points 6 days ago

You depend on the cloud instead~

load more comments (1 replies)
[–] timewarp 26 points 1 week ago* (last edited 1 week ago) (6 children)

The video card monopoly (but also other manufacturers) have been limiting functionality for a long time. It started with them restricting vGPU to enterprise garbage products, which allows Linux users to virtualize their GPU for things like playing games with near-native speeds using Windows on Linux. This is one of the big reasons Windows still has such a large marketshare as the main desktop OS.

Now they want to restrict people running AI locally so that they get stuck with crap like Copilot-enabled PCs or whatever dumb names they want to come up. These actions are intentional. It is anti-consumer & anti-trust, but don't expect our government to care or do anything about it.

[–] [email protected] 1 points 6 days ago

But that’s assuming there is actual high demand for running big models locally, so far I’ve only seen hobbyists do it.

I agree with you in theory that they just want more money but idk if they actually think locally run AI is that big of a threat (I hope it is).

load more comments (5 replies)
[–] ObviouslyNotBanana 24 points 1 week ago (2 children)

What the fuck is going on with the world

load more comments (2 replies)
load more comments
view more: ‹ prev next ›