this post was submitted on 17 Dec 2024
184 points (97.4% liked)

Technology

59979 readers
3659 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] brucethemoose 6 points 20 hours ago (2 children)

You should think about selling it TBH. 3090 prices are shooting up like crazy, and may be at a peak, because they are the last affordable card to self host LLMs.

[–] [email protected] 2 points 14 hours ago* (last edited 14 hours ago) (1 children)

Never even thought of that, is there a good website to sell a GPU on or is it pretty much just eBay?

I just don't play games like I used to, just videos now. Poor thing hardly gets any use.

[–] brucethemoose 2 points 11 hours ago

You could list it locally depending on where you are, through FB marketplace or Craigslist.

Otherwise, yeah, eBay.

[–] [email protected] 3 points 19 hours ago (1 children)

Can’t you run LLMs on 4090/5090 maybe 5080? Basically any Nvidia card with 24GB+ of VRAM?

[–] brucethemoose 9 points 18 hours ago* (last edited 18 hours ago) (1 children)

Yeah, but they not worth it.

The 4090 is basically just as good as the 3090 because it has the same amount of vram, but twice the price... so you mind as well get 2x 3090s.

The 5090 will be hilariously expensive, and 24GB -> 32GB is not that great, as you still can't run 70B class models in that pool... again, mind as well get 2x 3090s. I would not even bother trading my single 3090 for 5090.

If AMD sold a 48GB consumer card, you would see them dominate the open source LLM space in a month, because every single backend dev would buy one and get their projects working on them. Same with Intel. VRAM is basically the only thing that matters, and 24GB is kinda pitiful at a 4090's price.

[–] [email protected] 4 points 18 hours ago

I'd already be happy if AMD goes with 24 GB on their upper midrange cards, but I would not be surprised if they stick with 16 GB. 48 GB seems extremely unlikely, unfortunately.

Doing LLMs with 8 GB is not fun, especially not with RDNA 2 which has so many issues with ROCm.