this post was submitted on 03 Dec 2024
242 points (97.6% liked)

Technology

62044 readers
4055 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

If even half of Intel's claims are true, this could be a big shake up in the midrange market that has been entirely abandoned by both Nvidia and AMD.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 12 points 2 months ago (1 children)

Yeah, AMD and Intel should be running high VRAM SKUs for hobbyists. I doubt it'll cost them that much to double the RAM, and they could mark them up a bit.

I'd buy the B580 if it had 24GB RAM, at 12GB, I'll probably give it a pass because my 6650 XT is still fine.

[–] M600 2 points 2 months ago (2 children)

Don’t you need nvidia cards to run ai stuff?

[–] [email protected] 12 points 2 months ago* (last edited 2 months ago)

Nah, ollama works w/ AMD just fine, just need a model w/ enough VRAM.

I'm guessing someone would get Intel to work as well if they had enough VRAM.

[–] Wooki 3 points 2 months ago