this post was submitted on 07 Dec 2023
307 points (97.8% liked)

Technology

58364 readers
5187 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
all 27 comments
sorted by: hot top controversial new old
[–] Thcdenton 80 points 9 months ago (1 children)
[–] [email protected] 32 points 9 months ago (2 children)

Who is that for? I would like to give the middle finger to 75% of those companies.

[–] NateNate60 69 points 9 months ago

This is Linus Torvalds, creator, namesake, and supreme dictator of the Linux kernel. It's from a video of him talking about his frustrations in working with NVIDIA. Essentially, NVIDIA treats Linux like a second-class citizen and its components don't play nicely with the rest of the Linux code base. In this scene, Torvalds shows his middle finger and says "NVIDIA, fuck you!".

[–] [email protected] 47 points 9 months ago (1 children)

linus torvalds(linux kernel) response to nvidia (who consistently treats linux as a nobody for consumers)

[–] Shadywack 22 points 9 months ago (1 children)

It’s far worse than just simply ignoring what their users and customers want.

Nvidia makes a shit ton of money off of Linux. Between what their Tegra chips were sold for and their data center products being used on Linux (most recently in the AI space), their company is practically built in its success in using Linux for all their backend and supercomputing research.

They are literally as successful as they are, from using Linux themselves, but they treat the community like shit. They take take take, and then they go and shit on the front lawn.

Their senior leadership is ethically bankrupt trash and can go fuck themselves.

[–] bruhduh 2 points 9 months ago* (last edited 9 months ago)

Nvidia and Linux = Apple and open source, literary same story

[–] [email protected] 60 points 9 months ago (1 children)

Took long enough - at a certain point Nvidia's pricing just to get CUDA doesn't make sense when compared to the cost of just investing in ROCm and OneAPI.

All they had to do was find the right balance, but apparently they decided to see how much money the printer could make...

[–] [email protected] 22 points 9 months ago

I keep hearing how good AI is at coding these days, why can't they just use it to rewrite all the model and library code up to full AMD support?

/s

[–] [email protected] 52 points 9 months ago (2 children)

Large companies that are themselves (near) monopolies see the risk of only having one supplier. This should be evident to all spectators.

[–] [email protected] 19 points 9 months ago (1 children)

Nuh, free markut regulgates itself. Smol govment only way (except for suppressing the minorities).

[–] stevehobbes 2 points 9 months ago

I mean, this is kinda the free market at work? Nvidia built and dominated a market, and AMD and Intel are pouring billions in to give people an alternative which will drive prices down?

[–] [email protected] 12 points 9 months ago

True, but this will also help boost alternatives to nvidia for consumers too I'd wager, and with that probably also better Linux support along with it.

[–] [email protected] 21 points 9 months ago (1 children)

I would kill to run my models on my own AMD linux server.

[–] dublet 6 points 9 months ago (1 children)

Does GPT4all not allow that? Or do you have specific other models?

[–] [email protected] 8 points 9 months ago* (last edited 9 months ago) (2 children)

I haven't super looked into it but I'm not interested in playing the GPU game against the gamers so if AMD can do a Tesla equivalent with gobs of RAM and no display hardware I'd be all about it.

Right now it's looking like I'm going to build a server with a pair of K80s off ebay for a hundred bucks which will give me 48GB of RAM to run models in.

[–] dublet 4 points 9 months ago (1 children)

Some of the LLMs it ships with are very reasonably sized and still be impressive. I can run them on a laptop with 32GB of RAM.

[–] [email protected] 0 points 9 months ago

This is very interesting! Thanks for the link. I'll dig into this when I manage to have some time.

[–] [email protected] 4 points 9 months ago* (last edited 9 months ago)

if AMD can do a Tesla equivalent with gobs of RAM and no display hardware I’d be all about it.

That segment of the market is less price-sensitive than gamers, which is why Nvidia is demanding the prices that they are for it.

An Nvidia H100 will give you 80GB of VRAM, but you'll pay $30,000 for it.

AMD competing with Nvidia in the sector more-strongly will improve pricing, but I doubt very much that it's going to make compute cards cheaper than GPUs.

Besides, if you did wind up with compute cards being cheaper, you'd have gamers just rendering frames on compute cards and then using something else to push the image to the screen. I know that Linux can do that with PRIME, and I assume that Windows can as well. That'd cause their attempt to split the market by price to fail. Nah, they're going to split things up by amount of VRAM on the card, not by whether there's a video interface on it.

I suspect that a better option is to figure out ways to reasonably split up models to run on lower-VRAM GPUs in parallel.

[–] [email protected] 11 points 9 months ago (1 children)
[–] [email protected] 3 points 9 months ago (1 children)

Thanks, I usually submit biz related stuff here, hardware related stuff to c/[email protected]. It's not very active but I haven't really found a good replacement to r/hardware on Lemmy.

[–] [email protected] 4 points 9 months ago

I see.

On another note, the way you linked the sub doesn't work for me, should be: [email protected]