this post was submitted on 23 Jan 2025
29 points (100.0% liked)

PC Master Race

15214 readers
881 users here now

A community for PC Master Race.

Rules:

  1. No bigotry: Including racism, sexism, homophobia, transphobia, or xenophobia. Code of Conduct.
  2. Be respectful. Everyone should feel welcome here.
  3. No NSFW content.
  4. No Ads / Spamming.
  5. Be thoughtful and helpful: even with ‘stupid’ questions. The world won’t be made better or worse by snarky comments schooling naive newcomers on Lemmy.

Notes:

founded 2 years ago
MODERATORS
 

Today my PC finally ate it. No POST, no disk activity, so I'm pretty sure the mobo has failed. I built this PC 8 or 10 years ago, and I'm honestly too old and out of touch to know where to start on a rebuild lol.

I'm an arch Linux user, my job is in machine learning, and I'm looking at a soup to nuts style rebuild but I don't know where to start. I want as much future proofing as I can get and I'm happy with a budget anywhere from $2k to $8k. I don't game now, but I might want to in the future.

So it seems like to leverage good ML tools I'm locked to cuda, so probably Nvidia GPU. Does that mean 4070 Ti is the knee in the curve? CPU I came from AMD but I have no idea. RAM speed is something I have never ever considered. And mobo wise, I have a couple of M.2 drives now, but I'm not sure what else should drive decisions? 1 monitor currently that I intend to replace, so I'm not sure why I would need multiple GPUs or something that necessitates a lot of PCIe connections.

I want a plain old closed black case, no color changing gamer shit, and about as much computing power as I can get. Pcpartpicker came up a little short, how do I start?

I've got maybe a week of lead time, then I would like to pull the trigger. This whole build process was a lot easier circa 2003!

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 4 points 20 hours ago (5 children)

VRAM is king for AI workloads. If you're at all interested in running LLMs, you want to maximize VRAM. RTX 3090 or 4090 are your options if you want 24GB and CUDA. If you get a 4090, be sure you get a power supply that supports the 12V HPWR connector. Don't skimp on power. I'm a little out of the loop but I think you'll want a PCIe 5.0 PSU. https://www.pcguide.com/gpu/pcie-5-0-psus-explained/

If you're not interested in LLMs and you're sure your machine learning tasks don't/won't need that much VRAM, then yes, the 4070 Ti is the sweet spot.

logicalincrements.com is aimed at gaming, but it's still a great starting point for any workload. You'll probably want to go higher on memory and skew more toward multi-core performance compared to gaming builds, IMO. Don't even think about less than 32GB. A lot of build guides online skimp on RAM because they're only thinking about gaming.

[–] [email protected] 2 points 20 hours ago (4 children)

This is all great info and the new power supply and ram kit stuff is blowing my mind. Fortunately, my worn is not LLM related but just simple neural networks, but I don't know how that might affect best practices for hardware.

[–] scribbler 3 points 18 hours ago

When training you'll want way more VRAM than you need to run inference - get a 90 series GPU for the memory.

load more comments (3 replies)
load more comments (3 replies)