this post was submitted on 23 Jan 2025
29 points (100.0% liked)
PC Master Race
15214 readers
881 users here now
A community for PC Master Race.
Rules:
- No bigotry: Including racism, sexism, homophobia, transphobia, or xenophobia. Code of Conduct.
- Be respectful. Everyone should feel welcome here.
- No NSFW content.
- No Ads / Spamming.
- Be thoughtful and helpful: even with ‘stupid’ questions. The world won’t be made better or worse by snarky comments schooling naive newcomers on Lemmy.
Notes:
- PCMR Community Name - Our Response and the Survey
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
VRAM is king for AI workloads. If you're at all interested in running LLMs, you want to maximize VRAM. RTX 3090 or 4090 are your options if you want 24GB and CUDA. If you get a 4090, be sure you get a power supply that supports the 12V HPWR connector. Don't skimp on power. I'm a little out of the loop but I think you'll want a PCIe 5.0 PSU. https://www.pcguide.com/gpu/pcie-5-0-psus-explained/
If you're not interested in LLMs and you're sure your machine learning tasks don't/won't need that much VRAM, then yes, the 4070 Ti is the sweet spot.
logicalincrements.com is aimed at gaming, but it's still a great starting point for any workload. You'll probably want to go higher on memory and skew more toward multi-core performance compared to gaming builds, IMO. Don't even think about less than 32GB. A lot of build guides online skimp on RAM because they're only thinking about gaming.
This is all great info and the new power supply and ram kit stuff is blowing my mind. Fortunately, my worn is not LLM related but just simple neural networks, but I don't know how that might affect best practices for hardware.
When training you'll want way more VRAM than you need to run inference - get a 90 series GPU for the memory.