this post was submitted on 11 Mar 2025
34 points (100.0% liked)

PC Master Race

16159 readers
1 users here now

A community for PC Master Race.

Rules:

  1. No bigotry: Including racism, sexism, homophobia, transphobia, or xenophobia. Code of Conduct.
  2. Be respectful. Everyone should feel welcome here.
  3. No NSFW content.
  4. No Ads / Spamming.
  5. Be thoughtful and helpful: even with ‘stupid’ questions. The world won’t be made better or worse by snarky comments schooling naive newcomers on Lemmy.

Notes:

founded 2 years ago
MODERATORS
 

Hey team,

Years ago, SO gifted me an Alienware Aurora R7.

It has an Intel I7-8700 and Nvidia GTX 1080 (8GB VRAM), 16 GB of DDR 4 RAM. (what else is relevant to this question?)

My question to you is basically this -

Given that I'm not gaming with it anymore, I want to use it for only two things -

  1. Plex Server
  2. Running random local LLM stuff like Kotaemon (https://github.com/Cinnamon/kotaemon)

Let's say I have $1000 to throw at a GPU and I'd like to get 16 GB VRAM (or 24 if it's possible). I want to install it myself rather than take it into a shop.

What's a GPU I can buy that will fit,

  1. within my budget?
  2. within the chassis?
  3. with the CPU and motherboard without issues?
  4. with the needs I've detailed (namely Plex transcoding, and running ML models)?

I am an absolute noob when it comes to figuring out what hardware to buy (hey, we got us an Alienware sucker here).

So lemmings, help me out! I'd rather not ask ChatGPT.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 9 points 1 week ago (1 children)

Not 100% sure about the chassis and compatibility with the rest of your rig so hopefully someone follows up on me, but in general I would try getting the AMD 9070 XT if you can find it near its real price ($600). That might be a bit difficult since it just released, so I would wait until it starts being more available if you can.

If not that, I'd recommend the 4060 Ti (16GB). Note that there's an 8GB version too, do NOT buy that one. Nvidia is generally better when it comes to running AI, but the new AMD gpus are pretty good too.

[–] damnthefilibuster 4 points 1 week ago (2 children)

Thank you for that! I’ll look at the AMD. I thought most ML tools don’t have outright compatibility with AMD though? Is that no longer the case?

The 4060 Ti 16 GB version… that sounds good. About $500?

[–] [email protected] 7 points 1 week ago* (last edited 1 week ago) (1 children)

AMD's compatibility has gotten better the past year, and ones that aren't compatible usually have workarounds. But yeah, Nvidia is probably better if LLMs are important for you.

The 4060 Ti 16 GB version… that sounds good. About $500?

More like ~$800-900 unless you're okay with buying used. The market is pretty darn bad, and it's gotten SO much worse due to the tariff scare. Like I said, you're better off waiting a few months if you can.

[–] damnthefilibuster 2 points 1 week ago (2 children)

I don’t shy away from buying refurb electronics. But is there a problem with buying used GPUs?

Not looking forward to buying new during this tariff era. So perhaps my local marketplaces might be best…

[–] Mistic 3 points 1 week ago

Not really. You just have to make sure you got what you ordered by doing a bunch of tests. Plus, you usually don't get a warranty.

[–] deepfriedchril 3 points 1 week ago

I've bought used for my last 2 gpus, no issues.

[–] [email protected] 3 points 1 week ago (1 children)

https://github.com/ollama/ollama/blob/main/docs/gpu.md

Ollama supports AMD and Nvidia GPUs, including your existing 1080

[–] damnthefilibuster 2 points 1 week ago

Wow; thanks for finding that doc. Yeah, I’ve made it work on my system. But I’d like to use some of the bigger models.