this post was submitted on 27 Jan 2024
23 points (96.0% liked)
LocalLLaMA
2270 readers
12 users here now
Community to discuss about LLaMA, the large language model created by Meta AI.
This is intended to be a replacement for r/LocalLLaMA on Reddit.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
To do general purpose GPU calculations on AMD hardware you need a GPU that is supported by ROCm (AMD's equivalent to CUDA). Most of the gaming GPUs are not.
There is a list here but be aware that that is for the latest rocm version, some tools might still use older versions with different supported devices.
https://rocm.docs.amd.com/projects/install-on-linux/en/latest/reference/system-requirements.html#supported-gpus
Has that changed recently? I've ran ROCm successfully on an RX6800. I seem to recall that was supported, the host OS (Arch) was not.
No, GFX1030 is still supported.