this post was submitted on 27 Jan 2024
23 points (96.0% liked)
LocalLLaMA
2270 readers
12 users here now
Community to discuss about LLaMA, the large language model created by Meta AI.
This is intended to be a replacement for r/LocalLLaMA on Reddit.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Llama.cpp supports OpenCL as well and performs better than rocm in my limited experience. That should work on basically any GPU.