this post was submitted on 06 Sep 2023
26 points (93.3% liked)
LocalLLaMA
2274 readers
5 users here now
Community to discuss about LLaMA, the large language model created by Meta AI.
This is intended to be a replacement for r/LocalLLaMA on Reddit.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
If you don't need cuda or ai, the 7900 is great.
You can run CUDA apps on ROCm HIP. It’s easy.
Whoa need to me. I'll have to dig in on that.