this post was submitted on 14 May 2024
44 points (86.7% liked)
ChatGPT
8974 readers
1 users here now
Unofficial ChatGPT community to discuss anything ChatGPT
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
local AI is the way. it's just that current models aren't gpt4 quality yet and you'd probably need 1 TB of VRAM to run them
Surprisingly, there’s a way to run Llama 3 70b on 4GB of VRAM.
https://huggingface.co/blog/lyogavin/llama3-airllm
Llama3 70b is pretty good, and you can run that on 2x3090's. Not cheap, but doable.
You could also use something like runpod to test it out cheaply