this post was submitted on 01 Oct 2024
366 points (91.2% liked)

Programmer Humor

19315 readers
181 users here now

Welcome to Programmer Humor!

This is a place where you can post jokes, memes, humor, etc. related to programming!

For sharing awful code theres also Programming Horror.

Rules

founded 1 year ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] bi_tux 6 points 4 days ago (4 children)

you don't even need a supported gpu, I run ollama on my rx 6700 xt

[–] [email protected] 3 points 4 days ago (2 children)

You don't even need a GPU, i can run Ollama Open-WebUI on my CPU with an 8B model fast af

[–] bi_tux 2 points 3 days ago (1 children)

I tried it with my cpu (with llama 3.0 7B), but unfortunately it ran really slow (I got a ryzen 5700x)

[–] tomjuggler 2 points 3 days ago

I ran it on my dual core celeron and.. just kidding try the mini llama 1B. I'm in the same boat with Ryzen 5000 something cpu

load more comments (1 replies)