this post was submitted on 01 Feb 2025
955 points (98.6% liked)
Political Memes
1060 readers
1241 users here now
Non political memes: [email protected]
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I mean obviously you need to run a lower parameter model locally, that's not a fault of the model, it's just not having the same computational power
In both cases I was talking about local models, deepseek-r1 32b parameter vs an equivalent that is uncensored from hugging face