278
Someone made a GPT-like chatbot that runs locally on Raspberry Pi, and you can too
(www.xda-developers.com)
This is a most excellent place for technology news and articles.
Sorry, I'm just curious in general how fast these local LLMs are. Maybe someone else can give some rough info.