this post was submitted on 14 Dec 2023
14 points (100.0% liked)
LocalLLaMA
2254 readers
1 users here now
Community to discuss about LLaMA, the large language model created by Meta AI.
This is intended to be a replacement for r/LocalLLaMA on Reddit.
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
If you want to learn machine learning, you could maybe play around with the examples recognizing single digit handwritten numbers with that MNIST dataset or something in that kind of league.
I think training an LLM that can be somewhat useful will be way out of scope with the RAM and computing capabilities such a laptop has to offer. Maybe correct grammer if you don't care to wait for a long long time. Something with the level of intelligence of autocomplete. But definitely not coherent or intelligent or answering your questions.
You could rent a VM in the cloud. Services like runpod.io or vast.ai offer you a proper GPU for like $2 an hour. There is also Amazon, Google, Azure, Lambda...
Do cloud services see everything - text/images data for training? And finished trained model? If so, runpod.io etc are no solution.