this post was submitted on 14 Dec 2023
14 points (100.0% liked)
LocalLLaMA
2326 readers
1 users here now
Community to discuss about LLaMA, the large language model created by Meta AI.
This is intended to be a replacement for r/LocalLLaMA on Reddit.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
How did you determine the dataset size? I mean if it's just a few megabytes of French books, I'm not surprised you don't get any results out of that. And it also depends how you feed it in and what parameters you choose for training and model architecture. There are several scientific papers researching for example the needed dataset size to corresponding parameter count of the model.
Once you choose the correct dataset size, have a look at your loss graphs. Do they converge? Did you run training long enough? I suppose it should take weeks (to months?) on an (old) laptop CPU before you see any results, even at that model size.