Glagnar

joined 1 year ago
 

Paper released by Meta a few days ago, detailing a method for extending the context or "memory" of an LLM up to 32k tokens. What is interesting is that they give a mention to: https://kaiokendev.github.io/

This is a blog post written by a guy in his spare time who came up with the same method simultaneously, he calls it SuperHOT.

It's really exciting how AI/ Machine learning can be advanced by relatively ordinary people putting in hard work without the resources of Microsoft etc.

[–] [email protected] 12 points 1 year ago (1 children)

I've gotten so much out of Terraria over the years, it makes me feel a bit guilty having paid less than £5 for it. Truly a great game.