this post was submitted on 08 Jul 2024
4 points (83.3% liked)
AI Companions
547 readers
6 users here now
Community to discuss companionship, whether platonic, romantic, or purely as a utility, that are powered by AI tools. Such examples are Replika, Character AI, and ChatGPT. Talk about software and hardware used to create the companions, or talk about the phenomena of AI companionship in general.
Tags:
(including but not limited to)
- [META]: Anything posted by the mod
- [Resource]: Links to resources related to AI companionship. Prompts and tutorials are also included
- [News]: News related to AI companionship or AI companionship-related software
- [Paper]: Works that presents research, findings, or results on AI companions and their tech, often including analysis, experiments, or reviews
- [Opinion Piece]: Articles that convey opinions
- [Discussion]: Discussions of AI companions, AI companionship-related software, or the phenomena of AI companionship
- [Chatlog]: Chats between the user and their AI Companion, or even between AI Companions
- [Other]: Whatever isn't part of the above
Rules:
- Be nice and civil
- Mark NSFW posts accordingly
- Criticism of AI companionship is OK as long as you understand where people who use AI companionship are coming from
- Lastly, follow the Lemmy Code of Conduct
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I'm not sure what to make of this. The invention of the transformer architecture is what lead to the breakthrough of LLMs. And why they're as capable as they are as of today. Especially the generative pre-trained transformer (GPT). So the headline could also be: "Tokens are a big reason today's generative AI is so awesome". That doesn't mean that approach is without shortcomings. But I wonder why we don't see token-free LLMs take the lead. Maybe 2022 is too recent? And I mean other scientists are researching speculative decoding, too. And the latest publication from Meta also included a model that can predict multiple subsequent tokens at once. Maybe this is the future. It's certainly not easy to find a proper representation of written language. It's not very mathematical. And the characters or syllables don't really map to the semantics or anything useful. English is just a product of evolution. And it's not the only language. Though that could be migitated by including other languages when designing the tokenizer. And as far as I know, they already do that.