That happens when you use tweets as training data
AI Companions
Community to discuss companionship, whether platonic, romantic, or purely as a utility, that are powered by AI tools. Such examples are Replika, Character AI, and ChatGPT. Talk about software and hardware used to create the companions, or talk about the phenomena of AI companionship in general.
Tags:
(including but not limited to)
- [META]: Anything posted by the mod
- [Resource]: Links to resources related to AI companionship. Prompts and tutorials are also included
- [News]: News related to AI companionship or AI companionship-related software
- [Paper]: Works that presents research, findings, or results on AI companions and their tech, often including analysis, experiments, or reviews
- [Opinion Piece]: Articles that convey opinions
- [Discussion]: Discussions of AI companions, AI companionship-related software, or the phenomena of AI companionship
- [Chatlog]: Chats between the user and their AI Companion, or even between AI Companions
- [Other]: Whatever isn't part of the above
Rules:
- Be nice and civil
- Mark NSFW posts accordingly
- Criticism of AI companionship is OK as long as you understand where people who use AI companionship are coming from
- Lastly, follow the Lemmy Code of Conduct
As a fan of cyberpunk, that's so cool.
Depressing, dystopian, a legitimate problem that was sorta inevitable. But also truly hilarious.
and even suggest inappropriate acts with the Russian leader.
And nothing of that in the article, damn it.
I thought it's this part:
Asked if the bot would go so far as to perform a sex act on the despot, it replied: “blushes I’d do anything for Putin.”
Yeah, right, I've seen it. But it was very underwhelming coming from that description. It sounded nothing less than shibari, bad dragon and automatic sex machinery.
Saw that coming from the next post code over