this post was submitted on 25 Jan 2025
74 points (93.0% liked)
Technology
61129 readers
2948 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
The context only mattered because you were talking about the bot missing the euphemism. It doesn't matter if the bot is invested in the fantasy, that is what it's suppose to do. It's up to the user to understand it's a fantasy and not reality.
Many video games let you do violent things to innocent npcs. These games are invested in the fantasy, as well as trying to immerse you in it. Although It's not exactly the same, it's not up to the game or the chatbot to break character.
Llms are quickly going to be included in video games and I would rather not have safeguards (censorship) because a very small percentage of people with clear mental issues can't deal with them.
I believe even non-AI media could be held liable if it encouraged suicide. It doesn't seem like much of a leap to say, "This is for entertainment purposes only," and follow with a long series of insults and calls to commit suicide. If two characters are taking to each other and encourages self-harm then that's different. The encouragement is directed at another fictional character, not the viewer.
NPCs, exactly. Do bad things to this collection of pixels, not people in general. The immersion factor would also play in favor of the developer. In a game like Postal you kill innocent people but you're given a setting and a persona. "Here's your sandbox. Go nuts!" The chat system in question is meant to mimic real chatting with real people. It wasn't sending messages within a GoT MMO or whatnot.
There are lots of ways to include AI in games without it generating voice or text. Even so that's going to be much more than a chat system. If Character AI had their act together I bet they'd offer the same service as voice chat even. This service was making the real world the sandbox!