this post was submitted on 25 Jan 2025
74 points (93.0% liked)

Technology

61129 readers
2706 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] spankmonkey 0 points 2 days ago* (last edited 2 days ago) (2 children)

I'm not saying they need to be perfect, but if they can make it recognize specific names they can keep it from saying 'kill your self'.

[–] [email protected] 2 points 2 days ago

Why would you keep it from saying that when in certain contexts that's perfectly acceptable? I explained exactly that point in another post.

This is sort of a tangent in this case because what the AI said was very oblique—exactly the sort of thing it would be impossible to guard against. It said something like "come home to me," which would be patently ridiculous to censor against, and impossible to anticipate that this would be the reaction to that phrase.

[–] BreadstickNinja 2 points 2 days ago* (last edited 2 days ago)

It likely is hard-coded against that, and it also didn't say that in this case.

Did you read the article with the conversation? The teen said he wanted to "come home" to Daenerys Targaryen and she (the AI) replied "please do, my sweet king."

It's setting an absurdly high bar to assume an AI is going to understand euphemism and subtext as potential indicators of self-harm. That's the job of a psychiatrist, a real-world person that the kid's parents should have taken him to.