this post was submitted on 19 Jan 2024
384 points (98.2% liked)

Technology

58123 readers
4093 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

ChatGPT's new AI store is struggling to keep a lid on all the AI girlfriends::OpenAI: 'We also don’t allow GPTs dedicated to fostering romantic companionship'

you are viewing a single comment's thread
view the rest of the comments
[–] _number8_ 33 points 8 months ago (26 children)

why? why not let people just retreat into fantasy? it's probably healthier than many common coping mechanisms. i mean, it's a chatbot, how much can you do with it?

let people have their temporary salve to get them thru whatever they were going thru such that they were resorting to this. and if it's not temporary, ok, fine? better to have some outlet than be even more mentally isolated. maybe in 50 years this will be common, who knows.

[–] devfuuu 18 points 8 months ago (6 children)

These kinds of things are not temporary. We know that humans can't control themselves and aren't rational enough to "just use it a bit". It's highly addictive and leads to people to remove themselves from reality.

load more comments (5 replies)
load more comments (24 replies)