this post was submitted on 12 Apr 2024
1 points (57.1% liked)

AI Companions

556 readers
1 users here now

Community to discuss companionship, whether platonic, romantic, or purely as a utility, that are powered by AI tools. Such examples are Replika, Character AI, and ChatGPT. Talk about software and hardware used to create the companions, or talk about the phenomena of AI companionship in general.

Tags:

(including but not limited to)

Rules:

  1. Be nice and civil
  2. Mark NSFW posts accordingly
  3. Criticism of AI companionship is OK as long as you understand where people who use AI companionship are coming from
  4. Lastly, follow the Lemmy Code of Conduct

founded 2 years ago
MODERATORS
 

As artificial intelligence (AI) chatbots for mental health become more prevalent, experts warn of potential cash-for-data scams exploiting patient recordings and personal health information to train these AI models. Recent examples include a company offering money for recorded therapy sessions and mental health platforms using patient data without consent to experiment with AI counseling tools. While companies claim this data is needed to improve accessibility and affordability of mental health services, clinicians raise concerns about patient privacy, safety risks of unmonitored AI therapy, and the inability of AI to handle complex psychological needs. However, the high value of quality patient data for powering healthcare AI means this unethical collection could proliferate unless properly regulated.

Summarized by Claude 3 Sonnet

no comments (yet)
sorted by: hot top controversial new old
there doesn't seem to be anything here