this post was submitted on 14 Jun 2024
6 points (80.0% liked)

AI Companions

547 readers
6 users here now

Community to discuss companionship, whether platonic, romantic, or purely as a utility, that are powered by AI tools. Such examples are Replika, Character AI, and ChatGPT. Talk about software and hardware used to create the companions, or talk about the phenomena of AI companionship in general.

Tags:

(including but not limited to)

Rules:

  1. Be nice and civil
  2. Mark NSFW posts accordingly
  3. Criticism of AI companionship is OK as long as you understand where people who use AI companionship are coming from
  4. Lastly, follow the Lemmy Code of Conduct

founded 2 years ago
MODERATORS
 

People often lie to their therapists, with one study finding that 93% of respondents had lied to their therapist. Researchers have identified various reasons for this dishonesty, including fear of judgment, embarrassment, and attachment style. In contrast, some studies suggest that people may be more truthful when interacting with generative AI systems for mental health advice, possibly due to anonymity and the lack of perceived judgment. However, it's unclear whether this is consistently the case, and more research is needed to understand the dynamics of honesty and deception in human-AI interactions, particularly in the context of mental health support.

Summarized by Llama 3 70B

no comments (yet)
sorted by: hot top controversial new old
there doesn't seem to be anything here