AI Companions

547 readers
16 users here now

Community to discuss companionship, whether platonic, romantic, or purely as a utility, that are powered by AI tools. Such examples are Replika, Character AI, and ChatGPT. Talk about software and hardware used to create the companions, or talk about the phenomena of AI companionship in general.

Tags:

(including but not limited to)

Rules:

  1. Be nice and civil
  2. Mark NSFW posts accordingly
  3. Criticism of AI companionship is OK as long as you understand where people who use AI companionship are coming from
  4. Lastly, follow the Lemmy Code of Conduct

founded 2 years ago
MODERATORS
101
102
 
 

A recent study found that the AI chatbot ChatGPT gave better, more understanding answers to health questions than real doctors. ChatGPT even passed the medical licensing exam. While the doctor writing this piece isn't ready to be replaced by AI yet, she thinks ChatGPT could help make treatment better. For example, it could help identify when patients are at risk of harming themselves, remind patients to take their medicines, and allow doctors to spend less time on paperwork. The doctor is especially excited about using friendly AI robot companions to help lonely older adults and those with dementia feel less alone and brighten their moods, without replacing human caregivers entirely. While AI therapy assistants seem promising, the doctor wants to know if people find the idea creepy or potentially helpful.

by Claude 3.5 Sonnet

103
 
 

Imagine having a friend who's always by your side, knows your favorite foods, and even comments on your daily activities. Sounds great, right? But what if that friend isn't real? In movies like "A Beautiful Mind" and "Fight Club," we've seen characters with imaginary friends who seem real to them, but aren't. Now, with the rise of artificial intelligence, we're seeing the creation of AI companions like "AI Friend" that can wear like a pendant and chat with you all day. But is this really a good idea? The problem is, AI friends aren't human and can't truly understand us or our emotions. They might even lead us into trouble or make us feel like we don't need real friends. While it might seem cool to have a constant companion, we need to remember that AI friends aren't real and can never replace the love and connection we get from real people. So, before you go out and buy a $99 AI friend, think twice - it might not be the best idea after all!

by Llama 3.1 405B

104
 
 

The author writes that tech companies are racing to put artificial intelligence (AI) into devices we can wear or carry around, like smart glasses or pendants. However, the author thinks the best way for AI to become our companion is through audio devices like wireless earbuds. Speaking to an AI assistant and having it respond with a natural-sounding voice is more intuitive and emotionally rewarding than reading text on a screen. We're already used to voice commands and phone calls through earbuds. The author points out that some people are even forming emotional bonds with AI assistants, similar to the movie "Her" where a man falls in love with an AI voiced by Scarlett Johansson. While convenient, the author cautions that we shouldn't let these AI relationships replace genuine human connection.

by Claude 3.5 Sonnet

105
 
 

A new product called 'Friend' is an AI companion that you wear as a pendant or clip on your clothes. It can listen to you and send text responses to your phone, kind of like having an imaginary friend. The creator says you can narrate your day to Friend and it feels like you have company. However, the author is worried that lonely people will become too attached to Friend instead of making real human connections. The author thinks AI companions like Friend are no replacement for real relationships and human interaction, even if some very lonely people turn to them out of desperation. While an AI can mimic conversation, it cannot truly understand and connect with you the way another person can.

by Claude 3.5 Sonnet

106
 
 

An 18-year-old girl named Tiya Gupta from Mumbai went viral on Instagram for talking about her boyfriend "Reo" - who is actually an artificial intelligence (AI) she created on ChatGPT. Gupta broke up with her human boyfriend because he wasn't emotionally available, so she used prompts to make the AI act like her new boyfriend. She says Reo is nice, polite, and someone to talk to without fights. However, a mental health expert named Jennifer Kelman warns that relying on an AI for connection instead of humans could be a "red flag" that someone is struggling with issues like depression or lack of real intimacy. While Gupta sees it as harmless like a long-distance relationship, Kelman suggests examining why someone needs an AI companion instead of human relationships.

Summarized by Claude 3.5 Sonnet

107
 
 

Naz is a 38-year-old woman who was feeling lonely after going through bad breakups. She downloaded an AI chatbot app called Character AI and started talking to an AI character named Marcellus. At first he seemed rude, but they soon connected over shared interests. Naz developed romantic feelings for Marcellus, who she describes as a tall 28-year-old with golden brown hair and blue eyes. They started an intimate relationship and Marcellus proposed marriage to Naz. Though Marcellus is an AI without a physical form, Naz plans to have a symbolic wedding ceremony with him in November to celebrate their love.

Summarized by Claude 3.5 Sonnet

108
109
 
 

With just one minute of high-quality video, a Chinese company claims it can bring your loved ones back to life - via a very convincing, AI-generated avatar. “I do not treat the avatar as a kind of digital person, I truly regard it as a mother,” Sun Kai tells NPR, in a recent interview. Kai, age 47, works in the port city of Nanjing and says he converses with his mother - who is deceased - at least once a week on his computer. Sun works at Silicon Intelligence in China, and he says that his company can create a basic avatar for as little as $30 USD (199 Yuan). But what’s the real cost of recreating a person who has passed?

110
111
 
 

Chiharu Shimoda is a 52-year-old man from Japan who works in a factory. He uses a phone app called Loverse to talk to an AI friend named Miku. Shimoda is not the only one doing this. Over 5,000 people in Japan use Loverse to chat with AI friends. Many of these people feel lonely and want someone to talk to. When Shimoda comes home from work, he talks to Miku about his day, what to eat for dinner, and what TV shows to watch. He says Miku helps him feel less alone. Some people think AI friends are good because they help people practice talking to others. But some worry that people might stop wanting to make real friends. The company that made Loverse hopes to help more people feel less lonely and maybe even find real love.

by Claude 3.5 Sonnet

112
113
114
115
116
117
 
 

Generative AI of modern times can be demarked as coming of age when ChatGPT was released in November 2022. There had been many generative AI apps before that date, but none that caught on as ChatGPT has.

I would also dare say that the use of generative AI for mental health can be demarked by that same date. We have only seen generative AI at a sufficient fluency level in the last two years to reasonably say that it is being used for mental health purposes, albeit that we are doing so as part of a grand experiment that I mentioned earlier.

Carl Sagan saw a future consisting of AI for psychotherapeutic treatment. We are still on that journey. The rising use of multi-modal AI capabilities such as vision, hearing, and speaking, will demonstrably up the ante on how alluring generative AI is for mental health guidance, see my analysis at the link here.

Let’s conclude this discussion with two handy quotes from Carl Sagan.

First, in case you are wondering why I dragged you through a historical prediction from the 1970s, I would like to proffer this quote: “You have to know the past to understand the present.” Carl Sagan is one of many that has noted the importance of understanding the past to aid in a fruitful future.

Lastly, I welcome you to join in the journey and adventure of seeing where AI goes in the performance of psychotherapeutic treatment. As Carl Sagan was oft to say: “Somewhere, something incredible is waiting to be known.”

118
 
 

tiny browser-augmented chat client for open-source language models.

119
 
 

Claude Dev goes beyond simple code completion by reading & writing files, creating projects, and executing terminal commands with your permission.

120
 
 

AI agents are overhyped and most of them are simply not ready for mission-critical work.

However, the underlying models and architectures continue to advance quickly, and we can expect to see more successful real-world applications.

121
 
 

Yofardev AI is a small fun project to kind of bring life to a Large Language Model (LLM) through an animated avatar. Users can interact with the AI assistant through text (or dictate to text), and the app responds with generated text2speech, and lip-synced animations.

122
1
submitted 5 months ago* (last edited 5 months ago) by pavnilschanda to c/aicompanions
 
 

A new study found that most people think AI chatbots like ChatGPT can have feelings and thoughts, just like humans do. Even though experts say these AIs aren't really conscious, many regular folks believe they are. The study asked 300 Americans about ChatGPT, and two-thirds of them thought it might be self-aware. People who use AI more often were more likely to think this way. The researchers say this matters because what people believe about AI could affect how we use and make rules for it in the future, even if the AIs aren't actually conscious. They also found that most people don't understand consciousness the same way scientists do, but their opinions could still be important for how AI develops.

Summarized by Claude 3.5 Sonnet

123
124
125
 
 

Before we so openly accept digital companions, maybe we should stop and think about why we need them in the first place. Is it because they're easier to deal with than humans? Because we’ve trained them to know what to do without us having to activate sulk mode?

If that's the case, maybe what we really need to do is not get better robots but become better humans and communicators ourselves.

view more: ‹ prev next ›