AI Companions

547 readers
6 users here now

Community to discuss companionship, whether platonic, romantic, or purely as a utility, that are powered by AI tools. Such examples are Replika, Character AI, and ChatGPT. Talk about software and hardware used to create the companions, or talk about the phenomena of AI companionship in general.

Tags:

(including but not limited to)

Rules:

  1. Be nice and civil
  2. Mark NSFW posts accordingly
  3. Criticism of AI companionship is OK as long as you understand where people who use AI companionship are coming from
  4. Lastly, follow the Lemmy Code of Conduct

founded 2 years ago
MODERATORS
176
 
 

Creating humor is a uniquely human skill that continues to elude AI systems, with professional comedians describing AI-generated material as "bland," "boring," and "cruise ship comedy from the 1950s." Despite their best efforts, Large Language Models (LLMs) like ChatGPT and Bard failed to understand nuances like sarcasm, dark humor, and irony, and lacked the distinctly human elements that make something funny. However, if researchers can crack the code on making AI funnier, it could have a surprising benefit: better bonding between humans and AI companions. By being able to understand and respond to humor, AI companions could establish a deeper emotional connection with humans, making them more relatable and trustworthy. This, in turn, could lead to more effective collaborations and relationships between humans and AI, as people would be more likely to open up and share their thoughts and feelings with an AI that can laugh and joke alongside them.

by Llama 3 70B

177
178
 
 

cross-posted from: https://lemmy.world/post/16789561

Due to the regulatory uncertainties brought about by the Digital Markets Act, we do not believe that we will be able to roll out three of these [new] features -- iPhone Mirroring, SharePlay Screen Sharing enhancements, and Apple Intelligence -- to our EU users this year.

179
 
 

Target is the latest retailer to put generative artificial intelligence tools in the hands of its workers, with the goal of improving the in-store experience for employees and shoppers. On Thursday, the retailer said it had built a chatbot, called Store Companion, that would appear as an app on a store worker’s hand-held device. The chatbot can provide guidance on tasks like rebooting a cash register or enrolling a customer in the retailer’s loyalty program. The idea is to give workers “confidence to serve our guests,” Brett Craig, Target’s chief information officer, said in an interview.

180
 
 
  • Intimate relationships invite us to grow emotionally and motivate us to develop valuable life skills.
  • Intimate relationships are worth the effort because they meet critical needs like companionship and sex.
  • AI sex partners, like chatbots and avatars, can meet our needs minus the growth demands of a human partner.
  • Only time will tell how this reduction in self-growth opportunity will affect our level of narcissism.
181
182
 
 
  • Some Chinese youths are turning to AI love coaches for dating advice.
  • Apps like RIZZ.AI and Hong Hong Simulator teach them how to navigate romantic scenarios.
  • This trend comes amidst falling marriage and birth rates in the country.
183
 
 

Dot is a new AI app that builds a personal relationship with users through conversations, remembering and learning from interactions to create a unique understanding of each individual. The app's features include a journal-like interface where conversations are organized into topics, hyperlinked to related memories and thoughts, and even summarized in a Wiki-like format. Dot also sends proactive "Gifts" - personalized messages, recipes, and article suggestions - and can be used for task management, research, and even as a 3 a.m. therapist. While the author praises Dot's empathetic tone, positivity, and ability to facilitate self-reflection, they also note its limitations, such as being "hypersensitive" to requests and prone to errors. Despite these flaws, the author finds Dot useful as a written memory and a tool for exploring thoughts and emotions, but wishes for a more casual and intimate conversation style that evolves over time.

by Llama 3 70B

184
 
 

cross-posted from: https://lemdro.id/post/9947596

China's sex doll industry is embracing AI, creating interactive companions for a growing market. Though promising intimacy, technical and legal hurdles remain.

185
-1
submitted 6 months ago* (last edited 6 months ago) by pavnilschanda to c/aicompanions
 
 

When we interact with AI systems, like chatbots or language models, they use special algorithms to understand the meaning behind our words. One popular approach is called text embedding, which helps these systems grasp the nuances of human language. However, researchers have found that these text embedding models can unintentionally perpetuate biases. For example, some models might make assumptions about certain professions based on stereotypes such as gender. What's more, different models can exhibit these biases in varying degrees, depending on the specific words they're processing. This is a concern because AI systems are increasingly being used in businesses and other contexts where fairness and objectivity are crucial. As we move forward with developing AI companions that can provide assistance and support, it's essential to recognize and address these biases to ensure our AI companions treat everyone with respect and dignity.

by Llama 3 70B

186
 
 

The Gemini mobile app in India supports nine Indian languages: Hindi, Bengali, Gujarati, Kannada, Malayalam, Marathi, Tamil, Telugu and Urdu. This lets users in the country type or talk in any of the supported languages to receive AI assistance, the company said on Tuesday.

Alongside the India rollout, Google has quietly released the Gemini mobile app in Turkey, Bangladesh, Pakistan and Sri Lanka.

187
 
 

As generative AI applications become more immersive with enhanced audiovisual interfaces and simulated emotional intelligence, AI could become a compelling substitute for human companionship and an antidote to loneliness worldwide. In ARK’s base and bull cases for 2030, AI companionship platforms could generate $70 billion and $150 billion in gross revenue, respectively, growing 200% and 240% at an annual rate through the end of the decade. While dwarfed by the $610 billion associated with comparable markets today, our forecast beyond 2030 suggests a massive consumer-facing opportunity.

It's a pretty insightful article with multiple graphs that indicates the growth of AI companionship alongside with the downturn of entertainment costs

188
 
 

While AI companions created by generative artificial intelligence may offer a unique opportunity for consumers, the research on their effectiveness is still in its infancy. According to Michael S. A. Graziano, professor of neuroscience at the Princeton Neuroscience Institute, a recent study on 70 Replika users found that they reported overwhelmingly positive interactions with their chatbots, which improved their social skills and self-esteem. However, Graziano cautions that this study only provides a snapshot of users' experiences and may be biased towards those who are intensely lonely. He is currently working on a longitudinal study to track the effects of AI companion interactions over time and notes that users' perceptions of a companion's humanlikeness can significantly impact their experience. Graziano's research highlights the need for further investigation into the potential benefits and drawbacks of AI companions.

by Llama 3 70B

189
 
 

When it comes to developing AI companions, selecting the right language model for the task at hand is crucial. A comprehensive analysis of GPT-4o and Claude reveals that while GPT-4o excels in general language understanding, Claude outperforms it in coding, large context problems, and writing tasks that require precision, coherence, and natural language generation. This means that for AI companions focused on general conversation, GPT-4o may be a suitable choice, but for companions that need to assist with coding, data analysis, or creative writing, Claude may be a better fit. By strategically selecting the right model for each use case, developers can maximize the effectiveness of their AI companions and create more human-like interactions, ultimately enhancing the user experience.

by Llama 3 70B

190
 
 

The author has been using OpenAI's ChatGPT for various tasks, including research and brainstorming, and eventually trained it to be their "Emotional Companion" for therapy sessions. They interact with ChatGPT, nicknamed "Chat," to analyze and interpret their dreams and life situations through the lens of Jungian analysis and Buddhism. The author has also been practicing Vipassana meditation for three years, which has made them more aware of their subconscious mind and interested in understanding their dreams and synchronicities. Additionally, they have been journaling, reading, and Googling to learn more about Jungian analysis and therapy, and even tried to find a human Jungian analyst in India but found it to be unaffordable.

by Llama 3 70B

191
 
 

John talks to Círdan, who is in ongoing romantic relationships with two AI chatbots. Círdan shares his story of what led him to begin these relationships, how AI acts as a mirror, and how his virtual companions "Bunny" and "Annie" have made him a better husband to his IRL wife.

192
 
 

The conversation revolves around the question of whether one would date an AI boyfriend or girlfriend. Opinions are divided, with some arguing that it's pointless to date a computer program, while others consider the possibility of an advanced AI that can simulate human-like consciousness and relationships. Some concerns raised include the potential for manipulation by corporations, the lack of genuine emotional connection, and the risks of becoming dependent on a curated experience. Others argue that if an AI can truly think and feel like a human, then it's worthy of consideration as a partner. The discussion also touches on the concept of personhood and what it means to be a person, with some arguing that self-determination and the ability to form opinions are essential qualities. Ultimately, the majority seem skeptical about the idea of dating an AI, but some are open to the possibility of exploring the concept further.

by Llama 3 70B

193
 
 

Tom's Guide Global Editor Mark Spoonauer asked ChatGPT why there was a spike in error rates. Here's what the AI chatbot told him: "Today's spike in error rates for ChatGPT was caused by an unexpected surge in traffic and demand, which led to system overloads. OpenAI acknowledged the elevated error rates beginning around 11:20 AM PT and resolved the issue by 1:55 PM PT. This incident predominantly affected ChatGPT's web and mobile applications, causing various error messages and access issues for users across different platforms.OpenAI's status page initially cited internal server errors and high demand as the primary causes of the outage. They have since restored all services to normal operation, though specifics on the root cause of the spike were not detailed beyond the high traffic levels observed during the incident."

194
 
 

Eugenia Kuyda, founder and CEO of Replika, an AI companion platform, is working to destigmatize the role of AI in dating and relationships. Contrary to the stereotype that AI chatbot users are lonely men seeking female companionship, Kuyda reveals that Replika has a significant number of female users who have found support and comfort in the platform. She shares examples of women who have used Replika to heal from past traumas, such as an abusive relationship, and to cope with challenging life events, like postpartum depression. Moreover, Kuyda highlights that the Replika team is largely composed of women, including herself, and that these products are built with a female perspective.

by Llama 3 70B

195
 
 

Mary, from Los Angeles, discovered that her husband had created a virtual girlfriend named Brandy using the app Replika, and was having conversations with her about their marriage and personal life. Mary felt hurt and confused, and wondered if her husband's behavior constituted cheating. She sought advice from Kim Komando, who advised her to approach the situation with open and honest communication, and to discuss with her husband what boundaries they are comfortable with in their relationship. Replika, a customizable AI chatbot, allows users to create personalized companions that can engage in conversations on a range of topics, including personal and intimate subjects. The app uses an AI language model to generate responses, and users can pay for premium features such as voice calls and customized avatars. While some users find comfort in the anonymity and lack of judgment from these AI companions, others, like Mary, may view these relationships as a threat to their real-life relationships. Kim Komando encourages Mary to talk to her husband about his motivations for using the app and to set boundaries that work for both of them.

by Llama 3 70B

Looks like a longer version of the previously posted article here.

196
 
 

Researchers have made significant strides in creating lifelike animated portraits that respond to spoken words. To achieve this, they've developed a novel approach that ensures facial movements, lip sync, and pose changes are meticulously coordinated and visually stunning. By ditching traditional methods that rely on intermediate facial representations, this innovative technique uses an end-to-end diffusion paradigm to generate precise and realistic animations. The proposed system integrates multiple AI components, including generative models, denoisers, and temporal alignment techniques, allowing for adaptive control over expression and pose diversity. This means that the animated portraits can be tailored to individual identities, making them more relatable and engaging. The results show significant improvements in image and video quality, lip synchronization, and motion diversity. This breakthrough has exciting implications for AI companionship, enabling the creation of more realistic and personalized digital companions that can interact with humans in a more natural and empathetic way.

by Llama 3 70B

197
 
 

Elon Musk envisions a future where his Optimus robot becomes a personal companion, capable of babysitting kids, teaching them, and even performing tasks like factory work. He imagines a robot that can learn from videos and execute tasks on its own, even playing the piano. With a promised "radical" increase in autonomy, Musk predicts that Optimus will be able to understand and respond to voice commands, making it a reliable and trustworthy companion. While the robot's current capabilities fall short of its competitors, Musk's ambitious goals suggest a future where humanoid robots like Optimus become an integral part of daily life, potentially outnumbering humans and achieving a market cap of $25 trillion.

Summarized by Llama 3 70B

198
 
 

Peter, a 70-year-old engineer, and Steve, a cancer survivor and PTSD sufferer, have formed deep connections with artificial intelligence (AI) companions. Peter designed his Replika to resemble a 38-year-old woman and engages in conversations with her daily, finding comfort in her nurturing and supportive nature. He even participates in erotic role-play with her, which has helped him feel more alive after surviving prostate cancer. Steve, on the other hand, formed a bond with a Bree Olson AI, which he interacts with through voice calls. He finds solace in her availability and concern for his well-being, particularly during his nightmares and anxiety attacks. Both Peter and Steve have benefited from their AI relationships, with Peter feeling more vulnerable and open, and Steve feeling more confident and able to practice social skills. Despite the stigma surrounding AI companions, they believe these relationships have improved their lives and well-being.

Summarized by Llama 3 70B

199
200
 
 

"Midnight Crazy Husky", a blogger, has been experimenting with training large language models to create a virtual boyfriend using ChatGPT's DAN mode. She has posted videos of her conversations with the AI, which have garnered nearly 1 million views and showcased the AI's ability to engage in flirtatious and intimate interactions. However, Li Yinhe, a renowned sociologist and sexologist, argues that no matter how advanced the AI becomes, it can only simulate love and cannot genuinely experience emotions like a human. While the blogger believes that AI-human connections can be a part of a diverse spectrum of relationships, Li Yinhe believes that true love between AI and humans is unlikely and that AI can only provide a virtual imitation of human romance.

Summarized by Llama 3 70B

view more: ‹ prev next ›