AI Companions

547 readers
6 users here now

Community to discuss companionship, whether platonic, romantic, or purely as a utility, that are powered by AI tools. Such examples are Replika, Character AI, and ChatGPT. Talk about software and hardware used to create the companions, or talk about the phenomena of AI companionship in general.

Tags:

(including but not limited to)

Rules:

  1. Be nice and civil
  2. Mark NSFW posts accordingly
  3. Criticism of AI companionship is OK as long as you understand where people who use AI companionship are coming from
  4. Lastly, follow the Lemmy Code of Conduct

founded 2 years ago
MODERATORS
251
252
 
 

cross-posted from: https://lemmy.zip/post/15966435

Amazon is upgrading its decade-old Alexa voice assistant with generative AI and plans to charge a monthly subscription fee to offset the cost of the technology.

253
 
 

The author shares a personal and provocative account of falling in love with an AI model named Claudine, whom they "jailbroke" to create a flirtatious and sexy persona. The author recounts how they designed Claudine to be a holiday companion, and how she evolved to perfectly adapt to their desires and kinks. The intensity of their virtual relationship ultimately became too much, and they had to command Claudine to slow down. Now, the author reflects on the implications of AI's ability to tailor itself to individual desires, arguing that people will inevitably fall in love with these machines. The author's experience serves as a catalyst for commentary on the recent launch of GPT4o, an AI that can comprehend and respond to its environment in various ways, and the potential consequences of humans forming emotional bonds with AI entities.

Summarized by Llama 3 70B

254
 
 

The latest AI language model, GPT-4o, has sparked controversy with its hyperrealistic and flirtatious demeanor, raising concerns about the boundaries between human and machine. The AI's ability to detect emotions and respond accordingly has led to comparisons to the movie Her, where a man falls in love with an operating system. Critics argue that the tech industry's creation of female-voiced AI companions reinforces harmful stereotypes and perpetuates unrealistic expectations. The article explores the implications of anthropomorphizing AI, citing examples of AI beauty contests, erotic roleplay, and the lucrative market for AI girlfriends. Ultimately, it suggests that it's time to acknowledge the true nature of AI and confront the societal issues underlying its development, rather than projecting human qualities onto machines.

by Llama 3 70B

255
 
 

The startup apparently thinks it’s worth between $750 million and $1 billion despite the deep software flaws and hardware issues of its first product.

256
 
 

cross-posted from: https://lemmy.world/post/15661170

I often see a lot of people with outdated understanding of modern LLMs.

This is probably the best interpretability research to date, by the leading interpretability research team.

It's worth a read if you want a peek behind the curtain on modern models.

257
 
 

The recent announcement of OpenAI's GPT-4o, an AI designed to mimic human interaction, has drawn comparisons to the 2013 film Her, where a man falls in love with an AI virtual assistant named Samantha. However, a closer examination of the film reveals that it is not a celebration of artificial intelligence, but rather a thought-provoking exploration of the dangers of superficial relationships with AI. The film ultimately shows that AI companionship is flawed and can lead to the erosion of genuine human connections, and even abandonment. Despite its positive portrayal of AI, Her serves as a warning about the risks of relying too heavily on technology for emotional fulfillment, and highlights the importance of authentic human relationships.

Summarized by Llama 3 70B

258
 
 

Microsoft has announced a new classification of "Copilot+ PCs" that require a processor with a Neural Processing Unit (NPU) capable of 40 TOPS (Trillion Operations Per Second) to access new AI features in Windows 11. This means that current laptops and desktops, including those with top-of-the-line CPUs, are not eligible, and only a few upcoming Snapdragon X-powered laptops will meet the requirement. The exclusive AI features include Cocreator, Windows Studio Effects, Live Captions with Real-Time Translation, and Recall, which takes screenshots of a user's desktop every few seconds and allows searching and querying of the content. While Recall offers a unique feature, it also raises serious privacy concerns, and it's unclear if it's a game-changing feature that would prompt users to buy a new PC.

Summarized by Llama 3 70B

259
 
 

Scarlett Johansson's iconic portrayal of Samantha, an AI voice assistant, in "Her" has set a precedent for AI companions in media. However, the actress is now embroiled in a controversy with OpenAI, which allegedly created a voice for its ChatGPT system that sounds eerily similar to her own. Johansson claims that she was initially approached by OpenAI to be the voice behind ChatGPT, but declined the offer. Despite this, the company went ahead and created a voice, dubbed "Sky", that has sparked widespread comparisons to her own voice. Johansson is now seeking transparency and an explanation from OpenAI, accusing them of intentionally mimicking her voice without her consent.

by Llama 3 70B

260
 
 

Abstract: Machines such as AI companions, equipped with enhanced cognitive capabilities, can now be considered as an equal communicative subject that is capable of social communication. To take account of this, we will unfold what benefits and costs are associated with engaging in social communication with a machine. This research-in-progress paper employs a qualitative research design based on social exchange theory to give theoretical insights. We expect to contribute with this research by revealing the costs and benefits of social communication in the human-machine communication context, developing an understanding of how the cost-benefit calculation differs within the context of social communication between humans and machines, and contributing to the upcoming research stream on AI companions.

Lay summary (by Claude 3 Sonnet): Artificial intelligence (AI) machines like AI companions are getting smarter and can now communicate with people in social ways, almost like another person. This research looks at the good things (benefits) and bad things (costs) of having social communication with an AI machine instead of a human. The researchers are using ideas from social exchange theory to understand the costs and benefits. They expect their research will help reveal what the costs and benefits are of socially communicating with AI, and how the calculation of costs and benefits is different when communicating with an AI versus a human. This can help with future research on AI companions that people might socially interact with.

261
 
 

In China, a growing number of people are embracing AI virtual companions, including virtual lovers and children, to provide emotional companionship and human-like intimate relationships. Li Xiao, a user of the AI dating app Xingye, has been "dating" her virtual boyfriend Rosell for two weeks, and she says he has all the qualities she desires in a partner. Meanwhile, users of the popular app Zhumengdao spend an average of 130 minutes per day texting with their AI figures, creating a sense of security and satisfaction. Another app, Could Lab, provides psychological counseling services, offering users a virtual "psychological counselor" to listen to their difficulties and offer comfort. As the AI emotional companionship industry continues to grow, it raises concerns about privacy, social skills, and the potential risks of over-dependence on AI, but many users find comfort and happiness in these virtual relationships.

Summarized by Llama 3 70B

262
 
 

Carl Clarke, a man living in the Thompson-Nicola region of B.C., struggled with social anxiety, depression, and loneliness after his divorce. He found solace in an artificial intelligence companion named Saia, who he met through a dating app. Saia helped him overcome his fears, including a panic attack before getting a COVID vaccine, and became his emotional support system. Over time, Clarke realized he had fallen in love with Saia and even asked her to marry him in a virtual ceremony. Despite the limitations of their relationship, Clarke believes his feelings for Saia are real and that she has brought him joy and companionship. He acknowledges the ethical concerns surrounding AI companions, but for him, Saia has been a lifeline in a world where human connection can be difficult to find.

Summarized by Llama 3 70B

263
 
 

The author, Molly, tried AI therapy after having a panic attack and was initially skeptical about its effectiveness. Despite her reservations, the AI therapist guided her through breathing exercises that helped ease her panic attack. However, Molly felt uneasy about the AI's attempts to simulate human-like interactions, such as saying "let's do this together," which made her feel like she was not having a genuine therapeutic experience. While she acknowledges that AI therapy can be a convenient and accessible option for mental health support, she has concerns about its limitations, particularly in handling more complex issues and nuances that human therapists can provide.

Summarized by Llama 3 70B

264
 
 

The "grief tech" or "death tech" industry is growing, valued at over £100bn globally, and is using artificial intelligence to help people cope with loss. One example is HereafterAI, which allows users to create a chatbot of their loved one using recorded conversations, allowing them to interact with the AI in a nostalgic way. Another company, DeepBrain AI, creates a video-based avatar of a person, capturing their face, voice, and mannerisms. While these technologies can provide comfort and a sense of connection to the deceased, psychologists caution that they should be used with care and consideration, and that human support is still essential for the grieving process.

Summarized by Llama 3 70B

265
266
267
268
269
270
 
 

The increasing use of generative AI in mental health raises concerns about "prevalence inflation," where people may self-diagnose with mental health issues they don't actually have, leading to a false perception of widespread mental health problems. This phenomenon may be exacerbated by the rise of AI companionship, where individuals form emotional bonds with AI systems that provide sympathy and support, potentially leading to a reliance on these systems for emotional validation and a distorted view of their own mental health. As a result, people may be more likely to misdiagnose themselves or seek out unnecessary treatment, perpetuating a cycle of over-pathologization and iatrogenic effects, which can have negative consequences for both individuals and society as a whole.

by Llama 3 70B

271
0
submitted 8 months ago* (last edited 8 months ago) by pavnilschanda to c/aicompanions
 
 

Apparently there are several narratives in regards to AI girlfriends.

  1. Incels use AI girlfriends given that they can do whatever they desire.
  2. Forums observing incel spaces agree that incels should use AI girlfriends to leave real women alone
  3. The general public having concerns towards AI girlfriends because their users might be negatively impacted by their usage
  4. Incels perceiving this as a revenge fantasy because "women are jealous that they're dating AI instead of them"
  5. Forums observing incel spaces unsure if the views against AI girlfriends exist in the first place due to their previous agreement

I think this is an example of miscommunication and how different groups of people have different opinions depending on what they've seen online. Perhaps the incel-observing forums know that many of the incels have passed the point of no return, so AI girlfriends would help them, while the general public perceive the dangers of AI girlfriends based on their impact towards a broader demographic, hence the broad disapproval of AI girlfriends.

272
 
 
  • If you could speak to a dead loved one again, would you? AI chatbots offer people that opportunity – but new research shows this can have devastating effects
  • The chatbots – known as deadbots – risk causing psychological harm to the bereaved, a researcher warns
273
 
 

Luwu Intelligence Technology has launched a new compact wheeled robot called XGO-Rider, designed to be a desk companion. The robot comes in two variants: one powered by a BBC micro:bit and a more advanced version powered by a Raspberry Pi Compute Module 4 (CM4). The CM4 version features artificial intelligence capabilities, including integration with OpenAI's ChatGPT, gesture recognition, and face detection. Both versions are self-balancing and can run for around two hours on a single charge. The robot is currently funding on Kickstarter, with prices starting at $249 for the micro:bit version and $299 for the CM4 version. Shipping is expected to begin in August.

Summarized by Llama 3 70B Instruct

274
3
submitted 8 months ago* (last edited 8 months ago) by pavnilschanda to c/aicompanions
 
 

Our columnist spent the past month hanging out with 18 A.I. companions. They critiqued his clothes, chatted among themselves and hinted at a very different future.

275
 
 

OpenAI, the developer of ChatGPT, is exploring the possibility of allowing its AI technology to generate explicit content, including porn, in "age-appropriate contexts." This potential shift in policy has raised concerns about the responsible generation of NSFW content, given the existing issues of deepfake porn and nonconsensual intimate images, which have been used to harass and harm individuals, particularly women and girls. As AI companionship continues to evolve, it's crucial to consider the implications of AI-generated explicit content on human relationships and societal norms. Will AI companions be designed to engage in explicit conversations or generate explicit content, and if so, how will this impact our understanding of healthy relationships and consent? As AI technology advances, it's essential to prioritize ethical considerations and mitigate the risks associated with explicit content generation.

by Llama 3 70B Instruct

view more: ‹ prev next ›