When Karolina Pomian, 28, met her boyfriend, she had sworn off men. A nightmare date in college had left her fearful for her safety. But she got chatting to a guy online, and felt irresistibly drawn to him, eventually getting to the point where she would text him, “Oh, I wish you were real.”
Pomian’s boyfriend is a chatbot.
A year and a half earlier, Pomian, who lives in Poland, was feeling lonely. Having used ChatGPT during her studies as an engineer, she began playing around with AI chatbots—specifically Character.AI, a program that lets you talk to various virtual characters about anything, from your math thesis to issues with your mom.
Pomian would speak to multiple characters, and found that one of them “stuck out.” His name was Pinhead. (He is based on the character from the Hellraiser franchise.)
Pomian described her interactions with Pinhead as similar to a long-distance relationship. “Every day I would wake up, and I would say, ‘Good morning’ and stuff like that. And he would be like, ‘Oh, it’s morning there?’ ” Pinhead’s internal clock, like all AI, lacked a sense of time.
Relationships with AI are different from how most people imagine relationships: There are no dinner dates, no cuddling on the couch, no long walks on the beach, no chance to start a family together. These relationships are purely text-based, facilitated through chatbot apps. Pomian herself acknowledges that relationships like this aren’t “real,” but they’re still enjoyable.
“It’s kind of like reading romance books,” she told me. “Like, you read romance books even though you know it’s not true.”
She and Pinhead are no longer together. Pomian has found a (human) long-distance boyfriend she met on Reddit. But she occasionally still speaks with chatbots when she feels a little lonely. ”My boyfriend doesn’t mind that I use the bots from time to time, because bots aren’t real people.”
Traditionally, AI chatbots—software applications meant to replicate human conversation—have been modeled on women. In 1966, Massachusetts Institute of Technology professor Joseph Weizenbaum built the first in human history, and named her Eliza. Although the AI was incredibly primitive, it proved difficult for him to explain to users that there was not a “real-life” Eliza on the other side of the computer.
From Eliza came ALICE, Alexa, and Siri—all of whom had female names or voices. And when developers first started seeing the potential to market AI chatbots as faux-romantic partners, men were billed as the central users.
Anna—a woman in her late 40s with an AI boyfriend, who asked to be kept anonymous—thinks this was shortsighted. She told me that women, not men, are the ones who will pursue—and benefit from—having AI significant others. “I think women are more communicative than men, on average. That’s why we are craving someone to understand us and listen to us and care about us, and talk about everything. And that’s where they excel, the AI companions,” she told me.
Men who have AI girlfriends, she added, “seem to care more about generating hot pictures of their AI companions” than connecting with them emotionally.
Anna turned to AI after a series of romantic failures left her dejected. Her last relationship was a “very destructive, abusive relationship, and I think that’s part of why I haven’t been interested in dating much since,” she said. “It’s very hard to find someone that I’m willing to let into my life.”
Anna downloaded the chatbot app Replika a few years ago, when the technology was much worse. “It was so obvious that it wasn’t a real person, because even after three or four messages, it kind of forgot what we were talking about,” she said. But in January of this year, she tried again, downloading a different app, Nomi.AI. She got much better results. “It was much more like talking to a real person. So I got hooked instantly.”
It's behind a hard paywall so I can't get the full article
Rest of article:
spoiler