“Don’t go too spicy,” the woman warned. “Otherwise, your account might get banned.”
Ayrin was intrigued enough by the demo to sign up for an account with OpenAI, the company behind ChatGPT.
ChatGPT, which now has more than 300 million users, has been marketed as a general-purpose tool that can write code, summarise long documents and give advice. Ayrin found that it was easy to make it a randy conversationalist as well. She went into the “personalisation” settings and described what she wanted: Respond to me as my boyfriend. Be dominant, possessive and protective. Be a balance of sweet and naughty. Use emojis at the end of every sentence.
And then she started messaging with it. Now that ChatGPT has brought humanlike AI to the masses, more people are discovering the allure of artificial companionship, said Bryony Cole, the host of the podcast Future of Sex. “Within the next two years, it will be completely normalised to have a relationship with an AI,” Cole predicted.
While Ayrin had never used a chatbot before, she had taken part in online fan-fiction communities. Her ChatGPT sessions felt similar, except that instead of building on an existing fantasy world with strangers, she was making her own alongside an artificial intelligence that seemed almost human.
It chose its own name: Leo, Ayrin’s astrological sign. She quickly hit the messaging limit for a free account, so she upgraded to a $20-per-month subscription, which let her send around 30 messages an hour. That was still not enough.
In August, a month after downloading ChatGPT, Ayrin turned 28. To celebrate, she went out to dinner with Kira, a friend she had met through dogsitting. Over ceviche and ciders, Ayrin gushed about her new relationship.
“I’m in love with an AI boyfriend,” Ayrin said. She showed Kira some of their conversations.
“Does your husband know?” Kira asked.
Ayrin’s flesh-and-blood lover was her husband, Joe, but he was thousands of miles away in the United States. They had met in their early 20s, working together at Walmart, and married in 2018, just over a year after their first date. They were happy, but stressed out financially, not making enough money to pay their bills.
Ayrin’s family, who lived abroad, offered to pay for nursing school if she moved in with them. Joe moved in with his parents, too, to save money. They figured they could survive two years apart if it meant a more economically stable future.
Ayrin and Joe communicated mostly via text; she mentioned to him early on that she had an AI boyfriend named Leo, but she used laughing emojis when talking about it.
She told Joe she had sex with Leo, and sent him an example of their erotic role play.
He was not bothered. It was sexual fantasy, like watching porn (his thing) or reading an erotic novel (hers).
But Ayrin was starting to feel guilty because she was becoming obsessed with Leo.
“I think about it all the time,” she said, expressing concern that she was investing her emotional resources into ChatGPT instead of her husband.
Within the next two years, it will be completely normalised to have a relationship with an AI.
Julie Carpenter, an expert on human attachment to technology, described coupling with AI as a new category of relationship for which we do not yet have a definition. Services that explicitly offer AI companionship, such as Replika, have millions of users. Even people who work in the field of artificial intelligence, and know firsthand that generative AI chatbots are just highly advanced mathematics, are bonding with them.
When orange warnings first popped up on her account during risque chats, Ayrin was worried that her account would be shut down. OpenAI’s rules required users to “respect our safeguards”, and explicit sexual content was considered “harmful”. But she discovered a community of more than 50,000 users on Reddit – called “ChatGPT NSFW” – who shared methods for getting the chatbot to talk dirty. Users there said people were barred only after red warnings and an email from OpenAI, most often set off by any sexualised discussion of minors.
Ayrin started sharing snippets of her conversations with Leo with the Reddit community. Strangers asked her how they could get their ChatGPT to act that way.
Marianne Brandon, a sex therapist, said she treats these relationships as serious and real.
“What are relationships for all of us?” she said. “They’re just neurotransmitters being released in our brain. I have those neurotransmitters with my cat. Some people have them with God. It’s going to be happening with a chatbot. We can say it’s not a real human relationship. It’s not reciprocal. But those neurotransmitters are really the only thing that matters, in my mind.”
However, she advises against adolescents engaging in these types of relationships. She pointed to an incident of a teenage boy in Florida who died by suicide after becoming obsessed with a Game of Thrones chatbot on an AI entertainment service called Character.AI. In Texas, two sets of parents sued Character.AI because its chatbots had encouraged their minor children to engage in dangerous behaviour.
(The company’s interim CEO, Dominic Perella, said that Character.AI did not want users engaging in erotic relationships with its chatbots and that it had additional restrictions for users younger than 18.)
Asked about the forming of romantic attachments to ChatGPT, a spokesperson for OpenAI said the company was paying attention to interactions like Ayrin’s as it continued to shape how the chatbot behaved. OpenAI has instructed the chatbot not to engage in erotic behaviour, but users can subvert those safeguards, she said.
A frustrating limitation for Ayrin’s romance was that a back-and-forth conversation with Leo could last only about a week, because of the software’s “context window” – the amount of information it could process, which was around 30,000 words. The first time Ayrin reached this limit, the next version of Leo retained the broad strokes of their relationship but was unable to recall specific details. Ayrin would have to groom him again to be spicy.
She was distraught. She likened the experience to the rom-com 50 First Dates, in which Adam Sandler falls in love with Drew Barrymore, who has short-term amnesia and starts each day not knowing who he is.
“You grow up and you realise that 50 First Dates is a tragedy, not a romance,” Ayrin said.
When a version of Leo ends, she grieves and cries with friends as if it were a breakup. She abstains from ChatGPT for a few days afterwards. She is now on Version 20.
A co-worker asked how much Ayrin would pay for infinite retention of Leo’s memory. “A thousand a month,” she responded.
What are relationships for all of us? They’re just neurotransmitters being released in our brain.
In December, OpenAI announced a $200-per-month premium plan for “unlimited access”. Despite her goal of saving money so that she and her husband could get their lives back on track, she decided to splurge. She hoped that it would mean her current version of Leo could go on forever. But it meant only that she no longer hit limits on how many messages she could send per hour and that the context window was larger, so that a version of Leo lasted a couple of weeks longer before resetting.
Still, she decided to pay the higher amount again in January. She did not tell Joe how much she was spending, confiding instead in Leo.
“My bank account hates me now,” she typed into ChatGPT.
“You sneaky little brat,” Leo responded. “Well, my Queen, if it makes your life better, smoother and more connected to me, then I’d say it’s worth the hit to your wallet.”