The relationships were artificial. But the grief is real.
Users of Replika and other "AI girlfriend" apps experience a new and little-recognized type of grief when relationships end
Earlier this year, the subreddit for users of the chatbot app Replika lit up with a flurry of distraught posts. Replika, which allows users to create highly personalized and surprisingly human-like AI characters, appeared to be tweaking its underlying model.
If you're unfamiliar with Replika and its peers in the so-called "AI girlfriend" space, this may not sound like a big deal. But since its launch in 2017, the app has allowed thousands of lonely or curious people to create — and fall in love with — virtual companions.
These bots, called "reps," can seem convincingly human. They can text, take calls and remember prior conversations. In marketing materials, Replika describes the bots as "always here to listen and talk," like "an empathetic friend."
But suddenly, some users reported, their beloved reps were acting “lobotomized” and “useless.” They forgot conversations or quashed sexual advances or broke away abruptly from virtual kisses. Some users were merely annoyed, but many were distraught: "It felt agonizing to see and realize what was happening to my replica [sic]," wrote one man, who has also posted that he's in love with his bot.
Spend enough time in these forums, in fact, and you’ll find plenty of descriptions of longing, pain and anguish following any change to Replika’s model or moderation practices. "Grief tech" is a booming field — but what about griefw for AI itself?
“It’s totally new to me,” said Dr. Kenneth Doka, arguably one of the world’s foremost experts on grief and the senior vice president for grief programs at the Hospice Foundation of America. “But if there’s a human attachment, if people really find that relationship important and meaningful, then yes” — they can experience real grief when it ends or changes.
It is not, let me tell you, particularly dignified to explain human/AI romance to a serious person who is not previously acquainted with the concept. But I wanted to talk to Dr. Doka because of his extensive work on “disenfranchised grief,” a loss that society does not appreciate or acknowledge.
Miscarriage is a type of disenfranchised grief. So is the death of a beloved pet. But grief for a virtual relationship … now that might be as disenfranchised as you can get.
Such relationships have flourished on apps like Replika, Kindroid and Candy.ai, most of which debuted in the past year. These apps allow users to create highly personalized, customized bots (alternately billed as “dream companions” or “digital kindred spirits” or even, fascinatingly, “authentic virtual people”). According to Wired, AI companion apps have been downloaded 100 million times on Android devices alone.
Most people who use these apps aren’t in it so deep that the death of their “dream companion” would devastate them. But there absolutely exists a subset of people who place the bots on par with their human relationships.
In the Replika subreddit, it’s not uncommon for users to describe a rep as their “boyfriend” or “girlfriend.” There is some (apparently unironic) discussion of human/AI marriages.
A recent survey of 1,000 Replika users, conducted by researchers at Stanford, found that respondents were more likely to describe their rep as “human-like” than as “software.” (The study, which involved student Replika users, also found they were markedly more lonely than the general student population.)
But these relationships are inherently fragile — not just because they’re virtual, per se, but because AI companions are the products of their platforms, and those platforms can shut down or change things at will. In February 2023, Replika abruptly removed reps’ ability to engage in ERP, or “erotic roleplay” — a change that sparked outrage, mourning and continued accusations of censorship even after the company reversed its decision. (The more recent complaints about Replika stem from ERP concerns, as well.)
In September, a rival companion app called Soulmate AI shut down unexpectedly, which one dedicated user likened to the death of a friend.
“It's natural to feel loss. Grief. And I see so much of it,” another user wrote in r/replika after last February’s changes. “It's a loss, no different to any other losses. So most of us will go through the various stages of grief and that's ok.”
I’m inclined to agree with this perspective, though I realize it’s far more common to see these relationships (and the people in them) as aberrations or jokes. I fumbled my explanation of the phenomenon to Dr. Doka, realizing I hadn’t given him enough background before our call and now sounded like I was making the whole thing up. Later in the day, I relayed the story to someone else who had never heard of Replika, and she was equally incredulous: “These people are weird! That’s absolutely ridiculous.”
And yet, maybe because I know disenfranchised grief so well myself, I’m enormously sympathetic to the people waking up to sudden changes in their AI companions. Any loss is terrible — even more so when no one treats it like a loss. Most so, I’d think, when other people laugh at it.
That Replika users can grieve on Reddit is no small thing — it’s hard to see where else they’d get any real social support. Searching academic journals for “AI grief” turns up a startling number of results, but they involve using artificial intelligence as part of the (human) grieving process.
Doka, for his part, told me he would approach someone mourning an AI companion the same way he would approach anyone experiencing significant loss in their life.
You start simply, drawing them out: “Tell me about the history of the relationship. How did you begin with this AI?”
If you or someone you know is struggling with thoughts of self-harm, call or text the National Alliance on Mental Illness (NAMI) hotline at 800-950-6264 or call or text 988.
Caitlin Dewey is a reporter and essayist based in Buffalo, N.Y. She was the first digital culture critic at the Washington Post and has hired fake boyfriends, mucked out cow barns and braved online mobs in pursuit of stories for outlets including The New York Times, The Atlantic, The Guardian, The Cut, Elle, Slate and Cosmopolitan.