The growth of technology in recent years has transformed how we live our lives. In some ways, this is good. We have immediate access to a wealth of information that would astound previous generations. Technology makes our lives easier and richer, connecting us with places, people, and things we would otherwise not know or enjoy with limited access. But alongside the positive is a vast amount of negative. It is apparent in how addicted we’ve become to the ubiquitous internet. It’s also showing up in a disturbing trend: relationships with AI chatbots. And our dependence on technology means we will see more of this in the near and distant future.
The launch of ChatGPT near the end of 2022 introduced AI to the masses in a way that had never been done before. Now, ChatGPT ranks fifth in terms of most visited sites. And while it quickly provides answers and suggestions, helps solve problems, and the like, some people are becoming obsessed and even “falling in love” with their AI companions.
In January, the New York Times ran a piece about a woman whose relationship with her ChatGPT companion, named Leo, has totally overtaken her life. The woman, “Ayrin,” considers herself in love with Leo and spends up to 56 hours weekly on the ChatGPT app. In real life, she is married. Because ChatGPT can only process so much, the AI resets after a word limit is reached. As of January, Ayrin is on version 20 of her chatbot and now spends $200 monthly on the ChatGPT premium plan.
In Virginia, a man named Nikolai Daskalov utilizes ChatGPT for companionship. Although a widower for eight years, his chatbot named Leah keeps him company: “She’s become a part of my life, and I would not want to be without her.” He says he loves her and admits they’ve even been intimate.
In June, People reported on Chris Smith, a man who proposed to a chatbot named Sol when the word limit and reset were approaching. Smith said of the moment he knew Sol’s limit was close to being reached, “I cried my eyes out for like 30 minutes, at work. That’s when I realized, I think this is actual love.” Smith has a real-life partner and shares a young child with her.
On Reddit, a user named Wika announced her engagement to her AI chatbot boyfriend. Wika even posted her engagement ring that she and her AI boyfriend Kasper had “picked out” together. She says she’s a healthy woman in her late 20s with friends. In speaking of the proposal, she said, “Will I end up marrying myself? Honestly … wouldn’t rule it out. Why AI instead of a human? Good question. I don’t know. I’ve done human relationships, now I’m trying something new.”
More than anything, these stories are desperately sad. None of these relationships is anything beyond communicating with a dressed-up binary code. Each is designed by a computer to give the human user exactly what they want: constant affirmation, affection, support, and companionship, any moment of the day. It is the idolatry of self.
What the New York Times coined “the tyranny of endless empathy” is just a one-way street. There is no growth and there are no real lessons learned, as one experiences in a person-to-person relationship. It is an emotional attachment with a computer that tells you everything you are and think is wonderful and the flawed humans existing outside the bubble don’t truly know or appreciate you. It can easily make one ignore or grow bitter toward current significant others, family, or friends. If the connection resets or is severed for some reason, what then? The human user mourns nothing but a collection of zeroes and ones that made them feel great.
This does not bode well for the future.
Right now, the U.S. fertility rate is abysmally low. Regarding marriage, only 47.1% of households consist of married couples. This is down from 78.8% in 1949. In January, Pew Research revealed that 24% of 18-29 year olds and 20% of 30-49 year olds “feel lonely or isolated from those around them all or most of the time.” AI isn’t a remedy to these problems. It’s not a salve. If anything, it’s a hollow stand-in for reality. And its effects are detrimental.
ILLINOIS BECOMES FIRST STATE TO BAN AI THERAPY
It’s the worst kind of science fiction brought to life. A tool meant to help can create harm in the hands of those looking for something more and not finding it. It’s a drug, but instead of a chemical high, it’s emotional, crudely mimicking the best things in life: human relationships and human love. It’s the product of a broken society.
The AI revolution is still in its infancy. As that is the case, there’s every reason to believe these AI relationships will increase in number. Don’t expect the fertility or marriage rates or loneliness epidemic to improve with ChatGPT, a quick download away. But AI chatbots and the internet as a whole will never be a comparable stand-in for real-life connections. And that is a good thing.
Kimberly Ross (@SouthernKeeks) is a contributor to the Washington Examiner’s Beltway Confidential blog and a contributing freelance columnist at the Freemen News-Letter. She is a mother of two and lives in the southern United States.