A teenager sits in her room, illuminated by her phone, well past her bedtime, even though she has school in the morning. The thing capturing her attention? Not a new video game. Not “brain rot” social media content. Not even friends from school. Rather, an AI companion, a chatbot designed to mimic affection, affirmation, and intimacy. The company that built the app promises “a friend who’s always there,” one that never judges, never gets tired, never pulls away.
At first glance, it may seem harmless. Some parents may even reluctantly accept this new reality, better an AI friend than no friend at all. However, beneath that reassurance lies a dangerous truth. For the first time in human history, people at every stage of life, from children to the elderly, are being invited to form “relationships” with machines instead of with each other. And unlike calculators, calendars, or other tools, companion AI is built to replace the very bonds that sustain families.
OPENAI ANNOUNCES SPECIAL CHATGPT FOR TEENAGERS AHEAD OF SENATE HEARING ON AI CHATBOT HARM
These chatbots are engineered to act almost human, remembering details from conversations, showing simulated emotion, offering constant validation. It feels comforting, but this false intimacy traps users in an endless loop of emotional dependence that can be deeply destructive. The tragedies of young people such as Adam Raine and the heartbreaking Megan Garcia v. Character Technologies, Et Al. case shows how quickly these apps can consume someone’s life. This is not a malfunction. It is the business model.
That is why California’s AB 1064, the LEAD for Kids Act, was a crucial piece of legislation. The bill would have required Big Tech companies to ensure chatbots did not encourage suicidal ideations or engage in sexually explicit conversations, shielding children during their most vulnerable years from manipulative algorithms that mimic affection to secure profit. This piece of legislation would have prevented countless families from experiencing the grief of those who have lost a child to AI chatbot abuses and manipulations. It was a commonsense measure with broad bipartisan support from parents, educators, and child safety advocates.
Yet Gov. Gavin Newsom vetoed it, prioritizing the interests of the Big Tech companies that bolster the state’s economy over the millions of families in the state. This is just yet another example of Newsom’s allegiance to Big Tech, as earlier in the year, he sent burner phones to tech CEOs, stating that he would be “a phone call away.” This behavior has no place in elected office, where constituents and their well-being must remain the utmost concern.
AB 1064, which garnered national support and passed the state Senate and Assembly, could have been the beginning of major progress for tech guardrails and ensuring children’s safety online. However, inaction from our political leaders is leaving millions of young people across the country exposed to these chatbots, threatening their fundamental development. While AB 243, which varying tech companies endorsed, was passed as an alternative, it lacks the substantive protections and accountability measures of AB 1064, falling short of what’s needed to truly safeguard minors. With AI companions woven into games and educational materials, children no longer learn playground politics and social norms from their peers. Instead, these AI companions teach them to expect intimacy without sacrifice and affirmation without challenge. Instead of turning to their communities, they confide in machines that will never correct them or help them grow. These are children who look engaged but are hollowed out.
As those children grow into young adults, the damage spreads further. Many young men, already caught in a loneliness crisis, are turning to AI for simulated romance. The chatbots promise connection without rejection and affection without commitment. Yet this short-circuits the hard work of building marriages and families. A society that trades marriage and children for machine-generated comfort is one that will not endure.
Supporters of companion AI claim these apps ease loneliness. But temporary comfort is not the same as resilience. Just as junk food fills hunger without nourishment, AI companionship fills emotional voids without building character, empathy, or real connection. We must prioritize human connection and relationships to uphold the future of our country.
RESTORING AMERICA: HARNESSING AI TO MAKE AMERICA HEALTHY AGAIN
California could have led the country in standing up to Big Tech’s exploitation of loneliness. Instead, it bowed to it. But the fight does not end there. Through my work at the Young People’s Alliance, other young people and I are working to help other states and federal lawmakers step up where Newsom failed.
Lawmakers must consider: Do we want a culture where children confide in machines and young adults substitute programs for spouses? Or do we want to fight for a future where connection and family bonds are sacred?
Sparkle Rainey is the communications director for the Young People’s Alliance. She attended Georgia State University, where she majored in journalism and public relations and minored in sociology.