Thirty years later, Section 230 is protecting the wrong people

.

Sometimes when my teenage son attempts to educate me about the internet, I find myself tempted to quote Aslan from The Lion, the Witch and the Wardrobe, “Do not cite the deep magic to me, witch. I was there when it was written.”

I remember the high-pitched screeches and squeals emitting from the cumbersome modems used for dial-up internet service. I remember waiting for what seemed an eternity for downloads to complete. I remember compiling college research for a paper in just one evening that would previously have taken me days to pull together at the library. 

But as primitive as the internet was then, especially by today’s standards, it did not take long for the information superhighway to turn into a virtual red-light district. Already by the mid-‘90s, children and teenagers were discovering online pornography. Sometimes intentionally, often unintentionally. 

There was, for example, a notorious practice known as “typosquatting,” in which adult-site operators intentionally registered domain names similar to legitimate ones. The most notorious example was WhiteHouse.com, a pornographic site frequently mistaken for the official government address, WhiteHouse.gov, and widely cited as a case where children innocently researching the presidency were instead exposed to explicit content.

These and other stories led to public outcry and demands for congressional action to help protect children online. 

This was the context and framework for the passage of the Communications Decency Act, which was enacted as Title V of the Telecommunications Act of 1996, an attempt to update federal communications law for the digital age. Its core purpose was to address mounting public and political concern over minors’ access to pornography and sexually explicit material online, which had been growing throughout the 1980s and 1990s as home internet use expanded. 

The CDA went a long way toward criminalizing the knowing transmission of obscene material to minors; criminalizing the display of “patently offensive” sexual or excretory content in ways accessible to minors (a provision that was later overturned by the courts); and providing legal defenses for those who made good-faith efforts to restrict minors’ access (for example, credit-card age gating). 

There was, however, a provision of that otherwise necessary piece of legislation that has had far-reaching and devastating long-term consequences that could not have been anticipated in 1996: Section 230

Although Section 230 was not the focus of the indecency hearings, it emerged in response to a different legal problem highlighted in congressional testimony. Courts had created a perverse incentive: platforms that moderated user content could be held liable for what users posted, while those that did nothing were shielded. Cases such as Stratton Oakmont v. Prodigy made clear that without reform, the safest path for online services was to avoid moderating harmful material altogether. Section 230 was designed to correct this imbalance by explicitly protecting platforms that chose to moderate in good faith.

But while Section 230 was crafted to solve a genuine problem created by the courts, its sweeping immunity has produced harms Congress never foresaw. 

Product liability is a necessary corrective to hold unethical business practices in check. Removing that liability also removes incentives for businesses to behave responsibly, to proactively anticipate problems, and to build in safety features and protections for consumers. What was intended to encourage responsible filtering of harmful content and safeguard a growing internet has, in practice, invited platforms to design systems that amplify danger rather than contain it.

Congressional hearings in recent years have revealed that social media companies were not only aware of the harm their products were causing, but their own internal research documented it. 

When 14 state attorneys general initiated lawsuits against TikTok, they published briefs that included hundreds of quotations from internal reports, memos, Slack conversations, and public statements in which executives and employees of TikTok acknowledge and discuss the harms that their company is causing to children, including underage use; addictive, compulsive, and problematic use; depression, anxiety, body dysmorphia, self-harm, and suicide; pornography, violence, and drugs; sextortion, CSAM, and sexual exploitation. 

In 2021, whistleblower Frances Haugen revealed that Meta executives were fully aware that their platform design was making teenagers feel worse about themselves and cultivating body image issues among teenage girls. When faced with that disturbing evidence, Meta executives did not act on it. They ignored it.

According to former Meta engineer Arturo Béjar, Meta executives, including Mark Zuckerberg, are fully aware of how to make their platforms safer for children, but they are choosing not to do so.

And why should they act on this information, when doing so might cut into their profits and Section 230 shields them from civil liability? 

HOW THE INTERNET BREEDS EXTREMISM

Thirty years after its passage, it is clear that Section 230 has outlived the internet for which it was written. What began as a well‑intentioned fix for a narrow legal problem has become a broad shield for systems and business models Congress could never have imagined: platforms whose scale, speed, and algorithmic design now shape the psychological and social landscape of an entire generation. 

It is time to retire Section 230 and replace it with a modern framework that protects children, empowers families, and restores accountability without recreating the perverse incentives it was originally meant to correct. An internet built on responsible design, transparent practices, and enforceable duties of care is not only possible, but it is long overdue. Families deserve nothing less. Children deserve nothing less.

Melissa Henson is the senior policy adviser for Media and Culture for Concerned Women for America, the nation’s public policy women’s organization, dedicated to promoting biblical values and constitutional principles in public policy. On X: @CWforA

Related Content