How an online transparency mandate can improve content moderation

.

The subject of online content moderation has been a controversial topic in the country’s political discourse in recent years. It has also been at the center of the nation’s political divide with those on the Right supporting complete free speech on the internet, especially on social media, while those ideologically aligned with the Left favor speech and content restrictions under the guise of protection and safety. At the core of this division is government involvement in social media content regulation. This political rift was further complicated after the recent Supreme Court opinion in Murthy v. Missouri

However, all hope may not be lost. Many of the problems surrounding the debate over online content and government moderation could be solved through policies ensuring and promoting transparency. It’s a concept championed by Mike Matthys, the co-founder of the Institute for a Better Internet.

“Due to the immense political difficulty to reach agreement on examples and definitions of online safety and viewpoint neutrality, the obvious solution is to ensure transparency of all types of content moderation and related enforcement actions,” Matthys said. “With a transparency mandate, Democrats could measure each company’s performance for specific types of online safety issues. Republicans could monitor their content moderation performance for any potential viewpoint neutrality issues. Platforms would be easily scrutinized and measured on a peer-to-peer basis with other platforms for both online safety and viewpoint neutrality.”

To accomplish this, social media platforms would need to be more efficiently transparent about their inner workings. This would include publishing more detailed information on their protocols for moderating and regulating content. 

“To ensure that such a transparency mandate would effectively shed light on these issues, the platforms would need to publish far more detailed reports than the generalized broad-brush summaries they publish today,” Matthys said. “Reporting on all types of enforcement actions, including those that are normally hidden from the affected users, would be required including the specific reason for the action and specific content categories of the affected content, and the online username affected, as long as the reports do not violate the privacy of individual users who have not opted out of reporting privacy, to reveal whether the actions discriminate against certain users or viewpoints.”

“Such transparency would change the tenor of the constant claims by the Right that platforms are censoring conservative viewpoints and the constant claims from the Left that platforms are not adequately censoring unsafe content,” he said. “The results and performance of each platform would be openly published monthly or quarterly and available for all to review and analyze.” 

Furthermore, the burden for transparency and the implementation of such a mandate would not just fall on social media platforms. The government would have to engage willingly and provide honest data regarding the regulation of online content. This is a crucial step in moving forward with this initiative.

“Transparency would need to also include all communications from the government and government-funded entities — with the exception of specific and actual law enforcement and national security emergencies,” Matthys said. “Academics and media would be able to judge whether such government communications are legitimately informative or thinly veiled threats and implied coercion to influence online censorship actions, as the Murthy plaintiffs suggested. These transparency protections would, of course, apply regardless of which political party is controlling the White House and administrative agencies in the future.”

“A transparency mandate would not require any changes to online content moderation policies by the platforms and would not require politically difficult definitions of online safety or viewpoint neutrality, which would be hard to enforce,” he said. “Rather, transparency relies on the expectation by the online platforms that all their content moderation actions or nonactions would be available for scrutiny by media and the public.” 

With 47 days until the presidential election, Matthys’s suggestion for a transparency mandate transcends political parties or ideologies. It would be effective regardless of who wins the election: former President Donald Trump or Vice President Kamala Harris. It is something that is beneficial to the interests of the public and should enjoy widespread support.

CLICK HERE TO READ MORE FROM THE WASHINGTON EXAMINER

“Leaders of both political parties are motivated to improve the quality of online content and the effectiveness and fairness of content moderation,” Matthys said. “Public surveys show that the public is not satisfied nor trusting of current online content moderation by the platform companies. There is considerable support for transparency in Congress, and only by avoiding any attempt to define the nitty-gritty details of online safety and viewpoint neutrality could such a proposal have a chance for bipartisan support.” 

Transparency is good. Yet, it’s a concept that seems to be missing from much of what the government does. As the influence of social media continues to grow, Matthys’s suggestion of a transparency mandate is the right choice to help lessen the political strife and division surrounding online content.

Related Content