Facebook parent company Meta’s regulatory body called out the Big Tech company for allowing celebrity and popular accounts to break its content rules, leaving up reviewed posts longer and giving preference to its business partners.
The oversight board concluded in a report released Tuesday that Facebook abused its two-tiered content moderation system, which flags VIP users differently than the rest. The VIP list, also known as “cross-check,” includes users such as former President Donald Trump, commentator Candace Owens, and even the company’s founder Mark Zuckerberg. If a VIP is thought to have broken the rules, their content is reviewed by human moderators rather than algorithms. Meta asked the board in September to address the system and offer recommendations for improvements. The report is based on several months of reviewing internal documents and speaking with staff.
“While Meta told the board that cross-checks aims to advance Meta’s human rights commitments, we found that the program appears more directly structured to satisfy business concerns,” the report found. “The board understands that Meta is a business, but by providing extra protection to certain users selected largely according to business interests, cross-check allows content which would otherwise be removed quickly to remain up for a longer period, potentially causing harm.”
Meta also failed to track data on whether the cross-check decisions were used to make more accurate decisions and noted the lack of transparency around the entire system, according to the board.
Members of the oversight board noted that the system did not have the effects Meta wanted from it. “I sincerely believe that there are a lot of people at Meta who do believe in the values of free speech and the values of protecting journalism and protecting people working in civil society,” board member Alan Rusbridger, a British journalist, told the Verge. “But the program that they had crafted wasn’t doing those things. It was protecting a limited number of people who didn’t even know that they were on the list.”
The board recommended that the platform hide posts in the cross-check system marked as “high severity” while they review the content, that they adopt a queue that is separate from business partners, and that they set “clear, public criteria” for others to check regarding which people are included in the cross-check system.