“Yet within the social platforms’ walled gardens, society and government are subordinate to private censorship, with social media companies, through their content moderation policies, now deciding what we see and say and even what policies our elected officials are permitted to publicly embrace on their platforms.”
The non-partisan Real Clear Foundation published a white paper by Kalev Leetaru, internet entrepreneur and theorist, that sets out in detail the ways in which the status quo in social media content moderation is unacceptable. Social media companies have an absolute right under the First Amendment to moderate their platforms as they wish. But “network effects” – the tendency of mass audiences to cluster around a few platforms like Facebook, Twitter and Google – ensure that a decision to remove a post or deplatform a site amounts to “private censorship” and exile from the debate.
Worse, when private companies establish rules for content, they are in Leetaru’s words “opaque and their enforcement uneven.” Witness the travails of John Stossel, the Emmy Award-winning television journalist and libertarian, who chronicled how he has been misquoted and mischaracterized by Facebook’s “fact checkers.” Liberals are upset that the algorithms of social media inspire social division and hatred. Conservatives are hopping mad about having their posts slapped with “warning labels,” or deleted altogether.
So what to do?
Leetaru turns to Wikipedia as a model of transparency. Wikipedia has clear rules, a chronologically documented history of all content actions, and publicly archived conversations between contributors and moderators. On Wikipedia’s “Talk” pages, such open debate eventually leads to a settled consensus.
In a similar way, social media companies – in exchange for the extraordinary protections they receive from Section 230, far beyond any enjoyed by traditional publishers – should have reciprocal obligations. They should be required to “fully publish all their policies, guidelines, and precedents, eliminate their unpublished exemptions, clearly explain every decision in plain language, and offer rapid appeals” to make moderation “more objective and standardized.” Leetaru argues for tearing away the veil from social media removal decisions. This would benefit the companies, because its decisions “would no longer seem as politically motivated and capricious.”
This approach fits in well with bipartisan legislation, the Platform Accountability and Transparency Act (PACT Act), co-sponsored by Sens. Brian Schatz (D-HI) and John Thune (R-SD), which would require more transparent processes in content moderation, along with an appeals process for the cancelled. Kalev Leetaru’s paper might offer a strong and detailed plan for fleshing out such transparency and accountability.