The City Law School’s Professor Elaine Fahey says the controversial board raises concerns about censorship and whether the social media giant holds unaccountable corporate power over free speech.
By Professor Elaine Fahey (re-published from The Conversation)
Referred to by some as Facebook’s “supreme court”, the oversight board tasked with reversing or upholding Facebook’s content moderation decisions has ruled that the social media company’s ban of Donald Trump should be maintained.
The board upheld Facebook’s January 7 decision to ban then-President Trump from posting content on Facebook and Instagram, after his social media activity was partially blamed for inciting the violence at the January 6 Capitol riots, during which five people died. However, the board noted that indefinite suspensions were not described in Facebook’s content policies – and so the ban will be reviewed again in six months.
This outcome is hardly surprising. Leading legal scholars had advocated not to reinstate Trump, whose words carry significant weight and whose apparent support for rioters violated Facebook’s community standards.
But the decision is also controversial. Those loyal to Trump may see the decision as partisan, or else as a dangerous precedent for censoring speech on the internet. Others have argued that banning Trump reveals a double standard – with other sitting and former leaders around the world avoiding a ban despite also being culpable for inciting violence. The oversight board must focus on building a reputation for consistency if it’s to be taken seriously as an independent regulator of online speech.
The oversight board is one of the most controversial and significant bodies ever developed to moderate content on the internet. Created in 2019, it is the “first body of its kind in the world” – an expert-led, independent organisation with the power to impose binding decisions on Facebook and to overrule the company’s chief executive, Mark Zuckerberg.
The origins of the board lie in the idea of law professor Noah Feldman that Facebook “needed its own supreme court” given the volume and importance of speech that the platform hosts. So did it get one?
The board has about 20 members, made up of experts and civic leaders, who have been through special training. Unlike many international courts, it is balanced in regional, gender and racial terms. It’s funded through a US$130 million (£93.7 million) trust from Facebook. In 2020, Facebook unveiled the board’s bylaws and announced its members and their first cases.
A case is referred to the board either by Facebook itself or through direct submissions from users who disagree with Facebook taking down their content or leaving someone else’s content up. Facebook has agreed to respect and act on the board’s decisions unless it would be unlawful to do so.
Does it work?
Legal scholars and experts are wary about judging the board in its early stages of development. Some argue that the first set of decisions from the board have showed a decidedly “libertarian tilt”, mainly overturning decisions to take down posts about misinformation. The board has arguably appeared more concerned with the risks of excluding rather than including speech from public discussions.
But the limited nature of the board’s binding authority has been criticised. And experts argue that the current model arguably gives Facebook significant power to determine which cases go before the board in the first place.
Others are more hopeful. Legal expert Kate Klonick has eloquently argued, as part of a high-profile study of its development, that the board has great potential to set new precedents for user participation in the governance of private platforms. It’s this wider impact of the board’s decisions, with the potential to guide state policymakers and the moderation guidelines of other social media companies, that make its rulings so consequential in the wider field of internet law.
This is also what makes the board so controversial, raising concerns about censorship and whether Facebook holds unaccountable corporate power over free speech. In the US, opinions about Trump’s social media ban are split along partisan lines, showing that the board’s decisions are rarely going to satisfy everyone.
Trump still online
In any case, in advance of the board’s decision, Trump announced he had developed a new “communications platform” to share his press releases. It remains to be seen whether the ability of users to share content from this new platform onto Facebook will render the board’s decision moot. Trump’s rival “social media platform”, announced in March 2021, is yet to appear online.
Facebook’s oversight board has the potential to become one of the best-funded and most interesting international court-like bodies in the world. Its success in achieving consistent decisions and nudging other platforms to self-regulate remains to be seen – but, until November 5 2021, its decision to keep Trump off Facebook and Instagram is binding.