The big question: Should Donald Trump be permitted to return to Facebook and reconnect with his millions of followers? Photo / AP
They meet mostly on Zoom, but I prefer to picture the members of this court, or council, or whatever it is, wearing reflective suits and hovering via hologram around a glowing table.
The members include two people who were reportedly on presidential shortlists for the US Supreme Court, along witha Yemeni Nobel Peace Prize laureate, a British Pulitzer winner, Colombia's leading human rights lawyer and a former prime minister of Denmark. The 20 of them come, in all, from 18 countries on six continents, and speak 27 languages among them.
This is the Oversight Board, a hitherto obscure body that will, over the next 87 days, rule on one of the most important questions in the world: Should Donald Trump be permitted to return to Facebook and reconnect with his millions of followers?
The decision has major consequences not just for American politics, but also for the way in which social media is regulated, and for the possible emergence of a new kind of transnational corporate power at a moment when almost no power seems legitimate.
The board will seriously examine the Trump question, guided by Facebook's own rules as well as international human rights law. If Facebook accepts its rulings, as it has pledged to do, as well as the board's broader guidance, the company will endow this obscure panel with a new kind of legitimacy.
"Either it's nothing, or it's the New World Order," said a lecturer at Harvard Law School who studies content moderation, Evelyn Douek, who pushed Facebook to send the Trump case to the Oversight Board.
It might surprise you to know that such a board exists — that one of the word's most powerful executives would go to such lengths to give up control of a key tool, the delete key.
But after four years of unending criticism for being too slow to act on the rise of right-wing populism on the platform, and parallel complaints from the right over alleged censorship, you can see why Mark Zuckerberg, Facebook's chief executive, was drawn to the idea of handing the thorniest calls off to experts, and washing his hands of the decisions.
Zuckerberg floated the notion of an independent content moderation body back in 2018, and Facebook finally appointed its members last May. The company put US$130 million ($180.3m) into a legally independent trust with a staff of 30, which two people involved said paid six figures annually to each board member for what has become a commitment of roughly 15 hours a week.
The board is structurally independent, and Zuckerberg has promised its decisions will be binding. The members I spoke to said they felt no particular obligation to Facebook's shareholders.
The company, meanwhile, has pledged to abide by decisions on topics as varied as nudity and hate speech, in hopes that it will ultimately shield Zuckerberg from making endless, impossibly controversial public choices.
But the board has been handling pretty humdrum stuff so far. It has spent a lot of time, two people involved told me, discussing nipples, and how artificial intelligence can identify different nipples in different contexts.
Board members have also begun pushing to have more power over the crucial question of how Facebook amplifies content, rather than just deciding on taking posts down and putting them up, those people said.
In October, it took on a half-dozen cases, about posts by random users, not world leaders: Can Facebook users in Brazil post images of women's nipples to educate their followers about breast cancer? Should the platform allow users to repost a Muslim leader's angry tweet about France? It is expected to finally issue rulings at the end of this week, after what participants described as a long training followed by slow and intense deliberations.
And it has faced questions about whether it would ever be more than a public relations gesture, including from critics who started an alternate "Real Facebook Oversight Board" to call for a sweeping crackdown on the platform. So when Facebook suspended Trump's account indefinitely after the attack on the Capitol on January 6, the Oversight Board's leaders didn't disguise their eagerness to take on a big and meaty question.
"This is the kind of case the oversight board is for," said one of the board's co-chairs, Jamal Greene, a former Supreme Court clerk and Kamala Harris aide who is the Dwight Professor of Law at Columbia University Law School and a prominent legal scholar.
Another board co-chairman, the conservative former federal judge and Stanford law professor Michael McConnell, told me before Facebook finally referred the case that it was "quite appropriate for the board to hear" the questions raised by the Trump ban.
It's hard to imagine a more consequential case. The decisions by Twitter and Facebook to bar Trump immediately reshaped the American political landscape. In the course of a few hours after the Capitol riots, they simply vapourised the most important figure in the history of social media.
The board took up the case Thursday, and will appoint a panel of five randomly selected board members, at least one of them American, to decide what is to be done with Trump's account. The full, 20-person board will review the decision, and could reinstate Trump's direct connection to millions of supporters, or sever it for good.
The odds aren't bad for Trump. Kate Klonick, an assistant professor at the St. John's University School of Law who described platforms as "New Governors" in an influential 2018 Harvard Law Review article, said the reaction to the Trump ban among legal academics has been "tepid and very qualified support for the outcome from people who are experts in free speech, mixed with long-term fear about what this is all going to mean for democracy going forward."
Noah Feldman, the Felix Frankfurter Professor of Law at Harvard Law School, who first brought the notion of a Facebook Supreme Court to the company, said he thought conservatives dismayed by the recent crackdown might be surprised to find an ally in this new international institution. "They may come to realise that the Oversight Board is more responsive to freedom of expression concerns than any platform can be given real world politics," he said.
Nick Clegg, Facebook's vice president for global affairs, said he was "very confident" the board would affirm the company's decision to suspend Trump's account the morning after the mob stormed the Capitol, though less sure what recommendations it would make about allowing him to return to the platform in the future.
The Oversight Board appears particularly relevant right now because it represents a new kind of governance, in which transnational corporations compete for power with democratically elected leaders.
The board doesn't have "Facebook" in its name, or Facebook blue on its website, for a reason: Clegg said he hoped it would "develop a life of its own," get buy-in from other platforms (no dice so far), and "even be co-opted in some shape or form by governments."
And if this sounds far-fetched, or sinister, consider the trends in public opinion: A global survey last week by the P.R. firm Edelman found that governments, the media and nongovernmental organisations alike had seen the public's faith sink in the COVID era. The all-caps headline: "BUSINESS NOW ONLY INSTITUTION SEEN AS BOTH COMPETENT AND ETHICAL."
That's how some of the board's members see it as well.
"Practically the only entities that I trust less than the companies would be the government," McConnell said.
To others, the idea of global corporations becoming de facto governments is dystopian — and the board's promise reflects low expectations for democratic governance. "No board, whether corporate or 'independent,' can or should replace a parliament," said Marietje Schaake, a Dutch politician who is a member of the "real" board.
"Both the storming of the Capitol and social media companies' panicked reactions have laid bare the depth of unchecked power social media companies hold over the public debate and public safety. The balancing and weighing of rights and interests belongs with democratically legitimate decision makers. There must be accountability beyond self-regulation."
The board's decision in the Trump case — due before the end of April — has obvious implications here in the United States, but it could also set the company's policy in other big democracies with leaders of the same new right-wing populist ilk, like Brazil, India and the Philippines. For them, too, Facebook is a major source of power, and they're now eying Palo Alto warily.
The Trump ban is "a dangerous precedent," an official in India's ruling party tweeted. In Brazil, as in the United States, conservatives have begun shifting their followers to Telegram, a messaging service.
The emergence of this new kind of governance, and this new kind of decision, signals the return of gatekeeping. The moves also underscore who really keeps the gate, and who has lost that power. That space between government and corporate power used to be occupied by a widely trusted mass media.
"The media played a role of this sort at a certain point in history, as a kind of trusted intermediary, but there are good reasons for it not to play that role anymore," Green said.
"There's got to be something in between private commercial incentives and government."