Donald Trump’s supporters say it is a disgrace that Facebook’s oversight board has just upheld the decision to ban the former US president from the social network. Many of his opponents say it is a fitting punishment for inciting post-election violence in Washington.

But the broader and more important issue is whether a Facebook-designed, appointed and funded oversight board is the appropriate body to be making such judgments. Why has it been left to a private company to create a faux public institution, Facebook’s “supreme court”, to draw the boundaries of free expression?

George Lakoff, the cognitive scientist, famously explained how framing an issue in a particular way can shape political or societal outcomes. “Framing defines the problem and limits what you can talk about,” he said. By setting up its own oversight board, Facebook has artfully framed the issue of free expression as one that implicitly accepts the company’s operating practices and business model and focuses on outcomes, not inputs. But, as the oversight board itself argued this week, that does not mean Facebook can avoid its responsibilities.

There are more pressing questions worth asking. Why does Facebook not consistently enforce its rules on all users? How can governments, civil society and its 2.5bn-plus users have more legitimate say over its content decisions? And, crucially, has Facebook become so big as to be ungovernable? There is a case that Facebook is now a systemically important informational institution that demands particular scrutiny as the communications infrastructure of our age, just as radio and television were regulated in earlier eras.

In its defence, Facebook’s oversight board does provide a little more transparency and accountability. Speaking at the Financial Times Global Boardroom this week, Sir Nick Clegg, Facebook’s head of global affairs, said the lack of agreed regulation meant that Facebook had no option but to fill the void and had committed $130m to develop the oversight board. Acknowledging it was early days, he said Facebook and the oversight board would constantly evolve and draw in more outside partners. 

For the moment, the board consists of 19 members drawn mostly from politics, academia and media. Its stated purpose is “to help Facebook answer some of the most difficult questions around freedom of expression online: what to take down, what to leave up and why”. The board’s decisions on Facebook’s content decisions are binding. But it cannot change company policy.

Since October, the board has received more than 300,000 appeals. But it can only deal with a handful of them. The company can itself also refer cases, as it did with Trump. Facebook also asked the board to consider its policy for suspending political leaders. The test will be how far Facebook goes in taking down the posts, or suspending the accounts, of other inflammatory leaders, such as Jair Bolsonaro in Brazil or Rodrigo Duterte in the Philippines.

David Kaye, former UN rapporteur on free expression, says Facebook’s oversight board is a “pretty promising innovation”. It has credible experts and decision-making independence. But he fears it is already sucking the oxygen out of other initiatives from the UN and civil society to develop cross-industry oversight. “It is interesting and innovative,” he says. “But it is crowding out other important mechanisms.”

Moderating content on such a vast social network is a mind-bending challenge, clearly beyond the scope of Facebook’s oversight board, 35,000 human moderators and even its smartest algorithms. Facebook should do more to defend free expression in countries where it is being threatened, such as India, Vietnam and Thailand. But we also want it to filter out the racial hatred, disinformation and extremist propaganda proliferating on social networks.

That means creating robust processes and more effective feedback mechanisms that take account of local context, politics and culture. It should not just be rich and powerful users’ interests that are considered. Regulators should focus more attention on the data that underlie companies’ algorithmic decision-making systems, as the EU is proposing with its draft legislation on artificial intelligence. Antitrust authorities must examine network effects and market structures and disentangle commercial and public interests. 

There are no simple answers to questions of free expression online. But that does not mean attempts to find more complex answers are futile. What we should not do is allow Facebook to frame these issues for all of us for its own corporate convenience.

john.thornhill@ft.com



READ NEWS SOURCE

LEAVE A REPLY

Please enter your comment!
Please enter your name here