Facebook’s supervisory board, which will soon decide whether or not Donald Trump can return to the platform, has expanded its powers of moderating content, in a sign that the California group sees this independent panel as the solution to these editorial dilemmas.
The social media giant’s “supreme” body, which is studying and deciding the contested decisions to remove content from Facebook or Instagram, will now also look at posts left on the internet despite reports, according to press releases published on Tuesday.
“Allowing users to resume content they want removed from Facebook is an important extension of the supervisory board’s capabilities,” said Thomas Hughes, director of the Board of Directors.
“The board was created to ensure that Facebook makes fewer decisions on its own on critically important content issues, and that better decisions can be made through an independent and transparent process that works to protect human rights and freedom of expression,” the official said.
The board, which took office last year, made its first decisions in January, binding on Facebook.
The most anticipated should fall by the end of the week: It concerns former US President Donald Trump, banned from Facebook (and other social networks) after the January 6 riots at Capitol Hill.
The Republican billionaire has come under fire for his repeated allegations of unfounded electoral fraud and his words of encouragement to hundreds of his supporters who have thrown violently into the seat of the US Parliament.
Anxiety for freedom of expression
In the United States, digital ostracism has been widely hailed as unfortunate, but necessary by civil society and many elected officials. But in Europe, it has also come under fire from associations and leaders, such as German Chancellor Angela Merkel, who are concerned about the power of tech companies over free speech.
Facebook was accused for years of censoring certain voices or, on the contrary, of promoting disinformation, harassment, and the activities of violent groups, and ended up creating this supervisory board made up of some twenty international and independent members, including professors, lawyers, and journalists. Or human rights defenders.
“We are delighted that the Supervisory Board is expanding its reach and influence,” said Jay Rosen, Vice President responsible for Safety of Group Platforms.
But for some observers, the main role of this body shows that Facebook is behaving more and more as a vehicle that should make editorial decisions, and not as a host enforcing a regulation without political bets.
The social network is modeling with “advertising here, opening there,” according to Emily Bell, a professor at Columbia School of Journalism.
“This is what a news outlet does,” she explained on Twitter. The job of founder, president, Mark Zuckerberg, and chief executives of the California Corporation “is to run a huge advertising company. So the supervisory board becomes the editorial board by default.”
But given the size of the board – which has made only a handful of decisions and recommendations so far – Emily Bell and other observers doubted its ability to tackle the core issues that go beyond the characteristic cases.
“Facebook knows that it cannot do moderation on a large scale,” it concluded. “They will use humans (as well as automated computer systems, Editor’s Note) to judge instances of publicly ringing bells.”
“Given Facebook’s lack of transparency and consistency in its handling of erroneous content, we do not see how expanding the board will make it possible to ensure that disinformation is systematically removed from the platform,” said Joe Locito, a professor at the University of Texas School of Journalism.
The authority issued a ruling on Tuesday over Facebook’s decision to remove a video posted by a user in the Netherlands, showing a young child confronting adults with blacked-out faces, disguised as “Zwarte Piet,” a St. Nicholas who sparks a cultural debate every year.
The supervisory board sided with the social network, with the majority of members referring to “racial stereotypes” and judging that there was “sufficient evidence of harm to justify withdrawal.”