By Rob Lever
Washington - Facebook's independent oversight board rules Wednesday on the platform's ban of former US president Donald Trump in a case that could set a precedent for how social media handles harmful content from world leaders.
The ruling, set for release at 1300 GMT Wednesday, is likely to be a defining moment for the leading social network's so-called "supreme court" envisioned by company founder Mark Zuckerberg to make thorny decisions on what to allow or remove from Facebook.
The oversight board, which makes decisions that are binding on Facebook and cannot be appealed, will rule on whether to leave the Trump ban in place or allow him back on the platform. It may also make related recommendations to the California-based social giant.
"This is a huge decision, it's getting a lot of attention and deservedly so," said Daniel Kreiss, University of North Carolina professor and researcher specializing in politics and social media.
"This is significant for the global precedent it will set. If they uphold the ruling, I think you will see more robust enforcement around the world."
The oversight panel, comprised of jurists, policy experts, journalists and others from across the globe, will make perhaps its most significant decision at a time when social platforms are struggling to remain open to political discourse while filtering out incitements to violence, misinformation and abusive comments.
"This Facebook oversight decision is a key litmus test for whether country-sized technology companies can effectively regulate themselves," said Lindsay Gorman, emerging technologies fellow at the nonprofit Alliance for Securing Democracy.
The oversight board is Facebook's "best attempt to stave off looming government regulation," Gorman added.
- Too late? -
Trump was suspended from Facebook after he posted a video during the deadly January 6 rampage by his supporters at the US Capitol in which he stated: "We love you, you're very special."
The US leader was banned permanently by Facebook the following day, and he was taken off other platforms including Twitter and YouTube.
Some analysts said Facebook and other social networks should have acted on Trump sooner, after years of giving him an exemption from rules on hateful content because of his "newsworthiness" as a political leader.
"If anything, the decision to originally ban... Trump should have come much earlier," said Samuel Woolley, a University of Texas professor specializing in computational propaganda.
"He was using Facebook and other platforms to actively spread patently false content about electoral processes -- very effectively undermining US democracy."
Trump has remained undeterred by the bans, sending out frequent statements via email and on Tuesday launching a blog-like website that he billed as "a beacon of freedom" and a "place to speak freely and safely."
Visitors to the website can like Trump's posts and repost them on Facebook and Twitter.
Facebook itself referred Trump's case to the oversight board, in line with its position that company executives should not be in the position of making key decisions on content and political speech. The panel has received more than 9,000 comments in the case.
But the move by Facebook and others has also drawn a torrent of criticism from Trump supporters, who argue that large tech platforms are biased and stifling opposing views.
And the ban has also sparked concern from others including German Chancellor Angela Merkel, who called Facebook's move "problematic," and from civil liberties activists.
Jameel Jaffer, executive director of Columbia University's Knight First Amendment Institute, said the issue is more complex than simply evaluating Trump's comments.
"I'm hopeful the board will use this case as an opportunity to put a spotlight on Facebook's decisions about the design of its platform," Jaffer said.
"These engineering decisions are often invisible, but they determine which speech proliferates on the platform, how quickly it spreads, who sees it, and in what context they see it."
In its submission to the board, the institute said Facebook should conduct "an independent study of how its platform may have contributed to the events of January 6" and that the panel should rule on Trump "only after the company has provided it with the results of that study."
Elizabeth Renieris, director of the Notre Dame-IBM Technology Ethics Lab, said the ruling is unlikely to end the controversy about content moderation.
"The board's analysis and reasoning in this instance could very well help shape the policies of Facebook and other digital platforms regarding how to treat political leaders and other public figures in the future," she said.
"Whatever the decision, we should remain uneasy about the fact that decisions of this nature are being made by unelected, unaccountable corporations and their self-appointed assessors."
Agence France-Presse