Facebook plans launch of its own “Supreme Court” for handling takedown appeals

“If someone disagrees with a decision we’ve made, they can appeal to us first, and soon they will be able to further appeal this to the independent board,” company CEO Mark Zuckerburg wrote in a letter (PDF). “As an independent organization, we hope it gives people confidence that their views will be heard and that Facebook doesn’t have the ultimate power over their expression.

The board will launch with at least 11 members and should eventually get up to 40. The entity will contract its services to Facebook. Participants will serve a maximum of three three-year terms each and will be paid for their time. Their decisions will “be made publicly available and archived in a database of case decisions,” with details subject to certain data or privacy restrictions. Facebook can also contact the board for an “automatic and expedited review” in exceptional circumstances, “when content could result in urgent real world consequences,” such as, for example, if a mass-murderer is livestreaming his crimes.

The panel’s decisions will be binding, Facebook added, and the company will implement its findings promptly, “unless implementation of a resolution could violate the law.”

Trans-what-ency?

Today, Facebook has a few problems with content moderation. The first is, simply, scale. The site boasts more than 2.4 billion monthly active users and even if only half of those accounts posted content, the result would still be a mind-boggling volume of photos, memes, videos, and text. Keeping up with content that other users flag and report is an utterly thankless task that has lingering harmful effects on the people who do it.

The size of the job is only one challenge, though. The rules themselves are another. Facebook has had an extremely difficult time trying to write universal guidelines defining hateful or threatening speech in recent years. Attempts to unify its policies have resulted in some frankly bonkers outcomes. The site infamously removed the Declaration of Independence in 2018, citing it as hate speech, two years after sparking criticism for pulling down a famous, Pulitzer winning photograph from the Vietnam War. Documents obtained by ProPublica in 2017 revealed that while “black children” did not make the cut as a protected category on Facebook, “white men” explicitly did, heightening criticisms that Facebook policy operated outside of any real-world understanding of hate speech and Internet violence.

Even more challenging for literally billions of users: Despite Facebook’s existing community guidelines, finding out why a post has been blocked or deleted, or appealing the decision if one has, has proven to be an enormous black box. Facebook only instituted an appeals process in April 2018, but even that process has left many users wondering why a post they reported was not removed, or a post they made was. Several months later, a large coalition of digital and civil rights groups called on the company to make its appeals process transparent to users, as well as to issue regular transparency reports about “community standards enforcement.”

That’s where the new board comes in, theoretically. Zuckerberg first suggested in early 2018 that content decisions could rest with such a body, telling Vox, “You can imagine some sort of structure, almost like a Supreme Court, that is made up of independent folks who don’t work for Facebook, who ultimately make the final judgment call on what should be acceptable speech in a community that reflects the social norms and values of people all around the world.”

This time we mean it

To its credit, Facebook has started issuing regular transparency reports showing how many posts were removed for being spam, fake accounts, hate speech, terrorist propaganda, harassment, or child sexual exploitation in a given quarter. Quantifying the problem, however, has not the problem.

Facebook executive Monika Bickert was joined earlier today by counterparts from Twitter and Google, as well as a representative from the Anti-Defamation League, to testify before the Senate regarding the prevalence of extremism on the platform. Bickert in her written testimony (PDF) spoke extensively of Facebook’s efforts to combat hate speech, extremism, and terrorist content on the site. The company currently has 30,000 employees working on content moderation around the world, Bickert said. Those employees review content in more than 50 languages, in shifts, 24 hours a day.

This leads to an obvious question: If 30,000 moderators, all of whom are already badly overworked and struggling as the system changes endlessly around them, can’t keep up with everything… why should a panel of 40 people, at first chosen by Facebook but operating independently of them, be able to stem the tide any better?

In this sense, perhaps Zuckerburg’s Supreme Court metaphor is apt. The panel will not by any stretch of the imagination be able to take on all cases brought before it. Staff will pick and choose which cases deserve appeal, according to a flowchart provided by Facebook:

Some appeals will be heard; others, ignored. Initially, all cases will be referred to the board by Facebook itself, without users able to submit appeals for more than a year. Users, then, still don’t necessarily have a fair mechanism from bringing forth complaints.

Kate Cox Kate covers tech policy issues, including privacy, antitrust, and other shenanigans, from Washington, DC.
Email[email protected]

You must login or create an account to comment.

Channel Ars Technica

Related Stories

Sponsored Stories

Powered by

Today on Ars

CNMN Collection
WIRED Media Group
© 2019 Condé Nast. All rights reserved. Use of and/or registration on any portion of this site constitutes acceptance of our User Agreement (updated 5/25/18) and Privacy Policy and Cookie Statement (updated 5/25/18) and Ars Technica Addendum (effective 8/21/2018). Ars may earn compensation on sales from links on this site. Read our affiliate link policy.
Your California Privacy Rights
The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with the prior written permission of Condé Nast.
Ad Choices

[ufc-fb-comments url="http://www.newyorkmetropolitan.com/tech/facebook-plans-launch-of-its-own-supreme-court-for-handling-takedown-appeals"]

Latest Articles

Related Articles