Facebook has released a lengthy 22-point document that explains more fully what its “community standards” are—in short, what is and isn’t allowed on the platform.
Facebook representatives declined to respond to Ars’ request for comment on the record, insisting that we speak to them only on background. It is not clear why the company, after 14 years, is finally releasing its guidelines now.
Last year, ProPublica obtained a slide deck outlining some of the mystifying rules, which allowed, for instance, attacks on a subset of a group (“radical Muslims” or “white female drivers”) but not larger groups with immutable characteristics (“all men”).
The new rubric, which was released Tuesday, attempts to explain in plain English and more detail how Facebook thinks about its seemingly reluctant role as a regulator. Other tech companies, notably YouTube, have been attempting to make their internal standards process more clear to their user base.
“We do not tolerate harassment on Facebook,” the company wrote in one of the sections.
“We want people to feel safe to engage and connect with their community. Our harassment policy applies to both public and private individuals because we want to prevent unwanted or malicious contact on the platform. Context and intent matter, and we allow people to share and re-share posts if it is clear that something was shared in order to condemn or draw attention to harassment. In addition to reporting such behavior and content, we encourage people to use tools available on Facebook to help protect against it.”
The company also noted that, for the first time, it will add a formal appeals process to challenge content removal.
“Over the coming year, we are going to build out the ability for people to appeal our decisions,” Monika Bickert, vice president of global policy management, wrote in a blog post. “As a first step, we are launching appeals for posts that were removed for nudity / sexual activity, hate speech or graphic violence.”