Facebook Publishes 27-Page Content Moderation Guide

Zuckerberg
Getty/Saul Loeb

Facebook has published a 27-page document outlining how the company moderates content on their platform and a new appeals process for users who believe their content was unfairly deleted.

In a surprise attempt at transparency, Facebook has decided to reveal their 27-page community standards document which outlines what content is banned from their platform and why user accounts may be suspended for publishing certain content. This seems to be another push by Facebook to show just how much content they have to moderate on their platform following criticism from Congress recently for the alleged limiting of certain Facebook pages such as that of conservative YouTube stars Diamond and Silk.

“I’ve been wanting to do this for a long time,” said Monika Bickert, Facebook’s Head of Global Policy Management, according to Wired. Facebook’s decision to publish their content guidelines appears to be in an effort to explain to the public that they do have set rules and guidelines for what they allow on their platform. “I have actually had conversations where I talked about our standards and people said, ‘I didn’t actually realize you guys have policies,'” said Bickert.

According to the content policy, Facebook leans towards allowing content to stay on the platform rather than remove it. “We err on the side of allowing content, even when some find it objectionable, unless removing that content prevents a specific harm,” states the community standards guidelines. The document released today clearly explains a number of situations in which content should be removed, for example, videos that show “tossing, rotating, or shaking of an infant too young to stand by their wrists, ankles, arms, legs, or neck,” should be removed from the platform. Similarly, links to psychological resources and help centers must be provided to anyone that posts “images where more than one cut of self mutilation is present on a body part and the primary subject of the image is one or more unhealed cuts.”

A whole section of the community standards document is dedicated to “hate speech,” this section reads:

We do not allow hate speech on Facebook because it creates an environment of intimidation and exclusion and in some cases may promote real-world violence.

We define hate speech as a direct attack on people based on what we call protected characteristics — race, ethnicity, national origin, religious affiliation, sexual orientation, sex, gender, gender identity, and serious disability or disease. We also provide some protections for immigration status. We define attack as violent or dehumanizing speech, statements of inferiority, or calls for exclusion or segregation. We separate attacks into three tiers of severity, as described below.

Sometimes people share content containing someone else’s hate speech for the purpose of raising awareness or educating others. Similarly, in some cases, words or terms that might otherwise violate our standards are used self-referentially or in an empowering way. When this is the case, we allow the content, but we expect people to clearly indicate their intent, which helps us better understand why they shared it. Where the intention is unclear, we may remove the content.

We allow humor and social commentary related to these topics. In addition, we believe that people are more responsible when they share this kind of commentary using their authentic identity.

In another section, Facebook outlines how they’re attempting to fight fake news:

Disrupting economic incentives for people, Pages, and domains that propagate misinformation

Using various signals, including feedback from our community, to inform a machine learning model that predicts which stories may be false

Reducing the distribution of content rated as false by independent third-party fact-checkers

Empowering people to decide for themselves what to read, trust, and share by informing them with more context and promoting news literacy

Collaborating with academics and other organizations to help solve this challenging issue

Facebook’s full community standards can be read here.

Lucas Nolan is a reporter for Breitbart News covering issues of free speech and online censorship. Follow him on Twitter @LucasNolan_ or email him at lnolan@breitbart.com 

COMMENTS

Please let us know if you're having issues with commenting.