Report: Leaked Documents Explain Facebook’s Content Moderation System

Facebook responding to US regulators in data breach probe
AFP

Recently leaked internal documents partially explain Facebook’ muddled process to decide what content on its platform counts as “hate speech.”

Motherboard published a long-form article recently which outlined how Facebook moderates content on its own platform. Motherboard spoke to a number of current and former Facebook employees, including Dave Willner, Facebook’s first head of content policy, about content moderation on the site.

Willner discussed the early days of Facebook stating: “Originally, it was this guy named Karthi who looked at all the porn, but even from the beginning the moderation teams were fairly serious,” Willner told Motherboard. “It was a college site, and the concerns were largely taking down the dick pics, plus some spam and phishing. When I was joining the moderation team, there was a single page of prohibitions and it said ‘take down anything that makes you feel uncomfortable.’ There was basically this Word document with ‘Hitler is bad and so is not wearing pants.’”

In 2009, the company had only 12 people moderating the content of more than 120 million users around the world. It took a paramilitary organization in Iran killing a protester and posting a video of the execution to the platform for Facebook to begin expanding its content moderation team — but Facebook did choose to leave this video on the platform. Willner operated as Facebook’s head of content policy between 2010 and 2013, in which time he said Facebook’s role in the world became more clear: “If we’re into people sharing and making the world more connected, this is what matters. That was a moment that crystallized: ‘This is the site.’ It’s not ‘I had eggs today,’ it’s the hard stuff.”

As Facebook grew, the website found itself dealing with complex geopolitical situations, in-depth histories, and confusing contexts which meant understanding what should be allowed on the site became an extremely challenging task: “If you say, ‘Why doesn’t it say use your judgment?’ We A/B tested that,” Willner said. “If you take two moderators and compare their answers double-blind, they don’t agree with each other a lot. If you can’t ensure consistency, Facebook functionally has no policy.”

According to some employees, Facebook preferred to take a wider view of content on the platform, choosing to protect the Facebook ideology and mission over defining what specifically counts as hate speech: “Facebook would tend to take a much more philosophical/rational approach to these kinds of questions, less humanistic/emotional,” one former employee told Motherboard. “Engineering mindset tended to prevail, zooming out from strong feelings about special cases, trying to come up with general, abstract, systems-level solutions. Of course, the employees are all people too. But the point is that the company tries to solve things at a systems-scale.”

Some of Facebook’s internal training documents relating to hate speech and various other content moderation issues have been released by Motherboard. The slide below, which is a reproduction of a Facebook training slide, shows how Facebook defines and deals with the issue of hate speech:

Monika Bickert, Facebook’s developer of policy enforcement, told Motherboard that Facebook often has to focus on individual policy directed by different country’s own laws, which is not an easy feat given some of the hate speech restrictions in certain countries. “We have a global policy, but we want to be mindful of how speech that violates that policy manifests itself in a certain location versus others,” she said, adding that Facebook continues to have ongoing policy conversations about hate speech and hate organizations. “We’re constantly looking at how do we refine [policies], or add nuance, or change them as circumstances change.”

Nine different countries legally pursue issues around Holocaust denial, including France, Germany, Austria, Israel, and Slovakia. Facebook blocks users with IP addresses from these countries from viewing Holocaust denial content on its platform and respects local laws “when the government actively pursues their enforcement,” according to internal documents.

Motherboards investigation into Facebook’s hate speech rules can be found here.

Lucas Nolan is a reporter for Breitbart News covering issues of free speech and online censorship. Follow him on Twitter @LucasNolan_ or email him at lnolan@breitbart.com

COMMENTS

Please let us know if you're having issues with commenting.