Shock Claim: Facebook Moderators Told to ‘Err on the Side of an Adult’ with Potential Child Sexual Abuse Material

Mark Zuckerberg Smiles during testimony (Pool/Getty)
Pool/Getty

According to recent reports, Facebook’s content moderators are told to “err on the side of an adult” when unsure of the age of a victim in potential child sexual abuse material (CSAM).

A recent report from the New York Times reveals that Facebook’s training documents for the company’s content moderators instruct them to “err on the side of an adult” when they don’t know the age of an individual shown in a photo or video suspected of being child sexual abuse material (CSAM).

WASHINGTON, DC - APRIL 10: Facebook co-founder, Chairman and CEO Mark Zuckerberg arrives to testify before a combined Senate Judiciary and Commerce committee hearing in the Hart Senate Office Building on Capitol Hill April 10, 2018 in Washington, DC. Zuckerberg, 33, was called to testify after it was reported that 87 million Facebook users had their personal information harvested by Cambridge Analytica, a British political consulting firm linked to the Trump campaign. (Photo by Chip Somodevilla/Getty Images)

Facebook co-founder, Chairman and CEO Mark Zuckerberg (Photo by Chip Somodevilla/Getty Images)

zuckerberg

(Josh Edelson/AFP/Getty Images)

Companies like Facebook are required to monitor their platforms for child sexual abuse material and if found, report it to the National Center for Missing and Exploited Children (NCMEC). Many tech firms employ content moderators to review content flagged as potential CSAM.

The Verge reports that the policy that instructs Mark Zuckerberg’s moderators to “err on the side of an adult” was made for Facebook’s content moderators at the third-party contractor Accenture and is mentioned in a California Law Review article from August, which states:

Interviewees also described a policy called “bumping up,” which each of them personally disagreed with. The policy applies when a content moderator is unable to readily determine whether the subject in a suspected CSAM photo is a minor (“B”) or an adult (“C”). In such situations, content moderators are instructed to assume the subject is an adult, thereby allowing more images to go unreported to NCMEC.

Facebook’s head of safety, Antigone Davis, told the Times that the policy is accurate and attempts to take into account the privacy concerns of those posting sexual imagery of adults. Davis told the NYT: “The sexual abuse of children online is abhorrent,” and emphasized that the company employs a rigorous review process that flags more potential CSAM than any other company.

Read more at the New York Times here.

Lucas Nolan is a reporter for Breitbart News covering issues of free speech and online censorship. Follow him on Twitter @LucasNolan or email him at lnolan@breitbart.com

COMMENTS

Please let us know if you're having issues with commenting.