According to recent reports, Facebook’s content moderators are told to “err on the side of an adult” when unsure of the age of a victim in potential child sexual abuse material (CSAM).
A recent report from the New York Times reveals that Facebook’s training documents for the company’s content moderators instruct them to “err on the side of an adult” when they don’t know the age of an individual shown in a photo or video suspected of being child sexual abuse material (CSAM).
Companies like Facebook are required to monitor their platforms for child sexual abuse material and if found, report it to the National Center for Missing and Exploited Children (NCMEC). Many tech firms employ content moderators to review content flagged as potential CSAM.
The Verge reports that the policy that instructs Mark Zuckerberg’s moderators to “err on the side of an adult” was made for Facebook’s content moderators at the third-party contractor Accenture and is mentioned in a California Law Review article from August, which states:
Interviewees also described a policy called “bumping up,” which each of them personally disagreed with. The policy applies when a content moderator is unable to readily determine whether the subject in a suspected CSAM photo is a minor (“B”) or an adult (“C”). In such situations, content moderators are instructed to assume the subject is an adult, thereby allowing more images to go unreported to NCMEC.
Facebook’s head of safety, Antigone Davis, told the Times that the policy is accurate and attempts to take into account the privacy concerns of those posting sexual imagery of adults. Davis told the NYT: “The sexual abuse of children online is abhorrent,” and emphasized that the company employs a rigorous review process that flags more potential CSAM than any other company.
Read more at the New York Times here.