Predator’s Paradise: Facebook Faces Criticism over Sexual Comments on Underage Instagram Photos

Facebook's Mark Zuckerberg askew on a TV
MANDEL NGAN /Getty

Facebook-owned Instagram is facing intense criticism for failing to remove accounts that post photos of children in swimwear or little clothing that receive hundreds of sexualized comments from sickos that feel free to use Mark Zuckerberg’s platform to share their interests with like-minded perverts.

The Guardian reports that the Facebook-owned photo-sharing app Instagram is failing to remove accounts on its platform that post pictures of children in swimwear or partial states of dress that receive hundreds of sexualized comments. Facebook (now known as Meta) alleges that it takes a zero-tolerance approach to child exploitation, but accounts that have been flagged as suspicious have still remained on the platform and been deemed acceptable by the company’s automated moderation system.

Mark Zuckerberg discusses Instagram

Mark Zuckerberg discusses Instagram (AFP/Getty)

One researcher reported an account posting photos of children in sexualized poses. Instagram responded that same day saying that due to “high volume” it had not been able to view the report but that its “technology has found that this account probably doesn’t go against our community guidelines.” The account remained live with more than 33,000 followers.

The accounts are reportedly often used for “breadcrumbing,” where those seeking child sexual abuse images will post technically legal images to attract other internet predators who then arrange to meet up in private messaging groups to share even more extreme content.

Andy Burrows, the head of online safety policy at the NSPCC, said that the accounts are like a “shop window” for pedophiles. He commented: “Companies should be proactively identifying this content and then removing it themselves. But even when it is reported to them, they are judging that it’s not a threat to children and should remain on the site.”

Facebook said that it has strict rules against content that sexually exploits or endangers children and that it removes it when made aware of it. A spokesperson commented: “We’re also focused on preventing harm by banning suspicious profiles, restricting adults from messaging children they’re not connected with and defaulting under-18s to private accounts.”

Breitbart News recently reported that Facebook’s training documents for the company’s content moderators instruct them to “err on the side of an adult” when they don’t know the age of an individual shown in a photo or video suspected of being child sexual abuse material (CSAM).

Facebook’s head of safety, Antigone Davis, told the Times that the policy is accurate and attempts to take into account the privacy concerns of those posting sexual imagery of adults. Davis told the NYT: “The sexual abuse of children online is abhorrent,” and emphasized that the company employs a rigorous review process that flags more potential CSAM than any other company.

Read more at the Guardian here.

Lucas Nolan is a reporter for Breitbart News covering issues of free speech and online censorship. Follow him on Twitter @LucasNolan or email him at lnolan@breitbart.com

COMMENTS

Please let us know if you're having issues with commenting.