Unredacted Complaint Reveals Horrifying Level of Pedos Sexually Targeting Children on Instagram, Facebook

Mark Zuckerberg (Drew Angerer /Getty)
Drew Angerer /Getty

A New Mexico complaint against Mark Zuckerberg’s Meta has been unredacted to reveal horrifying claims of sexual harassment against children on Facebook and Instagram. One internal presentation estimated that 100,000 children a day are targets by pedophiles on Zuckerberg’s platforms, including receiving ” pictures of adult genitalia.”

Digital Content Next CEO Jason Kint pointed out in an X/Twitter thread that “most redactions just now removed in the New Mexico attorney general’s complaint vs Instagram and Facebook” show “it is even more shocking, infuriating and stomach turning” than what people may have realized. The mostly unredacted complaint is available here.

In one highlighted section, the complaint claims that “100,000 children per day received online sexual harassment, such as pictures of adult genitalia.”

The complaint goes on to state the following:

One internal document shows Meta scrambling in 2020 to address an Apple executive whose 12-year-old was solicited on the platform, noting, “this is the kind of thing that pisses Apple off to the extent of threatening to remove us from the App Store,” asking whether there was a timeline for when “we’ll stop adults from messaging minors on IG Direct,” and noting that if they did not address other accounts with “sugardaddy” — “they will reply with 100 more accounts if we’re not able to take them down.”

“A May 2018 presentation to Meta’s Audit Committee confirms this fact: ‘While user-provided data indicates a decline in usage among young users, this age data is unavailable because a disproportionate number of our young users register with an inaccurate age,'” the suit continues.

“Two years later, a January 2020 presentation entitled, ‘Succeeding in US Messaging’ demonstrated both the depth of Meta’s knowledge regarding usage of Messenger by children and its ambitions to leverage that usage to further entice younger generations to use its products,” the complaint reads.

The complaint goes on to say Meta is aware that its platforms are popular with children as young as age six:

One of Meta’s “End Game” goals was to “become the primary kid messaging app in the US by 2022.” The document confirmed that “in the US — Messenger is popular… with Kids (13% primacy, US 510).” Meta’s knowledge that its platforms were used by and “popular with” children as young as 6-years-old makes its failures to protect minors against CSAM and solicitation all the more egregious.

Elsewhere in the thread, Kint points out an internal conversation highlighted in the complaint, in which one Meta employee states that nothing is being done to counter child grooming on the company’s social media platforms, adding, “I’d argue we’re making it worse.”

“What specifically are we doing for child grooming,” an employee asks, to which a second employee replies, “Somewhere between zero and negligible. Child safety is an explicit non-goal this half. I’d argue we’re making it worse with Interop, but that’s a can of worms.”

Moreover, Meta’s “People You May Know” (PYMK) feature allegedly contributed to 75 percent of all inappropriate adult-minor contact. and “had a direct link to trafficking,” prompting employees to ask, “How on earth have we not just turned off PYMK between adults and children?” the complaint added.

Another apparent internal conversation showed an employee saying, “teenage self harm and suicide are so difficult to explain publicly that our current response looks convoluted and evasive… The fact that we have age limits which are unenforced (unenforceable?) and that there are, as I understand it, important differences in the stringency of our policies on IG vs Blue App [Facebook] makes it difficult to claim we are doing all we can.”

After that, someone else chimed in, asking whether Meta could improve its policies or whether it was a question of enforcement, adding, “We can definitely say that we need to improve our enforcement or our policies.”

Another chart highlighted in the complaint appears to show that Instagram’s own research found that 13 to 15-year-old users were more likely than average to be exposed to adult nudity and sexual activity.

You can follow Alana Mastrangelo on Facebook and X/Twitter at @ARmastrangelo, and on Instagram.

COMMENTS

Please let us know if you're having issues with commenting.