Report: Facebook Moderators Use Drugs to Escape ‘Misery’ of Platform that Gives Them PTSD

Facebook CEO and founder Mark Zuckerberg testifies during a US House Committee on Energy and Commerce hearing about Facebook on Capitol Hill in Washington, DC, April 11, 2018. / AFP PHOTO / SAUL LOEB (Photo credit should read SAUL LOEB/AFP/Getty Images)
SAUL LOEB/AFP

Some Facebook content moderators have resorted to using drugs while on break as an escape from the “misery” of seeing traumatic images on the platform, which have led some to develop symptoms similar to PTSD after they leave the social media giant.

In an article, the Verge documented the work lives of Facebook content moderators working at the contractor Cognizant, and detailed the problems that the moderators face.

“The moderators told me it’s a place where the conspiracy videos and memes that they see each day gradually lead them to embrace fringe views,” reported the Verge, Monday. “One auditor walks the floor promoting the idea that the Earth is flat. A former employee told me he has begun to question certain aspects of the Holocaust. Another former employee, who told me he has mapped every escape route out of his house and sleeps with a gun at his side, said: ‘I no longer believe 9/11 was a terrorist attack.'”

According to the Verge, moderators have also been “found having sex inside stairwells and a room reserved for lactating mothers,” eager “for a dopamine rush amid the misery,” and moderators “cope with seeing traumatic images and videos by telling dark jokes about committing suicide, then smoking weed during breaks to numb their emotions.”

“Moderators are routinely high at work,” the Verge claimed, adding, “Employees are developing PTSD-like symptoms after they leave the company, but are no longer eligible for any support from Facebook or Cognizant.”

In September, it was reported that a former Facebook content moderator was suing the social network for allegedly giving her PTSD and psychological trauma, and in July, an undercover reporter claimed the company failed to remove examples of child abuse

In 2017, the Guardian revealed that a Facebook bug “exposed the identities of moderators to potential terrorists.”

Charlie Nash is a reporter for Breitbart Tech. You can follow him on Twitter @MrNashington, or like his page at Facebook.

.

Please let us know if you're having issues with commenting.