Microsoft Employees Sue Company over PTSD Suffered from Filtering Child Porn, Murder

online
AP Photo/File / CHINATOPIX

Two members of Microsoft’s Online Safety Team are suing the company, citing post-traumatic stress and auditory hallucinations after being made to filter through and remove videos and pictures of child porn, murder, and bestiality as part of their job.

“Members of Microsoft’s Online Safety Team had ‘God-like’ status, former employees Henry Soto and Greg Blauert allege in a lawsuit filed on Dec. 30. They ‘could literally view any customer’s communications at any time,'” reported the Daily Beast. “Specifically, they were asked to screen Microsoft users’ communications for child pornography and evidence of other crimes.”

“But Big Brother didn’t offer a good health care plan, the Microsoft employees allege. After years of being made to watch the ‘most twisted’ videos on the internet, employees said they suffered severe psychological distress, while the company allegedly refused to provide a specially trained therapist or to pay for therapy,” the report alleges. “The two former employees and their families are suing for damages from what they describe as permanent psychological injuries, for which they were denied worker’s compensation.”

Soto, who was one of the first employees in the department, claims to have been “involuntarily transferred” to the position, adding that he was “not informed prior to the transfer as to the full nature” of the job.

Soto was also forced to remain in the position for at least a year and a half before being able to request another transfer– the usual practice for any position at Microsoft.

As part of the job, Soto claimed that he had to watch and filter through “horrible brutality, murder, indescribable sexual assaults, videos of humans dying and, in general, videos and photographs designed to entertain the most twisted and sick-minded people in the world.”

“Many people simply cannot imagine what Mr. Soto had to view on a daily basis as most people do not understand how horrible and inhumane the worst people in the world can be,” proclaimed the lawsuit, adding, “He had trouble with sleep disturbance, [and] nightmares.”

“He suffered from an internal video screen in his head and could see disturbing images, he suffered from irritability, increased startle, anticipatory anxiety, and was easily distractible,” the lawsuit claims.

Microsoft defended the department and its employees, citing the importance of the job in an email.

“Microsoft applies industry-leading, cutting-edge technology to help detect and classify illegal images of child abuse and exploitation that are shared by users on Microsoft Services,” responded a Microsoft spokesperson to the Daily Beast. “Once verified by a specially trained employee, the company removes the image, reports it to the National Center for Missing & Exploited Children, and bans the users who shared the images from our services. We have put in place robust wellness programs to ensure the employees who handle this material have the resources and support they need.”

However the “wellness program” allegedly consisted of just an “under-trained counselor,” which resulted in a “compassion fatigue” diagnosis, instead of the fully-fledged mental health program that was allegedly available for Microsoft’s Digital Crimes Unit.

Greg Blauert, who is also currently suing Microsoft, cited similar concerns about the job, declaring that he had to sift through “thousands of images of child pornography, adult pornography and bestiality that graphically depicted the violence and depravity of the perpetrators.”

“He began experiencing nightmares and intrusive images,” claimed the Daily Beast. “If he or a co-worker broke down at work, their employers allegedly encouraged them to merely ‘leave work early’ as part of the department’s ‘Wellness Plan.'”

Both men have been diagnosed with PTSD and are currently on leave from work.

Charlie Nash is a reporter for Breitbart Tech. You can follow him on Twitter @MrNashington or like his page at Facebook.

COMMENTS

Please let us know if you're having issues with commenting.