A UK coroner recently listed the Instagram algorithm as a contributing cause in the death of a 14-year-old girl who took her own life in 2017 after viewing thousands of posts on the platform promoting self-harm. The coroner described the teen’s death as “an act of self-harm whilst suffering from depression and the negative effects of online content.”

Bloomberg reports that during a court hearing in London last week, a coroner named Andrew Walker began the task of determining how responsible social media content algorithms are for the mental health of their underage users.

Walker was tasked with determining the full causes of death for a 14-year-old girl named Molly Russell who took her own life in 2017 after viewing thousands of posts on platforms such as Facebook-owned Instagram and Pinterest promoting self-harm. At one point during the inquest, Walker described the content that Russell liked or saved as so disturbing that he found it “almost impossible to watch.”

AntonioGuillem /iStock / Getty Images Plus

Walker concluded that Russell’s death could not be ruled a suicide, instead describing her cause of death as “an act of self-harm whilst suffering from depression and the negative effects of online content.”

Walker came to the decision based on Russell’s “prolific” browsing of Instagram, liking, sharing, and saving 16,300 posts in the six months leading up to her death. Russell had also saved 5,739 pins on Pinterest in the same amount of time.

Walker stated: “The platforms operated in such a way using algorithms as to result, in some circumstances, of binge periods of images, video clips and text,” which “romanticized acts of self-harm” and “sought to isolate and discourage discussion with those who may have been able to help.”

Following Walker’s ruling, Russell’s family issued a statement, telling Ars Technica: “This past fortnight has been particularly painful for our family. We’re missing Molly more agonizingly than usual, but we hope that the scrutiny this case has received will help prevent similar deaths encouraged by the disturbing content that is still to this day available on social media platforms including those run by Meta.”

Oliver Sanders, the family’s lawyer, has requested that Walker “send instructions on how to prevent this happening again to Pinterest, Meta, the UK government, and the communications regulator.”

Internal Facebook documents show that the company knows the Instagram platform is toxic for teen girls.

As the Wall Street Journal reported:

“Thirty-two percent of teen girls said that when they felt bad about their bodies, Instagram made them feel worse,” the researchers said in a March 2020 slide presentation posted to Facebook’s internal message board, reviewed by The Wall Street Journal. “Comparisons on Instagram can change how young women view and describe themselves.”

“We make body image issues worse for one in three teen girls,” said one slide from 2019, summarizing research about teen girls who experience the issues.

“Teens blame Instagram for increases in the rate of anxiety and depression,” said another slide. “This reaction was unprompted and consistent across all groups.”

Among teens who reported suicidal thoughts, 13% of British users and 6% of American users traced the desire to kill themselves to Instagram, one presentation showed.

Read more at Bloomberg here.

Lucas Nolan is a reporter for Breitbart News covering issues of free speech and online censorship. Follow him on Twitter @LucasNolan