Instagram Bans ‘Graphic’ Self-Harm Pictures to Prevent Teen Suicide

Teens using Instagram
Drew Angerer/Getty

In a press release, Thursday, Facebook-owned platform Instagram announced it would start banning “graphic” self-harm pictures uploaded to the platform.

“At Instagram, nothing is more important to us than the safety of the people in our community. Over the past month we have seen that we are not where we need to be on self-harm and suicide, and that we need to do more to keep the most vulnerable people who use Instagram safe,” declared Head of Instagram Adam Mosseri in the press release. “That’s why today, following a comprehensive review with global experts and academics on youth, mental health and suicide prevention, we’re announcing further changes to our approach on self-harm content.”

Mosseri announced that the Facebook-owned photo-sharing platform would no longer “allow any graphic images of self-harm, such as cutting on Instagram – even if it would previously have been allowed as admission.”

“We have never allowed posts that promote or encourage suicide or self-harm, and will continue to remove it when reported,” Mosseri proclaimed.

Mosseri also declared that the platform “will not show non-graphic, self-harm related content – such as healed scars – in search, hashtags and the explore tab, and we won’t be recommending it.”

“We are not removing this type of content from Instagram entirely, as we don’t want want to stigmatize or isolate people who may be in distress and posting self-harm related content as a cry for help,” he explained. “We want to support people in their time of need – so we are also focused on getting more resources to people posting and searching for self-harm related content and directing them to organizations that can help.”

Mosseri also claimed that Instagram will continue “to consult with experts to find out what more we can do.”

Earlier this week, Mosseri wrote an article for the Telegraph admitting that Instagram has failed to protect vulnerable users from posts promoting self-harm and suicide, and revealing the platform’s plan to introduce “sensitivity screens” to self-harm posts.

Mosseri did not say at the time that Instagram would outright ban “graphic” self-harm content.

In the article, Mosseri referenced 14-year-old Instagram user Molly Russell, who took her own life in 2017.

According to the BBC, Russell’s family “looked into her Instagram account,” following the suicide, and “found distressing material about depression and suicide,” prompting Russell’s father to claim Instagram “helped kill my daughter.”

Charlie Nash is a reporter for Breitbart Tech. You can follow him on Twitter @MrNashington, or like his page at Facebook.



Please let us know if you're having issues with commenting.