Instagram Expands Ban on Suicide Content to Include Cartoons, Memes

Syria presidency says Instagram account reactivated
AFP

Facebook-owned social media platform Instagram has expanded its ban on graphic self-harm and suicide-related imagery to include cartoons and memes.

Facebook-owned image sharing website Instagram has expanded its ban on suicide and self-harm related content to include cartoon and meme imagery, TechCrunch reports. Instagram head Adam Mosseri explained the decision in a company blog post stating:

This past month, we further expanded our policies to prohibit more types of self-harm and suicide content. We will no longer allow fictional depictions of self-harm or suicide on Instagram, such as drawings or memes or content from films or comics that use graphic imagery. We will also remove other imagery that may not show self-harm or suicide, but does include associated materials or methods.

Mosseri met with the UK’s health secretary earlier this year to discuss how Instagram deals with self-harm content. Instagram has faced pressure in the UK to make changes following a public outcry after a 14-year-old schoolgirl committed suicide after viewing suicide content on Instagram. Facebook announced that it would be prohibiting self-harm related images such as those that depict cutting, and would be restricting access to photos and videos of healed scars by not recommending it in searches.

Instagram stated that it has doubled the amount of self-harm related content that it has acted on following the policy change with Mosseri stating that in the three months following the ban on graphic images of cutting it has  “removed, reduced the visibility of, or added sensitivity screens to more than 834,000 pieces of content.”

Mosseri told BBC News: “It will take time to fully implement. It’s not going to be the last step we take.” Mosseri stated in the blog post that the new policy is “based on expert advice from academics and mental health organizations like the Samaritans in the UK and National Suicide Prevention Line in the US”, saying: “We aim to strike the difficult balance between allowing people to share their mental health experiences while also protecting others from being exposed to potentially harmful content.”

He added: “Accounts sharing this type of content will also not be recommended in search or in our discovery surfaces, like Explore. And we’ll send more people more resources with localized helplines like the Samaritans and PAPYRUS in the UK or the National Suicide Prevention Lifeline and The Trevor Project in the United States.”

Mosseri stated that the issues are complex and “no single company or set of policies and practices alone can solve.” Mosseri adds: “But getting our approach right requires more than a single change to our policies or a one-time update to our technology. Our work here is never done. Our policies and technology have to evolve as new trends emerge and behaviors change.”

Lucas Nolan is a reporter for Breitbart News covering issues of free speech and online censorship. Follow him on Twitter @LucasNolan or email him at lnolan@breitbart.com

COMMENTS

Please let us know if you're having issues with commenting.