Facebook Begins to Explain the ‘Problematic’ Content It ‘Demotes’

Mark Zuckerberg surrounded by guards
Chip Somodevilla /Getty

Social media giant Facebook is beginning to reveal more about the types of content that the company considers “problematic,” which the Masters of the Universe then “demote” on the News Feed, ensuring that fewer people ever come across the content.

The Verge reports that Facebook is attempting to provide users with more information relating to the company’s moderation of the News Feed and the types of content it demotes rather than removes entirely. This week, Facebook published its new “Content Distribution Guidelines” which detail around three dozen types of posts that it demotes from the News Feed for various reasons.

OPSHOT - A lit sign is seen at the entrance to Facebook's corporate headquarters location in Menlo Park, California on March 21, 2018. Facebook chief Mark Zuckerberg vowed on March 21 to 'step up' to fix problems at the social media giant, as it fights a snowballing scandal over the hijacking of personal data from millions of its users. / AFP PHOTO / JOSH EDELSON (Photo credit should read JOSH EDELSON/AFP/Getty Images)

OPSHOT – A lit sign is seen at the entrance to Facebook’s corporate headquarters location in Menlo Park, California on March 21, 2018. Facebook chief Mark Zuckerberg vowed on March 21 to ‘step up’ to fix problems at the social media giant, as it fights a snowballing scandal over the hijacking of personal data from millions of its users.  (Photo credit should read JOSH EDELSON/AFP/Getty Images)

The company claims its standards for bad content include clickbait-type posts and repeat policy offenders. Facebook’s demotion process relies heavily on machine learning algorithms that automatically search for this type of content and throttles the reach of offending posts and comments.

Mark Zuckerberg CEO of Facebook

Mark Zuckerberg CEO of Facebook (Associated Press)

Facebook’s guidelines don’t detail how a demotion works and how much it reduces the reach of the offending content, or how different types of offending posts are demoted in comparison to each other. For example, whether a spam link throttled as much as health misinformation on the platform.

Jason Hirsch, Facebook’s head of integrity policy, explained why the company chose to detail the content it demotes, telling the Verge: “We want to give a clearer sense of what we think is problematic but not worth removing.” Hirsch added that the company plans to add more information to the guidelines but likely will not rank the severity of demotions “for adversarial reasons.”

The recent move towards alleged transparency by Facebook could be another attempt by the company to gain some positive publicity following a series of damning reports from the Wall Street Journal. According to the Wall Street Journal, internal Facebook documents reveal that tech giant Apple threatened to remove Facebook from its App Store in 2019 following a report from BBC News that detailed the human trafficking taking place across the social media platform.

The Wall Street Journal also claimed that Facebook’s newsfeed algorithm saw a major change in 2018 that appeared to promote outrageous and negative content on the platform. When informed of this, top executives including CEO Mark Zuckerberg were allegedly hesitant to solve the issue.

In another report titled “Facebook Knows Instagram Is Toxic for Teen Girls, Company Documents Show,” the Wall Street Journal claims that Facebook is aware that its photo-sharing app Instagram can have a negative effect on the body image of young women.

Read more at the Verge here.

Lucas Nolan is a reporter for Breitbart News covering issues of free speech and online censorship. Follow him on Twitter @LucasNolan or contact via secure email at the address lucasnolan@protonmail.com

COMMENTS

Please let us know if you're having issues with commenting.