Facebook has announced the formation of an independent body designed to review and make decisions on content which could affect Facebook’s policies in the future.
Facebook announced this week that the company will be developing an independent body to make decisions relating to what content should be removed from the social media platform, Yahoo News reports. While Facebook has increased their ability to detect “hate speech” and terrorism-related content on the platform, the social media website has still come under fire for their decision relating to the removal of certain types of content – it appears this independent body is an attempt to remedy that situation.
“I have come to believe that we shouldn’t be making so many decisions about free expression and safety on our own,” said Facebook chief executive Mark Zuckerberg in a media briefing. Currently, any content that may violate the sites rules is detected by Facebook’s artificial intelligence software or reported by users and is then sent to Facebook’s internal review system. This new independent body will act as a “higher court” considering appeals of removed content from the platform.
The company also plans to publish content removal summaries at a frequency on par with the release of its earnings reports. Facebook plans to start this practice next month. “We have made progress getting hate, bullying and terrorism off our network,” Zuckerberg said. “It’s about finding the right balance between giving people a voice and keeping people safe.
Facebook has often struggled with the understanding of the multiple languages used on Facebook along with cultural context and the significance of certain phrases or statements. “We are getting better at proactively identifying violating content before anyone reports it, specifically for hate speech and violence and graphic content,” Facebook said in the new transparency report. “But, there are still areas where we have more work to do.”