Facebook is now blaming multiple languages on the platform for the company’s failure to effectively monitor content.
Reuters reports that as Facebook grows and spreads worldwide, the company is facing increasing issues with the moderation of content on the platform due to a large number of languages spoken by users. Currently, Facebook offers 2.3 billion users access to 111 different languages on the site’s menus and prompts, Reuters claims another 31 are widely spoken across the platform.
The websites “community standards” which outline the content that users are not allowed to post on the platform, such as “hate speech” and celebrations of violence, were only translated to 41 different languages of the 111 languages supported on Facebook as of March. Facebook’s content moderation team is comprised of 15,000 people speaking 50 languages, but professional translators are often employed when needed.
But as countries such as Australia, Singapore, and the U.K. threaten to impose harsh new regulations on the companies for failure to police hate speech — with steep fines or jail time for executives if violated — Facebook will need to beef up its content moderation across all languages. The Facebook vice president in charge of standards, Monika Bickert, previously told Reuters that the company was “a heavy lift to translate into all those different languages.”
A Facebook spokesperson told Reuters that Facebook is prioritizing a number of languages for translation, such as Khmer, the official language in Cambodia, and Sinhala, the dominant language in Sri Lanka. Facebook was recently blocked in Sri Lanka to prevent the spread of “hate speech” following the Easter Sunday bombings which took the lives of 290 people.
Phil Robertson, deputy director of Human Rights Watch’s Asia Division, commented on Facebook’s failure to monitor content which has allegedly led to the spread of “hate speech” in countries such as Myanmar: “These are supposed to be the rules of the road and both customers and regulators should insist social media platforms make the rules known and effectively police them. Failure to do so opens the door to serious abuses.”