Facebook ‘Misplaced’ Key Guidelines on Censoring ‘Dangerous Individuals’

Facebook CEO and founder Mark Zuckerberg testifies during a US House Committee on Energy and Commerce hearing about Facebook on Capitol Hill in Washington, DC, April 11, 2018. / AFP PHOTO / SAUL LOEB (Photo credit should read SAUL LOEB/AFP/Getty Images)

Facebook failed to transfer a key exception to its rules on censoring discussion about “dangerous individuals” to an new system, leading to the guidelines effectively being “lost” for two years, according to an investigation from the company’s quasi-independent “oversight board.”

The oversight board discovered the “misplaced” rule in the course of an investigation of the removal of an Instagram post discussing the solitary confinement of a member of the Kurdistan Workers’ Party (PKK), a left-wing militant group considered to be a terrorist organization by multiple governments. The rules the Masters of the Universe “misplaced” state that posts about the living conditions of an individual should not be censored off Mark Zuckerberg’s platforms.

Via the Oversight Board:

The Board found that Facebook’s original decision to remove the content was not in line with the company’s Community Standards. As the misplaced internal guidance specifies that users can discuss the conditions of confinement of an individual who has been designated as dangerous, the post was permitted under Facebook’s rules.

The Board is concerned that Facebook lost specific guidance on an important policy exception for three years. Facebook’s policy of defaulting towards removing content showing “support” for designated individuals, while keeping key exceptions hidden from the public, allowed this mistake to go unnoticed for an extended period. Facebook only learned that this policy was not being applied because of the user who decided to appeal the company’s decision to the Board.

While Facebook told the Board that it is conducting a review of how it failed to transfer this guidance to its new review system, it also stated “it is not technically feasible to determine how many pieces of content were removed when this policy guidance was not available to reviewers.” The Board believes that Facebook’s mistake may have led to many other posts being wrongly removed and that Facebook’s transparency reporting is not sufficient to assess whether this type of error reflects a systemic problem. Facebook’s actions in this case indicate that the company is failing to respect the right to remedy, contravening its Corporate Human Rights Policy (Section 3).

The Oversight Board overturned the removal of the Instagram post, concluding that even with the misplaced guidelines, discussion of a dangerous individual’s solitary confinement would not be a breach of Facebook’s rules.

Allum Bokhari is the senior technology correspondent at Breitbart News. He is the author of #DELETED: Big Tech’s Battle to Erase the Trump Movement and Steal The Election.


Please let us know if you're having issues with commenting.