A senior ACLU manager is welcoming Facebook’s decision to let corporate-favored political groups censor mainstream news reports distributed via its network.
“We do not think Facebook should set itself up as an arbiter of truth … [but] this may be the best, most carefully crafted approach for the company to take,” said a statement from Jay Stanley, a “Senior Policy Analyst” for the ACLU’s speech, privacy, and technology project. He wrote:
It is an approach based on combatting bad speech with more speech. Instead of squelching or censoring stories, Facebook includes more information with posts, telling people, in effect, that “this party here says this material shouldn’t be trusted.” That does not create the censorship concerns that more heavy-handed approaches might take. We applaud Facebook for responding to the pressure it is under on this issue with a thoughtful, largely pro-speech approach.
Facebook’s owner, Mark Zuckerberg, announced his “Faceblocking” plan Thursday, Dec. 15.
Zuckerberg’s scheme allows a picked group of outside groups to tag posts by articles ad posts by media outlets as “fake.” Once tagged as “fake,” the posts and news articles will be given a much lower priority for distribution to Facebook users who are interested in similar political news.
So, far the planned system does not allow outside groups to tag Facebook posts with other terms, such as “heretical,” “good” or “beneficial.”
The ACLU manager admits problems:
Perhaps the biggest question is what the boundaries will be for how this system is applied. As I discussed in my prior post, the question of what is fact and what is fiction is a morass that is often impossible to neutrally or objectively determine. Armies of philosophers working for over two thousand years have been unable to come up with a satisfactory answer to the question of how to distinguish the two. And there is an enormous amount of material out there fitting every gradation between the most egregious hoax and the merely mistaken and badly argued. What if a piece is largely true, but includes a single intentional, consequential lie?
Facebook’s answer is that it is, for now at least, focusing its efforts on “the worst of the worst, on the clear hoaxes spread by spammers for their own gain.” From what we were told, it also sounds like whatever algorithm they use to refer stories to the 3rd party fact checkers will not only incorporate the number of fake news flags received from users, but also focus on pieces that are actually trending…
Facebook will likely find it impossible to both enable fact-checking, and to be seen as neutral by those who reject those facts and any organizations that validate them.
In an earlier post, Stanley suggests that Facebook’s agreement to censor the new would threaten progressives’ political goals:
for Facebook to assume the burden of trying to solve a larger societal problem of fake news by tweaking these algorithms would likely just make the situation worse. To its current role as commercially motivated curator of things-that-will-please-its-users would be added a new role: guardian of the social good. And that would be based on who-knows-what judgment of what that good might be at a given time. If the company had been around in the 1950s and 1960s, for example, how would it have handled information about Martin Luther King, Malcolm X, gay rights, and women’s rights? A lot of material that is now seen as vital to social progress would then have been widely seen as beyond the pale. The company already has a frightening amount of power, and this would increase it dangerously. We wouldn’t want the government doing this kind of censorship—that would almost certainly be unconstitutional—and many of the reasons that would be a bad idea would also apply to Facebook, which is the government of its own vast realm.
Stanley also recognized — and smeared — the mainstream public’s rejection of the establishment media. The popular opposition proved itself in the Nov. 8 election, which prompted an establishment backlash against the new media. That backlash includes calls for censorship of supposed “fake news,” and for suppression of the public’s free speech:
At the end of the day, fake news is not a symptom of a problem with our social-communications sites, but a societal problem. Facebook and other sites are just the medium.
… the existence of a specific political movement that rejects the “mainstream media” in favor of a group of ideological news outlets like Breitbart and Infowars— … what is new is a large number of Americans who have rejected the heretofore commonly accepted [establishment] sources of the facts that those narratives are built out of …This phenomenon has been dubbed “epistemic closure.” While originally a charge levied at intellectuals at Washington think tanks, it is an apt term for everyday readers of Breitbart and its ilk who close themselves off from alternate sources of information.
This is not a problem that can be fixed by Facebook; it is a social problem that exists at the current moment in our history.
Read it here.