Facebook’s Clegg on Amplifying January 6: Can’t Say ‘Yes or No’

Vice‑President for Global Affairs and Communications at Facebook Nick Clegg said Sunday on CNN’s “State of the Union” that he cannot answer if the social media platform amplified “pro-insurrection voices ahead of January 6.”

Anchor Dana Bash asked, “Just a simple yes or no, did Facebook’s algorithms amplify or spread pro-insurrection voices ahead of January 6?”

Clegg replied, “Let me be clear because there’s been a lot of, I think, misleading discussion about what the algorithms do. There are hundreds and thousands of them in Facebook as much as there are in many online companies. But what the ranking algorithms do — in other words, that’s the crucial algorithms that help decide what you see more prominently on your news feed, on Facebook than other pieces of content. If you remove the algorithms, which I think is one of Francis Haugen’s central recommendations, the first thing that would happen is people would see more, not less, hate speech. More, not less, information. These algorithms are precisely designed to work like giant spam filters to identify and deprecate bad content. Of course, it has downsides, but it also has a very powerful positive effect.”

Bash said, “But my question is specifically about January 6, did the algorithms that are in place amplify pro-insurrection voices ahead of January 6?”

Clegg said, “Given we have thousands of algorithms and millions of people using this, I can’t give you a yes or no answer to the individual, personalized feeds that each person uses. We work with law enforcement, of course, to give them content that might show up on our platform. January 6, the responsibility for that is the people who broke the law, who inflicted the violence, who aided and abetted in the media.”

Bash said, “But is it a problem that Facebook, you’re not really sure if your platform allowed it to fester and amplify what ended up as this huge attack?”

Clegg said, “What I was simply saying is that the algorithm, the whole point, of course, of Facebook, is that each person’s newsfeed is individual to them. It’s like an individual fingerprint, and that’s basically determined by the interaction of your choices, your friends, your family, the groups you choose to be part of.”

He added, “I can’t give a generic answer to each person’s individual feed. What I can say is that where we see content we think is relevant to the investigations of law enforcement, of course, we cooperate with them. But if I may, if our algorithms are so nefarious as some people suggest, why is it that it’s precisely those systems that have succeeded to reduce hate speech, the prevalence of hate speech on our platforms to as little as 0.05 percent? That means that for every 10,000 bits of content, you would only see 5 bits of hate speech.”

Follow Pam Key on Twitter @pamkeyNEN

COMMENTS

Please let us know if you're having issues with commenting.