Facebook Reveals How It Will Determine ‘Trustworthiness’ of News Outlets to ‘Shift the Balance’

A picture taken in Moscow on March 22, 2018 shows an illustration picture of the Russian language version of Facebook about page featuring the face of founder and CEO Mark Zuckerberg. A public apology by Facebook chief Mark Zuckerberg, on March 22, 2018 failed to quell outrage over the hijacking …

Alex Hardiman, Facebook’s head of news products, revealed further details about the social network’s plans to “rank” news outlets by “trustworthiness,” during the Facebook Developer Conference this week.

“For too long, I think it’s safe to say, the news feed unintentionally incentivized the wrong type of behavior. We sometimes saw that volume and engagement and clicks often won out over quality journalism and serious news,” Hardiman declared. “The reality is that not all news is created equal, and so we’re confident that the changes we’re making will improve the quality of news for people by quite a bit over the next 12 to 24 months. Now our first step in emphasizing high-quality news is focused on ranking changes in News Feed, designed to shift the balance of the types of news that people see.”

“I think in January, many of you are aware that Mark announced three principles that drive this work. We want to support quality news that is trusted, informative, and local or personally relevant to you,” she continued. “Today, unfortunately, we know that trust in the media is low and polarization is a real concern. So we want to make sure we are valuing and supporting the publishers that people trust. Our goal here is to expose people to news that is broadly trusted by members of diverse groups, so that people have access to a shared set of facts. So to understand which news sources are broadly trusted, we survey a diverse and random representative sample of people on our platform, and in these surveys we ask people two questions.”

“First, how familiar they are with the source, and then second, how much they actually trust that news source,” explained Hardiman. “We want to make sure the trustworthiness of a publication is determined by people who are discerning, and quite honestly even skeptical about the news that they read, and so to prevent this model from being skewed in too far in either direction, we then de-biased the results by discarding the trust ratings from people who are most and least skeptical about the source. This is not at all measured by popular vote. And from there, we use the answers to build a model that reflects how broadly trusted each publication is.”

Hardiman then continued to claim that Facebook will also ask users how informative they feel an article is.

“We know from our research that informative news encourages civic engagement and community activism and it can breed greater empathy between people. So for this ranking, we ask people to tell us how informative a post is in their news feed on a scale of one to five, and then we use that to create a prediction model for how likely we think other posts in their news feed are going to be for them,” she proclaimed. “We’re looking at ways to measure how much people actually feel more informed by the news they read on Facebook, and we’re also looking at ways to measure how much people actually are more informed based on their knowledge of current events in the world.”

The third principle, according to Hardiman, is prioritizing local news.

Facebook’s Head of News Products also claimed that the described ways to determine the rank of news sources were only the beginning and that Facebook has been talking to “external researchers, academics, media partners, and others,” to also determine what news sources should be prioritized over others.

Hardiman did not reveal which external researchers, academics, and media partners the social network had been working with, or whether they were ideologically diverse.

Charlie Nash is a reporter for Breitbart Tech. You can follow him on Twitter @MrNashington, or like his page at Facebook.


Please let us know if you're having issues with commenting.