In a report published by Sputnik News, psychologist Robert Epstein reveals evidence that Google is manipulating search results related to Hillary Clinton that may “shift as many as 3 million votes” in the upcoming presidential election.

Earlier this year, Matt Lieberman of Sourcefed published a video that claimed Google’s autocomplete suggestions were biased in favour of Clinton. The video went viral, with an abridged version of it being viewed over 25 million times on Facebook.

Epstein set out with his colleagues at the American Institute for Behavioral Research (AIBRT) to investigate the claims. They concluded that whilst the investigation is ongoing, their report “generally supports” Lieberman’s video.

In order to test the results, Epstein and his associates tested hundreds of different response terms related to the election, using Yahoo and Bing search as a control. Each search was also conducted through proxy servers, such as the Tor network, to make it very difficult for Google to identify the researchers and thus customize the search results for them.

It is somewhat difficult to get the Google search bar to suggest negative searches related to Mrs. Clinton or to make any Clinton-related suggestions when one types a negative search term. Bing and Yahoo, on the other hand, often show a number of negative suggestions in response to the same search terms. Bing and Yahoo seem to be showing us what people are actually searching for; Google is showing us something else — but what, and for what purpose?

As for Google Trends, as Lieberman reported, Google indeed withholds negative search terms for Mrs. Clinton even when such terms show high popularity in Trends. We have also found that Google often suggests positive search terms for Mrs. Clinton even when such terms are nearly invisible in Trends. The widely held belief, reinforced by Google’s own documentation, that Google’s search suggestions are based on “what other people are searching for” seems to be untrue in many instances.

Google tries to explain away such findings by saying its search bar is programmed to avoid suggesting searches that portray people in a negative light. As far as we can tell, this claim is false; Google suppresses negative suggestions selectively, not across the board. It is easy to get autocomplete to suggest negative searches related to prominent people, one of whom happens to be Mrs. Clinton’s opponent.

Epstein attached screenshots of some of the searches he conducted, clearly showing a lack of negative results for Clinton, in contrast to negative search results for rival Donald Trump. It was also not a case of people simply not searching for these terms on Google — Google Trends shows the search rates between the relevant keywords, with negative Hillary results usually trending higher.

 

Moreover, Epstein suggests that these autocomplete results will have a dramatic effect on the election itself.

Over time, differentially suppressing negative search suggestions will repeatedly expose millions of people to far more positive search results for one political candidate than for the other. Research I have been conducting since 2013 with Ronald Robertson of Northeastern University has shown that high-ranking search results that favor one candidate can easily shift 20 percent or more of undecided voters toward that candidate — up to 80 percent in some demographic groups, as I noted earlier. This is because of the enormous trust people have in computer-generated search results, which people mistakenly believe are completely impartial and objective — just as they mistakenly believe search suggestions are completely impartial and objective.

Perhaps you are skeptical about my claims. Perhaps you are also not seeing, on balance, a pro-Hillary bias in the search suggestions you receive on your computer. Perhaps you are also not concerned about the possibility that search suggestions can be used systematically to nudge people’s searches in one direction or another. If you are skeptical in any or all of these ways, ask yourself this: Why, to begin with, is Google censoring its search suggestions? (And it certainly acknowledges doing so.) Why doesn’t it just show us, say, the top ten most popular searches related to whatever we are typing? Why, in particular, is it suppressing negative information? Are Google’s leaders afraid we will have panic attacks and sue the company if we are directed to dark and disturbing web pages? Do they not trust us to make up our own minds about things? Do they think we are children?

Without whistleblowers or warrants, no one can prove Google executives are using digital shenanigans to influence elections, but I don’t see how we can rule out that possibility. There is nothing illegal about manipulating people using search suggestions and search rankings — quite the contrary, in fact — and it makes good financial sense for a company to use every legal means at its disposal to support its preferred candidates.

In a postscript, Epstein reveals that he is, in fact, a Clinton supporter himself, but concludes that he does not believe it would be “right for her to win the presidency because of the invisible, large-scale manipulations of a private company. That would make democracy meaningless.”

You can read his full report here.

Jack Hadfield is a student at the University of Warwick and a regular contributor to Breitbart Tech. You can follow him on Twitter @ToryBastard_ or email him at jack@yiannopoulos.net.