Former Google engineer Mike Wacker claims that CEO Sundar Pichai lied to Congress when claiming that the company does not manually intervene on users search results.
In a Medium blog post titled “Google’s Manual Interventions in Search Results,” former Google engineer Mike Wacker alleges that Google CEO Sundar Pichai lied to Congres when he stated “We don’t manually intervene on any particular search result.” Wacker notes that Pichai’s comment was in answer to a question from Rep. Zoe Lofgren (D-CA) who asked why photos of President Trump were returned under image searches for the term “idiot.”
Pichai claimed that this was an automated process as Google ranked results based on thing such as “relevance, freshness, popularity,” etc. and outright stated that the company does nto intervene on particular search results. Wacker notes that a few days later, Slate writer April Glaser searcher for the term “abortion” on YouTube and was unhappy with the results returned, writing an article about the issue. Wacker writes:
She would later receive a response from a YouTube spokesperson. However, this answer was different from the answer that Sundar Pichai delivered to Rep. Lofgren. Instead of explaining how search works and how these results were produced by an objective, automated process, the spokesperson “stressed that the company is working to provide more credible news content from its search and discovery algorithms.”
Wacker began his own investigation into Google’s search algorithm where he made a discovery. “I eventually found the smoking gun: the exact change where Google had altered the search results for abortion. My initial reaction was a mix of excitement and shock.”
Wacker goes on to describe the full details of the evidence he found that shows that YouTube — a Google-owned company — actively manipulated user search results:
To reference the infamous phrase “alternative facts,” the change essentially used an alternative algorithm that delivers alternative search results. A special file named youtube_controversial_query_blacklist could be used to manually trigger this alternative algorithm. For any search or query that matched an entry on that blacklist, YouTube would blacklist the normal search results, switching over to the alternative search results instead.
The smoking gun I had discovered was a change that added two entries to that blacklist: “abortion” and “abortions”. As a result of this change, searches for those terms displayed the alternative search results. The change had been made at Dec 14, 2018, 3:17 PM PST, mere hours after April Glaser of Slate had emailed YouTube.
Wacker then goes on to give examples of some of the blacklists created by Google to suppress certain content:
It didn’t end there. Herein lies the key point: it’s not about where the blacklist begins, but where the blacklist ends. In this case, that contrast could not be more stark: the first entry on the blacklist was added because of a mass shooting, and the last entry was added because a Slate writer complained about search results for abortion.
Two other additions to the blacklist deserve additional scrutiny. The first one is related to a member of Congress. On December 14, 2017, “maxine waters” was added to the blacklist. This change had been made because a single employee had complained that search results for her were low quality. The potential motivations and biases of that employee are not known. Another employee then compared the normal search results and the alternative search results for “maxine waters” and decided to switch over to the alternative search results. The criteria used to determine which search results are better are also not known.
The consequences of this change would then carry over into the 2018 midterm election. During that election, users who searched for Maxine Waters on YouTube would have received the alternative search results, whereas users who searched for her opponent, Omar Navarro, would have received the normal search results.
In Wacker’s opinion, it is clear that Pichai did not tell the truth to Congress:
Sundar’s answer misled Rep. Lofgren, who responded that the potential biases of Google’s employees “has nothing to do with the algorithms and the really automated process that is the search engine that serves us.” As it turns out, the question of “what we’re going to show the user” is not dictated solely by algorithms and automated processes, and the biases of Google’s employees could have something to do with those search results.
Read Wacker’s full exposé at Medium here.