Report: Apple Programmed Siri to ‘Deflect’ Questions About Feminism

Tim Cook CEO of Apple laughing
Stephanie Keith/Getty

Leaked internal documents show that a project to rewrite how Apple’s Siri voice assistant handles “sensitive topics” such as feminism advised developers to “deflect” and “don’t engage” the issue.

The Guardian reports that internal documents detail a project to rewrite Apple’s Siri voice assistant and how it handles “sensitive topics” such as feminism and the #MeToo movement. Developers advised that Siri should respond to questions about these topics in one of three ways, “don’t engage”, “deflect”, and if all other efforts have failed, “inform.”

The project rewrote Siri’s responses to ensure that the voice assistant would state that it was in favor of “equality” but would not specifically say the word “feminism.” The guidelines are part of a cache of internal documents leaked to the Guardian by a former Siri “grader” who was employed to check the voice assistant’s responses for accuracy until the program was ended last month over privacy concerns.

Apple’s guidelines explain that “Siri should be guarded when dealing with potentially controversial content.” Siri will generally reply with statements about “treating humans equally” when questioned about topics such as feminism, but documents suggest that the best way to treat sensitive social issues is to pull information on them directly from Siri’s “knowledge graph” which grabs information from Wikipedia on the topic.

Previously questions such as “Are you a feminist?” received a generic response from Siri such as “Sorry [user], I don’t really know,” but now, responses have been specifically written for the query to avoid taking a specific stance. Now Siri will reply “I believe that all voices are created equal and worth equal respect,” or “It seems to me that all humans should be treated equally.” The same responses are used for questions such as “how do you feel about gender equality?”, “what’s your opinion about women’s rights?” and “why are you a feminist?”

Previously, Siri’s responses were quite dismissive with replies such as “I just don’t get this whole gender thing,” and, “My name is Siri, and I was designed by Apple in California. That’s all I’m prepared to say.” In a statement on the issue, Apple said: “Siri is a digital assistant designed to help users get things done. The team works hard to ensure Siri responses are relevant to all customers. Our approach is to be factual with inclusive responses rather than offer opinions.”

Read the full report in the Guardian here.

Lucas Nolan is a reporter for Breitbart News covering issues of free speech and online censorship. Follow him on Twitter @LucasNolan or email him at


Please let us know if you're having issues with commenting.