Amazon reportedly shut down an artificial intelligence-powered recruitment tool after it repeatedly discriminated against female candidates.
The recruitment tool, which Amazon used to “give job candidates scores ranging from one to five stars — much like shoppers rate products on Amazon,” according to Reuters, could compile one hundred applications and deliver the top five candidates to Amazon.
Problems arose, however, when Amazon realized the tool was taking preference to male candidates over female candidates.
“That is because Amazon’s computer models were trained to vet applicants by observing patterns in resumes submitted to the company over a 10-year period. Most came from men, a reflection of male dominance across the tech industry,” Reuters explained. “In effect, Amazon’s system taught itself that male candidates were preferable. It penalized resumes that included the word ‘women’s,’ as in ‘women’s chess club captain.’ And it downgraded graduates of two all-women’s colleges, according to people familiar with the matter.”
As a result, the tool, which was originally described as a “holy grail” tool by one person familiar with it, was shut down in 2017.
In 2016, Tay, a Microsoft Twitter account powered by aA.I., had to be shut down within 24 hours of being made public after it quickly started posting racist and pro-Nazi comments.
In June, the Massachusetts Institute of Technology also created “psychopath” artificial intelligence.