Amazon has been recognized for their innovation, and a computer program designed to look over job applications for the company was supposed to be a new, helpful tool. According to Reuters, however, the tool didn’t end up being as fair as expected. The artificial intelligence tended to reject females from the top 5 applicants it would select.
“Everyone wanted this holy grail,” revealed an unknown source. “They literally wanted it to be an engine where I’m going to give you 100 resumes, it will spit out the top five, and we’ll hire those.” There’s only one problem: the engine was designed to analyze the hiring patterns from the past 10 years. It has only been recently that women have been making strides in technology-dominated workplaces. This resulted in the AI developing a preference for men.
The tool began to automatically cast out resumes that contained the word “women,” such as “women’s chess club captain.” It went so far as to reduce the positions of two applicants who attended all-female universities. It was also observed that the AI was partial to “masculine language.” After being programmed to memorize around 50,000 terms that appear on resumes, the AI began to disregard the skills on the resumes themselves, and instead paid attention to the verb choice. Men were more likely to use verbs like “executed” and “captured,” and so the program began to favor resumes that used those words.
Aside from the gender bias, there were other reasons the tool was faulty, such as suggesting unqualified applicants for certain jobs. Amazon, who is usually ahead of the game when it comes to finding new automated ways to get things done, was eventually forced to shut down the project. While the company isn’t releasing any official statements regarding the project, plenty of anonymous workers came forward to give Reuters the scoop.
Amazon is still able to use automation for smaller tasks, like eliminating duplicate resumes. However, providing an unbiased analysis like the tool was initially designed for is probably still a long way off, experts say.
“How to ensure that the algorithm is fair, how to make sure the algorithm is really interpretable and explainable – that’s still quite far off,” said computer scientist Nihar Shah, who teaches machine learning at Carnegie Mellon University.
Still, other companies such as Hilton Worldwide Holdings Inc and Goldman Sachs Group Inc are still trying to give machine recruiting a go, and will hopefully receive less biased results.