Tech Buzz

OPINION

Amazon Is Just the Tip of the AI Bias Iceberg

Amazon recently disclosed its 2015 decision to scrap a recruitment tool used to hire talent, after finding that it had a bias against women. While this story has been covered sufficiently, there is a much greater story still to tell: A substantial amount of the artificial intelligence technology that currently is used for recruitment and human resources purposes has been acting independently, without any form of regulation, for some time.

Before exploring this, it will be helpful to understand why this happened with Amazon’s software — what were the ghosts in the machine? I’ll offer some insights about how similar incidents can be avoided, and then explain why this has opened a giant can of worms for the rest of the US$638 billion a year employee recruitment industry.

Two Decades of Male Imprinting

Some of you may be surprised to learn that artificial intelligence has been used within the recruitment process for at least two decades. Technologies like natural language processing, semantics and Boolean string search likely have been used for most of the Western world’s placement into work.

A more commonly known fact is that historically — and even currently — men have dominated the IT space. Today, major companies like Google and Microsoft have tech staffs comprised of only 20 percent and 19 percent women respectively, according to Statista. Considering these statistics, it’s no wonder that we create technologies with an unconscious bias toward women.

So let’s recap: More than 20 years ago a male-dominated tech industry began creating AI systems to help hire more tech employees. The tech industry then decided to hire predominantly men, based on the recommendations of unconsciously biased machines.

After 20-plus years of positive feedback from recommending male candidates, the machine then imprints the profile of an ideal candidate for its tech company. What we’re left with is what Amazon discovered: AI systems with inherent biases against anyone who included the word “women’s” on their resume, or anyone who attended a women’s college.

However, this problem isn’t limited to Amazon. It’s a problem for any tech company that has been experimenting with AI recruitment over the last two decades.

AI Is Like a Child

So, what is at the center of this Ouroboros of male favoritism? It’s quite simple: There have been too many men in charge of creating technologies, resulting in unconscious masculine bias within the code, machine learning and AI.

Women have not played a large enough role in the development of the tech industry. The development of tech keywords, programming languages and other skills largely has been carried out in a boys’ club. While a woman programmer might have all the same skills as her male counterpart, if she does not present her skills exactly like male programmers before her have done, she may be overlooked by AI for superficial reasons.

Think of technology as a child. The environment it is created in and the lessons it is taught will shape the way it enters the world. If it is only ever taught from a male perspective, then guess what? It’s going to be favorable toward men. Even with machine learning, the core foundation of the platform will be given touchpoints to take into consideration and learn from. There will still be bias unless the technology is programmed by a wider demographic of people.

You may think this is trivial. Just because a female candidate writes about how she was “‘head of the women’s chess league” or “president of the women’s computer club in college,” that couldn’t possibly put her at a disadvantage in the eyes of an unprejudiced machine, could it?

While it certainly isn’t black and white, over the course of millions of resumes even a 5 percent bias where language like this is used could result in a significant number of women being affected. If the employees ultimately in charge of hiring consistently decide to go with candidates with masculine language displayed on their resume, AI slowly but surely will start feeding hirers resumes that share those traits.

Millions of Women Affected

Some quick general math: The U.S. economy sees 60 million people change jobs every year, and we can assume that half of them are women, so 30 million American women. If 5 percent of them suffered discrimination due to unconscious bias within AI, that could mean 1.5 million women affected every year. That is simply unacceptable.

Technology is here to serve us and it can do it well, but it’s not without its shortcomings, which more often than not, are a reflection of our own shortcomings as a society. If there is any doubt that most of the labor force is touched one way or another by AI technology, you should know that recruitment agencies place 15 million Americans into work annually, and all 17,100 recruitment agencies in the U.S. already use, or soon will be using, an AI product of some sort to manage their processes.

So, what is the next logical step to determine how to resolve this? We all know prevention is the best cure, so we really need to encourage more women to enter and advance within the IT tech space. In fact, conscientious efforts to promote equality and diversity in the workplace across the board will ensure that issues such as this won’t happen again. This is not an overnight fix, however, and is definitely easier said than done.

Obviously, the main initiative should be to hire more women in tech — not only because this will help reset the AI algorithms and lead AI to produce more recommendations of women, but also because women should be involved in the development of these technologies. Women need to be represented just as much as men in the modern workplace.

An HR Storm Is Coming

With this understanding of the Amazon situation, in a nutshell, let’s go back to that can of worms I mentioned. The second-largest company in the world, based on market cap, which is a technology house, just admitted that its recruitment technology was biased due to masculine language.

In the U.S., there currently are more than 4,000 job boards, 17,000 recruitment agencies, 100 applicant tracking systems, and dozens of matching technology software companies. None of them have the resources of Amazon, and none of them have mentioned any issues regarding masculine language resulting in bias. What does this lead you to believe?

It leads me to believe that an entire industry that has been using this technology for 20 years most probably has been using unconscious bias technology, and the people who have suffered because of this are millions and millions of women. Lack of representation of women in tech is global, and the numbers are worse going back 20 years. There is no doubt in my mind that the entire industry needs to wake up to this issue and resolve it fast.

The question is, what happens to the women who, even now, are not getting the right opportunities because of the AI currently in use? I am not aware of any companies that can viably and individually test AI solutions to recognize bias, but we need a body that can do so, if we are to rely on these solutions with confidence. This potentially could be the largest-scale technology bug ever. It’s as if the millennium bug has come true in the recruitment market.

My theory on how this has managed to go on for so long is that if you were to ask anyone, they would say they believe technology — a computer AI — is emotionless and, therefore, objective. That is entirely right, but that doesn’t stop it from adhering to the rules and language it has been programmed to follow.

AI’s fundamental qualities include not only a lack of emotion or prejudice, but also an inability to judge common sense — which in this case means knowing that whether language is masculine or feminine language is not relevant to the shortlisting process. Instead, it goes in the complete opposite direction and uses that as a reference point for shortlisting, resulting in bias.

Our assumptions around technology and our persistent sci-fi understanding of AI have allowed this error to continue, and the consequences likely have been astronomically larger than we’ll ever be able to measure.

I believe that a storm is coming for the recruitment and HR industries, and Amazon is the whistleblower. This is an industry-wide problem that needs to be addressed as soon as possible.

The opinions expressed in this article are those of the author and do not necessarily reflect the views of ECT News Network.

Arran James Stewart

Arran James Stewart is the co-owner of blockchain recruitment platform Job.com. Relying on a decade worth of experience in the recruitment industry, he has consistently sought to bring recruitment to the cutting edge of technology. He helped develop one of the worlds first multi-post to media buy talent attraction portals, and also helped reinvent the way job content found candidates through utilizing matching technology against job aggregation.

1 Comment

Leave a Comment

Please sign in to post or reply to a comment. New users create a free account.

What's your outlook for the business climate in 2025?
Loading ... Loading ...

Technewsworld Channels