How to take diversity into account when recruiting through AI
In conversations about diversity and inclusion in the workplace, the practical benefits of having a diverse workforce can sometimes get lost in talk. In this case you should consider: The 20 different companies of the S&P 500 achieve a higher profitability in the long term than their less different colleagues. Top quartile leadership diversity companies are more likely to outperform in terms of profitability, superior value creation, and good financial performance. By 2025, promoting gender equality in the workplace alone could add $ 12 trillion to global GDP. Despite these obvious benefits, 48% of organizations are either not on track to meet their diversity goals or have no goals at all.
How can companies struggling to improve their diversity scores? The first step is to recognize human prejudice when recruiting. On average, recruiters spend 7 seconds reviewing an individual résumé. In this short time, recruiters often rely on quick judgments that could be influenced by similarity biases, contrast effects and more. While human recruiters may not have the time to review each application in depth, they should be aware of the subjectivity that human reviewers bring to the hiring process.
Is software a solution? 90% of businesses and 68% of small businesses use applicant tracking systems, but recruiting software isn’t immune to bias either. Using software that is keyword-based in finding resumes may reflect a candidate’s ability to write a keyword-rich resume, but not their actual qualifications. Searching for synonyms solves part of the problem, but not all. One example of recruiting software going wrong is Amazon. In 2018, Amazon scrapped its state-of-the-art recruiting AI after teaching itself to punish resumes that contained the word “female” or mentioned exclusively female colleges. AIs are not programmed with human prejudice; they learn it by processing data and recognizing patterns. If the data provided is biased, the results will reflect the bias. Amazon’s AI was trained on a decade of résumés and hiring decisions. As a result, the AI picked up on existing biases in the screening process and exaggerated them.
What can be done to improve the situation? 81% of HR professionals admit that their current diversity practices are average or worse. Many are unsure how to train an AI without prejudice. Some ways to rectify the data an AI examines include collecting data from different industries, jobs, and candidates, removing factors such as age, gender, and name from the initial screening, and looking at how well a candidate is and a company fit together from both perspectives. Important steps on the human side are setting clear, traceable improvement goals, setting a standard for diversity training and the opportunity to learn from colleagues, establishing strategic partnerships with external organizations, schools and universities, and ensuring that management and guidelines are diversity and Inclusion support every level.
Overcoming recruiting bias can help a company thrive. And in the words of Lena Waithe, Emmy Award-winning writer, producer, and actress, “The only way to really see change is to help shape it.”