Skip navigation
Person looking for people through multiple floors of a building

AI can help diversity recruiting, but ask these questions first.

By:

LAST UPDATED: January 22, 2024

One solution that seems to be trending in 2023 is a growing interest in using artificial intelligence (AI) within diversity recruiting. A growing number of AI-based tools are now available that can, in theory, remove unconscious bias and streamline the recruiting process, ultimately helping companies increase diversity and/or identify why candidates may be leaving their talent pipeline. 

I get it. I am continuously talking with recruiting software suppliers to learn more about the newest product or service offering in the space of diversity recruitment. There are so many new solutions popping up daily that it can be enough to make your head spin. 

One straightforward example is GapJumpers, an AI-powered platform that removes identifying information from resumes to help eliminate unconscious bias in the initial screening process. There are also more complex solutions, such as HireVue, an AI-based interviewing platform that uses machine learning and predictive analytics to assess candidate responses and provide feedback on their potential fit for the role.

Algorithms, AI, and data analysis are now assisting businesses in finding the right internal and external talent to fill positions as well as deciding how to manage and develop their employees. The right AI tools can assess candidates' and current employees' skills, experience, and other specified criteria. The right AI tools can also exclude any characteristics that could incite bias. Theoretically. But that is not always the case. 

You can use AI to make your job postings more visible to potential applicants, speed up initial screening and interview scheduling, and remove human bias to help you bring in pre-vetted candidates from underrepresented groups. But, whenever I am talking with HR, Talent Acquisition, and DEI leaders, I feel obligated to talk about some of the challenges that I have with AI. So, before you go all in on using AI, I've listed three questions that I recommend you ask yourself beforehand.

Question #1: How does your workplace intend to use AI?

AI is being used (many times) as a replacement for the inner work that hiring individuals need to do. You can remove names all you want and create blind resumes, but eventually, the hiring team will still have to face their own biases at other milestones within the interview process. Do not use AI as a replacement tool. AI is one part of a multi-faceted solution. 

Another component is that inclusive hiring training still needs to happen for every single person on the front lines to effectively increase diversity. This includes, but is not limited to, your recruiters, hiring managers, interview teams, and workplace ambassadors. 

Question #2: Who programs and creates the AI? What are they doing to mitigate bias? 

Despite its good intentions, AI can be biased too. Think about it. Who programs and creates the AI software? Humans. As a result, bias may be baked into the design. Therefore, it is crucial that you inquire about what exactly the creators of the software are doing to mitigate their own bias.

Is the organization from which you are making your purchase doing its own internal work to reduce bias? If not, then that will be reflected in their software design, ultimately increasing the probability of bias being built into it. With that said, this does not negate your team, or your organization from holding yourselves accountable and doing the work, as mentioned above. Part of that work is regularly assessing the outcomes. 

Question #3: How will you audit for impact? 

Although AI can be a supportive tool, you still need to monitor your ATS data to find out who is getting ahead in your hiring process, who is getting left behind, and why. Do not underestimate the WHY. Consistently auditing for impact is how you identify and mitigate bias and how you play checks and balances to ensure that your AI is working as effectively as you had hoped. 

Look at your data and see if the software is delivering what it is supposed to. Is it helpful? Is it adding value? Or does your tool have side effects that you were not aware of? Depending on the answers to those questions, you can better eliminate bias and build a more inclusive and equitable hiring process within your organization. 

It is important to note here that I am not saying do not use AI. What I highly recommend is that you are thoughtful when deciding which software to use or which company to partner with. Just as you are doing the work and holding yourself accountable, asking these tough questions enables you to hold them, the creators of the artificial intelligence software, accountable as well. Only then will we be able to see true change within the hiring system as a whole.

Download our worksheet here to help you assess the risks and advantages of integrating AI tools into your hiring process to build a more diverse workplace.