Why reducing AI bias in hiring is essential

November 8, 20194 minute read

Select article text below to share directly to Twitter!

Dismiss

AI is expected to unleash a massive wave of digital transformation in the business sphere, especially where hiring is concerned. HR pros will likely use AI to supercharge and optimize their recruitment processes to match exceptional candidates with open positions in record time. However, reducing AI bias in hiring will be essential to cultivating such a high-performing talent pipeline. Here’s how companies can begin addressing this challenge around IT in hiring.

How AI will transform recruitment

Finding the right candidate for a position can be a time-consuming and painstaking process, so it’s no wonder that HR pros are looking to AI for assistance. AI can help recruit passive qualified candidates by drawing in talented professionals who have the right expertise for the job but may not be actively searching for their next gig. Digital HR tools could even infer skills based on experience listed on a résumé, automatically providing valuable insight to human hiring managers.

AI tools can also screen candidates for potential fit and vet résumés in a fraction of the time it would take a human. In addition, AI can use sentiment analysis to assess a candidate’s strengths and challenge areas and leverage predictive analytics to determine how successful a candidate’s tenure would be if they were hired. Automated helpers can even streamline the tasks involved in coordinating candidate interviews or keep applicants apprised of how the process is going via chatbots.

Businesses that successfully tap HR technology are already enjoying impressive outcomes. According to Josh Bersin’s HR Technology Market 2019 report, a recent study from Bersin by Deloitte found that companies who are leveraging advanced technology for recruiting achieve significant improvements in time to hire, quality of hire, and cost to hire. With more abundant resources at their fingertips thanks to innovative IT in hiring, HR could focus on other initiatives that enhance the company’s attractiveness to potential applicants.

Concerns around AI bias in hiring

Serious concerns about AI bias in hiring must be addressed before HR professionals can fully rely on the technology. As with so many AI initiatives, the quality and type of historical data used can directly influence how a system evaluates candidates. If that data involves biased assessments from human hiring managers or a past history of hiring predominantly non-diverse candidates, for example, then that data could influence the AI recruiting process. Even AI recruitment tools that don’t appear to display any prejudices at first can pick them up as they vet candidates—with nobody being the wiser as the talent pool quickly drains.

As The Verge reports, Amazon scuttled a secret AI initiative to vet candidates after the system began consistently downgrading female candidates. Even worse, those involved with the project reportedly could not guarantee that additional biases had not crept into the design. Although Amazon insists that it never used the AI tool to vet real-world candidates, it’s not hard to see how quickly the revelation of biased AI-driven hiring practices could have led to a lawsuit and major damage to the retail behemoth’s corporate image.

There’s another worrying knock-on effect HR managers should consider: If AI tools discriminate against diverse candidates, that could even have a harmful snowball effect on future recruitment. Brilliant potential candidates who look at a company’s top leadership and see only white men represented there may decide that the organization is not committed to diversity and not apply for a position. Before long, the talent pipeline could be in bad shape.

Prevent AI bias in hiring

One way is to prevent AI bias in hiring is to ensure that there is no under-representation or over-representation of certain demographic groups in the data your recruitment systems use, since skewed data sets could end up disqualifying excellent candidates who happen to be diverse. As HR Dive notes, HR professionals should pointedly question their service providers on exactly how they have trained their systems to eliminate human bias. Some companies are using apps like Blendoor to eliminate unconscious bias in their hiring processes, which anonymously matches them with qualified candidates they might otherwise overlook.

Because the recruitment process involves sensitive personal data, companies should be as open as possible with candidates about how their data is being used. It’s also crucial to have proper data security measures in place to prevent a breach. One way to do this is strengthening endpoint security, particularly at commonly overlooked areas like the print environment. By reducing AI bias and using transparent, secure practices to process candidate data, companies can ensure a healthy talent pipeline for years to come.

  • Recommended for you
  • Recommended for You