fbpx
Features Hub Opinion

How AI and machine learning can embed recruitment bias

Tue 7 Jun 2022 |

In many countries around the world, the employment market has shifted firmly to job seekers. With companies needing to fill an increasing number of vacancies, applicants with in-demand skill sets are in a uniquely strong position to negotiate. Yet, even in this buoyant job market, some applicants are facing unexpected barriers in getting the job they want.

Algorithmic bias is where AI solutions make decisions based on bias inputs to create systematically unfair outcomes for applicants. As AI is only as good as the data fed into it, when poor data is trained on, the chance for unfair decisions increases.

With a survey from Tidio finding that 85% of recruiters believe that AI will replace some elements of the hiring process, it’s vital that all enterprises that use AI-backed HR technologies are fully aware of the potential issues around algorithmic bias.

Built-in biases

Countless high-profile examples of algorithmic bias in recruitment have been reported on in recent years. For example, tech giant Amazon abandoned an AI-based recruitment solution after it was discovered to have taught itself that male candidates were preferable to women applicants.

A study released by Harvard Business Review also found evidence for AI-enabled anti-Black bias in recruiting. The report called The Elephant in AI discovered that almost one in three respondents were offered job alerts below their current skill level and 40% had recommendations based upon their identities, as opposed to their actual qualifications.

It’s not hard to see why HR professionals are actively looking for ways to incorporate advanced technologies to improve the hiring process. A survey of talent acquisition leaders finds that 52% report that the most difficult aspect of recruitment is screening candidates from a large pool of applicants.

Pair this with the fact that hiring managers often take just seconds to look at a CV before making a decision, the effective use of AI could support this process and enable these HR professionals to focus their time on assessing applicant information.

Essential audits

The extent to which AI recruitment biases are considered an issue for HR professionals indicates some in the industry are not aware enough of the challenges raised by the widespread introduction of AI solutions in the recruitment process.

According to the Tidio survey, 59% of HR respondents say that introducing AI to the recruitment process will remove unintentional bias, with only 14% disagreeing. Regular auditing of AI recruitment solutions is a powerful way to ensure that no protected groups are being discriminated against.

For example, if an audit discovered that an unexpectedly large percentage of a certain group were being rejected by the AI without reaching the interview stage, it could be a good idea to manually examine the reasons for this. In practice, this will mean human oversight of CVs that were rejected to see if the applications were rejected for fair and genuine reasons.

However, even undergoing an audit does not guarantee that AI decision making will be accurate, with there currently being no widely recognised industry standard for what an AI audit should encompass. Leading auditors all have very different ways they conduct an audit. Some use computer scientists to test back-end process, whereas others use human consultants to asses biases.

The growing use of AI in the recruitment pipeline is showing no signs of slowing down anytime soon. While there are clear challenges that need to be overcome before AI-backed recruitment solutions can be utilised more widely in businesses, it is a positive development that more and more organisations are understanding the importance of removing biases from the recruitment process.

Written by Tue 7 Jun 2022

Tags:

AI
Send us a correction Send us a news tip