AI in recruitment: are the rewards worth the risks?

AI is now being used to automate recruitment more than ever before. AI tools can scan CVs, analyse video interviews, and compare them with successful past applications to spot personality traits that fit the role, allegedly eliminating human bias.

A recent survey by Ceridian showed that 42% of executives worldwide are already using AI in recruitment, and a further 46% plan to do so. That’s basically everyone.

However, AI recruitment tools are not hypercompetent robot HR managers who can understand abstract hiring criteria the way humans can. All they know how to do is find patterns, making statistical links between features of past CVs and interviews and desired personality traits, then looking for them in new applicants.

This may not be as helpful for recruiters as it seems. The link between behavioural cues and personality is still debated by psychiatrists. The software also raises ethical concerns about privacy and consent. What’s more, it may actually perpetuate bias; if the successful past applicants are mostly white men, and you tell an AI to find people with similar behavioural cues, there are no prizes for guessing what you’ll get.

In 2019, Amazon stopped using an AI CV-scanner for exactly that reason: trained on applications that were mostly from men, the AI had become sexist. It downgraded CVs that mentioned “women’s” activities and favoured words more commonly used by men, like “captured” and “executed”. Meanwhile, a German study found that AI video analysis could be picking people based on “personality traits” like their video background, their hairstyle, and whether they wore glasses.

It’s not yet clear whether AIs can be taught to overcome this kind of bias. Training them to disregard appearance and intonation would defeat the purpose. In fact, since the AI runs on the kind of superficial differences that lead to bias, it could be more useful as a way to spot biases to avoid.

Privacy concerns also raise thorny problems; not all candidates want their personality analysed by a computer, but if they’re given the choice to opt out or select which results get shared with the employer, this could reintroduce bias: candidates might only choose to share flattering results, or interviewers could be less willing to hire those who withheld results.

Ethical AI recruitment demands highly controlled use by specialists who understand the tools. At this point, it’s worth asking whether the time and effort saved are worth the time and effort it will take to make AI recruitment truly ethical. AI analysis of hiring practices, rather than the people being hired, might be a better route to eliminate bias.

Recent Posts

6 Game-Changing Tech Trends to Embrace in 2024: A Recruiter's Perspective July 9, 2024

By Luke Morgan As a seasoned recruitment consultant at TEC Partners, I've witnessed firsthand how technological advancements can reshape industries and redefine business success. In this blog post, I'll share my insights on the six most transformative tech trends that your organisation should prioritise in 2024 and beyond.

How Microsites and Branded Recruitment Win Top Talent June 27, 2024

By Elliot Rose Having recruited heavily in the IT space, I've witnessed the increasing importance of employer branding in attracting top talent, especially for niche skill sets, and senior, or executive positions. In today's competitive job market, organizations must differentiate themselves and showcase their unique value proposition to potential candidates. This is where the combined power of microsites and branded recruitment campaigns, delivered by experienced agencies, can make a significant impact.

Top Five Cloud Computing Infrastructure Trends That are in Play Right Now June 12, 2024

Cloud computing continues to revolutionise the business landscape, with new trends emerging that shape how organisations operate and compete. As a specialist recruiting in this space, I thought it would be helpful to know about these trends as it will help you formulate your career trajectory.