In answer to a global human talent shortage, artificial intelligence (AI) is increasingly being used by recruiters to source, screen and select talent. While AI might be useful for recruiters, what are the implications of letting AI technologies take over the recruitment process? How is it already impacting the candidate experience?
Nick Sutton, Brunel Australasia’s General Manager and recruitment expert with over 30 years of experience in the industry unpacks what the future may look like if ‘the bots’ are in charge of the recruiting process.
An unsettling new world
Picture this. You arrive for your job interview, nervous. After receiving automated rejection letters from the last 10 jobs you applied for, you’ve finally gotten past the mechanised gatekeepers and landed an interview. You hope the interviewer is personable – a bit of chit chat always helps you to relax and ease into the formal component of the process. You’re ushered into a room where a lone chair has been placed in front of a large screen. There is not a human in sight. This does not bode well. As you take a seat, an electronic voice greets you by name and without preamble says, ‘let’s begin’. There’s not much point in making small talk with a machine, you suppose.
A question flashes across the screen and the electronic voice speaks the first interview question. Care has been taken to make the voice sound friendly, but there’s no escaping the fact that it’s not human. Intelligent, yes, but artificial…also yes. A recording light comes on and a large timer starts a countdown. You know enough about AI to understand that not only are you being recorded, your body language is also being analysed. Beads of sweat form on your forehead and you pray your minor facial twitch doesn’t kick in – a glitch in your hardwiring that comes to the forefront in stressful situations. Most humans don’t notice but you’re grimly aware this machine won’t miss a beat.
After you stumble through your answer, there is an eerie silence as the final seconds tick down on the clock. Should you say more? What if you run over? There’s no reassuring response, no acknowledgement that your point has been understood, no visual cues to tell you whether or not your answer has hit the mark. You’re still recovering when the next question is displayed, and the arbitrarily allocated countdown begins again…
Sound like a far-fetched scene from some dystopian sci-fi movie? Think again. With increasing AI involvement in the recruitment process, interviews like this are already taking place in various parts of the world.
Accelerating growth of AI in recruitment
The $422.37 billion global AI market is likely to grow at 39.4 per cent between 2022 and 2028. In response to an ongoing talent shortage (one study predicts a global talent shortage of 85 million workers by 2030), the recruitment industry is increasingly turning to AI for solutions. It’s easy to understand the appeal. AI is enticing – it offers capabilities that far exceed human capacity. We are not too far off the human elements of recruitment becoming fully automated. But what does it mean for recruiters, clients and candidates when we transfer the search, screening and assessment components of recruitment to AI?
Speed and efficiency…at the cost of the candidate experience?
It’s a competitive world out there when it comes to sourcing talent. With more jobs available than qualified workers to fill roles, recruiters must spread the net wider to secure the right person for the job. AI recruiting tools allow for this, broadening the scope of available talent. Hiring teams are now able to access multiple talent pools from a single source – connecting with millions of candidates in a way that was simply not possible in the past. AI can also remove manual aspects of recruitment such as scheduling and managing candidate pipeline data. It all sounds amazing…but how does this play out for the candidate?
I want to share my experience of being in the job market and applying for a leadership role via SEEK a few years ago. As an experienced recruiter, I carefully worded my resume and was excited by the prospect of receiving an opportunity to transfer the skills and competencies I’d learned through leading teams and growth in one industry sector to a complementary sector. I submitted my application…and was rejected within 20 minutes of applying. I received an automated response thanking me for my application and stating my experience didn’t match the criteria. Of course, we can’t win them all, but to be rejected by AI, not even a recruiter, was deflating to say the least.
In such a tight candidate market, can recruiters really afford to treat applicants this way? An impersonal, automated rejection email is certainly no match for a recruiter picking up the phone. Also, candidates need to consider carefully: how can they get past the first gatekeeper when it’s a machine? Who is educating candidates on how to write their resumes, now that they are more likely to be read in the first instance by a machine, not a human? And who is a talented candidate more likely to seek out – a recruiter who takes the time to develop a connection with them, or a recruitment agency that communicates via AI?
Does AI really remove bias?
Companies are increasingly invested in creating diverse and inclusive workplaces, but is this investment being unintentionally thrown out the window by a bot at the start of the candidate journey? Proponents of AI argue that removing the human element helps to eliminate bias – conscious or unconscious – in the recruiting process…but is this truly the case?
Let’s take a look at algorithms, which are now commonly used to make hiring decisions. People assume that algorithms are unbiased, and while it’s true that they work objectively, the fact is that the objective criteria is being evaluated by subjective metrics. The parameters are designed by humans, who are filled with biases, so are we actually better off? For example, algorithms may screen out applicants with long periods of unemployment, creating bias towards parents or those who have suffered illness, injury or disability. Algorithms may also rule out candidates based on incarceration records, language used and even physical distance from a job. These parameters can create biases against qualified candidates who have difficult pasts, come from different cultures, or live in low socio-economic areas. In attempting to eliminate bias by outsourcing recruitment to AI, we are in fact creating new sets of biases which automatically preclude countless qualified applicants.
AI certainly has a place in the recruitment industry and offers numerous benefits. However, it would serve us well to move carefully into a future where bots control the entire recruitment process. At its heart, recruitment is a people business. Replacing humans with AI in the recruitment industry threatens the very fibre of the recruitment industry, removing an essential human component and reducing applicants to a list of skills and qualifications. While we do need speed to market and can benefit from gaining broader insights into where talent is and what candidates are looking for in their next role, we still need human recruiters to capture the warmth, connection and relationship building that serves the candidate well and entices the right person for the job. We still need humans to practice compassion, build community and notice the subtleties that an algorithm might miss. That is the art of a good recruiter. So, until they can clone us, I think we are safe for now.
About the author
Overseeing Brunel’s West Coast operations since January 2019, Nick Sutton has more than 30 years experience in the recruitment industry across Oil & Gas, Mining, Civils and Infrastructure sectors. He is an accomplished professional with extensive skills in business leadership, stakeholder and organisational engagement and the development and retention of high-performing talent in various industry sectors.