While many people might be happy to have AI make suggestions around which film to watch next, or to recommend other music they may enjoy, handing it responsibility for the selection process in their next job application can raise understandable concerns.
In this blog post, we take a closer look at the pitfalls and the potential of using AI in recruitment, and where it might add value rather than systemized bias.
Thereâs no denying that in the modern world, AI has a lot of great uses. Across a range of industries, itâs being increasingly used to streamline workflows and cut costs. In fact, in the recruitment world, AI â and more specifically machine learning â has been used in various forms for some time now.
Common applications include automated interviews, resume (or CV) screening, as well as administrative duties, such as identifying duplicate candidate records. But the truth is, thereâs a lot we donât know about how AI is being used in recruitment. And thatâs a concern.
For the recruitment business, where AI can really shine is in crawling talent pools for potential candidates based on specific job criteria such as qualifications and experience. It can also learn broadly what kind of candidate is a good fit for a particular role. However, its real potential benefits come from automating some of the repetitive and time-consuming work which goes along with screening, assessing, and scheduling.
Even as far back as 2017, studies showed that hiring decision-makers saw the cost and time saving potential of AI, alongside its ability to deliver better candidate matches. Interestingly, 43% of decision makers also believed that it would remove human bias. And thatâs exactly where AI has come unstuck: when itâs been used to replace human decision-making in an effort to speed up the hiring process.
When we talk about AI in recruitment, we really mean machine learning â computer systems which can learn from statistical models and data sets. As Amazon discovered in 2015, using historical human data to inform AI decision models can come at a cost to both potential candidates and their future employers. Their experimental hiring tool which made use of machine learning was informed by ten years of resumes submitted to the company.
The issue was that the computer system was essentially given a decade of data revealing male dominance in the tech industry and adapted its algorithms accordingly. The result? Resumes containing the word âwomenâ were automatically downgraded. When this âalgorithmic biasâ was discovered, interventions were made to ensure the system became gender-neutral, but it left lingering concerns that future unintended discrimination might occur, raising important questions around how to ensure that recruitment AI is transparent and fair.
In fact, transparency over the use of AI in recruitment is one of the biggest challenges currently facing the industry. Thatâs where new legislation is playing a crucial role.
A game changer in terms of the use of AI in recruitment, New York Cityâs AI employment law restricts the use of âautomated employment decision toolsâ and requires recruiters to undergo an annual bias audit to alleviate potential bias in their systems.
Recruiters will also have to be transparent about where AI has been used in the recruitment process, and to even provide candidates with alternative options for processing their applications. Undisclosed or biased use of AI in the hiring process will also be subjected to penalty fines.
Taking effect from January, 1 2023, New Yorkâs new law blazes the trail for further legislation to tackle the inherent bias and potential discrimination generated by machine learning algorithms.
Similar conversations are taking place in the European Union, where the AI Act is set to institute self-certification programs and government oversight. This law will create transparency requirements for AI systems that interact with people, and will attempt to ban a few âunacceptableâ qualities of AI systems. Individuals or companies located within the European Union, placing an AI system on the market in the European Union, or using an AI system within the European Union would be subject to the regulation.
Itâs easy to be dismissive of AI in recruitment. After all, itâs a people business. But there are some clear use cases where it can make trustworthy and time-saving improvements to the placement process. AI can also be trained to trawl records and match specific criteria. For more common roles requiring limited experience or technical skills there is perhaps wider scope for AI to assist with search and selection. The real danger arises when AI trained on imperfect data, and therefore inherently biased, is used for key decision-making.
Conceivably, AI could be used to handle the entire recruitment process â which perhaps less scrupulous recruiters might find appealing. But for most recruiters thereâs a sense of professional pride, and a human aspect to hiring â building relationships with employers and candidates â which AI simply canât replace.
As an example, there could be three equally well-qualified candidates in the running for an open position. Their education, qualifications and experience all fit the bill, but what about their cultural fit? A good recruiter understands the importance of aligning the employerâs value proposition with the right candidates. AI can easily miss the subtle verbal and non-verbal nuances which influence human-decision making. Humans can also see beyond the data. Especially when youâre looking for someone with transferable skills from outside a particular industry. Good recruiters know that instinct, as much as anything else, comes into play when making the right hire.
Thereâs also the candidate’s perspective to consider. They may have a (sometimes valid) suspicion of any technology used to influence the hiring process. For example, even though Applicant Tracking Systems (ATS) have been around for decades, you can still find plenty of blog posts and guides for âhow to beat the ATSâ. Fears run even deeper with undisclosed uses of AI, particularly when it comes to diversity and inclusion in the workplace. As the Amazon experiment revealed, machine learning opens up the potential for discriminatory algorithms, and runs the inherent risk of excluding candidates you really wanted, over the ones it thinks you want.
Although there are some tasks and activities which can be comfortably replaced by AI, when it comes to key decision-making, sourcing the right talent and identifying the best-fit candidate is best left in the hands of humans. But how can you speed up the time to hire without using AI?
Recruitment automations can take care of repetitive, and repeatable everyday tasks, and give recruiters more time to spend time with candidates and employers. Itâs particularly helpful for keeping all parties updated through emails, form letters, and notifications, cutting down the time needed to manually create correspondence.
PCRecruiter is an ATS / CRM hybrid which offers recruiters a huge range of opportunities to automate certain aspects of their recruitment process, making workflows more streamlined and cost-effective. Tailored to specific workflows, itâs a seamless, customizable and powerful tool for handling your end-to-end talent sourcing and.
Want to learn more about the benefits of using automation in recruitment? Read our blog post 5 Ways To Use Automation In Recruitment.
The Telephone Consumer Protection Act (TCPA), enacted in 1991, is designed to protect consumers from unsolicited calls and messages. While its initial focus was on telemarketing calls, the TCPA’s reach now extends to texting, posing unique challenges for recruiting agencies who heavily rely on texts for candidate and client engagement.
Read moreJoin Gabe Mendelsohn, Partnerships Manager at Woo.io, alongside PCR’s James Blair and Drew Rothman in a webinar about uncovering the hidden potential in your database.
Read morePCR 9 is getting a few end-of-year enhancements in this week’s release, including some stylistic updates as well as updates and improvements to our integration with SEEK.
Read moreFind out more about who we and what we do.