Ever carefully crafted a job application for a role you’re certain that you’re perfect for, only to never hear back? There’s a good chance no one ever saw your application┬атАФ even if you took the internet’s advice to copy-paste all of the skills from the job description.
Employers, especially large companies, are increasingly using artificial intelligence (AI) tools to quickly whittle down applicants into shortlists to help make hiring decisions. One of the most widely used AI tools is an applicant tracking system (ATS), which can filter and rank candidates’ resumes against an employer’s criteria before a human recruiter looks at the best matches.┬а
And the systems are getting smarter: some AI companies claim their platforms can not only pinpoint the most qualified candidate, but also predict which one is most likely to excel in a given role.
“The first thing workers have got to understand is: Nobody is looking at your resume. You have to run the gauntlet of the AI before you get seen [by a recruiter],” says Joseph Fuller, a professor of management practice at Harvard Business School.
While AI hiring tools┬аcan save time and money for businesses when all they want to do is fill a job, experts caution that the platforms can overlook qualified candidates тАФ and even introduce new biases into hiring processes┬аif they’re not carefully used.
Meanwhile, human job seekers are usually in the dark about exactly which AI tools are being used and how their algorithms work, prompting frustrated searches for advice on how to “beat” recruitment software тАФ much of it only scratching the surface.
AI can ‘hide’ prospective workers
Last year, Fuller co-authored research into “hidden workers” тАФ applicants who are overlooked by companies due to their hiring practices, including their use of AI tools.
The researchers interviewed more than 2,250 executives across the United States, United Kingdom and Germany. They found more than 90 per cent of companies were using tools like ATS to initially filter and rank candidates.
But they often weren’t using it well. Sometimes, candidates were scored against bloated job descriptions filled with unnecessary and inflexible criteria, which left┬аsome qualified candidates “hidden” below others the software deemed a more perfect┬аfit.
Depending how the AI was configured, it could down-rank or filter out candidates due to factors such as a gap in their career history, or their lack of a university degree, even when the role didn’t require a post-secondary education.
“Ninety-plus per cent of companies just acknowledge, ‘We know that this process excludes qualified people,'” Fuller told CBC News.
Those overlooked candidates included immigrants, veterans, people with disabilities, caregivers and neurodiverse people, among others, he added.
The researchers urged employers to write new job descriptions, and to configure their AI to include candidates whose skills and experiences met a role’s┬аcore requirements, rather than excluding them based on other criteria.
The new rules of (AI) hiring
The U.S. government has issued guidance to employers about the┬аpotential for automated hiring software to discriminate against candidates with disabilities тАФ even when the AI claims to be “bias-free.”
And from April of this year, employers in New York City will have to tell candidates and employees when they use AI tools in hiring and promotion тАФ and audit┬аthose┬аtechnologies for bias.
While Canada’s┬аfederal government has its own AI use policy, there are no rules or guidance for other employers, although legislation currently before Parliament would require┬аcreators and users of “high-impact” AI systems to adopt harm and bias-mitigation measures to┬аmitigate harm and bias (details about what is considered “high-impact” AI haven’t yet been spelled out)..
So for now, it’s up to employers and their hiring teams to understand how their┬аAI software works тАФ and any potential downsides.
“I advise HR practitioners they have to look into and have open conversations with their vendors: ‘OK, so what’s in your system? What’s the algorithm like? тАж What is it tracking? What is it allowing me to do?” said Pamela Lirio, an associate professor of international human resources management at the Universit├й de Montr├йal.
Lirio, who specializes in new technologies, says it’s also important to question who built the AI and whose data it was trained on, pointing to the example of Amazon, which in 2018 scrapped its internal recruiting AI tool after discovering it was biased against female┬аjob applicants.┬а
The system had been trained on the resumes of past applicants тАФ who were, overwhelmingly, men тАФ so the AI taught itself to down-rank applicants whose resumes mentioned competing in women’s sports leagues or graduating┬аfrom women’s colleges.
As AI becomes smarter and more attuned to the kinds of candidates an employer likes, based on who they’ve hired in the past, companies run the risk of replicating Amazon’s mistake, says Susie Lindsay, counsel at the Law Commission of Ontario who has researched the potential regulation of AI┬аin Canada.
“If you quite simply are going to use a hiring tool for looking at resumes тАФ or even look at a tool for your existing employees to decide who to promote тАФ and you’re looking at who’s been successful to date, you’re тАж not giving the opportunity for people who don’t fit that exact model to potentially advance,” Lindsay said.
Can you actually ‘beat’ hiring bots?
Do a web search for “how to beat ATS” and you’ll find thousands of results, including from professional resume writers and online tools offering tips to help stuff your resume with the right keywords to get past the AI and onto a recruiter’s desk.
But keywords are only one of many data points that increasingly-advanced AI systems use.┬аOthers include the names of companies you’ve worked for in the past, how far into your career you are, and even how far you live from the organization you’re applying at.
“With a proper AI system that’s able to understand the context of the skill and the relationships between the skills, [keyword-stuffing] is just not as fruitful as it used to be,” says Morgan Llewellyn, chief data scientist at recruiting technology company Jobvite.
Instead of trying to fool the algorithm, experts recommend applying for jobs that fit the skills, knowledge and experience you really do have тАФ keeping in mind that a human always makes the final decision.
“Even if you put this keyword, OK, well, what have you done? What was your job function, job title that you’ve done in the past?” says Robert Spilak, vice-president at ATS provider TalentNest.
“You should meet the requirements [of the role]. If you don’t meet any of those, then of course, [a] human or some automation will filter you out.”