A Blog by Jonathan Low


May 6, 2024

Why HR Is Struggling With Unintended Consequences Of AI In Hiring

Be careful what you wish for. AI was supposed to reduce costs and increase productivity in the hiring process, especially for HR recruiters. But then the applicant pool embraced it.

HR departments are reporting exhaustion and frustration not just with the volume increase in applicants that AI has created, but with the need for double-checking disparities, claims and outright fraud on AI-generated applications, to say nothing of technological insensitivity to biases. There is, as well, a need for more flexibility in determining why a candidate  who lacks one specific skill, leading to AI rejection of that application, may actually be the best person for a job. The upshot is that, at least for now, AI may be increasing workload and degrading productivity until HR professionals and others responsible for hiring get a better handle on figuring out how best to use AI. JL 

Amanda Hoover reports in Wired:

There’s a disconnect in the online job hunt: “Everyone has a job they’re offering. Everyone is looking for a job. No one is getting it.” It’s not uncommon to now receive hundreds or thousands of applicants. Rounds of layoffs since 2022 have sent a mass of skilled tech workers job hunting, and the wide adoption of generative AI has upended the recruitment process, allowing people to bulk apply to roles. (But) those eager for work are hitting a wall: overwhelmed recruiters and hiring managers. Much is still unknown about how and why AI makes the choices it does, and it has a history of bias. (Recruiters) want to understand why AI makes the decisions it does, and (need) more room for nuance: Not all qualified applicants are going to fit into a role perfectly.

So far, over 3,000 people have applied to one open data science vacancy at a US health tech company this year. The top candidates are given a lengthy and difficult task assessment, which very few pass, says a recruiter at the company, who asked to remain anonymous because they are not authorized to speak publicly.

The recruiter says they believe some who did pass may have used artificial intelligence to solve the problem. There was odd wording in some, the recruiter explains, others disclosed using AI, and in one case when the person moved on to the next interview, they couldn’t answer questions about the task. “Not only have they wasted their time, but they wasted my time,” says the recruiter. “It’s really frustrating.”


It’s not uncommon for tech roles to now receive hundreds or thousands of applicants. Round after round of layoffs since late 2022 have sent a mass of skilled tech workers job hunting, and the wide adoption of generative AI has also upended the recruitment process, allowing people to bulk apply to roles. All of those eager for work are hitting a wall: overwhelmed recruiters and hiring managers.

WIRED spoke with seven recruiters and hiring managers across tech and other industries, who expressed trepidation about the new tech—for now, much is still unknown about how and why AI makes the choices it does, and it has a history of making biased decisions. They want to understand why the AI is making the decisions it does, and to have more room for nuance before embracing it: Not all qualified applicants are going to fit into a role perfectly, one recruiter tells WIRED.

Recruiters say they are met with droves of résumés sent through tools like LinkedIn’s Easy Apply feature, which allows people to apply for jobs quickly within the site’s platform. Then there are third-party tools to write résumés or cover letters, and there’s generative AI built into tools on sites of major players like LinkedIn and Indeed—some for job seekers, some for recruiters. These come alongside a growing number of tools to automate the recruiting process, leaving some workers wondering if a person or bot is looking at their résumé.

“To a job seeker and a recruiter, the AI is a little bit of a black box,” says Hilke Schellmann, whose book The Algorithm looks at software that automates résumé screening and human resources. “What exactly are the criteria of why people are suggested to a recruiter? We don’t know.”

Still, generative AI tools for both recruiters and job seekers are becoming more common. LinkedIn launched a new AI chatbot earlier this year, meant to help people navigate job hunting. The hope was that it would help people see better if they align well with a job or better tailor their résumé for it, peeling back the curtain that separates a job seeker and the hiring process.

That came after LinkedIn began rolling out a new set of generative AI tools for recruiters to source candidates in October. With the sourcing tool, recruiters can search a phrase like “I want to hire engineers in Texas,” and profiles of people that may meet those criteria appear, as do other specific skills that may be related to the role. They can also send messages written with generative AI and set automatic follow-up messages. LinkedIn’s data shows that AI-generated messages are accepted about 40 percent more frequently than one-off messages written only by a recruiter.

“We’re really focused on helping to make recruiters’ lives more efficient,” says Peter Rigano, director of product management at LinkedIn. By substituting the tedious parts of the job with generative AI, the company hopes recruiters can focus on “more rewarding aspects of their job,” like actually connecting and talking to job seekers.

Indeed also announced new AI tools in April. The company says its Smart Sourcing tool recommends candidate profiles based on an employer’s needs. It uses AI to read résumés of “active” profiles (which include those who have searched for jobs in the last 30 days or updated profiles) and summarize why the person might be a good fit, but Indeed says the tool will also note that some gaps could be overlooked, like if a person has four years of experience when a job description asks for five. Like with LinkedIn, employers can also send AI-generated messages to candidates. Both companies built their generative features on OpenAI tools, as well as their own internal data or models.

But the changes may not fix everything recruiters are dealing with. The recruiter from the health tech company says their company rarely posts jobs to LinkedIn; the job platform’s Easy Apply feature sends too many unqualified applicants their way. And with so many people out of work and applying, they’re relying more on inbound candidates, and have little need to source for more options.

Another recruiter from a second health tech company, who requested anonymity because they are not authorized to speak to the press, says the inbound candidates on LinkedIn often aren’t good matches or high-quality candidates, but that the site remains their “bread and butter” for sourcing. They pulled up the site’s generative messaging tool and had it draft potential outreach to candidates. While the recruiter told WIRED it wasn’t a bad first shot, they would still opt to add other details to further personalize it.

While LinkedIn’s tool shows candidates in an order that can feel preferential, Rigano says the company has programmed it to be representative of people in the category, by showing men and women proportionally to their presence in the industry, or by highlighting other relevant skills a person could search that might bring up candidates that would slip through the cracks of an initial job search. Indeed's head of responsible AI, Trey Causey, tells WIRED the company has engineers, scientists, and researchers who evaluate the system's fairness, and that the company takes feedback from people about ways to improve its generative AI systems. “However, no system can ever be completely unbiased, as there isn’t a single definition of bias and definitions often conflict,” Causey says.

These tools may favor more active profiles on their sites—which makes sense for recruiters hoping to reach people who are actually checking their messages. But that also could exclude people who have been less active on the sites, or potentially even stepped away from the workforce due to reasons like illness or caregiving.

Bias is a top concern in automated hiring. HR tools have been found to make rash, negative judgements on applicants who have Black-sounding names, prefer men, or skip over candidates who don’t check every box or have employment gaps on their résumés.

Sim Bhatia, the people operations manager at Reality Defender, a company that detects deepfakes, says she doesn’t use any AI tools to evaluate applicants. For now, the tools are not as useful as they are risky, she says. She can filter for applicants based in New York, where Reality Defender’s office is, without using the generative tools. Using the still developing technology might be a data safety issue for candidates, she says, or for current employees if it’s used in the company’s system.

Bhatia says she is reviewing applicants herself, looking at résumés and screening applicants over the phone, which takes about 10 hours a week as the company’s small staff is looking to expand. But like many recruiters, she’s not shutting down the future potential. “I’m excited to see it evolve, as with any technology that’s arisen,” Bhatia says. But in her opinion AI is “not there yet.”

And as more generative tools are built into the systems they use, recruiters are still learning where AI works and doesn’t in the process, says Leanne Getz, vice president of delivery channels at IT staffing firm Experis. “There’s no doubt that AI is going to add significant value into the recruiting space,” she says. But the tools won’t fully automate the hiring process. “We’re a people organization. The AI can’t replace what our recruiters can do day to day.”

Some with hiring power are becoming wary of the career social sites altogether. Krysten Copeland, founder of PR firm KC & Co Communications, says she doesn’t plan to post new jobs on LinkedIn. In the fall, she was looking for a public relations manager and had 600 people apply to the role. There were some quality candidates who ultimately didn’t work out, but also some odd ones, Copeland says. She believes one lied about prior work experience, a suspicion that came after checking in with colleagues she knew at that workplace, and says that person was one of the top recommended candidates from LinkedIn. LinkedIn can’t verify every claim made on a person’s profile, but it does have a feature for people to verify their profiles, and its extension lists of professional connections generally build trust among recruiters and hiring managers.

Ultimately, there’s a disconnect in the online job hunt, Copeland says: “Everyone has a job they’re offering. Everyone is looking for a job. No one is getting it.”


Post a Comment