Main image of article AI Hiring Tools Leave Tech Professionals Frustrated, Distrustful

AI-driven hiring tools are reshaping how technology professionals search for jobs, but not in ways that inspire confidence.

A Dice survey of more than 200 tech workers found widespread frustration with automated screening, with many saying the process favors keyword gaming over real qualifications and leaves them feeling dehumanized.

Most respondents said they believe AI systems regularly miss qualified candidates who don’t tailor resumes with the “right” keywords.

That has pushed many professionals to alter or even strip down their resumes to improve compatibility with automated tools, often removing details about personality or accomplishments.

Nearly eight in ten said they feel pressured to exaggerate their qualifications just to get noticed.

For job seekers, the flaws are tangible: Some noted that correctly spelling the name of a software tool could hurt their chances if the AI system expects a common typo. Others said the platforms fail to recognize transferable skills, a serious limitation in a field where adaptability is often critical.

Jonathan Kestenbaum, managing director for tech strategy and partners at AMS, says to pass AI screening systems, IT professionals should tailor their resumes to include context and keyword-rich language.

“They should also incorporate descriptions of their technical achievements that showcases the candidate’s ability to harness the power of AI and machine learning,” he adds.

Fadl Al Tarzi, CEO and cofounder of Nexford University, says if a resume isn’t optimized with the right language, it may never make it past the first filter.

“That means listing ‘AI’ isn’t enough. You need to show how you've used it,” he says.

He suggests translating technical projects into real-world outcomes powered by AI.

“Done well, you’re not just checking a box for an algorithm—you’re making the case to the human being behind the screen,” Al Tarzi says.

Kestenbaum says senior technologists, with decades of domain knowledge, can still demonstrate value in a hiring process that seems to increasingly reward those who “game the system” rather than demonstrate proven expertise.

“Seasoned IT pros can showcase their measurable outcomes and align their deep domain knowledge beyond surface-level metrics in the hiring process,” he explains.

By reflecting their ability to adapt and harness the power of AI-driven analytics and machine learning, they can demonstrate how they have served as force multipliers using their full skillset to build scalable systems, increase efficiency or reduce cost.

Meanwhile, trust in the process is faltering. Most of those surveyed expressed worry that no human ever sees their application, while others expressed concern that algorithmic bias is reinforcing existing inequities in the workforce.

Kestenbaum said talent acquisition teams must leverage AI ethically in their recruiting and hiring processes.

“While AI digests a huge amount of information to drive efficiency, talent leaders still play a central role in building relationships and making final hiring decisions, ensuring the human touch is at the center of talent acquisition and management,” he says.

Although AI dramatically drives efficiency in talent acquisition and management by streamlining processes, enhancing decision-making, and improving overall outcomes, with the new technology, there is an even greater need to recognize soft skills to propel innovation in the IT sector.

“Filtering out diverse or unconventional candidates in the hiring process can lead to setbacks on innovation as well as lead to IT organizations missing out on value-adding perspectives that can drive growth and efficiency,” Kestenbaum says.

He recommends HR leaders leverage AI but recognize AI’s limitations – whether it’s bias or outdated data – and the importance of putting human and AI collaboration at the organization’s core.

Al Tarzi cautions when hiring systems reduce candidates to keyword matches, the result isn’t efficiency, it’s missed potential.

“Tech teams that reward compliance over creativity will end up with echo chambers rather than breakthroughs,” he says.

From his perspective, bias in hiring isn’t theoretical—it’s already built into the signals managers reward.

“AI-driven hiring tools trained on preferences from universities with AI-integrated coursework preferences will amplify the problem,” he says.

This advantaging of “AI-forward” universities while overlooking self-taught professionals and career switchers is a structural cause of bias in hiring.

“Bias is shifting from demographics to access, and it’s no less damaging,” Al Tarzi says. “Tech leaders need to spot this and act.”

Experiences varied across demographics: early-career and highly experienced candidates reported the highest levels of distrust, women were more likely than men to reshape resumes for AI filters, and mid-career professionals expressed slightly greater tolerance of the systems.

The consequences extend beyond frustration. Three in ten respondents said they are considering leaving the industry altogether, citing the hiring process as a major factor. Many described their experiences with AI-driven screening as “hopeless” and “dehumanizing.”

For employers, the findings highlight a risk that talent pipelines will shrink as disillusioned professionals disengage or depart. At the same time, organizations may be missing out on candidates whose skills and creativity don’t align neatly with algorithmic models.

“If we misuse AI in hiring, we risk splitting the workforce—those who game the system and those who give up on it,” Al Tarzi says. “That divide weakens not just careers, but the long-term health of the industry.”

He says if AI-driven systems feel restrictive or unfair, great candidates will lose trust in the process.

“The consequences are lower retention, weaker diversity, and diminished confidence in the industry itself,” he warns.