AI Hiring Systems Are Screening CVs Written by AI

Artificial intelligence is transforming recruitment into a closed loop where AI-generated applications are filtered by automated hiring systems, weakening employers’ ability to identify genuine talent.
Recruitment has traditionally relied on human judgement. Recruiters assessed potential, interpreted experience and evaluated candidates through conversation, context and instinct.
That model is rapidly changing.
Artificial intelligence is now embedded across the hiring pipeline. Employers use automated screening systems, ranking algorithms and AI interview tools to manage growing application volumes. Candidates respond by using generative AI to write CVs, tailor cover letters and prepare interview answers.
The result is a system where machines increasingly interact with machines while humans appear only at the final stage.
Efficiency has improved. Signal quality has not.
Applicant tracking systems have become the first gatekeeper
Applicant tracking systems, commonly referred to as ATS platforms, now sit at the centre of modern hiring pipelines.
Research compiled by Tracker-RMS and SelectSoftware Reviews suggests around 70% of large companies now rely on ATS software to screen CVs, while almost every Fortune 500 company uses an ATS system somewhere in its recruitment process.
These systems filter applications based on keyword alignment with job descriptions and structured CV formatting.
For employers dealing with hundreds of applications per vacancy, the operational appeal is obvious. Automated screening can reduce the time spent reviewing CVs by as much as seventy-five percent.
The trade-off is structural.
When algorithms prioritise keyword alignment, candidates begin optimising their applications for the algorithm rather than for the employer.
Candidates are adapting by using AI against the system
Automation on the employer side has triggered automation on the candidate side.
Data compiled by StandOut CV using recruitment technology research from Beamery suggests around 46% of UK job seekers now use AI tools during their job search, including for CV writing, cover letters and interview preparation.
This behaviour is rational.
If automated systems determine which applications reach human recruiters, candidates must optimise their applications to survive that filter.
Generative AI tools allow candidates to create highly polished, keyword-optimised CVs in seconds. Cover letters can be tailored instantly to job descriptions. Interview answers can be rehearsed and refined using AI assistance.
The hiring process becomes less about presenting capability and more about navigating the ranking logic of the system.
Recruitment risks becoming an AI-to-AI process
The structural consequence is an algorithmic feedback loop.
Candidates generate applications with AI tools. Employers evaluate those applications using AI screening systems.
Human judgement often enters the process only after several automated filters have already determined which candidates remain.
Trust in this system is already fragile.
A Gartner survey found only 26% of job applicants trust artificial intelligence to evaluate them fairly, even though more than half believe AI is already involved in screening their application information.
That gap between adoption and trust signals a growing legitimacy problem in the hiring process.
Application volumes are overwhelming human recruiters
The adoption of automated hiring tools is largely a response to scale.
Digital platforms have dramatically lowered the cost of applying for jobs.
Recruitment commentary widely circulated within LinkedIn’s own ecosystem suggests the platform now processes around 11,000 job applications per minute, with application volumes increasing sharply year-on-year.
For employers, this creates an unavoidable constraint.
No hiring team can meaningfully review this volume of applications manually. Automated filtering becomes operationally necessary.
The side effect is that human judgement increasingly occurs only after automated systems have already shaped the shortlist.
Talented candidates may therefore disappear inside the filtering process before any human ever evaluates them.
The credibility of the cover letter is collapsing
One of the clearest signals of this shift is the declining credibility of the cover letter.
Reporting in The Times has highlighted how recruiters from firms including Robert Walters, Hays, Randstad and SThree increasingly view cover letters with suspicion because generative AI tools can now produce polished letters instantly.
Applications can therefore appear highly articulate and tailored without necessarily reflecting the candidate’s real thinking or communication ability.
A polished cover letter once implied effort and motivation.
Today it may simply indicate competent prompt engineering.
Recruiters are becoming system operators
Artificial intelligence is not eliminating recruiters, though it is changing their role.
Recruitment software increasingly handles administrative tasks such as candidate sourcing, screening, scheduling and communication. StandOut CV reports that 43% of large organisations are already using AI tools during candidate interviews, while automation saves recruiters roughly 4.5 hours of work per week.
These efficiency gains are real.
They also shift the recruiter’s role away from early-stage talent evaluation and towards process management.
Traditional headhunters continue to thrive in senior and specialist markets where judgement and networks remain decisive. In high-volume recruitment, however, algorithmic filtering is becoming the dominant mechanism.
The hidden cost for employers
The assumption that automation primarily harms candidates is incomplete.
Candidates certainly lose transparency. Many receive automated rejection notices without explanation and have no idea whether a human ever reviewed their application.
Employers face a different problem.
They lose signal quality.
When candidates optimise applications using AI and employers filter them using algorithms, the hiring process becomes better at sorting documents and worse at identifying genuine capability.
Strong candidates who do not optimise for ATS logic may be filtered out early. Candidates who understand algorithmic formatting may appear stronger than they actually are.
The system becomes more efficient and less informative at the same time.
Bias and accountability remain unresolved
Artificial intelligence does not remove bias from hiring decisions. It relocates it.
Automated hiring systems rely on training data and pattern recognition. Where historical hiring data contains bias, algorithms can reproduce those patterns.
The larger challenge is transparency.
Once hiring decisions are mediated through software systems operated by third-party vendors, explaining why a candidate was filtered out becomes difficult. The low level of trust recorded in the Gartner survey suggests that candidates already recognise this opacity.
The uncomfortable outcome
Artificial intelligence is not making recruitment irrelevant. It is making it faster.
The question is whether it is making it better.
Candidates optimise their applications using AI tools. Employers filter those applications using automated systems.
Both sides gain efficiency.
Both sides lose clarity.
Recruitment was once designed to identify human capability. As more of the process becomes automated, employers face a quieter risk.
The hiring system they built to save time may gradually make it harder to recognise the very people they are trying to hire.
Sources
- Gartner (2025). Survey Shows Only 26% of Job Applicants Trust AI to Evaluate Them Fairly
- StandOut CV (2025). AI in Recruitment Statistics (UK)
- Tracker-RMS / SelectSoftware Reviews (2024). Applicant Tracking System Statistics
- LinkedIn platform analytics and recruiter reporting on application volumes (2025).
- The Times (2026). Why the Cover Letter Is Dead – and How to Get Hired Without One.

