Artificial intelligence has already taken over large parts of modern hiring. Algorithms scan CVs, software ranks candidates, and automated systems now replace junior recruiters and even some of the engineers who once built them. CV screening, shortlisting and candidate ranking are now routinely automated. What has largely remained untouched, until now, is the interview: a human candidate speaking to a human interviewer in real time. That assumption is starting to fray, not through corporate announcements, but through unsettling first-hand accounts from people who believed they were speaking to real humans.
“My interviewer wasn’t even human”
In one account posted on Reddit, in the r/interviews subreddit, a job applicant described receiving a routine email invitation for an online interview. The link worked as expected. The video loaded. On screen was a woman who smiled, nodded, and began asking questions.At first, nothing seemed unusual.As the interview continued, the candidate noticed her head movements repeating in a way that felt unnatural. Small facial twitches appeared every few seconds. The applicant assumed it was a poor internet connection and carried on.When the conversation moved into substantive questions, the delivery stood out.“There was no hesitation. No ‘uh’. No pauses,” the user wrote.The interviewer responded instantly to every answer, with polished, perfectly structured language. Curious, the candidate asked a question back: “Why do you think this role matters?”The response came immediately. It was fluent, confident, and, in the user’s words, “textbook-perfect”.The candidate asked the same question again. The answer was identical. Same wording. Same cadence.They asked a third time.The response did not change.Shortly afterwards, the screen froze briefly. When the video resumed, the interviewer continued speaking as if nothing had happened.The candidate ended the post by questioning whether companies should be allowed to conduct interviews using AI avatars without disclosure, writing: “I’m not against AI in hiring, but if an interviewer is basically a talking bot, shouldn’t candidates at least be told?” The post sparked hundreds of comments, many expressing concern about transparency in recruitment and how easily AI could now pass as human in a professional setting.
“Five minutes in, I realised my candidate wasn’t human”
It is not only employers experimenting with automation. Some candidates, too, have turned to clever technological workarounds to get a foot in the door.In a separate but closely related account posted to r/recruiting, the experience unfolded from the other side of the interview. The poster said they were interviewing candidates for an AI engineering role and joined a video call expecting a routine screening. Almost immediately, they noticed something off about the candidate’s movements. “This person’s head moves a lot when they talk,” the recruiter wrote. “Weirdly repetitive. It is not natural. It is almost looping.” The recruiter continued the interview, assuming camera lag. Then the candidate began speaking uninterrupted for nearly two minutes. “No pauses. No filler words. Just continuous, textbook-perfect talking.” To test the situation, the recruiter asked a basic question: “What is AI?” The response came back scripted. When the question was repeated, the answer was identical. The third attempt produced the same result. Shortly after, the call disconnected. According to the recruiter, HR later confirmed what had happened. The real candidate had joined briefly at the beginning of the call to introduce themselves. After that, an AI agent had taken over the interview. “It even looked almost identical to the person’s LinkedIn photo,” the recruiter wrote. They concluded bluntly: “So yeah. Not just fake resumes anymore. Fake candidates are now literally joining interviews. Recruiting hell has officially entered the uncanny valley.” Even if this is not yet a formal or widespread practice, it fits squarely with the direction hiring has been moving in. Tasks once handled by people are increasingly outsourced to systems designed to simulate them, often well enough to pass without challenge. Interviews have long been one of the few remaining points of genuine human contact in recruitment. These accounts suggest that boundary is starting to blur.What makes the moment unsettling is not just that AI can now mimic faces, voices and conversational rhythm, but that it is learning to do so rapidly. Voice modulation tools already erase tell-tale pauses. Visual generators can produce lifelike faces that hold up under casual scrutiny. Behavioural models are improving at replicating imperfection itself. For now, repetition gives them away. Soon, even that may not.