What you missed:
Glitch in the matrix 🤖
A jobseeker's AI interview went haywire, looping endlessly and misnaming him, raising major concerns about tech-led hiring.
👉 ChipChick
IBM swaps HR for AI 👋
IBM is replacing chunks of its HR department with custom-built AI agents. Efficiency win? Or cultural crisis in the making?
👉 Economic Times
Gen Z won’t settle 😤
A recent Deloitte survey highlights shifting workplace priorities among Gen Z and millennials, who prioritize meaningful work, flexibility, and mental wellbeing over traditional incentives like salary or job titles.
👉 News.com.au
Middle managers are not OK 😩
A recent report highlights that 72% of senior executives are stressed due to cuts in middle management, leading to overburdened managers and reduced employee support.
👉 CFO Dive
And speaking of broken systems… let’s talk about that AI interview fail that turned a real opportunity into a real joke—and what it reveals about the human cost of automation 👇
An AI glitch recently turned a dream job interview into a comedy of errors — and it says a lot about the future of hiring.
Last week, TikTok user Leo (@leohumpsalot) finally landed an interview for his dream job as a news reporter. The catch? The interviewer wasn’t a person. It was an AI bot.
What could go wrong? Turns out, a lot.
The chatbot kicked things off with: “Let’s circle back. Tell me about a time when, when, when…” and promptly got stuck in a loop. Leo couldn’t get a word in. Then, without skipping a beat, the bot thanked him for his “great responses.” Later, he received a rejection email addressed to “Henry”, congratulating him on an interview that apparently happened the day before.
You can watch the full breakdown here on ChipChick.
It’s funny in a surreal way, but it’s also revealing. This wasn’t just a one-off tech fail. It’s a snapshot of what can happen when automation runs ahead of common sense.
🤝 Supported by Notion
Thousands of startups use Notion as a connected workspace to create and share docs, take notes, manage projects, and organize knowledge—all in one place.
We partnered with Notion to give you up to 6 months free of new Plus plans, including unlimited Notion AI (up to 6,000$ in value)!
More and more companies are jumping on the AI hiring bandwagon, sold on the promise of speed and objectivity. The reality? Most of the tech still struggles to deliver what it promises. Sure, AI can scan a CV in seconds. But hold a decent conversation with a nervous, real-life human? That’s another story.
A recent Australian study found that AI hiring tools often struggle with accents and speech variations. In some cases, error rates spiked up to 22%. It’s not just a glitch, it’s an equity problem.
And the data that powers these tools? It’s often built on biased historical hiring practices. This analysis shows how things like career gaps, which disproportionately affect women and caregivers, can be treated as red flags.
Beyond the bugs, there’s the actual candidate experience to consider. In Leo’s case, he was misnamed, ignored, and then ghosted. That’s more than a poor user interface. It’s a complete breakdown in respect.
A study from Amity University found that AI-led interviews increase anxiety. Candidates feel pressure to perform flawlessly in front of a machine, with no room for clarification or recovery.
And when things go wrong, there’s often no clear channel to fix it. No apology. No human to contact. Just an awkward story for your next group chat or, in Leo’s case, a viral TikTok.
There’s a place for AI in hiring. It can help speed up screening and reduce admin. But let’s not confuse efficiency with empathy. If someone is chatting with a bot instead of a human, they should know and have the option to opt out.
And these tools need guardrails. Recruitics warns that companies are opening themselves up to serious legal and reputational risks if they’re not auditing their AI tools properly.
As Milo’s AI blog puts it, AI works best when it’s supervised by humans, designed thoughtfully, and used to complement, not replace, real conversation.
Leo’s experience was ridiculous, yes. But also revealing. Technology might help scale hiring processes, but it doesn’t mean we should take real people out of the loop.
If your candidate feels like they’re being interviewed by a bugged-out chatbot that doesn't even know their name, it’s not just a tech fail. It’s a brand fail.
How did you find this edition?Your feedback helps us improve. If you have thoughts, just reply to this email. |