AI isn't failing. Your workload is.
Stephen Oakes

I’ve been spending a lot of my time recently hanging around in Teams calls, Slack threads, and dimly-lit parking lots speaking with recruiters, vendors, and project managers about AI in hiring. The conversations tend to sound very similar:
What are you using it for? Automation, efficiency, scale.
Are we going to be replaced? Not yet. Probably. Maybe.
What results have you seen? Just the best. Everyone is happier. My inbox is empty. Meetings cancel themselves. It’s great.
It sounds impressive. But is it? A peek behind the curtain - at adoption rates and real ROI - reveals a costly pattern: AI is being added without redesigning the workflows beneath it. Isolated tasks are optimized but the joins between them are ignored. This results in task efficiency, but process friction.
The assumption seems to be: add intelligence, get transformation. But that isn’t how it works. You can strap a horn onto a horse’s head, but it’s still a horse. Only now it’s confused. And you’re calling it a unicorn.
The ROI mirage
Enterprise spend on GenAI is eye-watering. MIT’s Project NANDA estimates $30-40bn in 2025 alone. Yet about 95% of organizations report no measurable return. Gartner expects 40% of agentic AI projects will be cancelled by 2027.
This isn’t a bubble bursting. It’s a diagnostic error. The models are capable. The use cases exist. The systems they’re dropped into are the problem.
A day in the TA trenches
Recruiters face an expanding buffet of siloed solutions: LLM-powered sourcing, auto-personalized outreach, AI screeners, schedulers, summarization bots. Each relieves a narrow pain. None resolve the system.
A day in the life still looks like this:
Source in one tool
Draft outreach in another
Chase replies across inboxes
Paste notes into the ATS
Update hiring managers in Slack or email
Reconcile activity in dashboards and spreadsheets
Every step is slightly easier. The whole is worse. Context-switching grows. Data quality slides. Cognitive load spikes. AI, meant to simplify, starts to overwhelm. These aren’t just inconveniences. They’re symptoms of workflows that were never re-architected for intelligence at scale.
The cost of broken workflows
Everyone feels this failure to redesign workflows differently:
For recruiters: AI turns into another tab to juggle. They spend more time managing tools than moving candidates. Each copy-paste is a reminder that the system was built for the demo, not the desk.
For candidates: the experience is fractured. They face duplicate questions, delayed responses, and dropped threads. The process feels automated but impersonal: efficient in theory but frustrating in reality.
For hiring managers: the signals arrive fuzzy. Updates are fragmented across emails, dashboards, and chats. They hear about activity, not progress. Trust in both process and pipeline erodes.
The cracks show up in the metrics: sliding response rates, longer time-to-fill, more drop-outs, uneven satisfaction scores. But the real damage is experiential. A hiring journey that feels harder, not smoother, for everyone involved.
And when workflows break, credibility breaks with it. Executive trust in the pipeline fades. TA becomes reactive, not strategic. Requisitions stall. Hiring managers go rogue. The cost is a loss of confidence in the entire function along with inefficiency. And when that confidence dissipates, TA loses influence over priorities, over tooling, and over headcount. It becomes a service desk, not a strategic partner.
What the 5% get right
The minority seeing ROI aren’t using smarter tools. They’re redesigning the work.
They start by mapping the flow: who does what, when, with which inputs, and where context goes missing. Then they place AI at inflection points: decision moments, handovers, time sinks.
In recruitment, that looks like:
Inside the ATS, not around it. Surface matched candidates as requisitions evolve, using signals from hiring manager feedback, interview outcomes, and market response.
Memory-capable copilots. Track recruiter and candidate interactions, maintain context, and recommend next actions such as nudge, escalate, or pause.
Embedded analytics. Generate reporting from the workflow itself, with no duplicate data entry or side spreadsheets.
These systems adapt. They learn from recruiter behaviour and outcomes. They reduce clicks, not just keystrokes. They remove steps, not just automate them.
Most systems fail here because they’re built for the demo, not the desk.
Shiny features win trials. Seamless workflows win adoption.
A practical playbook
A simple north star keeps everyone aligned: fewer tabs, fewer handovers, faster next step. If a tool doesn’t move those three, it doesn’t make the cut.
If you lead talent, here’s how to redesign the system:
Map the reality
Document the recruiter’s actual journey: tools, tabs, waits, backtracks. Count the handovers and copy-paste moments.
Why it matters: Recruiters feel seen. Candidates stop falling through gaps. Hiring managers finally understand why updates feel fragmented.Instrument the seams
Where does context vanish? Where does work pause? Mark those points with a quick status, timestamp, and reason - without drowning recruiters in admin.
Why it matters: Recruiters can flag friction fast. Candidates get fewer delays. Hiring managers get clearer signals.Orchestrate, then automate
Choose a single orchestration layer (often the ATS) to own state and context. Automate only once flow is reliable.
Why it matters: One source of truth for recruiters. A smoother journey for candidates. A pipeline hiring managers can trust.Close the learning loop
Feed reply rates, pass-through rates, and time-to-next-action back into prompts, rankings, and routing.
Why it matters: Recruiters waste less time. Candidates get smarter nudges. Hiring managers see steady improvement.Manage the change
Teach the workflow, not just the tools. Set rules for response times and handoffs. Close off shortcuts that let old habits sneak back in.
Why it matters: Recruiters build consistent habits. Candidates see follow-through. Hiring managers feel a steadier cadence.
Slow is smooth. Smooth is fast.
Plan. Research. Ideate. Test. Iterate. Measure. Repeat. This is not waterfall or agile. It is disciplined, iterative change.
The stakes are high. The tools are powerful. But workflows - the everyday paths of action, collaboration, and decision - determine whether AI compounds or corrodes.
Fix the workflow first, then act. Align the flow, then scale the impact.
Because AI isn’t failing. Your workflow is.