TL;DR: Rather than fixating on which jobs AI will eliminate, HR leaders should focus on the far larger challenge of workforce transformation — using a practical framework to reskill, adapt, and build organisational resilience as roles are reshaped over the next few years.
The Question HR Leaders Are Actually Asking
You've probably seen the headlines. "Bill Gates names the three jobs AI can't replace." "Which careers will survive the robot takeover?" It's compelling clickbait — and almost entirely the wrong question if you're an HR leader trying to build a workforce strategy for the next three years.
The real challenge isn't identifying a handful of "safe" job titles and steering your people towards them. It's understanding that the dominant story isn't elimination — it's transformation. According to Boston Consulting Group, 50–55% of jobs will be reshaped within the next two to three years, while only 10–15% are likely to be eliminated over a five-year horizon. That's a workforce planning challenge of enormous scale, and it lands squarely on HR's desk.
We know the anxiety is real — for employees and for HR leaders themselves. But the organisations that navigate this well won't be the ones that predicted the future most accurately. They'll be the ones that built the frameworks, skills, and culture to adapt as it unfolds. That's what this article is about: a practical, four-step framework you can start applying to your own workforce today.
What the Data Actually Tells Us About UK Jobs and AI
Before you can plan effectively, you need to separate signal from noise. The headlines conflate three very different things: jobs being eliminated, tasks being automated, and roles being reshaped. These are not the same, and treating them as such leads to either panic or complacency — neither of which serves your organisation.
The scale of exposure is real. The Institute for Public Policy Research (IPPR) estimates that up to 8 million UK jobs are at risk from AI, concentrated in routine and repetitive roles — clerical, secretarial, data entry, and administrative work. But "at risk" means exposed to change, not necessarily eliminated. That distinction matters enormously when you're making workforce decisions.
Near-term employer intent is shifting, though. According to the British Chambers of Commerce (April 2026), 1 in 6 UK employers — 17% — expect AI to reduce their headcount within the next 12 months. More striking still, 50% of UK executives now predict a net reduction in total employment within a decade, up from just 33% two years ago (Accenture, via Gotrade). The direction of travel is clear, even if the pace remains uncertain.
Entry-level roles are bearing the brunt of early disruption. Junior vacancies — graduate schemes, apprenticeships, internships — have dropped 32% since the launch of ChatGPT, and 40% of employers plan to hire fewer graduates in 2026 (People Management). Finance, administration, and customer service are seeing the sharpest impact.
The picture for resilient roles is equally consistent. The NHS is actively recruiting 50,000+ nurses. Electricians face roughly a 16% automation risk. Social care sits at 10–15%. Teaching is around 19%. The UK creative industries grew 9% between 2015 and 2022 and face low automation risk overall (LiveCareer UK / IPPR analysis, 2025). The pattern is clear: roles requiring physical presence, human judgement, empathy, and genuine creativity are the most resilient. Not because AI can't touch them, but because the value they deliver is fundamentally human in nature.
Why 'Which Jobs Survive?' Is the Wrong Framework
Here's the problem with the "which jobs survive AI?" question: it treats the job title as the unit of analysis. It isn't. The tasks within a job are.
Most roles contain a mix of automatable and non-automatable work. A finance manager's day might include data entry and report generation — both high automation candidates — alongside stakeholder communication, commercial judgement, and navigating organisational politics, which are far harder to replicate. The job title "Finance Manager" doesn't tell you much. The task composition tells you everything.
This concept of task decomposition is central to any meaningful AI impact assessment. As explored in how AI is changing HR roles and what it means for your team, AI can automate roughly a third of most roles' tasks — what's sometimes called the 30% rule. But that third varies enormously depending on the specific role, sector, and organisation. A customer service adviser in a highly scripted environment faces very different exposure to one handling complex complaints requiring empathy and discretion.
The implication for HR is significant. Your goal isn't to identify "safe" job titles and protect them. It's to understand which tasks across your workforce are shifting — and to plan accordingly. That reframe positions HR as a strategic architect of workforce change, not a passive observer waiting to see which roles survive.
Step 1: Conduct an AI Exposure Assessment Across Your Workforce
The foundation of any credible workforce planning response is an AI Exposure Assessment — a structured process for understanding where your organisation is most vulnerable to disruption, and where it isn't.
The process has three core steps. First, map roles by their actual task composition — not job titles, but the specific activities people perform day to day. Second, score those tasks by automation likelihood using consistent criteria. Tasks that are routine, repetitive, data-processing heavy, or rule-based carry higher exposure. Tasks requiring empathy, physical presence, complex judgement, or genuine creativity carry lower exposure. Third, assign roles to exposure tiers: High Exposure (more than 50% of tasks automatable), Medium Exposure (20–50%), and Low Exposure (below 20%).
In a UK context, clerical roles, secretarial positions, data entry, administrative assistants, and cashiers typically fall into the High Exposure tier. Healthcare, social care, skilled trades, and teaching typically sit in the Low Exposure tier — though even these roles contain tasks that AI will increasingly support.
One practical note: don't conduct this assessment in isolation. Line managers and employees know the actual task mix of their roles far better than any org chart or job description. Involving them in the process produces better data and, crucially, builds the trust you'll need when it comes to communicating findings.
The assessment will almost certainly surface skills gaps alongside exposure levels. This shouldn't come as a surprise — 97% of British organisations already report significant AI skills gaps, with a third saying this is actively impacting their business goals (British Chambers of Commerce, April 2026). The assessment doesn't create this problem; it makes it visible so you can address it.
Questions to ask in your AI exposure assessment: - What are the core tasks this role performs on a daily and weekly basis? - Which of those tasks are routine, repetitive, or rule-based? - Which require human judgement, empathy, or physical presence? - How much of the role involves processing or summarising data vs. interpreting and acting on it? - If AI tools were available for this role today, which tasks would change first?
Step 2: Prioritise Reskilling — The Human Skills That AI Can't Replicate
Once you've mapped exposure levels, the next question is where to invest. Not every role warrants the same reskilling response, and HR teams don't have unlimited capacity. A simple 2x2 matrix helps prioritise.
Plot roles on two axes: current AI exposure level (high to low) and strategic importance to the business (high to low). The quadrant that demands your most urgent attention is high exposure, high strategic importance — these are roles where AI disruption is likely and the business can't afford to lose the capability. Reskilling here is urgent. Roles that are high exposure but lower strategic importance may be better candidates for redesign, redeployment, or managed reduction. Low exposure, high strategic importance roles are your future-proof anchors — protect them, develop them, and make sure you're not inadvertently hollowing them out through short-term cost pressures.
The skills worth investing in are consistent across sectors: critical thinking and complex problem-solving, emotional intelligence and empathy, creativity and innovation, cross-functional collaboration, and — perhaps most importantly — AI literacy. That last one doesn't mean everyone needs to become a data scientist. It means people can work alongside AI tools effectively, understand their limitations, and apply human judgement where it matters.
The CBI has warned that guidance on AI skills training in the UK remains inadequate. HR leaders who build structured reskilling programmes now have a genuine competitive advantage — not just in productivity, but in talent retention.
One area that deserves specific attention: the youth pipeline. The 32% drop in junior vacancies since ChatGPT's launch isn't just a short-term hiring trend — it risks hollowing out leadership pipelines over the next decade. Reskilling strategy must include entry-level pathways, not just mid-career and senior roles. If the next generation of managers never gets the foundational experience that junior roles traditionally provided, organisations will feel that gap acutely in five to seven years.
Step 3: Manage the Human Side — Anxiety, Trust, and Change Readiness
Workforce planning for AI isn't just a structural exercise. It's a change management challenge — and the human dimension is where many organisations underinvest.
According to the British Chambers of Commerce (April 2026), 65% of UK employees are anxious about AI replacing their jobs. That's not a fringe concern — it's the majority of your workforce. Ignoring it is a retention and engagement risk, not just a communications oversight.
Three practical actions make a real difference here. First, communicate proactively. Silence breeds rumour, and rumour breeds anxiety. Share your assessment findings and reskilling plans with employees — even when the picture is incomplete. People can handle uncertainty far better than they can handle being kept in the dark. Second, involve employees in role redesign. People accept change significantly better when they've had a hand in shaping it. Where roles are being restructured, bring the people in those roles into the conversation early. Third, create psychological safety around AI experimentation. Reward curiosity and learning, not just output. If people feel they'll be judged for trying new tools and getting it wrong, they won't try — and your AI adoption will stall.
There's also a risk worth naming directly: when employees are anxious and unsupported, they turn to unsanctioned tools for answers. That means personal ChatGPT accounts, free AI assistants, and other platforms that sit entirely outside your organisation's data governance framework. Employees asking sensitive questions about their roles, their performance, or their employment rights through unsanctioned channels creates real data security and compliance risks — particularly under GDPR.
HR's role here is dual: manage the change, and model responsible AI adoption. Both matter.
Step 4: Build a Living Workforce Strategy — Not a One-Off Audit
Here's the uncomfortable truth about AI workforce planning: by the time you've finished your first assessment, the landscape will have shifted. AI capability is evolving faster than annual planning cycles can accommodate. A one-off audit, however thorough, will be out of date within months.
The solution is to treat workforce strategy as a living document rather than a periodic deliverable. A quarterly review cadence for AI exposure assessments is a reasonable minimum — more frequent for roles in high-exposure sectors. The metrics worth tracking include reskilling participation rates, internal mobility rates, AI tool adoption rates, employee confidence scores from pulse surveys, and time-to-fill for roles that have been redesigned.
Scenario planning is also worth building into your process. Develop two or three workforce scenarios based on different AI adoption speeds — a conservative trajectory, a moderate one, and an accelerated one. This isn't about predicting the future; it's about avoiding being caught flat-footed when the pace of change accelerates faster than expected. Firms using bespoke AI are already seeing the effects: 10% of UK firms with advanced AI deployments report 20% staffing reductions and three times higher job role restructuring than non-AI adopters (British Chambers of Commerce, April 2026).
Consider designating an AI Workforce Lead within your HR function — someone who owns the ongoing assessment process and bridges HR strategy with technology adoption. This doesn't need to be a new hire; it's often a development opportunity for an existing team member with both people and analytical skills. For those wanting to go deeper on the technology implementation side, AURA's AI implementation blueprint provides a practical starting point.
How AI Tools Can Support Your Workforce Planning (Without Replacing Your Judgement)
There's an irony worth acknowledging here. HR teams are being asked to lead the most complex workforce transformation in a generation — often with the same headcount and resources they had before AI became a boardroom priority. Doing this work manually, at scale, is genuinely difficult.
AI tools can help surface patterns in workforce data that manual analysis would miss: skills gap clusters by department, turnover risk concentrated in specific role types, learning engagement rates that reveal where reskilling is landing and where it isn't. These are insights that take weeks to compile manually and can be available in hours with the right tools.
Think of Aura as the tool that handles the volume so your team can focus on the judgement. When employees have questions about role changes, reskilling programmes, or what AI means for their specific job, they need accurate, instant answers grounded in your company's actual policies — not a three-day wait for an HR response, and certainly not a personal ChatGPT session that sits outside your data governance framework. Aura provides that 24/7 support layer, in multiple languages, freeing your HR team to focus on the strategic workforce planning work this moment demands.
The goal isn't to automate HR. It's to give HR the capacity to do the work that only humans can do.
The Bottom Line: Workforce Resilience Is an HR Strategy, Not a Prediction Game
The question "which jobs will survive AI?" is the wrong one. The right question is: how do we build a workforce that can adapt as the landscape shifts — and how do we lead that process with clarity, empathy, and rigour?
The four-step framework outlined here gives you a starting point: assess exposure at the task level, prioritise reskilling using a structured matrix, manage the human side with proactive communication and psychological safety, and build a living strategy that evolves as AI capability does.
The organisations that navigate this well won't be those with the most accurate predictions. They'll be the ones where HR led the conversation — ahead of IT, ahead of finance, and ahead of the anxiety that comes from silence.
If you're ready to explore how AI can support your HR team in doing this work — from answering employee questions about role changes to freeing capacity for strategic planning — visit Aura at aura-hr.tech to see how it works in practice.