TL;DR: The '30% rule' is a research-backed pattern suggesting roughly 30% of tasks in knowledge-worker roles like HR can be automated with today's AI — not to replace HR teams, but to free up capacity for more strategic work.
Introduction: A Rule of Thumb Worth Taking Seriously
Picture this: it's Tuesday morning, and your inbox contains fourteen variations of the same question. Holiday entitlement. Parental leave. Sick pay thresholds. Your team is capable of genuinely strategic work — manager development, workforce planning, culture initiatives — but the day keeps disappearing into queries that, frankly, a well-designed system could handle just as well.
This is where the so-called "30% rule" becomes a useful lens. It's not a formally codified law or an industry standard with a governing body behind it. Rather, it's a pattern that has emerged consistently from automation research by organisations like McKinsey, the World Economic Forum, and the CIPD: roughly 30% of tasks within most knowledge-worker roles can be automated or significantly augmented by AI using technology that exists today.
The distinction matters. This isn't a prediction about job losses or a vendor's sales pitch. It's a strategic prompt — an invitation to ask which third of your HR team's work could be handled differently, and what you'd do with the capacity that frees up.
In this article, we'll define the concept properly, map it to HR specifically, and give you a practical framework for using it as a planning tool — including how to take it into a business case conversation with your CFO.
What Is the 30% Rule in AI?
The 30% rule is a shorthand drawn from automation research, and it's worth being precise about what it actually claims. According to McKinsey Global Institute's research on the future of work in the UK (2023), approximately 30% of hours worked across the UK economy could be automated using existing AI and automation technology. Critically, this refers to tasks within jobs, not whole roles. AI rarely eliminates a position entirely; it restructures what people spend their time doing.
This distinction is important because it changes the strategic question entirely. The question isn't "will AI replace my HR team?" It's "which specific tasks in my HR team's week are candidates for automation — and what happens to the capacity that creates?"
Alongside the 30% concept, there's a related framework worth understanding: the 10-20-70 rule of AI value creation, drawn from McKinsey and Google's enterprise AI adoption research. It holds that only 10% of the value from an AI initiative comes from the algorithm itself. Another 20% comes from data and technology infrastructure. The remaining 70% comes from people, processes, and organisational change.
That's a striking number, and it has direct implications for HR. It means that the technology is, in a sense, the easy part. The harder and more valuable work — getting adoption right, governing AI responsibly, redesigning workflows, building trust with employees — is fundamentally a human challenge. And it's one that HR is uniquely positioned to lead.
Think of it like a dishwasher. It handles the washing reliably and efficiently. But someone still has to load it, unload it, decide what goes in, and deal with the pan that needs hand-washing regardless. The machine doesn't replace the cook; it changes what the cook spends their time on.
Why HR Is One of the Best-Placed Functions to Apply This Framework
HR is, structurally, one of the highest-volume, highest-repetition functions in any organisation. Policy queries, holiday requests, onboarding admin, payroll FAQs, benefits questions — these are precisely the task types that automation research consistently flags as most amenable to AI. They're rule-based, high-frequency, and largely consistent in their answers.
The CIPD has noted that HR professionals spend a disproportionate share of their working time on transactional queries rather than the strategic work they were hired to do. This isn't a criticism of HR teams — it's a structural reality of lean HR functions. In many UK mid-sized firms, the ratio is one HR professional to 100 or more employees. The maths simply doesn't work for anything other than triage.
And yet, contrast that transactional burden with the work that genuinely requires human judgment: a grievance investigation that requires careful listening and legal awareness; a redundancy conversation that demands empathy and precision in equal measure; a culture initiative that needs someone who understands the organisation's history and unspoken dynamics; a manager who needs coaching through a difficult performance conversation. None of these are automatable. All of them are where HR's real value lies.
The 30% rule, applied to HR, isn't a threat. It's an argument for why HR should be leading AI adoption rather than waiting for it to arrive. Almost a third of UK employers currently lack clear visibility of their future skills needs for the next two to three years (Personnel Today, 2024) — which means HR teams are already stretched on forward planning. Freeing up capacity isn't a luxury; it's a strategic necessity.
Mapping the 30%: Which HR Tasks Are Most Automatable?
The following isn't a definitive list — every organisation's HR function looks different. Think of it as a starting taxonomy you can adapt to your own context.
Automate: High Automation Potential
These are the tasks where AI can handle the query end-to-end, with appropriate oversight:
- Answering policy and compliance questions (holiday entitlement, parental leave rules, statutory sick pay thresholds)
- Onboarding document checklists, reminders, and first-day logistics
- Benefits FAQs and enrolment guidance
- First-line employee queries routed through a self-service interface
- Scheduling and calendar coordination
- Basic reporting and data pulls from HRIS systems
- First drafts of job descriptions and standard HR communications
These tasks share a common characteristic: the answer is largely determined by policy or statute. There's a right answer, it's consistent, and it doesn't require reading the room.
Augment: AI-Assisted, Human-Led
These tasks benefit from AI doing the heavy lifting on information gathering or drafting, but require a human to review, contextualise, and decide:
- Performance review summaries and trend analysis
- Learning and development recommendations based on skills data
- Exit interview analysis and theme identification
- Workforce planning data modelling
- Drafting HR policy updates or employee communications
Keep Human: Low Automation Potential
These are the tasks where AI should not be making decisions, and where its role — if any — is purely research support:
- Grievance and disciplinary proceedings
- Redundancy and restructuring conversations
- Mental health and wellbeing support
- Culture and values work
- Complex employment law judgments
- Manager coaching and development
A practical note for UK HR Directors: employment law complexity adds an important layer here. TUPE transfers, IR35 determinations, Working Time Regulations disputes — these involve fact-specific legal judgments where even AI-assisted research needs human oversight before any action is taken. The stakes of getting it wrong are significant, and an employment tribunal won't accept "the AI said so" as a defence.
What the Other 70% Unlocks: The Real Strategic Opportunity
Here's the reframe that matters: the point of automating 30% of HR tasks isn't to reduce headcount. It's to redirect 30% of HR capacity toward work that actually moves the needle.
What does that look like in practice? It might mean running quarterly manager development programmes that currently can't be resourced. It might mean building a genuine workforce planning capability rather than reacting to attrition. It might mean leading the organisation's AI governance framework — which, given that HR sits at the intersection of people, policy, and compliance, is arguably HR's job to own. It might simply mean having the bandwidth to be proactive rather than perpetually reactive.
This connects back to the 10-20-70 principle. The 70% people-and-process component of AI value creation is precisely where HR's expertise is irreplaceable. Change management, adoption, governance, trust-building with employees — these aren't IT problems. They're HR problems. And they're where the real return on AI investment is realised.
The evidence supports this framing. McKinsey's Superagency in the Workplace report (2025) found that employees who use AI effectively in their roles report higher engagement and a greater sense of purpose at work — not the displacement anxiety that dominates the headlines. The narrative of AI as a threat to meaningful work is, at least in well-implemented cases, simply not what the data shows.
Tools like Aura are built on exactly this logic. Aura handles the repetitive policy and compliance queries that consume HR bandwidth — think holiday entitlement questions at 11pm, parental leave queries from your Paris office, sick pay rules for a remote worker in Germany — so your team can focus on the work that genuinely requires human judgment. When a query falls outside the automatable 30%, Aura creates a ticket and routes it to your designated HR expert with full context attached. No dead ends, no dropped threads.
How to Use the 30% Rule to Build a Business Case for AI Investment
This is the section to take into your next CFO conversation. Here's a five-step framework for turning the 30% concept into a concrete investment case.
Step 1 — Task audit. Spend two weeks logging where HR time actually goes. Be granular: not "employee queries" but "holiday entitlement questions," "parental leave queries," "payroll discrepancy questions." Categorise each task type using the Automate/Augment/Keep Human framework above.
Step 2 — Quantify the 30%. Calculate the hours per week your team spends on automatable tasks. Multiply by average hourly cost (salary plus on-costs). This is your baseline ROI figure — the cost of doing manually what AI could handle. For a team of three HR professionals spending 40% of their time on first-line queries, the numbers add up quickly.
Step 3 — Model the redirect value. What would your team do with those hours? Attach a value to it. "We could run the manager coaching programme we've been unable to resource, which we estimate would reduce voluntary turnover by X%." Vague efficiency claims don't land with CFOs; specific redirected capacity does.
Step 4 — Identify the risk of inaction. Shadow AI is already happening in your organisation, whether you've sanctioned it or not. Research from BusinessMole (2024) found that around 30% of workers in organisations without clear AI policies exhibit inconsistent behaviour regarding AI use — creating real compliance and data security risk. Employees are already using ChatGPT to draft HR-adjacent communications, summarise policies, and answer their own questions. The question isn't whether AI is in your organisation; it's whether it's governed or ungoverned.
Step 5 — Select tools with the right architecture. There's a meaningful difference between purpose-built HR AI — grounded in your specific policies and UK employment law, GDPR-compliant, with source citations and controlled knowledge bases — and generic tools like ChatGPT or Microsoft Copilot, which lack source control, carry data security risks, and are prone to hallucination on compliance-sensitive questions. For HR specifically, where a wrong answer about statutory sick pay or a TUPE obligation can have legal consequences, that distinction matters enormously.
A brief governance note: the EU AI Act and the UK's emerging AI regulation framework both have implications for HR technology procurement. High-risk AI applications — which include systems used in employment decisions — face stricter requirements around transparency, human oversight, and documentation. Factor this into your business case now, not after procurement.
The Balance Question: When Should AI Defer to Human Judgment?
The right balance between AI and human judgment in HR isn't a fixed percentage — it's a decision framework. Three factors should determine where any given task sits on the spectrum.
Emotional stakes. Bereavement, mental health disclosures, disciplinary proceedings, redundancy notifications — these require human presence, not just human oversight. An employee in distress needs to feel heard by a person, not processed by a system. AI has no role in leading these interactions.
Legal complexity. TUPE transfers, discrimination claims, whistleblowing disclosures, IR35 determinations — these involve fact-specific legal judgments where the consequences of error are significant. AI can assist with research and precedent, but a human with appropriate expertise needs to own the decision.
Individual context. Policy questions have consistent answers; people's situations don't. An employee asking about flexible working arrangements might be managing a disability, a caring responsibility, or a family crisis. The policy answer is straightforward; the right response requires reading the situation. That's a human skill.
The practical implication: any AI system used in HR must have a clear, frictionless escalation path to a human. Not a dead end, not a "please contact HR during office hours" message, but an intelligent handoff that preserves context and ensures continuity. It's called Human Resources for a reason. AI handles the repeatable; humans handle the irreplaceable.
Conclusion: The 30% Rule as a Strategic Lens, Not a Magic Number
The 30% rule isn't a precise formula, and it was never meant to be. It's a useful prompt — a way of asking "which third of our HR work could AI handle better than we currently do, and what would we do with the capacity that creates?"
The answer to that question is where the real strategic opportunity lies. Automating the routine doesn't diminish HR — it elevates it. It creates the space for HR professionals to do the work that genuinely requires their expertise: the judgment calls, the difficult conversations, the culture work, the strategic planning that organisations desperately need but rarely get because the inbox is always full.
And remember the 10-20-70 principle. The technology — the algorithm, the platform, the integration — is the easy part. The 70% that determines whether AI actually delivers value is people, process, and change management. That's HR's domain. It's where HR Directors earn their strategic seat at the table, and where the profession's future is being written right now.
Over the next three to five years, the HR functions that thrive will be those that made deliberate choices about which work to automate, which to augment, and which to keep firmly human — and built the governance frameworks to make those choices stick.
If you'd like to see how Aura approaches this balance — purpose-built for HR, grounded in your policies and UK employment law, with intelligent escalation built in — explore how Aura works.