The Data (Use and Access) Act 2026 received Royal Assent and came into force earlier this year. There is no grace period, no transitional window, and no sector exemption for HR. If you run an HR consultancy or recruitment firm that uses AI tools — for CV screening, candidate scoring, performance assessment, or employee monitoring — your legal position changed the moment this Act came into force. Most firms in the sector haven’t mapped it.
This is not a theoretical future risk. It’s a current compliance obligation. Here is what the Act changed, what it means specifically for HR consultancies, and what a compliant position now requires.
What the Data (Use and Access) Act 2026 actually changed
The Act amends the UK GDPR and the Data Protection Act 2018. It is not a wholesale replacement — it is a targeted set of changes to existing data law, several of which have direct implications for HR practice. The four most relevant changes for HR consultancies are:
1. Strengthened rights around automated decision-making
The Act tightens and clarifies the rules around decisions made solely by automated means. This was already a regulated area under UK GDPR Article 22, but the Act strengthens the individual’s right to request human review, explanation, and contest of automated outcomes. In an HR context, this applies directly to any AI tool that scores, ranks, or filters candidates without meaningful human oversight at the decision point. If your ATS uses algorithmic shortlisting, your screening process uses AI-powered CV analysis, or you use any tool that assigns candidates a score that influences hiring outcomes — you are in scope. The strengthened rights mean candidates can now formally request that a human review their rejection, and that a documented explanation of the automated logic must be available.
2. New transparency obligations for data intermediaries
HR consultancies and recruitment firms occupy a legally complex position: they collect and process candidate data on behalf of client employers, making them data processors (or in some cases joint controllers) relative to their clients. The Act introduces new obligations around data intermediary transparency — including clearer disclosure requirements about how candidate data flows between your firm and client employers, and how it is used in automated or AI-assisted processes. If your contracts with client employers do not already clearly define data controller and data processor roles, the Act has made that gap more consequential.
3. Changes to the legitimate interests basis
The Act introduces a list of “recognised legitimate interests” — processing purposes where the legitimate interests basis can be relied upon without a full balancing test. Candidate data processing for active recruitment may qualify, but the scope is defined narrowly. Retaining candidate data for future roles, using candidate data to train AI screening models, or sharing candidate data with third-party tools without explicit consent are not automatically covered. HR consultancies that have historically relied on loose applications of legitimate interests for candidate data retention need to review that position.
4. Senior Responsible Individual requirement
The Act introduces the Senior Responsible Individual (SRI) role, replacing the Data Protection Officer designation for many organisations. For HR consultancies handling significant volumes of candidate and employee data, the SRI requirement applies. The SRI must be named, must have demonstrable responsibility for data governance decisions, and must be the documented point of accountability for automated processing decisions. This is not administrative paperwork — it is a named individual with legal accountability. Claiming the responsibility is “shared across the team” is not a compliant position.
Where HR consultancies are most exposed
Across the sector, the most common compliance gaps we see fall into three categories:
AI tools adopted without governance. AI-assisted screening tools, interview analysis platforms, and candidate ranking software have been adopted at speed over the past two years. In most cases, the decision to adopt was made by a hiring manager or operations lead, not a compliance function. The result is that many HR consultancies are using tools that generate automated assessments of candidates with no documented oversight process, no audit trail of decisions, and no mechanism for the candidate to request human review. Every one of those tools is now subject to the Act’s automated decision-making provisions.
Candidate data retained beyond its purpose. Building a candidate database is a legitimate business objective. Retaining CVs and contact details indefinitely without a documented retention schedule, a renewal of consent, or a lawful basis for ongoing retention is not. The Act sharpens the ICO’s enforcement position on data minimisation. Candidate databases that have never been audited for stale, consented, or purposeless records are a visible enforcement risk.
Unclear controller/processor boundaries with client employers. When a candidate’s data is shared with a client employer, who is responsible for what? In most HR consultancy arrangements, the answer to that question is either undefined or buried in a commercial contract that was not drafted with data law in mind. The Act’s transparency obligations make this ambiguity harder to defend.
What a compliant position looks like
Compliance with the Data (Use and Access) Act 2026 in an HR context requires four things to be in place:
- An AI and automated processing inventory. Every tool that makes or influences candidate or employee decisions must be documented, including its purpose, its data inputs, and what human oversight exists at each decision point.
- A candidate-facing transparency notice that clearly explains when automated processing is involved in hiring decisions and how candidates can exercise their right to request human review.
- A named Senior Responsible Individual with documented accountability for data governance and automated decision oversight — not a committee, a named person.
- A data retention schedule covering candidate data, with clear rules on how long data is held, on what basis, and what the process is for deletion or consent renewal.
None of this is disproportionately complex for a professional HR consultancy. But each item requires deliberate action — it does not self-assemble from existing processes, and it does not emerge from a generic GDPR policy written before AI screening tools existed.
The practical question is when, not if
The ICO has an AI Auditing Framework, enforcement powers under existing data law, and an established pattern of using complaint-led investigations to open broader audits of organisations. In a sector where candidates regularly receive unexplained automated rejections, the number of complaints that could trigger ICO attention is not small. The question for HR consultancies is not whether the Act applies to them — it does. The question is whether they want to address it now, or explain their position to the ICO later.
Find out exactly where your HR consultancy stands
We offer a structured AI Compliance Framework for UK professional services firms — including HR consultancies. It covers your AI tool inventory, automated decision documentation, SRI appointment, and candidate transparency obligations. Fixed fee: £497.
About the author: Scott Neve is the founder of Ops Intel, a Newcastle-based AI compliance and automation consultancy. He works with HR consultancies, solicitors, accountants, and professional services firms across the North East and wider UK. Learn more →