Skip to content
5 min read

Making the ICO Accountability Framework work in practice

Natural history illustration of a peregrine falcon – featured image for Making the ICO Accountability Framework work in practice
Natural history illustration of a peregrine falcon – featured image for Making the ICO Accountability Framework work in practice

In over six years of enforcing the UK GDPR, the ICO has never issued a fine for failing to maintain a Record of Processing Activities. Article 30 requires controllers to document what personal data they collect, why they collect it, and how they use it. Not once has a controller been penalised for failing to create or maintain a compliant ROPA. Likewise, Data Protection Impact Assessments – structured risk assessments that organisations must complete before implementing new systems or processes involving personal data – have yielded only a handful of reprimands and zero fines.

Yet the accountability framework has never been more extensive. In October 2024, the ICO released a nine-toolkit Data Protection Audit Framework covering governance, training, AI, and age-appropriate design, among other areas. Organisations can now assess themselves against the same standards the ICO's own assurance teams use during audits. And the Data Use and Access Act 2025, which received Royal Assent in June, intentionally retained every accountability obligation that the shelved DPDI Bill had proposed to weaken.

The result is a paradox that affects the practical reality of compliance auditing. The regulatory obligations are well-defined and settled, the tools are comprehensive and publicly accessible, yet the enforcement pattern incentivises organisations to invest in security rather than governance. Understanding this paradox is where making the framework work in practice begins.

The accountability obligation is a burden of proof

Article 5(2) of the UK GDPR establishes a dual obligation that is deceptively straightforward: controllers must comply with the six data protection principles, and they must be able to demonstrate that compliance. The latter obligation is the one that matters for practical purposes. Compliance alone is insufficient; an organisation must be able to provide evidence that it has complied.

This burden of proof runs through several interconnected provisions. Article 24 requires controllers to implement "appropriate technical and organisational measures" based on risk and to review those measures regularly. Article 30 requires written records of processing activities to be available to the ICO on request. Article 35 requires DPIAs to be completed before new systems or processes involving personal data go live. Articles 37 to 39 require certain organisations to appoint a Data Protection Officer – the individual responsible for monitoring compliance with data protection law – with statutory independence.

These provisions interlock. An inaccurate ROPA means the organisation cannot fulfil its transparency obligations, because it cannot inform individuals about processing it has failed to document internally. A missing DPIA means a system went live without a structured assessment of its potential impact on individuals. Weakness in one area of accountability creates exposure across several.

The DUAA 2025 closed the escape route

Prior to mid-2025, some organisations delayed accountability investment on the assumption that future legislative changes would ease the burden. The DPDI Bill, introduced under the previous government, proposed replacing mandatory DPOs with a less independent "Senior Responsible Individual," limiting ROPA requirements to high-risk processing only, and giving controllers broader discretion over DPIAs.

The DUAA 2025 rejected all of these proposals. The DPO, ROPA, and DPIA obligations remain intact. The only textual amendment to the accountability provisions was a minor rewording – "an element by which to demonstrate" became "a means of demonstrating" – with no substantive effect. The accountability framework, as it stands, is settled architecture.

I’m bringing this up because the confusion in the market is real. Several published sources incorrectly claim the DUAA introduces Senior Responsible Individuals replacing DPOs. This is wrong – the SRI provision was in the DPDI Bill and was explicitly not carried forward. Any compliance programme relying on these inaccurate sources risks building against requirements that do not exist in the enacted legislation.

The enforcement pattern does not align with the framework

The comprehensive nature of the framework sits in stark contrast to the ICO's enforcement record. The large monetary penalties – Capita at £14 million, PSNI at £750,000, Advanced Computer Software Group at £3 million – were assessed under Article 5(1)(f) (integrity and confidentiality) and Article 32 (security of processing). The accountability-specific provisions generated reprimands: Chelmer Valley High School for deploying facial recognition without completing a DPIA, Serco Leisure for fingerprint scanning with a defective DPIA and a post-hoc Legitimate Interests Assessment conducted only after the ICO began investigating.

There are signs this pattern may be shifting. The ICO framed the security failures at Capita as "ultimately a cyber governance problem." The DPP Law fine of £60,000 was the first instance where late breach notification was explicitly cited as an aggravating factor. But the current incentive structure remains: organisations that invest in security controls are treated more favourably than those that invest in governance documentation. In my experience, this misalignment is a significant reason why practitioners consistently find under-investment in governance documentation across UK organisations.

The gap hides in plain sight

Survey evidence reinforces what the enforcement data implies. The GRC Solutions GDPR Benchmark Report 2025 found UK organisations operating at "limited" or "developing" levels of accountability maturity across all sectors surveyed. Where ROPAs exist, they are typically outdated spreadsheets with vaguely stated purposes – "HR" rather than the specific processing activities being carried out. DPIAs are treated as formalities, with risks defaulted to "low" and minimal documented mitigation. Training is completed annually to satisfy a requirement rather than to change how people handle data. Processor audit rights are written into contracts under Article 28 but rarely exercised.

The most dangerous compliance posture is not the absence of documentation. It is documentation that does not reflect operational reality. An organisation with a comprehensive set of policies describing a governance programme that does not exist in practice is, in some respects, worse off than one with no policies at all: the documentation creates an illusion of compliance that collapses under scrutiny.

What an ICO audit actually tests

The ICO's audit methodology is designed to find exactly this gap. Consensual audits – approximately 20 to 30 per year, conducted under section 129 of the DPA 2018 – follow a structured process: risk-based planning, document review, on-site interviews with staff, records inspection, and observation of actual processing activities. Each scope area is rated on a four-tier scale from high assurance to very limited assurance, with recommendations assigned priority ratings.

The critical distinction is methodology: the ICO does not simply evaluate whether policies exist. It tests whether they work in practice. Staff interviews reveal whether people follow the procedures their organisation claims to have. Records inspection shows whether the ROPA reflects current processing or was last updated two years ago. The nine toolkits, now publicly available, set out the specific control measures the ICO expects – effectively publishing the exam paper in advance.

I have worked with accountability expectations across enough different contexts to recognise a consistent pattern: organisations that score poorly are not those without policies. They are the ones whose policies describe a compliance programme that bears little resemblance to how data is actually handled. The framework works when it is treated as a living governance structure. It fails when it is treated as a filing exercise – and closing that gap starts with an honest assessment of whether the documentation matches the reality.