How businesses can prepare for the Data Protection and Digital Information Bill imageThe Data Protection and Digital Information Bill, currently before the UK Parliament, proposes a slew of changes to UK data protection law, including law enforcement and intelligence services data processing regulated by Parts 3 and 4 of the Data Protection Act 2018 (DPA 2018) respectively.

This post critiques three proposed changes to these bespoke regimes: (1) allowing joint processing; (2) removing the obligation to log justifications; and (3) enabling large-scale special circumstances transfers.  It recommends each is reconsidered, given their potential to undermine fundamental rights to data protection and privacy.

Joint Processing

When police (or other ‘competent authorities’) process personal data for ‘law enforcement purposes’ they are currently regulated by Part 3 of the DPA, which implemented the EU Law Enforcement Directive (LED).  Police processing for any other purpose remains subject to the general UK GDPR obligations.  In contrast, data processing for any purpose by intelligence services—MI5, MI6, and GCHQ—is regulated solely by Part 4, which reflects the Council of Europe’s modernized Convention 108+.  As the Bill’s Explanatory Notes (EN) record at [22], there is an “increasing expectation” police and intelligence services will cooperate in joint operational partnerships.  According to the Bill’s Impact Assessment (IA) at [478], however, the fact these agencies are subject to different data protection regimes “adds friction when working in partnership and presents challenges”.

Clauses 25 and 26 seek to “simplify data protection considerations by enabling a single set of data protection rules to apply”: Delegated Powers Memorandum at [58].  These clauses allow the Secretary of State to issue a notice certifying that police data processing will be regulated exclusively by Part 4 when working jointly with intelligence services.  The intelligence services believe this “will lead to more dynamic working practices with police”, including “improved confidence in sharing data”: IA at [356].  Before issuing a notice, the Secretary of State would need to need to consult with the Information Commissioner’s Office (ICO) and be satisfied that it “is required for the purposes of safeguarding national security”.  Notices would be reviewed annually and could be challenged on judicial review grounds before the Tribunal.

This proposal deserves further scrutiny.  As the ICO noted at [186] of an earlier consultation response, “people have more limited rights under the intelligence regime than where processing is carried out by competent authorities for law enforcement purposes”—these regimes “deliberately impose differing obligations”.  Police could seek a notice under these new provisions when acting jointly with intelligence services during what are (otherwise) ordinary law enforcement investigations.  Such a notice may well be granted on the basis these investigations implicate ‘national security’, given the (deliberate) ambiguity of this term.  While this area is complex, a more rights-friendly way to achieve the Bill’s simplification aim here may instead be to regulate the associated intelligence services processing under Part 3 and/or the UK GDPR. Fundamentally, whether it is ever appropriate to exempt police from these provisions is debatable.

The fact UK intelligence services data processing is exclusively regulated by Part 4 may moreover be a bug rather than feature of UK data protection law.  At [32] of its review of the EU Commission’s draft UK LED adequacy decision, the European Data Protection Board (EDPB) suggested “a closer look” is warranted here, as intelligence services’ activities “fall both within the scope of law enforcement and national security”.  Comments from Singh LJ and Holgate J at [130] of a recent Divisional Court judgment on investigatory powers also appear apt:

“When the security and intelligence agencies act for an ordinary criminal purpose, we cannot see any logical or practical reason why they should not be subject to the same legal regime as the police.  The mere fact that in general they operate in the field of national security cannot suffice for this purpose.  It is the particular function in issue which is relevant”.

Logging Justifications

Police need to keep logs of data processing activities under s 62 of the DPA.  As well as recording the time, date, and, if possible, identity of the person undertaking processing, police must also currently record the justification (ie reason) for processing.

Clause 16 would remove the justification obligation.  It is apparently “technically challenging” and “resource intensive”, requiring human input: EN at [21] and [221].  It supposedly “holds limited value in maintaining accountability” as “an individual misusing the database is unlikely to record an honest justification”: IA at [139].  The status quo also risks “non-compliance risks” for competent authorities operating old databases—a transitional provision exempting compliance for such databases expires in May 2023: IA at [150]–[152].

This proposal may downplay the status quo’s benefits while inflating its difficulties.  At p 26 of its LED review, the Article 29 Working Party (29WP)—the EDPB’s predecessor—described logging, including justifications, as “a crucial tool for data protection monitoring” with a “two-fold goal”.  Contrary to the IA’s claims, recording a false justification may well assist with “punitive action” against a misbehaving police officer, as 29WP explained; the officer will be committed to the contemporaneous justification recorded; without this, they can freely construct an ex post facto one.  It also serves “as a deterrent”: requiring officers to consciously record justifications may lead them to reflect whether processing is legitimate, avoiding misconduct at the outset.

The human input required is also relatively minor—at most, two minutes, according to the IA at [142]—and in any event warranted given its deterrent effect. Other LED jurisdictions operate with this obligation; indeed, some, like Austria, have imposed additional logging requirements, Matthias M Hudobnik explains at p 496. A reader may have limited sympathy regarding ‘non-compliance risks’: competent authorities have had years to develop new databases since the LED was finalised—this would also be consistent with their overarching s 57 ‘data protection by default and design’ obligations, as the WP29 noted at p 27 of its LED review.  Regardless, an alternative would be to further extend this transitional period—a process expressly contemplated by Article 63(3) of the LED, which permits a further three-year extension to May 2026.

Large-scale special circumstances transfers

Sections 72–78 of Part 3 govern transfers of personal data by police to equivalent overseas competent authorities, providing three tiered options.  In “special circumstances”, when adequacy regulations (s 74A) and appropriate safeguards (s 75) are unavailable, s 76 permits transfers when necessary for various “special purposes”, operating as derogations.  Two are “in individual cases for any of the law enforcement purposes” or “in individual cases for a legal purpose”—unless the public interest in the transfer is outweighed by a data subject’s fundamental rights and freedoms.

Paragraphs 7(4)(c)-(d) of Schedule 6 of the Bill would replace ‘in individual cases’ with ‘in particular circumstances’ for these two derogations.  This is intended to facilitate “more frequent large-scale transfers” under s 76 (IA at [477]), by “making clearer that transfers can take place involving a broader set or category of data in particular circumstances”, particularly for “operations and investigations that are broad in scope”: EN at [746].  A new clause would emphasize that “the amount of personal data transferred in reliance on this section must not be excessive in relation to the special purpose relied on”.

This expansion of these two s 76 derogations risks undermining data protection rights.  Recital (72) of the LED cautions that such derogations “should be interpreted restrictively and should not allow frequent, massive and structural transfers of personal data, or large-scale transfers of data, but should be limited to data strictly necessary”.  The UK Supreme Court has emphasised (at [11], [157], and [223]–[224]) the importance of interpreting s 76 narrowly pursuant to Recital (72).  When discussing the analogous GDPR provision, the European Data Protection Supervisor warned at [224] that “[t]here is a risk that the protection afforded to individuals … would be significantly weakened if any set of transfers, including those that are repeated or massive, could always be justified by one of the derogations and would thus escape from the requirement to enter into appropriate safeguards”.  Ultimately, it may be more appropriate for large-scale transfers to be undertaken using the existing alternative transfer mechanisms.


It is encouraging that the UK has delayed the second reading of the Bill “to allow Ministers to consider the legislation further”.  On 3 October 2022, however, the Secretary of State for Digital, Culture, Media and Sport appeared to announce additional data protection reforms, potentially going well beyond the Bill.  Time should be taken to comprehensively review proposed changes to this area of law, including but not limited to the matters above.  Any potential weakening of data protection safeguards should be embarked on with caution.

Policymakers and MPs should also bear in mind that, in addition to undermining data privacy and privacy rights, these three amendments would require the UK to diverge from EU data protection law.  Each law under amendment was referenced by the EU Commission in its UK LED adequacy decision: see (80) (special circumstances), (90) (logging), and (128) (joint processing).  The Commission has also recently been asked whether it intends to revisit its UK adequacy decisions given the Bill.  Whether the proposals outlined above pose any real threat to adequacy is beyond the scope of this post, but merits further scrutiny.  In any event, their potential impact on data protection and privacy alone should give pause for concern.

Tim Cochrane, PhD Candidate at the University of Cambridge Faculty of Law, Centre for Intellectual Property and Information Law, and Fitzwilliam College.