Inforrm covered a wide range of data protection and privacy cases in 2025. Following  our posts in 20182019,  202020212022 and 2023 here is our selection of notable privacy and data protection cases from England and Wales, the ECtHR, the CJEU, Australia and Canada in 2025.

  1. Farley & Ors v Paymaster (1836) Limited (t/a Equiniti) [2025] EWCA Civ 1117

The case concerned 432 current and former Sussex Police officers whose annual pension benefit statements were mistakenly posted to out-of-date addresses in 2019 by Equiniti, which manages their pension scheme. Each statement contained sensitive information including names, dates of birth, National Insurance numbers, salary details, and pension information  In February 2024, the High Court had struck out all but 14 of the claims on the basis that claimants needed to prove their data had been accessed by unauthorised third parties. Nicklin J held that absent such proof, there was no “real” processing of the data and thus no actionable GDPR breach ([2024] EWHC 383 (KB)).  The Court of Appeal allowed the appeal, holding that proof of third-party disclosure is not an essential ingredient of alleging processing or infringement under the GDPR.  It was held that there is no threshold of seriousness for bringing a data breach claim and an allegation of “distress” is not an essential ingredient of a claim. The Panopticon Blog had a post about the case

2.  Information Commissioner’s Office v Clearview AI Inc [2025] UKUT 319 (AAC)

Clearview AI, a US company offering facial recognition software, scraped billions of images from the internet to build its database. The Information Commissioner had issued a £7.5 million fine and an enforcement notice ordering Clearview to stop using UK residents’ data and delete such data from their systems. The First-tier Tribunal (“FTT”) had ruled in 2023 that the ICO lacked jurisdiction to impose these sanctions on the basis that Clearview AI’s activities—scraping publicly available images of UK residents to support facial recognition services for foreign law enforcement—constituted an act of state beyond the ICO’s jurisdiction. The Upper Tribunal found that the First-tier Tribunal had erred in law in finding that Clearview’s processing was outside the material scope of the GDPRs by operation of Article 2(2)(a). However, the Upper Tribunal agreed that Clearview’s processing fell within the territorial scope of the GDPR.  The concept of “behavioural monitoring” in GDPR Article 3(2)(b) should be interpreted broadly and includes “passive collection, sorting, classification and automated data storing” with a view to future use for profiling purposes (including by another controller) – active human involvement is not required.   This meant that Clearview’s automated collection of information about individuals through the creation and maintenance of its facial recognition database constituted behavioural monitoring, although there was no active real time monitoring of the individuals occurred.   Privacy International had a post about the decision.

3.  Ashley v HMRC [2025] 4 WLR 29

Mr. Ashley made a subject access request (SAR) to HMRC under UK GDPR Article 15.  HMRC initially refused to provide any data at all in response but, after the issue of proceedings provided some personal data .  It had applied a “sufficient proximity” test, asking whether information in which Mr Ashley was identified or identifiable was “sufficiently proximate” to him. Heather Williams J rejected this approach, holding that information will amount to personal data if it is linked to the data subject by reason of its content, purpose, or effect. For example, valuation figures relating to each of the properties owned by Mr Ashley and caught up in a tax dispute were personal data, even if Mr Ashley was not directly named in those valuations. The court also held that subject access requests are not confined to particular agencies or departments within HMRC, requiring a comprehensive search across the organisation.  The Panopticon Blog had a post about this decision.

4.  Ministry of Defence v Global Media and Entertainment Limited [2025] EWHC 1806 (Admin)

Chamberlain J discharged a “super-injunction” which had been granted contra mundum in September 2023. The injunction prevented any reporting of a catastrophic data breach affecting almost 19,000 Afghan applicants for UK relocation and of the fact that the injunction  had been granted. The data leak occurred in 2022 when a UK government employee accidentally disclosed a spreadsheet containing 33,000 records of gravely sensitive personal details relating to Afghan applicants, members of the armed forces, and MI6 agents following the Taliban takeover in Afghanistan in 2021. The MOD assessed that public disclosure would expose thousands of people to risk of extra-judicial killing or serious violence by the Taliban, and may have resulted in some deaths.

5.  Green v United Kingdom  [2025] ECHR 91

In this case the Court of Appeal had granted the applicant an interim injunction to protect the confidentiality of the material in the NDAs (see ABC & Ors v Telegraph Media Group Ltd [2019] EMLR 5).  However, on 25 October 2018, after a House of Lords debate, Lord Hain made a short personal statement naming the applicant.  These comments were widely reported and the orders for anonymity were subsequently discharged.  The applicant complained that the failure by Parliament to prevent Lord Hain from revealing this  information was a  violation of his privacy rights under Article 8.   The Court of Human Rights but held there was no violation of Article 8, giving considerable weight to the constitutional role of parliamentary privilege and the limits of State responsibility for statements in Parliament.  We had an Inforrm post about the judgment.

6.  EDPS v Single Resolution Board (SRB) (C‑413/23 P)

The case concerned the EU’s Single Resolution Board (“SRB”), an EU agency that had organised a consultation with creditors and shareholders of a Spanish bank. The SRB shared the creditors’ and shareholders’ comments with a consulting firm , replacing respondents’ names with alphanumeric codes. The European Data Protection Supervisor (“EDPS”) argued that the statements were pseudonymised personal data and not anonymised because the SRB had the alphanumeric code that could link the responses given during the registration phase to those given during the consultation, even though the identifying information from the registration phase was not transferred to the recipient.  The CJEU held that the comments remained personal data for the SRB because it had additional information enabling it to link the data to data subjects.  If the recipient could not attribute the comments to an identifiable data subject during its processing, they were not perse, in reality, the “risk of identification appears insignificant” – such as where re-identification would be unlawful or would require disproportionate effort. In other words, the same data may constitute personal data in the hands of one party, but not in the hands of another.

7.  X v Russmedia Digital SRL (C-492/23)(2025)

Russmedia Digital, a Romanian business, owns the website <www.publi24.ro>. This platform serves as an online marketplace where advertisements may be published either free of charge or for a fee. On 1 August 2018, an unidentified user posted on this marketplace an advertisement falsely representing a woman (the “claimant”) as offering sexual services, including her photographs and telephone number, without her consent. Once notified, Russmedia removed the advertisement within an hour, but the advertisement was relayed on several third-party websites, indicating the source of the initial website.  The claimant brought proceedings for breaches of her image rights, reputation and data protection rights.  The Grand Chamber found that the operator of an online marketplace such as Russmedia Digital is a controller, within the meaning of the GDPR, of the personal data contained in an advertisement published on its online marketplace, although the advertisement is designed and placed by a user.  The online marketplace and the advertising user were joint controllers.  As a result the operators of online marketplaces were required to have systems in place for identifying advertisements containing sensitive personal data, they were obliged to verify the identity of the user-advertiser and whether that person is the person whose sensitive data appears in that advertisement.   If the user advertiser was not the data subject publication of the advertisement should be refused.

8 . Australian Information Commissioner v Australian Clinical Labs Limited (No 2) [2025] FCA 1224

The case followed a catastrophic cyber-attack on ACL’s Medlab Pathology business, which resulted in the exfiltration of sensitive health, financial, and contact data belonging to over 223,000 individuals  . The court ordered Australian Clinical Labs (ACL) to pay AU$5.8 million for privacy interferences affecting 223,000 individuals.  The Court held that ACL contravened section 13G(a) of the Privacy Act by failing to implement adequate cybersecurity controls to protect personal information from unauthorised access, in breach of Australian Privacy Principle 11.1(b). The antivirus software deployed on the Medlab server did not prevent or detect a threat actor uploading data to the internet between December 2021 and July 2022.  Second, ACL contravened section 26WH(2) by failing to carry out a reasonable and expeditious assessment within 30 days to determine whether there were reasonable grounds to believe the cyberattack amounted to an eligible data breach. Third, having formed the view by 16 June 2022 that an eligible data breach had occurred, ACL contravened section 26WK(2) by failing to notify the Commissioner “as soon as practicable”—the court found it was practicable to notify within two to three days, yet ACL delayed for 24 days.  The court concluded that ACL engaged in a separate contravention of section 13G for each of the 223,000 individuals affected, rather than treating the inadequate security as a single breach. This approach to counting contraventions significantly increased the potential penalty exposure. The  penalty of AU$5.8 million was calculated based on three categories of contraventions: inadequate security ($4 million), failure to assess promptly ($300,000), and delayed notification ($1.5 million).

9.  Kurraba Group Pty Ltd & Smith v Williams [2025] NSWDC 396

This case marks the first published judicial consideration of Australia’s new statutory tort for serious invasions of privacy, which commenced on June 10, 2025. The litigation involved a dispute where a defendant published private wedding photographs of a company CEO as part of an alleged extortion campaign. Gibson DCJ determined that the wedding photographs were never intended for public consumption and that their use to portray “moral delinquency” constituted a serious invasion of privacy. The court granted an interlocutory injunction, the first of its kind under the new regime. .

10.  Insurance Corporation of British Columbia v. Ari, 2025 BCCA 131

The case arose after an Insurance Corporation of British Columbia (ICBC) employee accessed and disclosed private information of certain policyholders without authorisation.  The British Columbia Court of Appeal in 2025 upheld a damages award of CA$15,000 per class member without proof of consequential loss or harm for breach of privacy under the BC Privacy Act.  The defendant had argued that aggregate damages should be limited to a nominal amount of CA$500 per class member for the “mere fact their privacy was violated”. The BC Supreme Court rejected this position, finding it would “trivialise” the privacy interest violated and render section 1 of the Privacy Act meaningless. The Court of Appeal agreed, finding that damages serve to deter and compensate for the loss of privacy itself.

 Please add additional cases or analysis via the comments. We would like to thank readers for their contributions, reposting and reading of this series to date.