generated-image.pngThe suicide on 7 February 2026 of Princess Dickson, a social influencer’s sixteen year old daughter who was subject to “years of online abuse and bullying on Tattle Life”  brought into renewed focus the responsibilities of online forums to ensure that the creation and dissemination of information about identifiable individuals, especially “anonymous strangers”, is lawful.

In particular, on 9 February 2026 it triggered an open letter from some twenty MPs to UK Information Commissioner John Edwards (as well as Dame Melanie Dawes, the Chief Executive of Ofcom) which detailed Princess’ experience, that this “followed more than five years of harassment directed at her mother” and that after Princess’ death users were “posting degrading and offensive comments about both Princess and her mother”.  It inter alia requested investigation of Tattle Life for potential breaches of data protection legislation and even the taking of action to remove or suspend the site as a whole.  Within the week, Edwards replied with his own public letter stating that the Information Commissioner’s Office (ICO) was already investigating Tattle Life and that this “will establish whether there are any infringements upon which we can take enforcement action”.   Irrespective of whether this is directed at adults, children or both, online abuse involving personal information can clearly cause very serious harm and this is so irrespective of whether a service is likely to accessed by minors.

However, whilst tackling such online abuse holistically is a crucial part of the ICO’s remit, it is not currently in an effective position to do this.  This is because in 2025 it deliberately chose to delete all mention of previous guidance (drawn up in 2013) setting out a general policy on the responsibilities of online forums, after having failed for years to follow through on its publicly announced intention to update this for the General Data Protection Regulation (GDPR) which has applied from May 2018.  Furthermore, its 26 March 2026 response to my Freedom of Information (FOI) request indicated that during the following period of almost eight years, it failed to produce even a partial draft of an updated policy and does not hold any record of the decision to remove the now defunct policy guidance, nor any discussion related to its announced intention to produce an updated version of this or what became of this.

The origins of the ICO’s defunct policy trace back to its explicit refusal in the early 2010s to regulate another deeply problematic online forum, evocatively named Solicitors from Hell.  In a subsequent High Court judgment shutting down the site under data protection (and also harassment) law, Justice Tugendhat strongly criticised the ICO stating that he “did not find it possible to reconcile” its views “with authoritative statements of the law”.  He further clarified that data protection “does envisage that the Information Commissioner should consider what it is acceptable for one individual to say about another” (at [100]).  In response, the new ICO policy recognised that such online forums did have terms and conditions for use and were therefore data controllers.  As a result, it held that they must have complaint policies “sufficient to deal with … derogatory, threatening or abusive posts” as well as “disputes between individuals about the factual accuracy of posts”.  The ICO also indicated that it would take action through complaint-handling and wider regulation to ensure the adequacy of online forums’ policies and procedures.

Figure 1 – ICO’s intention to update post-GDPR (removed in (2025)

Unfortunately, despite rising public concern, the ICO did little to implement its policy and, in particular, never deployed any of its formal regulatory powers.  Nevertheless, after the GDPR applied from May 2018, the policy guidance was not only retained on the ICO website but the ICO also announced an intention to “update this guidance to explain the requirements” of the new law (see end of webpage).  It should be noted here that the new law has been widely recognised as being especially exacting including compared to the status quo ante.  Indeed, in December 2025 the Court of Justice of the EU in Russmedia controversially held not only that the intermediary shields did not apply at all here but that the GDPR’s wording required an online service which merely organised “the classification which will determine the arrangements” (at [72]) for the dissemination of personal data (even of adults) to verify the identity (and not just the age) of its users, ensure that a lawful lifting of the prohibition on special category data processing applied prior to any publication and be able to demonstrate that the GDPR’s accuracy duties had also been met.

Figure 2: ICO’s FOI response 26 March 2026

Despite the ICO’s publicly announced post-GDPR intention, in the many years following it failed to update its online forum policy in any way.  Even more concerningly, sometime during the first half of 2025 all mention of this policy guidance, including the intended updating, was removed without warning or notice from the ICO website (compare the end of these two archived versions).  It would appear that the policy itself may have been removed somewhat earlier but was still there in June 2023.  Furthermore, the ICO’s 26 March 2026 response to my FOI request (see Figure 2 above) revealed that during the almost eight years since GDPR it failed to draft (even internally) any updated guidance “covering the same issues, either in whole or in part” as were focused on in the 2013 policy.  It also indicated that it did not hold any record on the decision to remove this from its website or the rationale for this, let alone any discussion on either this or anything related to its announced intention to produce updated policy guidance.

Given the deliberate absence of any overarching ICO policy in this area and notwithstanding the Information Commissioner’s 13 February 2026 public letter, it is sadly not currently in a good position to holistically investigate online forums such as Tattle Life for data protection violations related to derogatory, threatening or abusive posts.  However, this is undoubtedly an important part of its duties and, following previous judicial criticism, it did draw up policy in 2013 which was meant to discharge this.  It is therefore particularly troubling that it failed to undertake any post-GDPR updating of this even internally, despite announcing an intention to do for over half a decade.

Given the very strong and justified public concern about the very serious harms to both adults and children resulting from widespread violations of data protection on such forums, it is even more disturbing that all mention of the previous policy guidance and the announced intention to update this was removed from the ICO’s website last year, without any notice or justification.  The ICO now indicate that it holds no record on this matter.  Questions should therefore be asked about whether the ICO is properly and transparently discharging its vital regulatory responsibilities in this highly impactful and crucial area.

David Erdos is Professor of Law and the Open Society and Co-Director of the Centre for Intellectual Property and Information Law in the Faculty of Law and WYNG Fellow at Trinity Hall, University of Cambridge. He is also a fixed term member of Matrix Chambers.