Earlier this week, in a written statement to the House of Commons, the UK’s Secretary of State for Digital, Culture, Media and Sport, the Rt Hon Nicky Morgan MP, announced the end of the UK’s controversial age-verification plans for online porn

Part 3 of the Digital Economy Act 2017 ((hereafter: DEA), as given further effect by the Online Pornography (Commercial Basis) Regulations 2019 (SI No 23 of 2019)) seeks to regulate online pornography. Section 14 DEA imposes a requirement on pornography websites to prevent access to by persons under 18; section 16 DEA permits the designation of an age-verification regulator (AVR); and, pursuant to the procedure in section 17 DEA, the Secretary of State for Digital, Culture, Media & Sport announced in the House of Commons on 20 February 2018 that the British Board of Film Classification (BBFC) was designated as the AVR.

Pursuant to section 25 DEA, the AVR published Guidance on Age-verification Arrangements (pdf) in October 2018, and this was approved by the House of Commons on 17 December 2018. At this point, the EU Commission should have been notified, pursuant to Articles 4 and 5 of Directive (EU) 2015/1535, which lays down a procedure for the provision of information to the Commission to ensure as much transparency as possible as regards national initiatives for the establishment of technical regulations and of rules on Information Society services. For example, the Court of Justice of the European Union recently held that German legislation prohibiting internet search engines from using newspaper or magazine snippets without the publisher’s authorisation was subject to such prior notification to the Commission and had to be disregarded in its absence (see Case C-299/17 VG Media v Google). However, the UK Government failed to make the necessary notification.

The AVR was to have begun enforcement on 15 July 2019. It had set up a voluntary, non-statutory age certification scheme, with which online age-verification services would have sought to comply. Sites that failed to verify users’ ages could have been blocked, and could have been prevented from accessing payment services and other ancillary services, pursuant to sections 21 and 23 DEA.

However, on 20 June, having discovered the failure, the Government announced in the House of Commons that there would be a postponement of up to six months to allow notification to the EU. Now comes the news that the whole scheme is to be abandoned. In her statement to the Commons, the Secretary of State said [with added links]:

Protecting children is at the heart of our online harms agenda, and is key to wider Government priorities. … The Government published the Online Harms White Paper in April this year [the White Paper; pdf]. … The Government announced as part of the Queen’s Speech [on 14 October 2019] that we will publish draft legislation for pre-legislative scrutiny. It is important that our policy aims and our overall policy on protecting children from online harms are developed coherently in view of these developments with the aim of bringing forward the most comprehensive approach possible to protecting children.

The Government have concluded that this objective of coherence will be best achieved through our wider online harms proposals and, as a consequence, will not be commencing part 3 of the Digital Economy Act 2017 concerning age verification for online pornography. The Digital Economy Act objectives will therefore be delivered through our proposed online harms regulatory regime. … [emphasis added]

All of this is not quite a complete abandonment of the age-verification plans, nor is it exactly back to the drawing board, since the commitment is now to do something in the context of the implementation of the White Paper, which envisages that the proposed independent regulator for online safety will provide guidance on steps internet providers should take to ensure children are unable to access inappropriate content, including guidance on age verification (see [7.52] 76). Hence, if an age-verification system similar to the now-defunct AVR is established as part of the online harms regulatory regime, then all that has happened is another (if potentially rather longer) postponement. Moreover, since the scope of the White Paper is much broader than that of Part 3 DEA, covering all “online harms” and not specifically confined to porn, the postponed regime might turn out to be more comprehensive than the abandoned one.

As Pamela Weaver pointed out to me on twitter, the following day, on 17 October 2019, the Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport, Matt Warman MP, confirmed this analysis, when he provided further detail on government’s change of approach to online age-verification:

… we can protect children better and more comprehensively through the online harms agenda [in the White Paper] … than we can through the measures in the Digital Economy Act 2017. I shall be straightforward: it will take slightly longer to do it through this mechanism, but we will go as fast as we can and deliver on the agenda in a more comprehensive way. … This week, the Government announced as part of the Queen’s Speech that they would publish draft legislation for pre-legislative scrutiny next year … This is not an indefinite postponement of the measures that we are seeking to introduce; it is an extension of what they will achieve. …

Age verification will be a key part of the online harms agenda. It will be a key tool in the box, but the toolbox will, through the online harms agenda, be bigger. … I hope that the BBFC will be a key part of the future of this process, because its expertise is in the classification of content. … However, I am not seeking to make age verification line up with that timescale [of the implementation of the White Paper]. We will do this aspect of the policy as quickly as we possibly can.

Politicians on the Irish side of the Irish sea have also been considering online age-verification measures, though with a general focus rather than confined to porn. A personal public services number (PPSN) enables access to social welfare benefits and other public services; and it has recently been supplemented by a controversial public services card (PSC), under-girded by a single online login for public services (MyGovID). Jim Daly TD (on 22 January 2018) and Hildegarde Naughton TD (on 2 April 2019) have both called for such IDs to be linked to social media accounts for age-verification purposes.

In a refinement of his initial suggestion, (on 27 December 2018 and 20 February 2018) Daly has also suggested that the State such produce a separate system of online codes to verify the age of an applicant for an account with an internet provider. However, these calls have, so far, received a mixed political welcome. On the one hand, in response to Daly’s first suggestions, the Taoiseach said there is no plan to link the PSC to social media accounts. On the other hand, in response to Daly’s subsequent suggestions, the now-outgoing EU Commissioner for Digital Economy and Society, Mariya Gabriel, agreed that strong authentication is an essential element in creating an internet that people can trust. As Ronan Lupton pointed out to me on twitter, implementation may not be too far away: the incoming President of the Commission, Ursula von der Leyen, has committed to the introduction of a new EU Digital Services Act to “upgrade … safety rules for digital platforms, services and products” (pdf; p13). And the Minister for Social Protection, Regina Doherty TDurged Government to examine whether MyGovID could be also used to for age-verification. The least that can be said, therefore, is that online age-verification measures are certainly part of the political and policy discussions around online safety in Ireland.

As a consequence, Irish politicians have been watching the UK’s age-verification developments with interest. On 19 June 2019, giving a reply during Leaders’ Questions in the Dáil, the Taoiseach, Leo Varadkar TD, suggested that, “at the end of the year or perhaps after a year or so of implementation, the Minister for Justice and Equality would make contact with his British counterpart to seek advice and a report on whether the law [Part 3 of the DEA] has been effective and whether there have been unintended consequences”; and he made similar comments the following day, even as the UK was postponing implementation. The Secretary of State’s reasons this week for the UK not commencing Part 3 of the DEA is that coherence requires that it be folded into the remit of the regulator to be established pursuant to the White Paper. A similar solution may be emerging in Ireland.

Long-standing Irish Government policy is to establish a similar regulator, the Digital Safety Commissioner. The process probably began with the establishment of the now-defunct Office of Internet Safety [the OIS] in 2007. In May 2014, the Internet Content Governance Advisory Group [the ICGAG] recommended the replacement of the OIS by a National Council for Child Internet Safety, subject to an Inter-Departmental Committee on Internet Content Governance (see the Report of the ICGAG 30-33 (pdf), but it did not propose a regulator with enforcement powers, such as a Digital Safety Commissioner. The most substantial such proposal is the Law Reform Commission Report on Harmful Communications and Digital Safety (2016) (pdf) [the LRC Report) 141-145, which was adopted as formal government policy in their Action Plan For Online Safety 2018-2019 (pdf) 43-45.

As a consequence, on 4 March 2019, the Minister for Communications, Climate Action and Environment, Richard Bruton TDannounced that he will introduce legislation to establish an Online Safety Commissioner to protect children online. This does not seem very far from the position that the UK government is now taking. Similarly, on 20 June 2019, giving a reply during Leaders’ Questions in the Dáil, the Tánaiste, Simon Coveney TD (standing in for the Taoiseach) said that it is intended to bring forward the necessary legislation before the end of the year to end the era of self-regulation on social media. On 20 September 2019, in his speech to the Digital Summit, Dublin, the Taoiseach said that these proposals are now at an advanced stage of development. And, in his statement at the Leaders’ Dialogue on the Christchurch Call, at the United Nations in New York on 23 September 2019, he said the proposed legislation will both regulate harmful content online and implement a recent EU Directive on audio-visual media services (AVMS II).

Proposals for digital regulation have not been confined to the Government. On 22 February 2018, a back-bench bill, the Digital Safety Commissioner Bill 2017, passed Second Stage in the Dáil, and was referred the Select Committee on Communications, Climate Action and Environment, (though it has not since been debated in that Committee). It would implement the recommendations of the LRC Report, especially as regards the establishment of the Digital Safety Commissioner. On 24 June 2019, in the context of implementing AVMIS II, the Broadcasting Authority of Ireland (BAIproposed that it – rather than a new Digital Safety Commissioner – should be the statutory regulator for online videos and harmful online content (in the UK, the Department for Digital, Culture, Media & Sport has recently consulted on appointing Ofcom (which has functions both of the BAI and of the Commission for Communications Regulation (ComReg)) as the national regulatory authority for AVMS II purposes, at least on an interim basis, until the regulator proposed in the White Paper is established), but the Irish Government hasn’t responded to the BAI’s proposals). And, on 16 October, another back-bench bill, the Children’s Digital Protection Bill 2018, was passed in the Seanad. It would regulate legal but age-inappropriate content by means of a takedown procedure, enforced by ComReg.

It is likely that both of these Bills will be overtaken by the Government’s pending proposals, but they demonstrate a commitment by both Houses of the Oireachtas, as well as the Government, and beyond, to deliver regulatory solutions to the problems of online safety in general and the protection of children in particular, whether this be in the shape of additional functions to existing regulators (such as the BAI and ComReg) or the establishment of a new Digital Safety Commissioner. Hence, in the context of age-verification, in March 2018, the Joint Oireachtas Committee on Children and Youth Affairs (chaired by Daly) said that self-certification of age “is not a robust system”; and it therefore recommended that the Digital Safety Commissioner should be tasked with developing a more accurate system of age-verification. It would not, therefore, be a surprise if the functions of that office were to include the regulation of online age-verification, perhaps similar to the ill-fated UK proposals, but with a broader focus, even without the benefit of a year’s experience with the AVR.

It may be, therefore, that both jurisdictions are converging on similar solutions, which would see online age-verification requirements regulated as part of the functions of an online safety regulator.

This post originally appeared on the Cearta.ie blog and is reproduced with permission and thanks