The dust has settled since the government released its draft Online Safety Bill. Now is therefore a good time to evaluate its aims, methods, and potential impacts, which we will do so in this two-part post.
The first post will have a look at the overall architecture of the bill, discussing what it is trying to do and how it is trying to do it. The second post will survey responses to the bill from academics and civil society campaigners, discussing whether the bill does too much or not enough.
The general strategy of the Online Harms Bill is to place duties on “regulated services”, requiring them to identify and mitigate system level risks of harm to their users. This post will focus on the meaning of “regulated services”, and the various duties that the Online Harms Bill places them under. As things stand, the bill would give significant powers to Ofcom, which would act as a regulator and enforcer of the various duties created under the bill. This first post will conclude with a look at the new powers that would be given to OFCOM under the bill.
The bill would apply to “regulated services”. The definition of regulated services is found in section 3: regulated services are either “user-to-user services” or “search services” which “have links to the United Kingdom” and which are not exempt.
The first important thing to note is the broadness in the drafting of all these definitions. A service has links to the UK if it has a significant number of users in the UK, if UK users are a target market, or if there are “reasonable grounds to believe that there is a material risk of significant harm to individuals in the UK” using the service. Thus, territorially, a very wide number of online services could be caught.
A “user-to-user service” (since publication of the draft bill generally called a ‘U2U’ service in commentary), defined in section 2, is a service which allows users to share user generated content with other users. The definition excludes content generated by the site itself, and content shared by those employed by the service.
This is a widely defined provision. Obviously intended to catch large social media organisations like Facebook, Twitter, Instagram and TikTok, it is nonetheless drafted broadly enough to also include smaller blogs, websites for shopping, online gaming sites and other categories of online platform which hosts user generated content.
However, the exact nature of how those sites will be regulated will be dependent on their classification by OFCOM as category 1 or 2A/2B services. Category 1 is reserved for services with greater functionality and larger user bases, and services classified as such are subject to stricter duties, which will be explained in the duties section; machinery for classification is currently found in Schedule 4 of the bill.
Some exceptions apply, but these are tightly drafted. Functions such as email and SMS/MMS services, limited functionality services (such as services where users can only comment on site generated content), internal services such as intranets, and public bodies in the exercise of a public function are exempt. Exemptions can be found in Schedule 1; per s.3(8), the Secretary of State can amend the exempt services found there.
“Search services” are defined as services providing an internet search engine that are not U2U services. Much of the same duties apply to search and U2U services, so these will be largely dealt with together.
Risk Assessments and Safety Duties
Fundamental to the strategy of the Online Harms Bill as it stands is to require U2U and search service providers to perform risk assessments which assess the potential for harm produced by their systems. It is worth reiterating that the Bill attempts to work at a systems level: individual users will not have any action taken against them on the basis of the Online Safety Bill. What is necessary is to identify how the systems which underpin the services – the algorithms that disseminate content on U2U services, for example – can be potentially harmful. The risk assessment duties of all sites are currently found in s. 7 for U2U services and s. 17 for search services.
All services will have to perform illegal content risk assessments, responding to risk profiles created by OFCOM. This will have to identify, among other things, how quickly illegal content is disseminated in the service, how widely, and the severity of the harm the content may cause. In particular, the bill identifies terrorism and child sexual exploitation and abuse (CSEA) content as being a priority, as well as other “priority illegal content” which will be designated in regulations by the secretary of state. Services likely to be accessed by children will have to perform a “children’s risk assessment”, assessing the likelihood and severity of children accessing harmful content using their services.
Correspondingly, all services have a “safety duty” to take “proportionate steps to mitigate and effectively manage the risks of harm to individuals” which are identified in the risk assessments. For the highest priority content – terrorism and CSEA content – OFCOM will lay down codes of practice helping services to minimise it. For “priority illegal content”, the duty requires services to minimise the presence of illegal content, the time is it accessible on the service, and its dissemination on the service. Further, services have a duty to respond “swiftly” to complaints of illegal content. The safety duties are largely repeated for services likely to be accessed by children, who must mitigate against the risk of children coming across harmful content.
Those U2U services designated “category 1” will further have to carry out what the bill calls an “adults’ risk assessment”. This, with the accompanying safety duty, has proven to be one of the most controversial aspects of the bill. It requires category 1 U2U services to identify risks from so-called “legal but harmful” content and proportionally mitigate against them. As things stand, the content of this duty is not entirely defined, as part of what is to be considered harmful will be designated in regulations by the relevant secretary of state. However, consultation responses suggest this could include content about self-harm, eating disorders, and suicide.
S. 46(3 – 5) also specifies that content will be considered harmful if there is a “material risk of the content having… a significant adverse physical or psychological impact on an adult of ordinary sensibilities”. A detailed discussion of the potential impacts of this provision and responses to it will be contained in the subsequent post.
Other Category 1 Duties
A number of other duties apply to Category 1 services. These are largely aimed at ensuring that a good balance is struck between the implementation of the adult safety duties and the need to protect both freedom of expression and privacy. The bill clearly envisages that more significant duties need to be placed on category 1 services: for example, they will have to carry out impact assessments on how their policies implementing their safety duties will affect freedom of expression and privacy. Non-category 1 services merely have a duty to “have regard to” the importance of freedom of expression and privacy.S. 13 places a duty to protect content of “democratic importance” on category 1 services. This amounts to a requirement to “take into account” the importance of content which contributes to democratic debate in the UK when making decisions about whether to remove content from their sites. It is unclear that a requirement to merely “take into account” its importance is on par with the significance ECHR jurisprudence has placed on political speech within the wider area of free expression.
Content of democratic importance is defined in an open-ended manner. While “news publisher content” is assumed to be of democratic importance, it is otherwise defined as content that “is or appears to be specifically intended to contribute to democratic political debate in the United Kingdom”. This seems to place significant discretion onto the category 1 services to decide what ought to be protected as political speech.
Category 1 services must also protect “Journalistic content”. This functions largely the same way as the duty to protect content of democratic importance, but also includes a number of duties which allow both uploaders and creators of journalistic content and expedited appeal procedure when content considered to be “journalistic” is acted against (such as being taken down). As with content of democratic importance, the definition of journalism will be significant for this provision, although the bill leaves this undefined.
Role of OFCOM
As a regulator, OFCOM’s duties under the bill can be grouped into those helping regulated services comply with their own duties, and powers of enforcement where regulated services do not comply. Significant enforcement powers would be given to OFCOM to police this.
Under ss. 72-73, senior managers of regulated services can receive criminal sanctions for not complying with OFCOM investigations. S. 75 gives OFCOM wide powers of entry and inspection. OFCOM can impose penalties of either £18 million or 10% annual turnover, whichever is greater; for comparison, under UK GDPR, the Information Commissioner can only impose penalties of up to £17.5 million or 4% of annual turnover. Given that category 1 services are in part determined by the size of the service, the 10% penalty may well be the larger.
OFCOM will also be the first place services will look for compliance with the new regime. The various risk assessment duties previously detailed will have to correspond to risk profile created by OFCOM. This applies to both category 1 services with adult safety duties and other services with illegal content risk assessment duties.
OFCOM also has a duty to prepare codes of practice which help regulated services comply with their various duties. Before setting these out, OFCOM is required to consult with a number of different stakeholders, the list of whom is set out in s. 29(5). These codes of practice will be advisory only, but it is likely that compliance with them will be equated to compliance with the wider duties under the bill. As such, they are likely to become a very significant feature of the regulatory landscape.
Overall, the bill presents a large new regulatory regime. Whether it is likely to be successful or not as things stand will be considered in the next post.
Rafe Jennings is an aspiring barrister with an interest in freedom of expression and privacy online
This post originally appeared on the UK Human Rights Blog and is reproduced with permission and thanks