The Online Safety Bill will shortly become law in the UK as soon as it receives Royal Assent. The legislation will introduce a new regulatory regime for online platforms and search engines which target the UK, imposing wide-ranging obligations on in-scope services with serious consequences for non-compliance.
After a long and controversial passage through Parliament since the Online Safety Bill was first published in 2021, the Bill completed its final stage in the House of Lords on 19 September 2023. Royal Assent is expected to be granted by the end of October 2023.
The Online Safety Act (OSA), as it will be, is part of the Government’s mission to “make the UK the safest place in the world to be online“. It is a vast piece of legislation and has been described by the Government as the “most powerful child protection laws in a generation“.
What measures does the OSA introduce?
The OSA will impose new duties on ‘user-to-user’ services and search services to tackle (1) illegal content, which includes content relating to terrorism and child sexual exploitation and abuse, and (2) content that is harmful to children on their platforms.
New offences have also been created, including the offences of epilepsy trolling, cyber-flashing, and sharing intimate images online, including “deepfake” pornography.
Since its conception, the OSA has undergone significant changes throughout the course of Parliamentary debate. Particularly onerous provisions which have been added to the Bill include:
- Stricter requirements for certain categories of services to proactively prevent under-18s from seeing the “highest risk” forms of content, such as content that encourages, promotes, or provides instructions for suicide, self-harm and eating disorders.
- Explicit requirements for online providers to impose age verification and age estimations measures to ensure those measures are effective in preventing children from accessing pornography.
- New obligations to seek to protect users from ‘scam ads’ and online fraud.
- Stricter user empowerment provisions to enable adult users to avoid content they do not want to see (e.g., abusive content).
- Greater powers for coroners to access children’s data on behalf of bereaved parents.
Ofcom, as the appointed regulator, has been tasked with developing codes of practice which will indicate what steps will need to be taken to comply with the legislation. It will also be granted extensive new powers to ensure the OSA is adequately enforced and complied with.
When will the changes take effect?
The Government and Ofcom have stressed that the changes introduced by the OSA will be implemented as quickly as possible once it becomes law.
According to Ofcom’s current roadmap to regulation, the regulator will adopt a phased approach to the OSA’s implementation. Phase one will focus on illegal harms, with Ofcom planning to publish its codes of practice relating to illegal content duties very shortly after commencement.
Phase two will focus on child safety duties and pornography. Ofcom intends to consult on its draft guidance on age assurance in Autumn 2023, with a further consultation on its draft codes of practice relating to the protection of children in around Spring 2024.
The final phase concerns transparency, user empowerment, and other duties on categorised platforms. Ofcom is currently considering responses to a call for evidence on the thresholds of these categorised services, and will be publishing a further call for evidence in Autumn 2023 on the duties that apply to categorised services.
Who needs to comply?
Services which will be caught by the OSA include not only social media companies and search engines, but also any services that allow users to encounter content published by one another, such as forums, blogs, gaming services, chat services, dating apps and messaging services. Businesses of all different types and sizes will therefore be required to comply with the legislation.
Any service that targets the UK will be caught, so international services that may have a relatively modest UK user base will still need to comply.
In-scope services will be expected to:
- Remove illegal content quickly or prevent it from appearing in the first place.
- Prevent children from accessing harmful and age-inappropriate content.
- Enforce age limits and age-verification measures.
- Ensure the risks and dangers posed to children on the largest social media platforms are more transparent (e.g., through undertaking risk assessments and publishing summaries of these in their terms of service).
- Provide parents and children with clear and accessible ways to report any problems encountered online.
- Deliver upon the promises made to users in their terms of service.
Failing to comply with the obligations imposed by the OSA will carry serious consequences. In-scope services that are found to be in breach of their duties could face fines of up to £18 million or 10% of their global annual revenue (whichever is higher). Senior executives and managers could also face criminal prosecution for certain offences created by the OSA.