The Online Safety Bill was reintroduced to Parliament late last year, with new amendments receiving scrutiny in the House of Commons in December, before the bill entered the House of Lords in January.
The bill continues to evolve, most notably with the government removing the requirement for user-to-user platforms and search engines to prevent adult users from encountering “legal but harmful” content, instead requiring online platforms to provide users with enhanced content controls (i.e., choice as to what content is seen), and imposing further obligations around transparency and enforcement of online platforms’ terms and conditions. “Legal but harmful” obligations remain in place for under 18s. The government has recently agreed to introduce criminal sanctions for senior managers who fail to take proportionate measures to protect children from potentially harmful content.
The bill contains considerable cross-party support. In Parliament on 5 December 2022, Damian Collins MP praised the work done by MPs in progressing the legislation:
As Members know, there is a tradition in the United States that when the President signs a new Bill into law, people gather around him in the Oval Office, and multiple pens are used and presented to people who had a part in that Bill being drafted. If we required the King to do something similar with this Bill and gave a pen to every Minister, every Member who had served on a scrutiny Committee and every hon. Member who introduced an amendment that was accepted, we would need a lot of pens and it would take a long time. In some ways, however, that shows the House at its best; the Bill’s introduction has been a highly collaborative process.
Damian Collins is right that the bill has involved an extraordinary number of amendments, but whether this is a good recipe for well-thought through legislation over something as fundamental as what information is available online is a separate question. Numerous free speech groups and lawyers have suggested that the legislation, while well intentioned, fails to grapple with the nature of the internet and is either unworkable or may have dramatic consequences for free speech.
In many ways, this Parliament has saved itself from the difficult questions by kicking the can down the road, requiring Ofcom to produce ‘codes of practice’ which will fundamentally shape what content can be shown on social media and what people can be shown when they look for content on search engines. This in itself is problematic. First, it means that Parliament is introducing legislation without themselves having gone through and debated the implications of what they are mandating at a high level. Second, is the democratic deficit in expecting an unelected regulator more practiced in dealing with linear broadcasting complaints to draw the boundaries in British society around what information and speech is permitted online and how content is moderated. So, despite good intentions, the legislation in its current form leaves many open questions and has the potential to lead to unintended and damaging outcomes.
The systems effect
The OSB has been heralded as a “systems” bill, targeted at regulating the algorithmic processes and technologies used by platforms rather than individual pieces of content. Damian Collins MP expressed his support for this approach in recent debates, saying “if people posted individually and organically…and that sat on a [social media] channel that hardly anyone saw, the amount of harm done would be very small”.
But mandating that algorithmic systems and technologies are implemented across the internet has potentially enormous consequences for niche or smaller user-to-user services. Take one example: Mastodon, an alternative open-source software that has grown in popularity since Elon Musk’s takeover of Twitter. Mastodon allows users to run self-hosted social networking services and has been heralded as by many well-known celebrities as a safer space and an alternative to larger social media sites. But what would the effect of the OSB be on this platform?
On Mastodon, each user is a member of a specific Mastodon server (or “Instance”) which operates as a federated social network. Each Instance operates its own content moderation policies, run by unpaid volunteers, summed up in Mastodon’s terms and conditions with the statement: “Who owns Mastodon? Everybody!”. The OSB’s requirements on platforms to tackle illegal and harmful content (rather than relying on “self-moderating” processes) might make such self-governing communities unworkable, expose volunteers to civil penalties, or in some cases even subject them to criminal investigations.
This isn’t just a Mastodon problem. Almost every online platform that allows user-to-user engagement or search will be caught by the OSB. From Wikipedia, to Mumsnet, to Minecraft, to Signal, to Tinder, to your local community forum, every online platform or communication channel around the globe which ‘targets the UK’ will have to comply with an increasingly onerous array of obligations.
Following political pressure from backbenchers, the Government has confirmed it supports an amendment to the OSB to impose criminal liability on senior managers of online platforms who have consented or connived to ignore enforceable requirements of the bill resulting in “risking serious harm to children”. The bill already included a provision to make senior managers liable if they failed to comply with requests to provide information (‘information notices’) sent by Ofcom.
Wikipedia has spoken out about the proposal, arguing that the risk of criminal sanctions could impact on what is widely regarded as a public interest resource. All content on Wikipedia is produced by volunteers, and, similar to Mastodon, the community decides what is acceptable. Wikimedia Foundation (which hosts the encyclopaedia) does not involve itself in decisions. The possibility of criminal liability for senior managers would force it to intervene if a volunteer editor kept up an article that could be deemed either illegal or harmful (and accessible) to children under UK law – requiring the platform to make judgment calls about public interest content – including decisions about which encyclopaedia entries should be accessible to under 18s.
Another proposed amendment to the OSB is to include in the definition of “priority illegal content” – i.e., content which platforms must proactively implement technological processes to remove – any content which could be seen to promote, aid or abet illegal immigration. So, in 2024, it is feasible that a tweet supporting the plights of small boats arriving on the UK’s shores could be deemed as ‘priority illegal content’ and platforms that fail to stem that content open to sanction. Search engines could be penalised if they allow UK users to access websites that discuss seeking asylum by illegal means. And the decision about what is available and what isn’t will often be taken by computer algorithms which struggle to understand context, potentially using technology designed by Ofcom or mandated using a technology notice.
Other unintended consequences are not hard to imagine. Would algorithms remove all tweets promoting an environmental protest, on the basis that they are procuring a Public Order offence? Would the UK population be banned from discussing the unofficial ‘420 Day’, where cannabis producers, consumers and advocates around the world celebrate marijuana use? Would discussion online about the pros and cons of buying shares be removed for potential violation of financial services legislation?
Quite apart from the potential impact of these amendments on freedom of expression, industry body TechUK has also said expanding criminal liability will be perceived as a “very open-ended risk by investors”. Against this hostile regulatory environment, it is certainly hard to conceive that there won’t be a significant impact on the UK technology sector. Comparatively moderate regulation introduced by the European Union’s Digital Service Act last year could mean talent and investment in the sector shifting to Berlin or Lisbon instead.
The government say that extensive steps have been taken to ensure that journalistic and news publisher content is given special protection within the OSB, including the introduction of a “temporary must carry” provision whereby platforms will need to notify news publishers and offer a right of appeal before removing or moderating journalistic content.
Despite these good intentions, there are concerns that other provisions in the OSB could pose serious risks of jeopardising journalistic sources and confidential journalistic material without incorporating the protections from s. 10 Contempt of Court Act 1981 and Article 10 ECHR – which ensure that journalists are entitled as a matter of law to protect the identity of their sources save for in limited circumstances.
A report by Index on Censorship has considered the powers awarded to Ofcom to impose “s. 104 notices” (“s. 110 technology notices” under the amended bill) on operators of private messaging apps and other online services (including those which currently use end-to-end encryption) requiring them to use technology which monitors the private correspondence of UK citizens in order to seek to identify terrorism and child exploitation content. These notices, which appear to be viewed by the government as a necessary measure to ensure the effective identification and removal of the most harmful material, could require providers to override protections offered by end-to-end encryption. According to Index on Censorship (supported by a legal opinion from Matthew Ryder KC) the powers envisaged under the bill in essence provide for state-backed surveillance powers which go far beyond those currently available in UK law – this type of surveillance would, for example, only be available under the Investigatory Powers Act in the interests of national security, and even then surveillance would only be available with a warrant from the Secretary of State, who must be satisfied the request is “necessary and proportionate”. Ofcom could therefore be granted a wider remit on mass surveillance powers than GCHQ. Signal, the platform favoured by investigative journalists and whistleblowers, has spoken out about the bill, stating it would “create an unprecedented regime of mass surveillance that would all but eliminate the ability of people in the UK to communicate with each other outside of government interference.”
The unintended consequences here are obvious, with a severely detrimental impact on journalism. Individuals could be subject to ongoing surveillance ordered by a regulator and operated on an indiscriminate basis via algorithms, with some of that content then being escalated for human review. This in turn could expose journalistic sources and endanger individuals investigating politically sensitive issues. Index on Censorship warns that “unless the government reconsiders or parliament pushes back, these powers are set on a collision course with independent media and journalism as well as marginalised groups”.
Given the extremely wide-ranging nature of the bill, important issues like this – deserving of attention due to their potentially enormous impact on journalism, human rights, and data privacy – risk being swept through with little or insufficient Parliamentary debate.
Despite the obvious enthusiasm in the Commons to enact the legislation with speed, the hope is that the House of Lords or the Government take a more sober approach. Debate and scrutiny is needed across all aspects of the OSB, which should include further engagement with stakeholders across the industry and proper scrutiny as to the potential impacts of introducing this legislation.
If the history of this bill is anything to go by, it’s likely that there will be a few more hurdles to overcome in the coming months.
This post originally appeared on RPC Perspectives and is reproduced with permission and thanks
Nadia Tymkiw is a Senior Associate at RPC.
Leave a Reply