Site icon Inforrm's Blog

DCMS Report: Time to regulate social media? – Zoe McCallum

Last month the Commons DCMS Committee on Disinformation and Fake News [pdf] recommended a new regulator for tech companies and an enhanced role for the ICO, paid for by a levy on tech giants.

These recommendations were highly political, made against a background of public trust in big tech declining in the wake of Cambridge Analytica and other data breach, disinformation and election interference scandals. So why the push for regulation – and are the recommendations sensible?

The push for towards tougher regulation

The Report quotes Margot James MP (Digital Minister for DCMS – very involved in the White Paper soon to be announced) lambasting tech companies for failure to act swiftly to remove harmful online content.

There have been no fewer than fifteen voluntary codes of practice agreed with platforms since 2008. Where we are now is an absolute indictment of a system that has relied far too little on the rule of law.

The criticisms of social media companies driving the regulatory push are not limited to their role in the proliferation of fake news and in election day scandals. They are:

Dragging feet / hiding under the table

Whilst tech companies can (and do) work to tackle these issues, the Report criticises them for not doing it fast enough – and for disclaiming liability in respect of some “online harms”. Specifically, it claims:

The Report’s recommendations

 To tackle these “online harms” the Committee recommends:

Comment: Are the recommendations sensible?

The last recommendation concerning the ICO is straightforwardly based on a misconception (see [42]). As ICO Guidance makes clear, the definition of “personal data” in the GDPR/DPA is already wide enough to cover profiling by social media companies. The Committee was right, however, to describe the ICO as underfunded relative to the task in hand.

Contrary to the report, “technical experts” are no more qualified to develop a code of practice for social media companies than TV camera operators are to develop the Broadcasters’ code. Nonetheless, there is (relatively speaking) an emerging consensus around forms of online harm. Social media companies have admitted on occasion that the scale of some tasks was too difficult to tackle e.g. in relation to incitement to hatred in Burma.

As to whether self-regulation is preferable to independent regulation, comparison to IPSO vs the Royal Charter system is tempting but unhelpful. That is because social media companies “select” rather than “generate” content. Via algorithms they wield enormous power to decide what of the endless internet we consume via newsfeed. But it is the power of a gatekeeper, not the writer – and an inference with the power to curate a newsfeed is not in the same league as an interference with journalistic freedom.

There are analogies to broadcast regulation. Historically, one purported justification for censoring broadcasters more heavily than print media (e.g. the watershed) was the sheer power of moving images to invoke a reaction. Social media newsfeeds are full of moving images and arguably more compelling than TV – because they involve interactivity. If the rationale still held for regulating broadcasters, it would carry over to the internet. But the Broadcasters’ Code is part-product of a historical hangover: When only analogue transmission was possible, analogue spectrum scarcity was the rationale for imposing onerous requirements relating to impartiality in TV news and election coverage. Now there is proliferation of channels, internet competes with TV and there are multiple technological solutions to control access to both by vulnerable users. Heavy-handed regulation is no longer justified – even though it persists.

It would be sensible (and cost-effective) to adopt a unified approach to regulating broadcast and social media – and take the opportunity to overhaul the broadcast system. That solution is unlikely to be palatable to tech companies but preferable to a German-style NetzDG law. I doubt that the government or Ofcom have the appetite – but via the White Paper, we’ll find out soon.

Zoe McCallum is a barrister at Matrix Chambers, practising in the field of media and information law.

This post originally appeared on the Matrix Media and Information law website and is reproduced with permission and thanks

Exit mobile version