The International Forum for Responsible Media Blog

Regulating online platforms for misinformation and disinformation – Mark Bunting

How to deal with misinformation is a topic of significant debate in the UK, and a focus of the LSE Commission on Truth, Trust and Technology, which will launch its report in November. The Commons Digital, Culture, Media and Sport Select Committee has released its interim report from its inquiry into fake news, and is likely to publish its final report in the coming months.

A White Paper outlining the Government’s internet safety strategy is expected this winter. Communications regulator Ofcom recently released a discussion document that looks at how lessons from its regulation of content standards for broadcast and on-demand video could provide insights to policy makers addressing harmful online content.

It’s often argued that ‘fake news’ is an impossible problem to regulate. Differentiating ‘fake’ from ‘true’ has already kept philosophers busy for centuries; even a consensual definition of ‘news’ is hard for today’s information-saturated, open-to-all social media environments. As ‘fake news’ shades into ‘opinions we don’t agree with’ or simply ‘facts we don’t like’, many would be wary of giving any regulator the power to determine which perspectives are permissible, and which aren’t.

The irony, though, is that the big platforms are already deeply implicated in a kind of ‘regulation’ of news. Facebook works with fact-checkers to assess disputed stories – and now photos and videos– which then show up less frequently in the News Feed. Its tools rank news sources for ‘trustworthiness’, based on ‘user signals’, and promote or suppress their content accordingly. Google News generally – but not always – prioritises coverage from mainstream news providers. And the policies and strategies of these giants and others (AppleSnapTwitter) make a significant difference to news providers’ economic opportunities.

Of course these are private firms, not government agencies. There’s no evidence, so far as I’m aware, that they leverage their power to influence news providers’ editorial coverage. While there is hot debate about whether their content ranking tools are biased in favour of certain outlets over others, it seems more likely that any apparent bias arises unconsciously in algorithmic content selection. Moreover, online platforms do not control access to the news market – most people get news from a range of sources, and few news providers are wholly dependent on platforms for audiences and revenue.

But with the internet now the main source of news in many developed markets, the ability of the biggest platforms to shape both the demand-side and supply-side of the news market is a legitimate matter of public interest. Platforms have made clear that they intend to use their power, in part in response to extreme media, political and public pressure to do so. As Facebook founder Mark Zuckerberg recently put it:

“When you build services that connect billions of people across countries and cultures, you’re going to see all of the good that humanity can do, and you’re also going to see people try to abuse those services in every way possible. Our responsibility at Facebook is to amplify the good and mitigate the bad [my emphasis].

Policy debate in the UK is increasingly focused on how, not whether, to regulate platforms that host content. The Government’s response to its Internet Safety Strategy Green Paper introduced plans for a statutory social media code of practice and transparency reporting. The Digital, Culture, Media and Sport Select Committee’s report on ‘fake news’ called on Government to define ‘misinformation and disinformation’ in legislation, and require platforms “to take responsibility for the way in which their platforms are used” – including legal liability to act against harmful and illegal content. The Cairncross Review is assessing the impact of the digital advertising market on the sustainability of high-quality journalism, and whether it incentivises the proliferation of inaccurate or misleading news.

Now Ofcom has published a discussion paper, drawing on its experience of broadcast regulation, that considers the application of content standards to online platforms. It stresses that existing frameworks could not be transferred wholesale to the online world, and that audience expectations differ between broadcasting and online – and, one might add, between different online environments. With respect to news, it doubts that standards for impartiality and accuracy could or should be imposed online. Instead it suggests a new set of regulatory principles for online platforms: protection and assurance; upholding freedom of expression; adaptability; transparency; enforcement and independence.

Behind these possibly uncontroversial sentiments, an important shift has been made. Ofcom is outlining an approach to regulation, related to content, and of companies that provide access to content. But it is not regulation of content. Ofcom is not suggesting that it, or any other statutory regulator, should exercise control over what content is permissible online.

Instead Ofcom says that protecting people from online harms requires Parliament to set clear statutory objectives regarding content, reflecting societal norms; and regulation of platforms to ensure they adopt practices or procedures designed to secure these objectives. This is regulation of platforms as intermediaries – with the power to shape content environments, and to prioritise some content over others – not as publishers. In particular, Ofcom identifies a need to focus on the processes that platforms employ to identify, assess and address harmful content, and their handling of subsequent complaints and appeals.

Implicit in its analysis is a view that where platforms have power – and this may only apply to the most significant platforms – this should be constrained. But this must be done by requiring them to observe principles of good governance in making content-related policy – not by specifying what content they may or must not carry.

In a recent article for the Journal of Cyber Policy, and a related paper published by Communications Chambers, I have described this as ‘procedural accountability’. ‘Procedural’ content regulation defines standards for the processes platforms use to make, implement and enforce their content policies. The goal is to make platforms better ‘regulators’ – more transparent, evidence-based, accessible and proportionate – and to test, validate and improve their efforts to address complex content problems. As a result, the risks of online communication will be better managed, users will have greater confidence in the safety of online content, and we will all understand more about platforms’ policies and impacts.

Bringing this back to ‘fake news’, the critical question is not ‘what is true?’ vs ‘what is fake?’, but ‘who controls the conditions in which truth is determined?’ Increasingly, the answer is the big online platforms. Consequently, the job of whoever ends up regulating ‘fake news’ is not to decide the truth – but to ensure that platforms’ power to do so is used cautiously and accountably.

Mark Bunting, is a member of Communications Chambers.

This post originally appeared on the LSE Media Policy Project Blog and is reproduced with permission and thanks

2 Comments

  1. truthaholics

    Reblogged this on | truthaholics and commented:
    “Bringing this back to ‘fake news’, the critical question is not ‘what is true?’ vs ‘what is fake?’, but ‘who controls the conditions in which truth is determined?’ Increasingly, the answer is the big online platforms. Consequently, the job of whoever ends up regulating ‘fake news’ is not to decide the truth – but to ensure that platforms’ power to do so is used cautiously and accountably.”

  2. Richard Gerrard

    Interesting. There are a few other takes on Fake News here, including expert opinion on implications for geopolitics, international law, journalism, PR, celebrities, politicians and the military : https://www.carter-ruck.com/insights

Leave a Reply to truthaholicsCancel reply

© 2024 Inforrm's Blog

Theme by Anders NorénUp ↑

Discover more from Inforrm's Blog

Subscribe now to keep reading and get access to the full archive.

Continue reading