The UK is ‘clearly a target for Russia’s disinformation campaigns,’ according to a new report. Protecting our democratic discourse from a hostile state is the role of the intelligence agencies. Integral to that process are the social media platforms, who are private actors. What role should platforms have in a national security context? LSE Visiting Fellow Monica Horten discusses the findings of the Russia report from the UK Parliament’s Intelligence and Security Committee, released on 21 July, which exposes some of the issues.
The Russia report* confirms that the UK is a target for online political interference by the Russian State (para 31), but it exposes a gaping hole in the ability of the UK authorities to tackle the problem. It paints a worrying picture of the intelligence agencies abrogating their responsibility to protect the discourse and processes of the UK against the activities of foreign powers. Despite the known interference on social media, including with the 2016 referendum, there seems to be little understanding of what happened or what to do about it.
The Russia report was published on 21 July by the UK Parliament’s Intelligence and Security Committee, which has statutory responsibility for oversight of the UK Intelligence Community, after a delay of many months. The release follows a petition to get it into the public domain.
It describes Russia as a highly capable cyber actor (para13)*, a hostile State (para 33) that is targeting the UK with campaigns to undermine our democratic discourse by either promoting its own agendas, or simply by sowing confusion. It is covertly using online methods, including on social media platforms, to spread false, distracting and distorting narratives (para 31). Specific tactics include bots and trolls, and the use of State-owned international media (para 28) that typically generate high levels of influence, with social media posts attaining a high reach.
A point of concern raised in the report is that the UK’s State security services claim that they are not responsible (para 33) for tackling this hostile State interference.
The intelligence agencies suggest that government responsibility lies not with them but with DCMS – the Department for Culture, Media and Sport. DCMS says it is only responsible for policy regarding the use of disinformation, not for state security to protect the public against hostile attacks. There seems to be total confusion in government about who is responsible for cyber-policy overall. The report describes ‘an unnecessarily complicated wiring diagram of responsibilities’ (para 18).
The intelligence agencies conducted no threat assessments regarding the interference via social media, either before or after the 2016 referendum. The report suggests that this was a failure to protect our democracy and that ‘it is important to establish whether a hostile state took deliberate action with the aim of influencing a democratic process, irrespective of whether it was successful or not’ (para 39).
The lack of a threat assessment is particularly shocking given the evidence that is not only in the public domain, but held by the UK Parliament’s DCMS Select Committee. This evidence (I have sifted through quite a bit of it**) reveals not only the bot and troll activity, but also how a range of covert online techniques were used to influence the 2016 referendum.
As the report rightly says, the intelligence agencies are responsible for safeguarding the democratic processes (paras 31, 33,34) against interference from a hostile foreign State and from ‘actions intended to undermine our democracy (paras 34, 66)’. In that context, they are responsible for protecting democratic discourse. In the 21st century, the main venue for democratic discourse is provided by the social media platforms.
The Russia report suggests that the intelligence agencies could have acted in this regard. They could have “stood on the shoulders” (para 46) of this evidence to find the owners of suspicious social media accounts and disrupted malicious activity.
The report then slips in a paragraph about social media platforms (para 35). It says they “hold the key” and are “failing to play their part.” It calls on the government to “establish a protocol to ensure they take covert hostile use of their platforms seriously and have clear timescales within which they commit to removing such material”.
This reflects the UK government’s policy on social media (well-intentioned but needs more work, as outlined by Graham Smith on his Cyberleagle blog). However, the issue at stake here is national security. As the Russia report has established, this is the role of the intelligence agencies. Asking social media platforms to take on a national security role would seem to be an arguably problematic approach.
Social media platforms are private actors. The largest players, such as Facebook, Google, Twitter and Microsoft are global corporations. They have no public accountability. There is no regulatory oversight.
They do not police their platforms according to the law, but according to their own internal policies***. When it comes to disinformation material, there is a problem because it is usually context-sensitive and not always obvious what is and is not in the category to be taken down. Figuring that out requires a complex understanding of the political, social and cultural context.
How therefore, should the hostile use of platforms be defined? A platform needs a definition to know what to seek out and take down. How should a platform know when something is intentionally false, creating a distorting or distracting narrative, with the aim of either influencing UK politics directly or by sowing discord and division in this country?
Social media platforms are likely to make decisions about content take-downs according to criteria that include corporate risk factors, which will be assessed across many countries. We cannot expect they will take decisions according to (what to them will be) narrow UK criteria.
The mechanisms put in place by the platforms to identify what they call ‘false news’ are inappropriate for this task. They rely on a pedantic fact-checking exercise, whereas hostile States seeking to disrupt our democracy use highly sophisticated techniques that would fall straight through that net. For example, they involve dropping pieces of information into a text that appears otherwise to be legitimate.
Content moderation is not the solution either. To a content moderator on the other side of the world, with no contextual knowledge of the UK political situation, a post or a meme may not seem to violate the platform policies, yet it may be deeply destructive from a UK perspective. By contrast, innocent, lawful memes and posts do get caught up in trawls by the platforms’ automated systems.
Social media platforms are the vehicle, but they are not the perpetrator that is creating false narratives. They hold data that could help in uncovering the perpetrators, but they are not law enforcers. They have no mandate to address issues of UK national security and neither should they.
The question that should be asked is around the kind of co-operation that could help the intelligence services protect democratic discourse. However, this also raises the tricky issue of how far the state can demand for example, to see data, without being overly intrusive into individual rights. There are already issues with the bulk powers given to the intelligence services under the Investigatory Powers Act.
It is not going to be easy to find the right balance. A threat assessment and enquiry into the 2016 referendum would seem to be a necessary first step. Aligning responsibility in government would be helpful too. There will be a need to call on social media platforms to assist. However, outsourcing to a private actor would be an abrogation of duty.
As the Russia report establishes, it is the duty of the intelligence agencies to maintain national security and to safeguard our democratic processes. That is surely the case now, as it ever was, back in the days of James Bond.
*Paragraph numbers refer to the Russia report: Intelligence and Security Committee of Parliament: Russia (HC 632) Presented to Parliament pursuant to section 3 of the Justice and Security Act 2013. Ordered by the House of Commons to be printed on 21 July 2020.
**I presented on micro-targeting in the 2016 referendum, at a seminar hosted by MEP Alexandra Geese in the European Parliament on 12 November 2019. European Parliament Micro-targetting & Profiling Event November 2019
***I’m just completing a preliminary study looking into Facebook’s enforcement policies and their impact on a small fleet of pages.
This post was originally published on Monica Horten’s Iptegrity blog and on the LSE Media Policy Project blog is reposted with thanks.
Leave a Reply