The International Forum for Responsible Media Blog

Facebook’s Frankenstein’s Monster: freedom of expression and the problem with fake news and violent and sexual content – Peter Coe

Nick Hopkins’ recent Guardian article on Facebook’s policies on violent and sexual content has, once again, brought the right to freedom of expression and the role of social media platforms under scrutiny.

Facebook’s latest statistics reveal that the platform has 1.28 billion daily active users. Other social media sites also continue to grow rapidly. For example, Twitter states that it currently has 313 million monthly active users. As of April 2017 Instagram announced it had reached 700 million users. In February 2016 WhatsApp reached 1 billion users and LinkedIn has over 500 million members.

These huge numbers mean that the ability of social media platforms to manage what is posted on their sites by their users is still, largely, uncharted, and very much subject to trial and error. This is reflected in Hopkins’ article in which it is said that Facebook ‘executives are struggling to react to new challenges such as “revenge porn”’ and that ‘moderators say they are overwhelmed by the volume of work, which means they have just 10 seconds to make a decision’ on whether a communication should be removed.

Platforms such as Facebook have been under pressure to be able to identify authors of, for instance, defamatory material, revenge porn, cyber-bullying, hate speech and communications inciting terrorism, to enable successful civil actions and/or criminal prosecutions (for example, under  section 3 of the Terrorism Act 2006, they may be subject to criminal proceedings if they fail to take down unlawful ‘terrorism-related’ material upon receiving appropriate notice) for some time – an issue that is animated by the recent ‘fake news’ phenomenon. Facebook, in particular, has been the subject of strong criticism in the wake of the US election.

With the traditional media (the press, television and radio broadcasting and book and journal publishing), the danger posed by contributors is mitigated by the presence of an active intermediary. In these contexts the journalist, editor, publisher or presenter can vouch for the integrity and reliability of their source or author. They can also check the story prior to publication or broadcast and, if need be, refer it to their legal team to prevent material being published that may present legal risks.  However, in the online world, it is unusual for platforms to assume this responsibility for two reasons.

Firstly, due to the volume of users, monitoring content is extremely difficult for social media platforms (indeed, in relation to Internet Service Providers monitoring such content is arguably impossible). An issue that animates this that has been the subject of widespread news coverage relating to online bullying is anonymous and pseudonymous expression, and the tensions this has created between free speech principles and the real name policies of social media platforms. Facebook’s anonymity and pseudonymity policy relies on users to report fellow users using pseudonyms. However, in many instances, it is likely that these users will have no idea that a pseudonym is being used. Notwithstanding this, from a practical perspective, it is almost impossible for platforms such as Facebook to monitor and vet the millions of messages carried each week.

According to Hopkins’ article Facebook alone reviews more than 6.5 million reports a week relating to potentially fake accounts – known as FNRP (fake, not real person). Furthermore, the policy also conflicts with the advice given to police officers to use a pseudonym on social media to protect their identity. Many police officers do use pseudonyms for this purpose on Facebook, among other social media platforms. For the same reason, the General Medical Council supports the right of doctors to use social media anonymously or pseudonymously. Additionally, Facebook’s real name policy has been held to infringe German data protection law (specifically, section 13, VI of the Telemedia Act 2007 and section 3a Data Protection Act 2003).

Secondly, there has been a disinclination amongst social media companies to play the role of arbiter, as this leaves them open to claims of censorship. This particular challenge, and the tension it creates, is illustrated by Facebook’s reaction to the criticism referred to above in relation to the US election and fake news. The platform announced that it will work with a third-party fact-checking organisation whilst, at the same time, and rather contradictorily, reiterating its commitment to ‘giving people a voice’ and that it ‘cannot become an arbiter of truth’, with Mark Zuckerberg stating:

‘We believe in giving people a voice, which means erring on the side of letting people share what they want whenever possible. We need to be careful not to discourage sharing of opinions or to mistakenly restrict accurate content.’

In my view pressure to act as intermediaries that censor material may deter social media platforms, website operators and ISPs from providing their service and/or encourage them to act as arbiters of truth. This, in turn, could have a chilling effect on free speech. The criticism leveled at social media platforms in respect of their reluctance to act as proactive intermediaries in the context of the dissemination of violent and sexual communications such as revenge porn, terrorist activity and cyber-bullying, and the fake news phenomenon, and the consequential response of these sites, are indicative of the challenges that emanate from the apparent conflict between the perceived roles of these platforms, and whether they should be required to operate as media or as technology companies.

Indeed, until as recently as November 2016, Mark Zuckerberg has consistently described Facebook as a ‘tech company’, rather than a ‘media company’. However, in December 2016, Zuckerberg finally conceded that Facebook is a ‘media company’. Until social media platforms work out for themselves what they actually are, there will be a lack of clarity, both internally and externally, as to their actual and perceived roles and responsibilities, an issue that will be reflected in their policies and guidelines. In the meantime, the Frankenstein-like Monster that they have created will continue to grow, as will the free speech challenges that come with it.

Peter Coe, Senior Lecturer in Law, Aston Law School, Aston University; Barrister, East Anglian Chambers; Consultant Lawyer, Addleshaw Goddard (AG Integrate).

Leave a Reply to Facebooks Frankensteins Monster freedom of expression and the problem with fake news and violent and sexual content Peter Coe - Real Media - The News You Don't SeeCancel reply

© 2024 Inforrm's Blog

Theme by Anders NorénUp ↑

Discover more from Inforrm's Blog

Subscribe now to keep reading and get access to the full archive.

Continue reading