In a recent article Damian Tambini suggested that it may at some point in the future be necessary to ‘break up’ Facebook. Following further discussion during the House of Lords Inquiry on Internet Regulation he expands on this theme.
I suggested that a first stage could be behavioral rules designed to separate out different functions within the company. These are one way in which specific social objectives are baked into competition law in order to deal with potential negative consequence of market dominance. An analogy is the public interest test for media mergers.
In the Enterprise Act (2002, s58) there is a specific public interest test which is applied in the case of media mergers. The test gives ministers powers to attach conditions to mergers. These are the powers which underlie for example the undertakings being suggested by those companies bidding for control of Sky. Ofcom and the Competition and Markets Authority have both agreed that there is a specific public interest that relates to the continuation of quality independent news provided by Sky News. Therefore the ‘suitors‘ wishing to purchase Sky have offered commitments firstly to continue funding Sky News, and secondly to maintaining its independence for example through separating the management of Sky News from the rest of the company.
These public interest rules relate to a specific historical period in which News and particularly broadcast news has played a hugely important role in society. They were developed over a long period of time, and apply also in mergers involving newspapers, for the same reason. These rules, along with self-regulatory ethical codes, and ethical practices such as the separation of editorial and advertising, form an important part of a regulatory system for the news media that protects against concentrations of unaccountable power and propaganda.
The rise of platform power raises a number of additional social objectives which are reflected neither in the legislation, nor in the guidance offered by regulators. These have included foreign interference, hate speech, misinformation, election offences, and use and abuse of personal data for targeting purposes. Facebook and other dominant platforms are developing ethical principles and practices – analogous to the separation of advertising and editorial – to regulate their own services, by designing in ethics, but they may need help from civil society to work through this wide range of issues. One way that the Inquiry could encourage platforms to work harder to develop their ethical practices would be to examine what forms of internal structural separation might ensure a more ethical Facebook. Ultimately, these could be written into a public interest test but the first stage would be self regulation.
Separate advertising and editorial?
One proposal could for example be to separate advertising from what we could call the ‘editorial’ functions. (Curation of newsfeed and relevance algorithm). Many of the problems associated with Facebook in recent years relate to this function within the company, for example opaque targeting, proto-censorship, advertising-funded propaganda and hate. Might it be possible for Facebook to operate a strict internal separation between advertising and editorial, as has been the case in powerful news media for approximately a century? This could be done on a voluntary basis, but some co-regulatory oversight by a regulator would help.
There are lots of precedents for separating editorial functions from advertising, and not only in newspapers. UK public broadcaster Channel Four was originally prevented from selling its own ads, to protect the public service nature of its output from commercial imperatives. If Facebook was able to separate ad sales and newsfeed, it would be more free to develop its own ethical algorithm in ways that benefit society, and not only its shareholders.
A new institution
This negotiation, to be effective requires a new institution: if Facebook (or other dominant platforms) do not develop their own ethical separations of functions, in the way that newspapers and others have developed their own approaches to these fundamental ethical questions, then some of this might have to be enforced by a regulator. We are currently having a societal debate about the ethics and responsibilities of platforms like Facebook but the discussion is fragmented, there is no ‘credible threat’ of regulation or breakup that will ensure that such companies deliver on their responsibilities.
During my evidence I was critical of the government’s Digital Charter process because it lacks transparency, independence, and public involvement. Historically, through for example Royal Commissions on the press, and broadcasting policy commissions, there has been a convention that major matters of policy which impact media freedom and autonomy will be dealt with through commissions that are independent of government with a transparent set of terms of reference and a clear process. The Digital Charter is driven by a safety agenda, which is hugely important but only part of the regulatory challenge of platforms. The process, which is run by the government, is not transparent or consultative. There is no guarantee this process will be continued by any future government. (The Opposition for example has plans for its own Charter.)
We need a new institution that is capable of articulating to dominant platforms the broad range of societal interests in their operation, developing sensible ways in which they can be addressed and monitoring the extent to which principles are adhered to. This should not in the first instance be a ‘regulator’ and it will not license social networks, but it should set out objectives of regulation which could be implemented by platforms and report on their implementation. It should be able to monitor transparency reports and audit self-regulation. If separation between advertising and editorial on platforms works is agreed as an objective, and works on a voluntary basis, breaking up the company through law (which would likely require involvement of the EU or other countries) would not be necessary.
This institution must be independent of the government of the day and must be able to advise Parliament whether it would be necessary to use the full range of tools: in fiscal, competition and other forms of regulation to encourage more public-interest oriented behaviour on the part of platforms. If Facebook intends to maintain its dominant position in the social media market, it has a responsibility to work with Parliament and civil society to ensure it serves all our interests.
This post originally appeared on the LSE Media Policy Project Blog and is reproduced with permission and thanks