e-commerceThis is one of a series of posts about the pending EU General Data Protection Regulation (GDPR), and its consequences for intermediaries and user speech online. In an earlier introduction and FAQ, I discussed the GDPR’s impact on both data protection law and Internet intermediary liability law. Developments culminating in the GDPR have put these two very different fields on a collision course – but they lack a common vocabulary and are in many cases animated by different goals.  Laws addressing concerns in either field without consideration for the concerns of the other can do real harm to users’ rights to privacy, freedom of expression, and freedom to access information online.

In my previous two posts about the GDPR (here and here), I identified serious problems with its notice and takedown process, and resulting threats to Internet users’ rights to free expression and access to information. The legal framework of intermediary liability provides a lens for identifying these problems.  It also offers a set of ready-made tools to address them.  Lawmakers could and should take advantage of these tools to improve the GDPR.

The cleanest and simplest solution to the GDPR’s notice-and-takedown problems comes from existing law under the EU’s eCommerce Directive.  That body of law could govern removal of user content by intermediaries, leaving intact the GDPR’s current provisions for deleting back-end data companies collect and store about user behavior. More ambitiously, GDPR drafters could craft a new and better process. European lawmakers could take either approach without undermining other important data protection goals or provisions of the Regulation.

Existing EU legal and policy resources could vastly improve the GDPR’s notice-and-takedown

1.  Applying existing eCommerce Directive law directly to the GDPR

Existing law under the eCommerce Directive provides the most obvious and simple way to sweep away the problems created by the GDPR’s current takedown process.  The GDPR could state clearly that its erasure obligations, for intermediaries processing third-party content, are subject to Articles 12-15 of the eCommerce Directive . Those articles cover enumerated activities such as hosting or caching user-generated content.  Unlike the current GDPR, they protect online expression by only requiring intermediaries to remove unlawful content once they know about it, typically following notice and takedown processes.  To incorporate them, GDPR drafters could add the following italicized language at the end of Article 2.3:

Intermediary service providers’ obligations under this Regulation regarding rectification, erasure, objection or restriction of information provided by a recipient of the service shall arise only when an intermediary service provider has acquired knowledge or awareness in a manner that would give rise to liability under Articles 12 to 15 of Directive 2000/31/EC.  Where Member State law specifies procedures or other requirements for intermediary liability under Articles 12-15, those shall apply.

Internet content removal laws under Member State legislation or case law implementing the eCommerce Directive were designed for precisely the situation an Internet intermediary encounters when faced with a “Right to Be Forgotten” erasure request.  Their purpose is to balance rights of aggrieved parties seeking removal of online content or links on the one hand, and the rights of other Internet users to share or access information on the other.  Of course, existing laws are far from perfect.  The widespread over-removal documented in academic studies illustrates the need for improvement in those laws as well.  But they are worlds better than the GDPR’s current provisions, and they bring to bear the right set of considerations about rights and responsibilities of multiple parties on the Internet.

Invoking the eCommerce Directive within the GDPR is also a clean solution as a drafting matter. With a few new sentences, the GDPR could eliminate a thicket of ill-suited rules for intermediaries, without changing removal processes that work for back-end content collected by Internet companies about users. This approach, providing separate and distinct erasure rules for information provided to intermediaries by third parties, tracks the Article 29 Working Party’s 2008 recommendations for search engines.  It also tracks the notice-and-takedown remedy provided by the CJEU in Costeja, which incorporates limitations that are important for indexed public content, but would not be needed in erasing stored, back-end data. As I will discuss below, this can be done without any change to the substantive privacy protections defined by the GDPR in its “Right to Be Forgotten” provisions and elsewhere.

2.  Crafting new GDPR removal processes consistent with intermediary liability principles

Of course, another option is to craft a new process that incorporates proportional protections for online expression.  This would be challenging in the time remaining, but if GDPR experts wanted expert input about notice and takedown, they would not have to look far.   The European Commission has developed considerable internal expertise in precisely this area in recent years.  As part of the 2012 Notice and Action Initiative, the Commission conducted a lengthy public consultation.  The resulting staff working document provides a thorough and nuanced review of notice and takedown law and practice in Europe, and discusses concerns raised by stakeholders including free expression advocates.  The Commission is delving into the topic again through the Digital Single Market project.

Europe also has a number of well-established civil society organizations that have thought hard about the nuts-and-bolts procedural aspects of content removal.  Article 19 has a concrete and sophisticated model for notice and takedown – which looks nothing like the GDPR.  La Quadrature du Net has also published extensive commentary and concrete recommendations for notice and takedown, and in its early responses to the Costeja case called for regulatory limitations to protect free expression.  These groups and others could provide thoughtful input.

Improving GDPR notice and takedown to protect online free expression would not harm the privacy rights protected by other parts of the GDPR.

Either of these approaches –invoking the eCommerce Directive, or inventing a better removal process – could be carried out without undermining the GDPR’s other achievements for data protection and privacy.

1. Protecting users’ rights to delete data tracking their online behavior

First, improved notice and takedown rules need not have any effect on rights or processes for deleting the other kinds of personal data held by Internet companies.  Much of the GDPR is designed for this important, separate purpose – giving data subjects legal erasure rights with respect to the stored, back-end data that companies hold about their online behavior.   The GDPR’s removal process seems designed for this pure user-to-business, two-party interaction.  Applying it to the very different situation that arises when one Internet user wants to delete content posted by another is dangerous to online expression, for the reasons I set out in my second post.  But using this single set of rules for both situations is a drafting choice, not a necessity. Drafters could invoke eCommerce law or other improved provisions for content notice and takedown without changing provisions for back-end data erasure at all.

 2. Protecting the “Right to Be Forgotten”

Second, content removal process issues can be separated from the substantive scope of the “Right to Be Forgotten.”  European lawmakers could decide that this right is very broad, and most user erasure requests should be granted; or they could decide the opposite.  That decision should not affect, or be affected by, the procedural rules for implementing an erasure request.  Well-crafted processes remain important to protect whatever content does fall outside of the “Right to Be Forgotten,” and to prevent its being unfairly targeted and removed from the Internet.

Procedural protections are especially important because the rights and remedies created by the GDPR will be around for a long time, and affect a broad and evolving Internet ecosystem – not just the large and well-resourced companies that appear in current headlines. Some of those companies, including Google, allocate considerable resources in an effort to avoid over-removal of content under intermediary liability law.  Processing requests carefully and rejecting the ones they believe are legally unfounded is, in my opinion, an important service to users.  But it is not behavior that should be taken for granted in crafting laws of general application.  The law should not incorporate any assumptions that all intermediaries will put effort into avoiding over-removal, or even that the ones doing it now will do it forever.

 3. Companies’ voluntary removals of lawful content

Finally, processes for content removal under the law can, and in this case should, be considered separately from processes companies use for discretionary content removal under their own community guidelines or policies. The two kinds of content removals pose important and related questions – about rights, procedure, and transparency in particular.  Comparison may be fruitful in other contexts.  But only one kind of removal, the one compelled by law, is being decided in the next six weeks under the GDPR.   The tools to improve protections for lawful online expression are readily available, drawing on existing intermediary liability law and models put forth by civil society groups.  Lawmakers should use them.

Conclusion

Public discussion of the GDPR has understandably been dominated by topics more traditionally associated with data protection, such as the data transfer provisions thrown into the spotlight by the recent Schrems case.  There has been very little public discussion of the Regulation’s notice and takedown provisions.   But principles for notice and takedown have been extensively discussed, debated, and passed into law in the field of intermediary liability.  By invoking the protections European laws create in that context, lawmakers can fix these serious problems with the GDPR while still achieving its data protection goals.

Daphne Keller is Director of Intermediary Liability at The Center for Internet and Society at Stanford Law School

Disclosure: Daphne Keller previously worked on “Right to Be Forgotten” issues as Associate General Counsel at Google. The version of this post available through Inforrm contains proposed language different from that which appeared in a footnote to the original Stanford CIS post.

Request: Comments and feedback on this analysis are very welcome, here or to  on Twitter @daphnehk

A version of this post was originally published on the Stanford Center for Internet and Society blog and the Internet Policy Review News & Comments.