Site icon Inforrm's Blog

Final Draft of Europe’s “Right to be Forgotten” Law – Daphne Keller

The probably-really-almost-totally final 2016 General Data Protection Regulation (GDPR) is here!  Lawyers around the world have been hunkered down, analyzing its 200-plus pages. In the “Right to Be Forgotten” (RTBF) provisions, not much has changed from prior drafts. The law still sets out a notice and takedown process that strongly encourages Internet intermediaries to delete challenged content, even if the challenge is legally groundless.

The problems I identified in earlier drafts could have been avoided with simple changes – putting procedural checks on invalid erasure requests, while giving effect to valid ones.  Those changes would not have diminished any gains for online privacy rights under the GDPR, or affected Internet users’ ability to delete data collected by companies and held in back-end logs, accounts, or profiling systems.  The opportunity to make those targeted changes has now passed.

The silver lining is that the final GDPR text is riddled with ambiguous passages on key points.  Happy holidays, data protection lawyers – it’s the gift of lifetime employment!  These ambiguities will move debates that should have been resolved in the lawmaking process into a new phase, centered on advocacy before regulators and courts.  For RTBF issues, there are enough important ambiguities to keep the public discussion going for a long time.  The right interpretations can help lawmakers protect online privacy and data protection rights without doing unnecessary and disproportionate harm to free expression and information access.

In overview, here is how the notice and takedown provisions landed in the final draft.   For the quick-and-dirty version, just read the underlined parts.

o   Intermediaries must immediately take content offline when they receive an erasure request, and keep it offline until they figure out whether the request is legally valid.  (Art. 17a.1.a and c)  For example, if someone claims that online information is inaccurate, the intermediary must “restrict” the information by taking it offline “for a period enabling the controller to verify the accuracy of the data.” (Art. 17a.1.a)  That kind of factual investigation of user-generated content is not something intermediaries are likely to attempt.  Information restricted under this remove-then-verify standard is unlikely to ever be reinstated. There are powerful policy arguments, as well as legal arguments based on fundamental rights, against this presumption of guilt for online speakers.  Those arguments should prevail.  But Article 17a won’t make it easy.

What changed: Prior language on “restriction” was confusing and scattered.  The final text in 17a cleans it up in ways that eliminate arguable exceptions or defenses from older drafts, but those were never strong anyway.

o   Intermediaries must decide what to erase without any meaningful guidance about how to weigh the rights at issue.  Hopefully regulators will consult with stakeholders, including civil society groups and experts on free expression and information, to develop concrete guidelines — like the very thoughtful ones they previously published for search engines.

o   The person whose online expression is erased is not notified or given an opportunity for defense.  The GDPR does not spell this out, but leaves intact the language that regulators already interpreted to preclude notice in the search engine context.  Under that interpretation, intermediaries can sometimes consult with the publisher before erasing the content, however.

o   The intermediary seemingly must disclose the online speaker’s identity to the person requesting erasure.   I think DPAs or courts will find ways interpret this part out of existence.  It’s seriously inconsistent with the GDPR’s pro-privacy goals.  Applying it to RTBF content removals was probably a drafting mistake.  The GDPR language on this is fuzzy enough that smart data protection lawyers should find a way around it, if they want to.  It says that the “data subject shall have the right to obtain from the controller” certain information, including “where the personal data are not collected from the data subject, any available information as to their source.”  Art. 15.1(g).  There is similar language at Article 14a(2)(g).

o   There is no specified form for requesting erasure or required information that data subjects must include to explain why erasure is justified.  This, too, can and hopefully will be remedied in guidance from regulators.

o   Most removals are to be processed in a one month turnaround time.  (Art 12.2)  That seems reasonable in most cases.  For companies experiencing rapid growth who don’t yet have legal teams dedicated to such things, it may lead to rushed and inaccurate removals decisions.

This post originally appeared on The Center for Internet and Society Blog and is reproduced with permission and thanks

Exit mobile version