1

Forgetting Made Easy

In the landmark case Google v. AEPD, handed down in May 2014, citizens of the European Union were granted one version of a right to be forgotten based on portions of the 1995 European Union Data Protection Directive. Within the first thirty-one days, Google received a reported 70,000 requests to remove 250,000 search results linking individuals in the EU to information they would rather be dissociated from. That is a lot of lost information. Responding quickly to technological threats is admirable, but extreme responses that do not account for social variation and legal uncertainty can lead to internal injustices and cross-cultural struggles.

Europe has a long history of privacy regulation,1 much of which is relevant to the right to be forgotten. A number of unifying documents have been passed in the twentieth century and continue to be refined and interpreted through two main braches: the Council of Europe and the European Union.2 The Council of Europe’s European Convention on Human Rights (ECHR) is over sixty years old and conveys the “right to respect for private and family life” in article 8. The Council’s Convention for the Protection of Individuals with regard to Automatic Processing of Personal Data (Convention 108) proactively addressed computational data processing in 1981. Since 2000, article 7 of the Charter of Fundamental Rights of the European Union has protected “respect for private and family life.” Additionally, the Charter explicitly denotes the protection of personal data in the next article (8). Arguably the most influential privacy regulation ever conceived, the EU Data Protection Directive (DP Directive) was passed in 1995 and is the core of the European data-protection framework. It directed all EU member countries to create data-protection agencies and establish means of protecting specified data rights granted to citizens. It has significantly influenced most, but not all, data-protection laws around the world. The two organizations exist as cooperative partners, engaging in joint programs and initiatives, but have complementary roles. All twenty-eight members of the EU are members of the forty-seven nations of the Council of Europe, and so privacy regulations and rights must account for article 10 of the ECHR, the right to “receive and impart information and ideas,” and its corresponding right found in article 11 of the Charter. Most importantly, in comparing European and U.S. legal cultures, each right must be accorded “equal respect” as courts seek to strike a “fair balance” when presented with controversies involving multiple rights, interests, and circumstances as directed by the European Court of Human Rights (ECtHR).3 These and other agreements have had important impacts on the protection of privacy in European countries. Each European country briefly surveyed in the following pages has a different relationship to, role within, and adapted legal structure for these regional intergovernmental organizations, laws, and rights. I have chosen to limit discussion of these interplays for the sake of brevity and to focus on the way in which the national systems have developed and deployed protection of information privacy and redemptive concepts.

Before looking at individual European countries, an important difference between European and American information policy to keep in mind is the default for sharing. The default in the U.S. generally permits the collection and transfer of personal information and prevents abuse through self-regulation and market forces. National systems within Europe operate under comprehensive regimes that protect personal information across both the public and private sectors and are enforced by specialized data-protection agencies. The structure, resources, regulatory tools, and enforcement tactics of data-protection authorities, as well as their place within the national regulatory system, vary across nations, but the default is generally not open sharing. For instance, cross-marketing efforts in the 1990s between financial services and insurance companies in Germany required each customer to consent to the transfer.4 Many countries restrict the transfer of publicly available data and protect information in public registries, such as voting records, real estate transactions, and judicial decisions. Because this approach applies to both the public and private sectors, sharing between the two types of institutions also is restricted. Another example is the injunctive relief based on a violation of intimacy of private life in France, ranging from the destruction of art to the suppression of books, which is extended to information prior to disclosure.5 This difference in default matters greatly to the right to be forgotten and is entangled with the way in which countries have navigated their relationships to intergovernmental organizations and laws.

Many national legal systems in Europe embrace a notion of redemption by which an individual may preclude another from identifying him or her in relation to his or her past. While the initial release of personal information may be warranted, over time the justifications for continuing to make available or to disclose old data wane as privacy rights gain legitimacy. This dynamic is essentially a right to be forgotten and finds different roots, interpretations, and enforcement across European countries. It is no surprise that countries with rich personality rights like Germany, France, and Italy—those that encompass and firmly protect dignity, honor, and the right to private life—have had established and evolved forms of the right to be forgotten for decades. What is somewhat surprising is that privacy rights in all three of these influential civil law countries (wherein codes and statutes intended to reach every eventuality are the main source of law) derived from common law (wherein the main source of law is judicial opinions that ensure consistency through adherence to precedents) judicial developments at various time periods, a characteristic not shared by the UK.

The United Kingdom, while part of the European Union, is unique within Europe because it is a common law legal system, as opposed to the rest of the countries of Europe (with the exception of Ireland), which have civil law systems. This difference is emphasized by the United Kingdom’s lack of a single constitutional document. The country has of course protected privacy, reputation, and speech through other claims and rights but has done so utilizing its “uncodified constitution.” Like many of the other countries in Europe, the UK has relied heavily on defamation law to protect reputation, as a right recognized within the right to respect for private and family life. The United Kingdom’s balance between the preservation of reputation and freedom of expression has shifted, particularly because the Defamation Act of 2013 made a number of changes that brought the tort closer to a U.S. version.

While defamation now provides less control of individual reputation than in the past, the Human Rights Act of 1998 and the Data Protection Act of 1998 have increased the UK’s recognition and enforcement of privacy rights through tort actions. Prior to the Human Rights Act, which was passed in the UK to give full effect to the ECHR and stated that it is unlawful “for a public authority to act in a way which is incompatible with a Convention right,” the UK protected privacy through an expansion of its law of confidentiality.6 Confidentiality claims center around the inappropriate sharing of personal information, which historically required the claimant to have an expectation of trust based on the nature of the relationship and information exchanged but has since been expanded to reach incidents where no prior relationship exists.7

A series of relatively recent opinions from English courts, most involving claims filed by celebrities like Michael Douglas and Naomi Campbell trying to prevent or be compensated for the publication of their personal information, reveal the doctrinal shift prompted by the need to fit national legal protections within the ECHR.8 By the time Max Mosley sued over the publication of video and pictures taken of him engaging in sexual acts under uncommon circumstances, the groundwork had been laid for his victory.9 Mosley’s claim that he had an expectation of privacy in information regarding his sexual activity was easily accepted.10 Additionally, the court rejected the publisher’s argument that the criminal element (prostitution) necessarily meant the information was of public interest.11 Finally, the aspect of the Nazi theme, which according to the court might have been of public interest, was dropped due to lack of evidence.12 The court balanced article 8 (right to respect of one’s privacy and family life) and article 10 (freedom of expression and information), diminishing somewhat the unique distinction that England possessed in regard to protecting privacy interest.13

The UK’s data-protection regime, in place since 1984, has been similarly altered to fit within broader European goals. Senior High Court Justice Tugendhat explained in 1999, “It is vital to note that the 1984 Act created, and the 1998 Act continues, concepts entirely new to English law. These rights do not depend on whether the data subject . . . would have rights under the existing law of confidentiality or defamation or any other tort or statute.”14 The implementation of the DP Directive required certain updates to the Data Protection Act of 1984, which were put in place by the Data Protection Act of 1998. Enforced by the UK’s data-protection agency, the Information Commissioner’s Office (ICO), the privacy regime is overseen by the Information Tribunal. While British jurisprudence experienced a bit of a jolt and data protection was restructured slightly, the UK has not embraced the right to be forgotten. The UK’s House of Lords EU Home Affairs, Health and Education Sub-Committee has fought against the incorporation of a right to be forgotten into EU law, calling it “misguided in principle and unworkable in practice.”15

Postwar Germans have embraced the protection of personal information more directly.16 Although reluctant to recognize expansive personality rights in the nineteenth and early twentieth centuries, German courts actively and consistently emphasized individual human dignity as the guiding principle of the Basic Law, which was drafted and instituted in response World War II and serves as constitutional law in Germany.17 Additional judicial steps were taken in Schacht (1954) when a newspaper published portions of a letter to the editor that created misleading impressions about the attorney who had written it on behalf of his client. The Federal Supreme Court found that the version of the publication was a gross misrepresentation and had not been consented; it ordered that the paper print the full statement.18 In 1973, when Princess Soraya, ex-wife of the shah of Iran and a German resident, sued a weekly paper for publishing a wholly fabricated interview with her, the Federal Constitutional Court upheld prior court decisions awarding damages for defamation and privacy invasions. Previously, in the prewar era, such cases were heard only by criminal courts and punished with a fine.19 The court found that the Basic Law protected “above all . . . a person’s private sphere, i.e. the sphere in which he desires to . . . remain free from any outside interference,”20 and in a controversial exercise of power, awarded substantial damages beyond the civil code for the violation of human dignity.

The German right to be forgotten is a personality right situated in the principle of informational self-determination; it incorporates the right to decide for oneself how one will be portrayed to third parties or the public, as well as what and to what extent personal information is made available and to whom. This principle is an extension of other rights, namely, articles 1 (“right to human dignity”) and 2 (“right to free development of one’s personality”) of the German constitution. A new, extensive national census in 1983 provoked public indignation and resistance that led to one of the most important early European data-protection cases. Suspending the census, the German Constitutional Court (BVerfG) interpreted the rights in light of technological innovation: “The worth and dignity of individuals, who through free self-determination function as members of a free society, lie at the core of the constitutional order. In addition to specific guarantees of freedom, the general right of personality guaranteed in Article 2.1 in conjunction with Article 1.1 of the Basic Law, which can also become important precisely in view of modern developments and the concomitant new threats to the personality, serves to protect that worth and dignity.”21 This 1983 ruling also mentions the obligations to provide details and clarification on data collected and stores, as well as to delete data.22 Most importantly, the ruling is foundational to the principle that “every use of personal data intrudes upon personal freedom and therefore requires legal justification.”23

Although Germans have a very different twentieth-century history than other European countries do, Germany, like all countries, must draft or interpret its own laws and jurisprudence in light of unifying European documents like the European Convention on Human Rights—a feat of legal reasoning handled by ECtHR judges who hear allegations of violations of the ECHR by Council of European state actors. One of Germany’s, and Europe’s, most important privacy cases involved a 1999 German BVerfG ruling against Caroline von Hannover (princess of Monaco), who had sought relief for the magazine publication of pictures depicting her going about her daily life (leaving restaurants or walking on the sidewalk).24 The BVerfG reasoned that she was a public figure and had to tolerate publication of images taken in public so as not to infringe on the freedom of the press and the public’s legitimate interest in her.25 However, the ECtHR overturned the decision, arguing that the German court had deprived von Hannover of her right to privacy and to respect for her family life.26 After dismissing another of von Hannover’s claims for injunctive relief in 2008 for another set of photos, the BVerfG’s decision was upheld by the ECtHR because the accompanying story discussed Prince Rainier’s illness, which was of legitimate public interest as opposed to a publication for entertainment purposes alone.27

Specifically, the German right to informational self-determination entered the conversation about the right to be forgotten when Wolfgang Werlé and Manfred Lauber started filing lawsuits against websites to have their names disconnected from the murder of the actor Walter Sedlmayr, of which they were convicted in 1993. The conviction is a matter of public record, and because of the actor’s fame, numerous news stories were drafted and disseminated about the crime. The two successfully convinced some sites to remove their names and sued a number of other media outlets, including the German-language Wikipedia.28 Lower courts were split on whether online archives were a current dissemination of a story, which would have violated Germany’s interpretation of its right to the development of one’s personality as it had been extended to other individuals who had served their time.29 But when the cases reached the Federal Court of Justice of Germany (BGH) in 2009, it found against Werlé and Lauber by focusing on whether the information was clearly marked as old and outdated, in which case it could be retained.30 This case, discussed later in chapter 4, takes into account not only the public interest but also the user experience with the information and the expressive interests of the data controller; it represents a refreshingly nuanced approach to privacy.

Franz Werro has investigated the “transatlantic clash” of the Swiss right to be forgotten and the right to inform in the U.S.31 Switzerland, a member of the Council of Europe but not part of the European Union, holds a strong right to be forgotten based on the country’s recognition of a general right to personality, which is similar to the right in Germany and has been interpreted repeatedly to protect criminally prosecuted individuals from being associated with the proceedings once the individual is considered rehabilitated (usually after the judicial proceedings or time served).32 Although the Swiss civil code and federal law recognize the primacy of the ECHR, national decisions on the right remain the dominant source on the subject because the right to be forgotten has yet to be heard explicitly by the ECtHR.33 Under Swiss law, publishing the name of an individual in relation to his or her criminal past is only acceptable if the conviction remains newsworthy, and even then it will not be permissible if significant rehabilitation efforts have been made. For instance, the Swiss Federal Court has prevented the publication of white-collar crimes34 and armed bank robbery,35 as well as preventing the broadcasting of a 1980 documentary on a man sentenced to death in 1939, an action brought by his son.36

French privacy rights are, like others, one in a bundle of personality rights that also include moral rights of creators for the purposes of copyright, the right to control the use of one’s image, and the right to protect one’s honor and reputation. Although France, too, is a civil law system, its privacy rights developed in a “remarkably uncivil” way.37 Without legislation on the books, French judges essentially created a right to oppose the publication of private facts through common law based on tort principles and expanded into recognition of a substantive right to privacy in the 1950s and 1960s. A right to be forgotten of sorts developed through extensive case law on various aspects of French personality rights and modern data-protection reform. In what is known as the Rachel affair of 1858, a famous actress was photographed on her deathbed at the request of the her sister with the express agreement that the photographs would not be shared with third parties, but a sketch of the deathbed scene appeared shortly after for sale in local stores.38 Rachel’s sister successfully sued to have the sketches seized and destroyed. A number of French privacy attributes are revealed through analysis of the case. The attorney and lecturer in law Jeanne M. Hauch has explained that, prior to legislation and convergence through France’s participation in the United Nations, Council of Europe, and European Union, French courts found “liability without much discussion of the reasonableness of the defendant’s conduct,”39 focused on the suffering of the plaintiff, preferred specific rather than monetary relief, and found that the right of privacy trumps the personality rights of expression and informational interest. The Rachel decision uses words like “absolute right” and “[no matter] the celebrity of the person involved,” reflecting “a feeling that protection tempered by a requirement of negligence or recklessness is simply not stringent enough where the human personality is at issue.”40

One of the most prominent privacy cases in the country’s history illustrates the development of a unique protection of private life. In 1867, the famous Three Musketeers author Alexandre Dumas père filed a claim revolving around a set of untoward photos disseminated by the photographer. Dumas admitted that he had sold his rights in the photographs to the man he was suing for publishing them.41 The Dumas court adopted ideas regarding private life that were expressed when the first law lifting post-Napoleonic censorship of the press was passed in 1819.42 The court explained that even if a person had consented to exposure, that person must retain the right to withdraw in order to protect his or her dignity.43 Only upon dissemination of the information may the person realize its ramifications, particularly when it violates the expectations that the subject had in mind upon its release. Although Dumas “had forgotten to take care of his dignity, [exposure may] remind him that private life must be walled off in the interest of individuals, and often in the interest of good morals as well.”44

Preliminary injunctive relief that would prevent the disclosure of information to avoid privacy violations was questioned and established nearly one hundred years later in the Philipe affair, after several reporters bombarded the hospital room of the nine-year-old son of the famous actor Gérard Philipe. Philipe’s widow brought an action to have the publications, set to come out in the next weekly issue, seized prior to hitting newsstands. Even though the article was supportive of the boy and most of the pictures in the article had already been published, the lower and appellate courts found the intrusion of the press intolerable, and the injunction was upheld by the highest court in 1966, further solidifying a strong privacy right that would restrict expression where necessary.45 A host of privacy claims that were decided through the 1950s and 1960s further refined levels of violations on the basis of the circumstances and associated relief. While the French maintained a newsworthiness component, they did not extend it to a number of things happening in the public view, because the public’s legitimate interest does not extend to knowing about personal interests or activities like hobbies or relationships.

Judicial interpretations of personality rights were codified in article 9 of the French Civil Code in 1970, which explicitly guarantees individuals the right to demand respect for his or her private life and grants judges the power to prescribe “all measures” to avoid or remedy a violation of intimacy of private life.46 In the late 1970s, France also enacted its first data-protection laws. Chapter 5 of Law 78–17/1978 included a right of individuals to demand erasure of personal data that is no longer relevant.47 Additionally, “numerical oblivion” is a notion adopted specifically for the banking sector and provides individuals with a right to erase personal information from databases after a reasonable period of time.48

Fast-forward to 2012: a Parisian civil court ordered Google.com and Google.fr to remove from search results all links to Diana Z and her past as a porn star.49 French President Nicolas Sarkozy declared, “Regulating the Internet to correct the excesses and abuses that come from the total absence of rules is a moral imperative!”50 France was among the first countries to enact data-protection laws, and the 1978 legislation was amended in 2004 to meet EU standards. In the amended legislation, article 38 provides a right to object to personal data processing for legitimate reasons, and article 40 provides a procedural right to rectify, complete, or erase “incomplete” data.51 The French data-protection agency (CNIL) pioneered recognition of a right to be forgotten for a digital world. In 2010, France gave the concept of le droit a l’oubli (right of oblivion) momentum. The campaign, led by the French secretary of state, Nathalie Kosciusko-Morizet, who was heading developments in the digital economy, drafted codes of conduct, one for behavioral advertising and one for social networks and search engines, to be signed by industry members.52 The charter was signed by a number of actors but did not include Google or Facebook. France has remained relatively isolated from industry pressure not only because of its fierce cultural and legal protection of individuals but also because many of the information sectors were nationalized in France during the mid-twentieth century.

The Italian legal culture has embraced as part of its personality rights what is called the right to personal identity, an individual’s right “to appear and to be represented in social life (especially by the mass media) in a way that fits with, or at least does not falsify or distort, his or her personal history.”53 The Italian legal scholar Giorgio Pino explains that the Italian Civil Code of 1942 explicitly recognizes only the right to one’s name and pseudonym, likeness, and moral copyright and that the Italian constitution is less explicit in its recognition of a right to the development of the person than are other national constitutions such as those of Germany and Spain. Nonetheless, Italian judges, not unlike French and German judges before them, interpreted these rights to craft a right to personal identity using common law techniques in a civil law system.54 For instance, in Pangrazi and Silvetti v. Comitato Referendum (1975), a photograph of an unmarried farming couple was used without their consent on posters supporting a traditionalist, antidivorce movement to which they were opposed.55 The judge expanded on the plaintiff’s claim that the poster violated their likeness rights by adding that the image violated the individuals’ right to have their personal identity protected from misrepresentation by widespread publication. Pino further explains that judicial developments have given the Italian right to personality distinct features: misrepresentation of the person must be made public to be actionable; it does not protect privately held ideas or thoughts; and the right is flexible, covering various contexts such as political, sexual, and professional identities. As an example of the right-to-oblivion context, a convicted murderer who was later pardoned and then the subject of a newspaper trivia game thirty years later successfully challenged the publication. The Tribunale Roma ruled that the paper “shed ‘false light in the public eye’ about the actual personality of the subject” and had not proved justifiable public interest.56

The Italian courts and data-protection agencies have focused on time and data quality to extend protection to citizens in a related way. In 1984, the Italian Court of Cassation established a balancing test for old information when 1972–1973 newspaper reports concerning the fund of two real estate companies were claimed to be continually damaging a decade later.57 The court used three criteria to weigh article 2043 of the Italian Civil Code, providing for recourse when loss is suffered, in light of article 21 of the Italian constitution, providing for freedom of the press (codified in the Italian Press Act): (1) the information must be of social or public interest, (2) the coverage must be correct or at least lead to serious investigation toward discovering truth, and (3) the information must be presented objectively and civilly.58 The court considered the social utility of the information, its incompleteness, and the harm and potential harm, with a special focus on time.59

In 2004, the Italian data-protection agency (Garante) interpreted the data-quality principle in article 11 of its data-protection legislation to mean that data quality is not preserved when personal data that no longer meets the initial objective associated with its collection is not deleted, particularly upon notification from the data subject.60 In 2005, the Garante dealt with the online retrieval of an Italian Antitrust Authority decision against a company for misleading advertisements.61 The agency determined that external search engines should be restricted from crawling the information by applying “Robot Meta Tags” to certain pages and asked the Antitrust Authority to create a time-based policy for hiding the sanctions.62 After five years, the Antitrust Authority would apply metatags to the pages to signal to external crawlers that the pages should not be indexed.63 In a decision out of the Third Civil Division in April 2012, the Italian Supreme Court of Cassation in Rome ordered online news archives to be kept up-to-date so that the right to be forgotten could be enforced in terms of keeping personal data accurate and timely.64 The ruling was directed at Italian newspapers like the Corriere della Sera, one of Italy’s most dominant news sources.65 The case arose when search engines were connecting a politician from Lombardy to news coverage of his 1993 arrest and charges of corruption for which he was later acquitted.66 Finding no recourse with the Garante or the Court of Milan, the claimant filed suit with the Court of Cassation to at least resolve the incompleteness of the information in the digital news archives.67 The court agreed that the information was incomplete in light of later events and that archived articles must be accompanied by relevant updates.68

The Spanish data-protection agency (AEPD) has ardently recognized and extended the right to be forgotten. With a firm constitutional foundation in article 10, which provides a right to the free development of personality, the AEPD established the Spanish right to be forgotten in the principles of collection limitation, purpose specificity, and data quality. However, Spain expanded the concept on the basis of its particular privacy position. The AEPD had no problem providing data subjects who were not public figures or part of a news story the right to correct or respond to personal data online.69 More so than any other country, Spain relies on the right to withdraw consent, found in article 6.3 of its data-protection law.

Gaining substantial U.S. attention, the AEPD brought the initial suit against Google to remove URLs from its index that point to personal information that the AEPD had determined should appropriately be forgotten.70 More than ninety citizens filed formal complaints with the AEPD about online information, including a domestic-violence victim’s address and an old college arrest.71 After assessing the privacy concerns of each complaint and failing to persuade the source of the content to take action, the AEPD ordered Google to stop indexing the information. Google challenged the order, saying that editing the index “would have a profound chilling effect on free expression without protecting people’s privacy”72 and would violate the “objectivity” of the Internet.73

The dispute involved information like a notice of home repossession for nonpayment of social security and a reference to a plastic surgeon’s alleged botched operation that was settled out of court; both are information produced and maintained by traditional news sources and retrieved by Google’s search engine when the individuals’ names are entered.74 Google appealed five of the determinations to the Audiencia Nacional, which in turn referred the matter to the European Court of Justice (CJEU) for clarification.75

In the midst of these distinct national efforts, the European Commission announced in 2009 that it would be reviewing and updating the 1995 Data Protection Directive.76 After more than three years of reflection, consultation, and debates, the European Commission published its proposal for a new European Data Protection Regulation (DP Regulation) in January 2012.77 As promised, the draft included a right to be forgotten. All of the aforementioned laws are relevant to the right to be forgotten in one way or another, but the DP Regulation contains an explicit, formalized, and mechanical right to erase personal data held by another party. The proposed DP Regulation, discussed more fully shortly, was met with great resistance and skepticism in the U.S.

But then, in the spring of 2014, something unexpected happened. It turned out that every European Union citizen had a right to be forgotten of sorts all along—well, since 1995 at least. The decision in the Google Spain and Google Inc. v. AEPD and González (Google v. AEPD) case that had been appealed through the Spanish court system and referred to the CJEU was handed down March 13, 2014; it interpreted the 1995 DP Directive to grant a right to be forgotten across the European Union.78 The named plaintiff in the case was Mario Costeja González, who had complained about a 1998 La Vanguardia notice regarding a real estate auction related to his social security debt. Although long since resolved, the notice came up when people searched him on Google. González first contacted Google Spain with a request to remove the notice from search results but was told that all search results are handled at Google headquarters in California. Unsuccessful, he took his issue to the AEPD. The AEPD pursued his claim against both Google Spain and Google but rejected González’s claim against the newspaper on grounds it had “lawfully published” the content.

When the Spanish court referred the question to the EU’s high court, the CJEU decided that Google’s search activity met the definition of a “data controller” under article 2(d) of the DP Directive because its collection, storage, analysis, and rank of content were acts that “determined the purpose and means” of the personal data in its system. The court also found that, as a data controller, Google was “processing” personal data under article 2(b), which defines the term as “any operation or set of operations which is performed upon personal data, whether or not by automatic means, such as collection, recording, organisation, storage, adaptation or alteration, retrieval, consultation, use, disclosure by transmission, dissemination or otherwise making available, alignment or combination, blocking, erasure, or destruction.”79

The CJEU next determined that the right to access in article 12 and the right to object in article 14 enable a data subject to address the search engine directly in order to prevent indexing. Article 12(b) grants each data subject the right to rectify, erase, or block data processing that does not comply with the Directive in articles 6 and 7. Under article 6, data processing is incompatible with the Directive if the data is inadequate, irrelevant, or no longer relevant; excessive in relation to the purposes of the processing; or kept after it has been used for the purpose for which it was collected. Article 7 provides the acceptable justifications for processing.

Additionally, article 14 provides a right to object to data processing. Article 14 requires member countries to provide a right to object for at least those data-processing practices that are legitimized as tasks of public interest (article 7(e)) or necessary for the legitimate interest of the data controller (article 7(f)) and when the data subject has “compelling and legitimate grounds.” This means that member states may not be required to provide a right to object when data processing is legitimized under the other justifications in article 7: the data subject has given his or her unambiguous consent, the processing is necessary to perform a contract, a legal obligation exists, or a vital interest of the data subject is being protected (article 7 (a)–(d)). Although it may read as a fairly limited right, the CJEU pointed out that “in each case the processing of personal data must be authorized . . . for the entire period during which it is carried out.”80

Substantively, the court offered very little guidance on implementing the rights it uncovered in the Directive. It did, however, find that data subjects cannot just delete something they dislike—they do not even need to dislike it. While González and the Spanish and Italian governments argued that data subjects should be able to enforce their right when search results are prejudicial against them (overriding the interest of the operator and general interest of freedom of information),81 the CJEU agreed with Google and the Greek, Austrian, and Polish governments and the European Commission, which all argued for a seemingly more limited interpretation: in order to force removal, an individual’s personal data must only have lost its adequacy, relevance, or original purpose or its processing lost its authorization.82 The court was frank but not clear:

As the data subject may, in the light of his fundamental rights under Articles 7 and 8 of the Charter, request that the information in question no longer be made available to the general public by its inclusion in such a list of results, it should be held . . . that those rights override, as a rule, not only the economic interest of the operator of the search engine but also the interest of the general public in finding that information upon a search relating to the data subject’s name. However, that would not be the case if it appeared, for particular reasons, such as the role played by the data subject in public life, that the interference with his fundamental rights is justified by the preponderant interest of the general public in having, on account of inclusion in the list of results, access to the information in question.83

Google argued—supported by Austria and Greece—that the publisher of the website is in the best position to assess the lawfulness of the content, to address its importance to the public, and to easily make the content inaccessible by making small technical adjustments that would prevent Google’s crawlers from indexing the content at issue.84 According to the opinion, Poland agreed that if the content is lawfully and truthfully published on the original page, a search engine is released from obligations to remove links to it.85 The CJEU did not see it this way, insisting that reaching search engines was vital to ensuring a high level of protection for fundamental rights in the Digital Age.86

Procedurally, having categorized Google’s search activities as those of a data controller under the Directive, the court established a direct line between EU users and Google. It created a subject-to-controller takedown system for personal information. It is similar to the U.S. Digital Millennium Copyright Act (DMCA), discussed further in chapter 2. A takedown-notice system is more efficient than having to go through an administrative or judicial proceeding of some kind. The takedown system also has the benefit of avoiding the unwanted publicity that filing a privacy claim can bring. We can conclude that a takedown regime for the right to erasure would be (and is proving to be, as Google receives more takedown requests every day from the EU) relatively cheap, efficient, and privacy preserving. All a data subject must do is contact a data controller that would exercise removal of the information, and the data controller must comply unless retention would fall under one of the exceptions. Efficiency comes at a cost, and the DMCA has offered a lesson when it comes to the threat of litigation that hinges on data controllers’ and site operators’ interpretation of and reaction to uncertain legal exceptions. One study from 2006 found that a third of the DMCA takedown notices in the Chilling Effects database presented obvious questions for a court, such as fair-use determination or the legitimacy of the copyright.87 Additionally, 57 percent of notices were sent to target content of a competitor.88 While not substantially effective, safeguards are included in the DMCA to prevent abuse, such as penalties for misrepresenting content or activity as infringing (17 U.S.C. § 512(f)). The right to erasure has no such safeguards.

The potential for abuse in user-initiated takedown systems is already incredibly high, but the added element of international uncertainty regarding the interpretation of numerous digital rights to be forgotten across the EU and their exceptions make widespread abuse inevitable. Arguably, every right-to-be-forgotten takedown request would involve a substantive legal question related to the underlying claim. As a data controller, Google is obligated to consider each and every removal request from data subjects. It must do so with only the aforementioned minimal guidance offered and across twenty-eight different legal cultures. Each country will find different answers as to when and why the interference with a data subject’s fundamental rights under the Directive is justified in light of the general public interest to access the information.

Most importantly, no country has a richly developed digital right to be forgotten. No data controller could receive a right-to-be-forgotten request from any EU country and find clear guidance in legislation, regulation, or jurisprudence and know how to respond legally. In the Future of Reputation, Daniel Solove argues, “The goal of the law should be to encourage the development of norms and to spur people to work out their disputes informally.”89 However, informality must take into account a lack of legal certainty and chilling effects. Attempting to knock Google down a few pegs, the CJEU instead further empowered the search giant by essentially putting the right to be forgotten in the company’s hands to be shaped. Google’s policies, instead of a data-protection agency or court, now serve as the jurisprudence for the right to be forgotten in Europe.

The decision was surprising because both the Directive and Google have been around for some time without this determination having been made. The opinion of the advocate general, a preliminary opinion generally relied on by the CJEU, came to an almost completely opposite conclusion less than a year earlier. Advocate General Jääskinen highlighted the many changes that had occurred since the 1995 Directive and stated that search engines were still “at their nascent stage. . . . The provisions of the Directive simply do not take into account the fact that enormous masses of decentrally hosted electronic documents and files are accessible from anywhere on the globe and that their contents can be copied and analysed and disseminated by parties having no relation whatsoever to their authors or those who have uploaded them onto a host server connected to the internet.”90 Jääskinen found that the objectives of the Directive required an interpretation that had balanced and reasonable outcomes91 and that to find Google a data controller would be a “blind literal interpretation” of the controller concept.92 Additionally, he found no right to be forgotten in the Directive. He stated that “a subjective preference alone does not amount to a compelling legitimate ground within the meaning of Art. 14 (a),” and he was not inclined to agree that a data subject is entitled to restrict or delete disclosed personal data “that he considers to be harmful or contrary to his interests.”93 This was the general understanding of data subjects’ right to object in article 14 and right to erasure in article 12. In fact, at the time, the European Commissioner stated that the DP Regulation would introduce the right to be forgotten when she discussed the Commission’s intentions to revamp the Directive with a Regulation in 2010.94

Although doubts about the advisability of a right to be forgotten exist after the CJEU opinion, there can be no doubt about the demand for digital redemption after the flood of requests that followed. Once Google created an online form for EU citizens to fill out describing their claim to be forgotten and verifying their identity, the company received 12,000 requests in the first day, 40,000 requests over the first four days, and 70,000 requests to remove 250,000 links by the end of the first month. The surge of requests settled into about 1,000 requests per day, with France submitting the most requests of the total figures at 14,086, followed by Germany with 12,678, and the United Kingdom at 8,497 requests (half of which reportedly came from those related to criminal convictions).95 Faced with mounting compliance costs, Google’s team granted the removal of 70,000 links within the first month and created a committee to help the company “be more European.”96

The decision does not mean that the inclusion of the right to be forgotten in the DP Regulation is now irrelevant—far from it. The Directive required each member country to provide protection for certain rights within its own national legal framework. There is a great deal of variation, as the input from various countries in the Google v. AEPD case illustrates. As the foregoing cursory look at only a few EU countries reveals, each European country has a different history when it comes to data, speech, access, public interest, public figures, privacy, identity, reputation, self-determination, and self-presentation. Some of these countries may be similar and their treatment appropriately converged into one, but others will not merge seamlessly. The point of the DP Regulation is to update the DP Directive from 1995 to meet the technological realities of today and the future, as well as to harmonize data-protection regimes throughout the EU in order to streamline compliance. The DP Regulation intends to set forth a single set of rules, with national data-protection agencies coordinated by a European Data Protection Board to resolve fragmentation and incoherence.

As mentioned, the European Commission has proposed its long-expected draft revision of the European data-protection legal framework.97 Although a final version is not expected to be adopted until the end of 2015 or early 2016, it is useful to evaluate the current DP Regulation proposal, which already represents four years of reflection and three stages of EU legislative decision making, in order to assess how it may resolve the issues that have already presented themselves since the Google v. AEPD case.

The DP Regulation’s objectives can largely be divided into two main categories: (1) the proposal tries to strengthen individuals’ control over their personal data, and (2) it tries to provide legal certainty and minimize administrative burdens for businesses. One of the elements included to achieve the first objective is the introduction of an explicit “right to be forgotten and to erasure,” building on the existing right of erasure in the DP Directive. In the Commission’s words, this provision is intended “to ensure that when an individual no longer wants their personal data to be processed, and if there is no legitimate reason for an organisation to keep it, it should be removed.”98

When the Commission’s proposal was sent to the next phase in the EU legislative process, the title of article 17 was changed to just “the right to erasure” by the Civil Liberties, Justice, and Home Affairs (LIBE) committee.99 The language I quote shortly is the language in the LIBE draft accepted by the EU Parliament. It will proceed to the EU Council, which may make additional alterations, but an agreed-on text must be established between Parliament and the Council in order to be enacted.

It is important to consider the overall scope of application of the DP Regulation itself. The right does not create obligations for activities that fall within the “household” exception, which is defined as “a natural person without any gainful interest in the course of its own exclusively personal or household activity” (art. 2(d)). Besides this personal use or household exemption, the Regulation is also not applicable to data processing in the context of national security issues or criminal investigations. Otherwise, article 17 is far reaching.

The first paragraph describes four situations in which the data subject has “the right to obtain from the controller the erasure of personal data relating to them and the abstention from further dissemination of such data”:

(a) the data are no longer necessary in relation to the purposes for which they were collected or otherwise processed; (b) the data subject withdraws consent on which the processing is based . . . , or when the storage period consented to has expired, and where there is no other legal ground for the processing of the data; (c) the data subject objects to the processing of personal data pursuant to Article 19 [right to object]; (c)(a) a court or regulatory authority based in the Union has ruled as final and absolute that the data concerned must be erased; (d) the data has been unlawfully processed.

Thus, dependent on verification of the data subject’s identity, the right can be invoked whenever (a) the purpose-limitation principle is breached, (b) consent is withdrawn or the legitimate storage period has been exceeded, (c) the right to object to data processing has been legally exercised or ordered by a court or agency, (d) the processing of data is illegal (i.e., does not comply with the Regulation).

The purpose-limitation principle is not a new concept and can be found in the current DP Directive (art. 6), although it is rare to find in practice. The third and fourth grounds, too, can be found in the current framework. When data is processed illegally, the controller must remove it, and when an individual exercises his or her right to object, this renders further processing illegal (and thus removable) as well. However, the second ground is more novel: consent withdrawal and expiration. The new Regulation tries to deemphasize reliance on a one-time consent framework, for example, by explicitly allowing for its withdrawal (art. 7(3)). The proposed right to erasure also follows this trend, seeking to establish a balanced environment where individuals can continually and effectively reevaluate their consent.100

The second paragraph,101 however, is the source of significant dispute, bringing to the forefront drastic differences between European and U.S. legal and cultural treatment of information privacy. This paragraph grants a right to the data subject that extends to circumstances in which the data controller has disclosed the personally identifiable information to the public (e.g., by publishing it on a website) or when publication is delegated to a third party. It reads, “Where the controller referred to in paragraph 1 has made the personal data public without a justification based on Article 6(1), it shall take all reasonable steps to have the data erased, including by third parties, without prejudice to Article 77. The controller shall inform the data subject, where possible, of the action taken by the relevant third parties” (art. 17(2)). The vagueness of the obligation to take “all reasonable steps to have data erased” is challenging. But the LIBE committee did edit the language that previously stated, “Where the controller has authorised a third party publication of personal data, the controller shall be considered responsible for that publication.” It was unclear when a publication would be considered “authorised” and what “responsible” duties and liabilities entailed—the nature of data sharing today made that obligation unworkable. The second paragraph is also puzzling when read in conjunction with article 13 of the proposed Regulation. This article already provides for a more general rule on data subjects’ rights to trace their data. This provision states that the controller has to communicate erasure requests to all recipients to whom personal data was disclosed, unless this proves to be impossible or disproportionate. The relationship between article 13 and article 17(2) is unclear. The current draft leaves much open to interpretation.

Most wanting for clarity is article 17(3), which outlines the exceptions to the right to be forgotten. In cases where the erasure request is based on a withdrawal of consent, personal data should not be erased when other legal grounds for processing the data exist. Furthermore, paragraph 3 determines that the data controller may retain the personal data at issue if it is necessary (a) to protect the right of freedom of expression, (b) for reasons of public interest in the area of public health, (c) for historical, statistical, and scientific research purposes, or (d) for compliance with a legal obligation to retain the personal data by Union or Member State law.102 The exceptions will define the right to be forgotten. It is not hard to establish a historical, statistical, or scientific justification for keeping any data. However, no further guidance on how to balance these vital, numerous, and circumstance-specific interests is provided. Personal data can also be retained—although the controller does have to restrict its processing103—(a) when its accuracy is contested by the data subject (for a period enabling the controller to verify the accuracy of the data), (b) for purposes of proof, (c) when the data subject is opposed to erasure (even though the processing is unlawful) and requests that the use of the data be restricted instead, and (d) for data-portability purposes. Although the applications of all these exceptions are not clear, their interpretations will make or break the right to be forgotten.

All of this vagueness is particularly troublesome in light of the chilling effects associated with user takedown systems and because of the Regulation’s stern penalties, which the Commission had set as a fine up to €1 million or up to 2 percent of its annual worldwide turnover and the LIBE Committee raised to a fine up to €100 million or up to 5 percent of the annual worldwide turnover, whichever is greater (art. 79(2a)(c)). At this point, article 17(9) may be the most important part of the entire section; in it, the Commission reserves the right to adopt “delegated acts” in order to specify criteria and requirements for the application of the right in specific sectors and situations but also to specify conditions with regard to the enigmatic second paragraph.

There are also international measures designed to guide all countries that participate in the councils and groups that craft the measures. The Fair Information Practice Principles (FIPPs) lay some foundation on which a right to be forgotten may find footing. The FIPPs have gone through many iterations and vary by sector. The Organization for Economic Cooperation and Development (OECD) proposed privacy guidelines104 similar to those developed by the Council of Europe,105 both in 1980. The principles in the OECD Guidelines on the Protection of Privacy and Transborder Flows of Personal Data are as follows:

This version of the FIPPs explicitly gives the individual user a right to delete personal data, but only when denial of a request is successfully challenged. The OECD’s principles have been widely embraced, including by the Asia-Pacific Economic Cooperation (APEC) Privacy Framework. The FIPPs were most recently utilized by the Obama White House in a report by the National Strategy for Trusted Identities in Cyberspace (NSTIC).107 However, the NSTIC set of FIPPs does not include user-initiated deletion in the individual-participation or data-minimization principles, but as we have seen, the combination can offer users a claim to delete data that has fulfilled its original purpose depending on how these principles are to be enforced. Again, though, these are principles to be incorporated and enforced (or not) by various legal cultures around the world. All of the mentioned FIPPs have been offered to members as guidelines.

There is no international law currently used to implement digital redemption, but there may be one on the books. The 1966 International Covenant on Civil and Political Rights (ICCPR) has since its adoption included a right to privacy in article 17. It reads,

  1. 1.) No one shall be subject to arbitrary or unlawful interference with his privacy, family, home, or correspondence, nor to unlawful attacks upon his honour and reputation.
  2. 2.) Everyone has the right to the protection of the law against such interference or attacks.108

Although this reads similarly to the ECHR’s article 8 and the Charter’s article 7, General Comment 16 of 1988 distinguishes the UN’s focus from these two European documents, which are much more encompassing. The ICCPR’s article 17 refers to the need for limits on collection, security measures, and access to collected data but little else relative to the principles that had already been stated prominently elsewhere. The UN is not perceived as a strong source of data privacy; even still, the General Comment on article 17 explicitly states, “Every individual should also be able to ascertain which public authorities or private individuals or bodies control or may control their files. If such files contain incorrect personal data or have been collected or processed contrary to the provisions of the law, every individual should have the right to request rectification or elimination.”109 There may be a right to be forgotten written into the ICCPR, making it an international human right, but the right to be forgotten is not likely the UN’s highest priority in terms of protecting information privacy, family privacy, or individual honor; nor has the UN taken the lead on data privacy issues. Furthermore, the ICCPR also mandates the freedom of expression in article 19, and the U.S. has made an official reservation that none of the articles should restrict the freedom of expression.