1 Bender on Privacy and Data Protection § 31.03[1] (2020)
Note: Effective May 25, 2018, the Directive will be replaced by the General Data Protection Regulation.
1 Bender on Privacy and Data Protection § 31.03[1][a] (2020)
The Directive sets out the framework for EU data protection law. It speaks to the principles required for maintaining the quality of personal data: personal data must be processed fairly and lawfully; collected only for legitimate stated purposes; not excessive in relation to those purposes; kept accurate and updated, with reasonable steps for modifying inaccurate data; and kept in a form that permits identification of persons for no longer than necessary for the purposes. Moreover, personal data may be processed only if one of the following conditions is met: the subject has given consent; processing is necessary to perform a contract to which the subject is party; processing is necessary for the processor to meet a legal obligation; processing is necessary to protect vital interests of the subject; processing is necessary for the public interest; or where the legitimate interests pursued by the controller outweigh the interests of the subject.
The Directive deals with the special nature of “sensitive” data. It also mandates the information that must be given a data subject regarding his or her data, and the subject’s right to demand that inaccurate or incomplete data be rectified. Further, it deals with the confidentiality and security that must attend processing, and with notification that must be given data subjects. In one of the more basic provisions, it requires generally a notification to the pertinent DPA before undertaking processing; the collection of these notifications comprises a public record. Moreover, aside from any administrative remedy, every person must have a right to a judicial remedy for a breach of these rights. And each Member State must provide that any person injured by unlawful processing is entitled to compensation from the data controller.
Important sections of the Directive deal with the requirements of exporting personal data from the EU. One section requires creation of an independent governmental entity responsible for monitoring the application of the national data protection law.
The Directive also creates the “Article 29 Working Party,” an independent advisory organ of the EU, comprising a member from the DPA of each Member State, a representative of the EU privacy agency, and a representative of the European Commission. At regular intervals, a report must be made to the EU on the implementation of the Directive, with any recommended amendments.
Articles 1–4 constitute the “General Provisions.”
Object. Article 1 notes that the object of the Directive is to protect fundamental rights of natural persons (especially the right to privacy with respect to processing personal data), and also states that the Member States are not to restrict the free flow of personal data between and among Member States for reasons connected with that protection.
Definitions. Article 2 defines several terms. “Personal Data” is information concerning an identified or identifiable individual.
2 A “controller” is the person or entity that determines the purposes and means for processing personal data.
Scope. Article 3 specifies that the Directive applies to the processing of data partly or entirely by automatic means, and to other processing where the data is intended to form part of a filing system. Further, the Directive does not apply to processing involved with state security or defense, to state activities in connection with criminal law, and to processing by an individual for personal or household purposes.
National Law. Article 4 states that the national law enacted pursuant to this Directive applies to the activities of an establishment of the controller of the data in that jurisdiction or, where the controller is not established in the EU, where it processes data in the jurisdiction.
Articles 5–21 make up the “General Rules on the Lawfulness of Processing of Personal Data.”
- Article 5: Specificity of National Laws
- Article 6: Principles Relating to Data Quality
- Article 7: Criteria for Making Data Processing Legitimate
- Articles 8 and 9: Special Categories of Processing
- Articles 10 and 11: Information to be given to the Data Subject
- Article 12: The Data Subject’s Right of Access to Data
- Article 13: Exemptions and Restrictions
- Articles 14 and 15: The Data Subject’s Right to Object
- Articles 16 and 17: Confidentiality and Security of Processing
- Articles 18–21: Notification
- Articles 22–24: Judicial Remedies, Liability and Sanctions
- Article 25: Conditions for Third-Country Transfers
- Article 26: Exemptions for Third-Country Transfers
- Article 27: Codes of Conduct
- Articles 28–30: Supervisory Authority and Working Party on the Protection of Individuals with Regard to the Processing of Personal Data
- Article 31: Community Implementing Measures; and
- Articles 32–34: Final Provisions
More Detail. Article 5 states that the Member State national laws will, consistently with the principles of the Directive, determine more specifically than the Directive the conditions under which processing of personal data is lawful.
Data Quality. Article 6 requires that personal data must be: processed fairly and lawfully; collected only for legitimate stated purposes; not excessive in relation to those purposes; kept accurate and updated, with reasonable steps for modifying inaccurate data; and kept in a form that permits identification of persons for no longer than necessary for the purposes.
Legitimacy. Article 7 provides that personal data may be processed only if one of the following conditions is met: the subject has given consent; processing is necessary to perform a contract to which the subject is party; processing is necessary for the processor to meet a legal obligation; processing is necessary to protect vital interests of the subject; processing is necessary for the public interest; or where the legitimate interests pursued by the controller outweigh the interests of the subject.
Processing of Special Data Categories. Article 8 states that, with certain exemptions, there shall be no processing of personal data revealing ethnic origin, political opinions, religious beliefs, or data regarding health or sex life. The exemptions comprise: situations where a person has consented, where employment law requires it, where processing is necessary to protect the vital interests of a subject incapable of consenting, where done in the course of legitimate activities by a non-profit organization, where the subject has made the data public, or where necessary for preventive medicine. The Member States are free to add exemptions. Processing of data related to the criminal justice system is subject to additional strictures, and each Member State may require that data regarding the civil justice system also be subject to additional strictures.
Journalistic, Artistic or Literary Expression. Pursuant to Article 9, the Member States may provide exemptions for data that is processed solely for journalistic purposes, or for purposes of artistic or literary expression.
Information Available to Subject. Article 10 requires that where data is collected from the subject, the controller must give the subject certain information: the controller’s identity; the purposes of the processing; and, where necessary to guarantee fair processing, the identity of the recipients (or categories of recipients); whether replies are obligatory and the possible consequences of failure to reply; and the existence of a right to access and rectify data.
Disclosure of Purpose. Article 11 states that, where the data has not been obtained from the subject, the controller must, at the time of recording or disclosure to a third party, except where processing is for historical or statistical purposes and the disclosures would be impractical, disclose the third party’s identity, the purposes of the processing, and further information necessary to “guarantee fair processing,” such as categories of data concerned, recipients of data and the existence of a right of access and rectification.
Right of Access. Article 12 sets forth a right of access. The subject has the right to obtain from the controller confirmation of whether data regarding him or her is being processed (with information on the purposes, categories of data and recipients), a copy of the data and available information as to source, and a description of the logic involved in processing any data regarding him or her that results in a decision that has legal effect on the subject. The subject also has the right to obtain rectification, erasure or blocking of data whose processing does not comply with the Directive, and (unless impractical) notification to third party recipients of any such rectification, erasure or blocking.
Exemptions and Restrictions. Article 13 gives the Member States the right to restrict the scope of obligations imposed by most of the provisions of the Directive when necessary to safeguard national security, conduct criminal investigations, undertake public security matters, safeguard important economic interests of a Member State or the EU, or monitor regulatory functions. Subject to “adequate safeguards,” and where there is no risk of breaching the subject’s privacy, Member States may restrict the rights in Article 13 when data is processed solely for purposes of scientific research, or kept in personal form for no more than the period necessary for that purpose.
Subject’s Right to Object. Article 14 sets forth the subject’s right to object. The subject has a right to object to processing of his or her data necessary for the public interest or for legitimate interests of the controller or a third party, based on the subject’s “particular situation.” The subject may also object to the processing of his or her data which the controller expects to process for direct marketing purposes, or to be informed before that data is initially disclosed to third parties or used on their behalf for purposes of direct marketing, and the subject must be offered the right to object, free of charge, to such disclosures or uses. In addition, the Member States are required to take steps to ascertain that subjects are aware of these rights regarding direct marketing data.
Automated Decisions. Article 15 requires that, in general, no person may be subject to a decision producing a legal effect on him, where that decision is based solely on automated data processing that evaluates his or her work performance, creditworthiness, reliability, conduct, etc., unless that decision is taken in the course of entering into or performing a contract, or is authorized by a law that provides safeguards for the subject’s interests.
Authority to Process. Article 16 requires that personal data must not be processed except on instructions from the controller, except as required by law.
Processing Security. Article 17 deals with security, and provides that the controller is required to implement appropriate measures to protect the data against accidental or unlawful destruction or loss, alteration, unauthorized disclosure or access (especially where transmission is involved), and against other forms of unlawful processing. The security level must be appropriate to the risks represented, with regard to state of the art, and cost. Where the controller does not itself do the processing, it must choose a processor that provides sufficient security guarantees, and must ensure compliance. Further, there must be a written contract between controller and processor stating that the processor shall act only on instructions from the controller, and the security obligations set forth immediately above shall apply to the processor.
Agency Notification. Pursuant to Article 18, the controller must notify the Member State agency responsible for compliance with privacy laws before undertaking any partially or entirely automatic processing (except for processing whose sole purpose is to keep a register intended to provide public information, or for processing by a non-profit organization in the course of its activities). Member States may provide a simplification or exemption only for processing categories that are unlikely adversely to affect rights of the subjects, where there is a description of the purposes of the processing, the data or categories thereof, the recipients or categories thereof, and the length of storage time, and/or where the controller appoints a data protection official responsible for insuring independent application of the privacy law, and maintaining a register of processing operations. And the Member States may provide that non-automatic processing may have a notification requirement as stated above, or one that is more simplified.
Notification Contents. Article 19 describes the content of the notice required in Article 18. At a minimum, it must contain: controller’s name and address; purpose of processing; description of categories of data and of subjects; recipients or categories thereof; proposed transfers to nations outside the EU; and a general description for the purpose of making a preliminary assessment of the sufficiency of security measures. Further, the Member States shall specify procedures under which any change affecting any of this information must be communicated to the supervising agency.
Prior Checking. Article 20 requires the Member States to identify those processing operations likely to present risks to individual rights, and requires the supervisory agency, after receipt of the notification from the controller, to ascertain that those operations are examined before they start. This obligation on the part of the Member States may be met by legislation or regulations defining the type of processing at issue, and defining appropriate safeguards.
Publication of Processing. Under Article 21, each Member State must provide a public register of processing operations as to which notice is given pursuant to Article 18, with the information listed in Article 19 (except for the general description). In addition, for processing as to which Article 18 imposes no duty to notify, an organization must provide similar information to any person who requests it. However, this provision does not apply to processing whose sole purpose is to maintain a public register.
Remedies. Article 22 provides that, without prejudice to any administrative remedy, every person must have a right to a judicial remedy for a breach of the rights guaranteed him or her by national law.
Liability. Under Article 23, each Member State must provide that any person injured by unlawful processing, or any act incompatible with national law adopted pursuant to this Directive, is entitled to compensation from the controller (unless the controller can show it was not responsible for causing the injury).
Sanctions. Pursuant to Article 24, the Member States are required to adopt suitable measures to ensure full implementation of this Directive, and to ensure sanctions for violation of provisions adopted pursuant to it.
Principles of Third-Country Transfers. According to Article 25, transfer to a country outside the EU of personal data that is undergoing processing or intended to be processed after transfer, is permitted only if the third country “ensures an adequate level of protection.”
3 The Member States and the EU must inform each other as to the identity of third countries they believe fall short of the necessary level of protection. Where the EU finds (under Article 31(1)) that a third country falls short, the Member States must take measures necessary to prevent any transfer of data of the same type to that country. The EU will negotiate to remedy such deficiencies. The EU may find that a third country ensures an adequate level of protection by reason of its domestic law or its international commitments.
Exceptions for Third-Country Transfers.4 Article 26 creates exceptions to Article 25. Transfer to a third country lacking adequate protection may take place if: the subject has consented; transfer is necessary to perform a contract between the subject and the controller; transfer is necessary to perform a contract concluded in the interest of the subject between the controller and a third party; transfer is necessary in important public interest grounds; it is necessary to protect the vital interests of the subject; or it is made from a public register. Also, transfers to third countries lacking adequate protection are authorized where the controller “adduces adequate safeguards,” which safeguards may result from contracts. All such authorizations that a Member State so grants shall be notified to the EU. The EU may determine that certain standard contractual clauses offer sufficient safeguards, in which case Member States shall take necessary measures to comply.
Codes of Conduct. Pursuant to Article 27, the Member States and the EU shall encourage codes of conduct directed to proper implementation of national provisions pursuant to this Directive. The Member State privacy agencies shall comment on draft codes submitted to them by trade associations or other bodies representing categories of controllers. Draft Community Codes may be submitted to the Working Party pursuant to Article 29 for a determination of whether they comply with national law enacted pursuant to this Directive.
National Privacy Agencies. According to Article 28, each Member State must have an agency for monitoring the application of provisions enacted by that Member State pursuant to this Directive. These agencies shall have the power to: collect information necessary to its duties; intervene, ensure publication, order blocking, erasure or destruction of data, and enjoin processing; and engage in legal proceedings for violation of national implementing provisions, or call such violations to the attention of judicial authorities. Each such agency shall hear claims of persons or associations concerning the protection of personal data rights. Each such agency shall regularly issue a public report on its activities. And each such agency may be asked to exercise its powers by an agency of another Member State. The privacy agencies of the Member States shall cooperate with each other. Agency staff shall be subject to confidentiality obligations extending beyond their periods of employment.
Working Party. Article 29 establishes the “Working Party,” an independent organ of the EU with advisory status. That organ will comprise a member from the privacy agency of each Member State, a representative of the EU privacy agency, and a representative of the European Commission. Decisions shall result from a simple majority of the members.
Functions of the Working Party. According to Article 30, the Working Party shall examine questions of application of national implementing law so as to contribute to uniform application; give the EU opinions on protection levels in the EU and third countries; advise the EU on proposed amendments to this Directive; and opine on codes of conduct drafted at the EU level. Where the Working Party finds divergences likely to affect equivalence of protection between the laws or practices of Member States, it is to inform the Commission. The Working Party is to issue an annual public report on the protection of individuals as to processing of personal data in the EU and third countries.
The Committee. Article 31 establishes a Committee composed of a representative of the Member States and chaired by a representative of the EU. The EU representative shall submit to the Committee a draft of measures to be taken, and the Committee shall opine on the draft. Voting shall be weighted as set forth in the Treaty of Rome. Where the Committee concurs with a measure, the measure shall apply immediately. Where the Committee does not concur, the Commission shall be advised, in which event application of the measure shall be deferred for three months, unless the Council takes a different decision prior to the end of the three month period.
Final Provisions. The Member States were required to have their implementing legislation in place three years after the date of the Directive (i.e., by 24 October 1998). Processing under way at the time implementing national legislation is adopted must comply within three years of this date. However, the legislation may provide that the processing of data held in manual files on the effective date of national legislation shall conform with Articles 6, 7, and 8 of this Directive within twelve years of the date on which the legislation is adopted, provided that the subject has the right to obtain rectification, erasure or destruction of data that is incomplete, inaccurate or stored in a way incompatible with the legitimate purposes of the controller. And Member States may provide that data kept only for historical research need not conform with Articles 6, 7, and 8.
1 Bender on Privacy and Data Protection § 31.03[1][b] (2020)
The Data Protection Directive applies to “personal data.”
4.1 The Directive defines personal data as “any information relating to an identified or identifiable natural person.”
4.2 And it defines an identifiable person as “one who can be identified, directly or indirectly, in particular by reference to an identification number or to one or more factors specific to his physical, physiological, mental, economic, cultural or social identity.”
4.3 It follows that if personal data can be anonymized (
i.e., the resulting data does not relate to an identified or identifiable person), the Directive does not apply to the anonymous data. The primary EU authority on anonymization is the Article 29 Working Party’s paper on Anonymization Techniques, released in 2014
4.4 and summarized below. Creating an anonymous data set from a rich set of personal data, while retaining much of the underlying information (as may be required for specific purposes) is not easy.
Data Protection Directive Recital 26 states that “to determine whether a person is identifiable, account should be taken of all the means likely reasonably to be used either by the controller or by any other person to identify said person.” That Recital further states that “the principles of protection shall not apply to data rendered anonymous in such a way that the data subject is no longer identifiable.” When data is de-identified to be anonymous, that process must be irreversible. “ … the outcome of anonymization as a technique applied to personal data should be, in the current state of technology, as permanent as erasure.”
According to WP29, four key features attend anonymization:
- It can result from processing personal data with the aim of irreversibly preventing identification;
- Several anonymization techniques may be envisaged, as there is no prescriptive standard;
- Context is important, and account must be taken of all means likely reasonably to be used for identification “by the controller and third parties,” with special attention to what has lately become “likely reasonably” in the current state of technology; and
- There is an inherent risk factor that must be considered in assessing validity of any anonymization technique, including possible use of resulting data and its severity and likelihood.
Moreover, there is an obligation to retain personal data in an identifiable format to enable, for example, providing access rights to data subjects.
4.5 ECJ
4.6 precedent holds that the Directive requires “a right of access to information on the recipients or categories of recipients of personal data and on the content of the data disclosed not only in respect of the present but also in respect of the past.” The controller’s legitimate interest must be balanced against the data subject’s right to fundamental freedoms.
Because research tools and computational power evolve, it is not possible to provide an enumeration of circumstances where identification is no longer possible. But there are a few key factors. First, controllers should focus on concrete means necessary to reverse the anonymization technique, regarding cost, know-how, and assessment of likelihood and severity. Identification risk may increase over time. Also, “means likely to be used” are those to be used “by the controller or by any other person.” Thus, according to WP29, when a controller maintains the original (identifiable) data, and transmits data that in and of itself is non-identifiable, the transmitted data is identifiable. In its discussion in this regard, WP29 does not seem to consider whether it is “likely reasonable” that the controller or any third party could render the transmitted data identifiable. Rather, WP29 seems to conclude that, because the original identifiable data still is in the controller’s possession, the transmitted data is identifiable. It is submitted that this conclusion substitutes in Recital 26 the notion of “conceivable” for that of “likely reasonable.”
Moreover, WP29 espouses that removing directly identifying elements in itself is insufficient to ensure that identification is impossible. Anonymization “prevents all parties from singling out an individual in a dataset, from linking two records within a dataset (or between two separate datasets), and from inferring any information in such dataset.”
WP29 lists three risks of using anonymous data. One risk is to consider pseudonymized data to be equivalent to anonymized data. Pseudonymity is likely to allow for identifiability. A second mistake is to consider that anonymized data deprive individuals of safeguards, in particular, because other legislation may apply to it. And finally, is the risk of not considering the impact on individuals from anonymized data, especially in profiling.
WP29 identified three risks essential to anonymization: singling out (the possibility of isolating some records that identify an individual); linkability (the ability to link at least two records concerning the data subject in the same or multiple databases); and inference (the possibility of deducing, with significant probability of being correct, the value of an attribute from the values of a set of other attributes). And there are two difference approaches to anonymization: randomization and generalization.
Randomization techniques alter the veracity of the data to remove the strong link between it and the individual. One form of randomization is noise addition, where attributes in the dataset are modified so that they are less accurate, while retaining the overall distribution. If height is reported in centimeters, randomization might involve reporting it to the + - to centimeters. With this technique, an individual cannot be identified. It will still be possible to single out the records of an individual,
4.7 and to link records of the same individual; inference may be possible, but with a lower success rate. Common mistakes include adding inconsistent noise, and assuming that noise addition is sufficient to anonymize.
Another randomization technique is permutation, where values of attributes are shuffled so that some are artificially linked to different data subjects. This assures that range and distribution of values remain the same. With permutation, it is still possible to single out an individual; correct linking is not possible (although incorrect linking is); and inference may still be drawn. Common mistakes in permutation include selecting the wrong attribute (permuting non-sensitive attributes won’t increase data protection); permutating attributes randomly (randomly permuting two strongly correlated attributes won’t provide a strong guarantee); and assuming that permutation suffices.
Yet another randomization technique is differential privacy, where the controller generates anonymized views of a dataset while retaining the original. Differential privacy informs the controller how much noise it must introduce, and the form, to obtain the necessary privacy. Differential privacy can guard against singling out, but not necessarily against linkability, and not against inference. A common mistake is no injecting sufficient noise.
Generalization is the dilution of the attributes of data subjects by modifying the scale or order of magnitude (e.g., using a region instead of a city). Aggregation and K-anonymity aim to prevent singling out by grouping individuals with at least K others. To permit this, attribute values are generalized to an extent such that each individual has the same value. It is not possible to singly out, but is possible to link records by groups of K users, and inferencing is possible. Common mistakes include missing some quasi-identifiers (the greater K is, the greater the privacy, but it is a mistake artificially to inflate K by reducing the set of quasi-identifiers); using a small value of K (the weight of any individual in a cluster is then too significant, and inference becomes easier); and not grouping individuals with the same weight (K must be sufficiently high so that no individuals represent too important a fraction of the entries in a cluster). The main failing of K-anonymity is that it will not prevent inferencing.
Another form of generalization is L-diversity/T-closeness, wherein each equivalence class every attribute has at least L different values. T-closeness seeks to create equivalent classes similar to the initial distribution of attributes. L-diversity and T-closeness preclude singling out; they do not prevent linkability; they preclude inferencing. Common mistakes include protecting sensitive attribute values by mixing them with other sensitive attributes.
WP216 deals also with pseudonymization, which it defines as the replacement of one attribute (typically unique) by another attribute. Pseudonymization alone does not result in anonymization.
4.8 The most frequently used pseudonymization techniques are:
- Encryption with secret key—the key holder can easily re-identify each data subject through decryption; decryption of state-of-the-art encryption is possible only with the key.
- Hash function—this returns a fixed-size output from any input and is irreversible. But if the range of input values is known, they can be played through the hash function to determine the correct value for any particular record. A salted-key function where a random (usually known) value—the salt—is added to the attribute being hashed, reduces the likelihood of deriving the input value.
- Keyed-hash function with stored key—this is a hash function that uses a secret key as an additional input.
- Deterministic encryption or keyed-hash function with deletion of the key—this is equivalent to selecting a random number as a pseudonym for each attribute, and deleting the corresponding table. With a state-of-the-art algorithm, it is extremely difficult to replay the function.
- Tokenization—Common in the financial sector, this replaces ID card numbers that have reduced usefulness for an attacker.
Pseudonymization does not eliminate singling out; linkability is trivial; inference attacks are possible. Common mistakes include believing that a pseudonymized dataset is anonymized; using the same key in different databases; using different keys for different users; and keeping the key in a compromising location.
By way of recommendations, WP29 suggests:
- Keeping in mind the inherent limitations of some anonymization techniques;
- The fact that each technique described above fails to meet with certainty the criteria of effective anonymization;
- That an optimal solution should be sought on a case by case basis; and
- That whenever a proposal fails to meet one of the criteria, an evaluation of identification risk should be conducted.
To reduce identification risk, the following should be considered:
- Don’t assume that once-anonymized data will stay anonymized indefinitely;
- Consider the identification potential of the non-anonymized portion of the dataset, especially when combined with the anonymized portion;
- Set out purposes clearly, as they play a role in determining identification risk;
- Consider context, such as nature of original data, control mechanisms in place, sample size, availability of public information, and contemplated release of data to third parties;
- Consider appeal of data for targeted attacks;
- Disclosure of anonymization techniques;
- Removal of rare attributes or quasi-identifiers from dataset;
- Determination of any noise level as function of value of an attribute, impact for data subjects of protected attributes, and sparseness of dataset;
- In differential privacy, need to track queries so as to detect privacy intrusive queries, as intrusiveness is cumulative; and
- If generalization techniques are used, use multiple generalization criteria.
WP216 also contains an Annex comprising a primer on anonymization techniques.
1 Bender on Privacy and Data Protection § 31.03[1][c] (2020)
The EU’s highest court
4.9 ruled
4.9.1 in 2017 on the issue of whether a “legitimate interest,” as that term is used in Article 7(f)
4.10 of the EU Data Protection Directive, was involved in the case before it. This provision was closely transposed into Art. 7(f) of the Latvian data protection law. Art. 12 of the Latvian data protection law provides that personal data relating to criminal offenses and judicial files may be processed only by persons authorized by law and in cases provided for by law.
The facts. As a trolley was passing a taxi, the taxi passenger opened the taxi door, damaging the trolley. An administrative offense was found to have taken place. The trolley company, intending to recover damages from the passenger, whose identity was unknown to it, sought the following information from the police, stating it would be used only to bring civil proceedings: statements made by the taxi driver and passenger, passenger’s name, identity document number, and address. The police provided only the passenger’s name, because documents in an administrative proceeding case file may be provided only to parties to the proceeding; the trolley company was not a party. Under Latvian law, a person who so requests may be given “victim” status in administrative proceedings that lead to sanctions. Victims have the right to consult the case file and seek compensation. The trolley company made no such request. But it brought an administrative law action against the police for failure to disclose certain requested information, and the court ordered the police to produce it.
The police appealed, and the court sought an opinion from the Latvian DPA on whether the police were obligated to disclose. The DPA opined that Art. 7(6) of the Latvian data protection law provided no legal basis to compel production, because Latvian administrative law lists the persons to which the police may disclose information relating to a case. Art. 7 does not require the controller to disclose the information; it merely permits it. Further, the ECJ noted that the DPA had indicated that the trolley company had two other avenues for obtaining the information: it could request if from the Civil Registry, or it could apply to the court for the court to request it from the police pursuant to the civil procedure law. But the ECJ observed that the referring court doubted the effectiveness of these methods. The Civil Registry might not be able to identify the passenger just from his/her name, and for the court to request the information from the police the court would have to know the passenger’s address.
4.11 Accordingly, the referring court had doubts about the interpretation of “necessity” in Directive Art. 7(f).
The question presented. One question referred by the Latvian court to the ECJ was:
- Must the phrase “is necessary for the purposes of the legitimate interests pursued by the … third party … to whom the data are disclosed” in Directive Art. 7(f) be interpreted to mean that the police must disclose the data sought that is necessary to bring a civil action?
Rationale. The court took it as a given that the data sought constituted personal data, and that the police were its controller. Art. 7 of the Latvian DP law specifies the legitimacy principles for processing, and 7(f) specifies necessity for purposes of the legitimate interests of the controller, or the party/ies to whom the data is disclosed, except where those interests are overridden by the interests or fundamental rights of the data subject. This is an authorization, not an obligation. Directive Art. 7(f) lays down three conditions for this authorization: the pursuit of a legitimate interest of the controller or third party; need to process for the legitimate interests pursued; and that the fundamental rights of the data subject not take precedence.
The interest of a person in identifying the individual who damaged its property in order to sue for damages can qualify as legitimate. Indeed, under Directive Art. 8(2)(e), even sensitive information may be disclosed where necessary to establish, exercise, or defend legal claims. Since first and last name alone do not identify a person with sufficient precision as to bring an action, it is “necessary” to obtain address and/or ID number. The balancing depends on the circumstances of the particular case. The seriousness of a violation of a data subject’s rights can vary depending on the possibility of accessing the data from public sources. The age of the data subject (who was a minor here) may be a factor in the balancing. But on the facts here, it was not relevant.
Holding. Directive Art. 7(f) does not impose an obligation to disclose personal data to a third party in order to enable it to bring a damages action for harm caused by the data subject. However, that article does permit disclosure. Bringing a legal claim was a legitimate interest, disclosure here was “necessary” for that legitimate interest, and there were no overriding rights of the data subject.
1 Bender on Privacy and Data Protection § 31.03[2] (2020)
In 2002 the EU released its “Directive 2002/58/EC of the European Parliament and of the Council of 12 July 2002 concerning the processing of personal data and the protection of privacy in the electronic communications sector (Directive on privacy and electronic communications)”
5 (referred to in this Treatise as the “e-Privacy Directive”).
The preamble recites that advancing technology has rendered outmoded an older directive covering e-Privacy,
6 which is being replaced by this Directive. Provisions should be made to protect fundamental rights as to public communications networks, and to harmonize these protections. The EU Data Protection Directive
7 applies to all matters concerning protection of fundamental rights not specifically covered by this Directive. The e-Privacy Directive does not address public or national security, defense, or criminal law. User terminal equipment and data stored thereon is part of the private sphere of users requiring protection under the European Convention for the Protection of Human Rights and Fundamental Freedoms (ECHR). Invasive means such as spyware and web bugs should be allowed only for legitimate purposes with user knowledge. But devices such as cookies can be a useful tool and should be permitted by users who are given clear and concise information about them, and have the opportunity to refuse them. Electronic communications systems should be designed to minimize the amount of personal data necessary.
Scope and Aim. Article 1 recites that the e-Privacy Directive harmonizes Member State law as to processing or personal data in the electronic communications sector and ensuring the free movement of such data and equipment.
8 This Directive complements the EU Data Protection Directive and provides for protection of the legitimate interests of legal persons.
Definitions. Article 2 sets forth definitions, incorporating those in the Data Protection Directive, and defining “user,”
9 “traffic data,”
10 “location data,”
11 “communication,” “personal data breach,”
12 “consent,”
13 “value added service,” and “electronic mail.”
14 Services concerned. Article 3 specifies that the Directive applies to the processing of personal data in connection with the provision of publicly available electronic communications services in public communications networks in the EU.
Security of Processing. Article 4 imposes on a provider of a publicly available electronic communication service an obligation to take appropriate technical and organizational measures to ensure an appropriate level of security. In case of a particular risk of breach of network security, the provider must inform subscribers of the risk and, where it is outside the scope of said measures, of possible remedies and likely costs involved.
Without prejudice to the Data Protection Directive, these measures shall at least ensure that personal data can be accessed only by authorized personnel for legally authorized purposes; protect personal data against accidental or unlawful destruction, accidental loss or alteration, and unauthorized or unlawful storage, processing, access or disclosure, and ensure implementation of security policy with respect to the processing of personal data.
In case of a personal data breach, a provider of publicly available electronic communications services must without undue delay notify the breach to the DPA. When the breach is likely adversely to affect an individual, the provider must also notify the individual, unless the data was encrypted. Providers must maintain an inventory of breaches with the pertinent facts. The Commission may adopt implementing measures regarding notification requirements.
Confidentiality of communications. Under Article 5, Member State legislation must ensure confidentiality of communications. This shall not affect legally authorized recording of communications and related traffic data in the course of lawful business practice to provide evidence of any business communication. Pursuant to Article 5(3), Member States must ensure that use of electronic communications networks to store data in subscriber terminal equipment is allowed only “on condition that the subscriber or user is provided with clear and comprehensive information in accordance with the [Data Protection Directive], inter alia, about the purposes of the processing, and is offered a right to refuse such processing by the data controller.” This provision deals with the use of cookies, and has provoked much debate at to its interpretation.
Traffic data. Article 6 provides that Traffic data relating to subscribers must be erased or anonymized when no longer needed for transmission of a communication. Traffic data necessary for billing and interconnection may be processed up to the end of the period during which the bill could be challenged. For marketing electronic communications services or providing value added services, the provider may process traffic data for the duration necessary to the services or marketing with the user’s consent, which may be withdrawn at any time. The provider must inform the user of the types of traffic data processed and the duration. The provisions of Article 6 apply without prejudice to competent bodies to be informed of traffic data to settle disputes.
Itemized billing. Article 7 requires providing subscribers with a right to receive non-itemized bills, and national legislation must reconcile the rights of subscribers receiving itemized bulls with the right to privacy of calling users and called subscribers.
Presentation and restriction of calling and connected line identification. Article 8 prescribes that, where caller ID is offered the provider must undertake offers as follows and must inform the public of these options:
- callers must be offered a simple free means of preventing it on a per-call basis;
- called parties must be offered a simple free means of preventing of preventing caller ID; and
- where ID is presented prior to call connect, the provider must offer the called party a simple means of rejecting calls whose ID the called party has blocked.
Location data other than traffic data. Article 9 states that location data other than traffic data may be processed only after being anonymized or with consent informed consent, which may be withdrawn at any time, for the duration necessary to furnish a value added service. Where consent has been given, the user must have a simple free means of temporarily refusing such processing for each network connection or each transmission.
Exceptions. Article 10 requires transparent procedures governing the way a provider may override: elimination of caller ID temporarily for a subscriber requesting tracing of malicious calls, and elimination of caller ID and temporary absence of consent of a subscriber for processing location data on a per-line basis for organizations dealing with emergency calls.
Automatic call forwarding. Article 11 mandates that Member States ensure that subscribers have a simple free means of stopping automatic call forwarding to them by a third party.
Directories of subscribers. Article 12 requires that, before including subscribers in any printed or electronic directory of subscribers (including directory assistance) containing personal information, subscribers must be informed about the purposes and possibilities for use. Subscribers must be given the opportunity to decide whether their personal data will be included, and to verify and correct same, all free of charge. The foregoing applies to natural persons, but Member States must also protect the legitimate interests of other subscribers regarding their entry in public directories. Member States may require that additional subscriber consent be required for any search beyond that on last name.
Unsolicited communications. Automated calling, fax, or email for direct marketing is permitted only with prior consent. But a company acquiring contact information in the context of a business transaction may market similar products or services if customers have the opportunity to object, free and easily, to such use, on collection and upon each usage. Direct marketing email that disguises the identity of the true sender is prohibited. Each Member State must provide a right to sue for violation of this provision.
Technical features and standardization. In implementing this Directive, no mandatory requirements of specific technical features may be imposed on electronic communications equipment that could impede placing equipment on the market and free circulation of same. However, where a provision may be implemented only by requiring specific technical features, Member States must inform the European Commission and undertake a standards procedure.
Application of certain provisions of Directive 95/46/EC [the EU Data Protection Directive]. Member States may restrict rights and obligations recited in Articles 5 (confidentiality of communications), 6 (traffic data), most of 8 (presentation and restriction of caller ID), and 9 (location data other than traffic data) of this Directive when necessary to safeguard national security, defense, public security, and prevent, investigate, and prosecute crimes or unauthorized use of an electronic communication system. “Toward this end, Member States may,
inter alia, adopt legislative measures providing for the retention of data for a limited period justified on the grounds laid down in this paragraph.”
15 The Article 29 Working Party is empowered to carry out tasks set out in Article 30 of the Data Protection Directive as to matters covered by this Directive.
Implementation and enforcement. Article 152 requires Member States to adopt rules on penalties, including criminal sanctions, for violations of this Directive. Penalties must be effective, proportionate and dissuasive. DPAs must have the power to order cessation of violations, and the power and resources to investigate. After giving the Commission sufficient notice, DPAs may adopt measures to ensure cross-border cooperation to adopt national laws adopted under this Directive.
The e-Privacy Directive
15.1 also contains provisions for grandfathering certain directories (Art. 16), latest date for enacting transposing national legislation (Art. 17—31 Oct. 2003), review (Art. 20—by 31 Oct. 2006), repeal of Directive 97/66/EC [the initial e-Privacy Directive] (Art. 19), and entry into force (Art. 20—immediate).
Tele2 Sverige ECJ Case. In
Tele2 Sverige AB v. Post-och telestyrelsen,
15.2 the EU’s highest court dealt with three issues,
viz.:
- whether government retention of, and access to, traffic and location data to combat crime, falls within the scope of Directive 2002/58 [held: yes]; and
- whether Article 15(1) of Directive 2002/58, read in the light of Articles 7, 8, and 52(1) of the Charter of Fundamental Rights of the EU, precludes national legislation that:
provides, for the purpose of fighting crime, for general and indiscriminate retention of all traffic and location data of all subscribers as to all means of electronic communications
[held: yes]; or
governs the protection and security of traffic and location data, and government access to retained data, where (i) that legislation does not restrict access solely to the objective of fighting serious crime, (ii) access is not subject to prior review by a court or an independent administrative authority, and (iii) there is no requirement that the data concerned should be held within the EU
[held: yes].
Scope of Directive. The preliminary issue was whether government retention of, and access to, traffic and location data to combat crime, falls within the scope of Directive 2002/58 (hereinafter, the “Directive”).
15.3 Directive Article 1(3) excludes from the Directive’s scope “activities of the State” in specified fields, including those in areas of criminal law and public security, defense and State security. And Directive Article 15(1) states that Member States may adopt “legislative measures to restrict the scope of the rights and obligations provided for in Article 5 [Confidentiality of communications], Article 6 [Traffic data], Article 8(1), (2), (3) and (4) [Presentation and restriction of caller ID], and Article 9 [Location data other than traffic data].” As an example of such measures, Article 15(1) identifies those “providing for the retention of data.”
The legislative measures referred to in Article 15(1) concern activities characteristic of government. Moreover, the specified objectives of those measures, such as safeguarding national security, defense and public security and the prevention, investigation, detection and prosecution of criminal offenses or of unauthorized use of the electronic communications system, overlap substantially with the those referred to in the Directive Article 1(3) exclusion.
But the court held that the factors identified in the preceding paragraph did not suggest that the laws referred to in Article 15(1) were excluded from the scope of the Directive, because otherwise Article 15(1) would be deprived of any purpose. Indeed, Article 15(1) necessarily presupposed that the laws it referred to, such as those relating to the retention of data to combat crime, fell within the scope of the Directive, since it expressly authorized the Member States to adopt them only if the conditions laid down in the Directive were met. In addition, the laws referred to in Article 15(1) govern, for the purposes mentioned therein, the activity of electronic communications services providers (“providers”). Thus, Article 15(1), read together with Directive Article 3,
15.4 must be interpreted as meaning that such laws fall within the scope of the Directive. The Directive’s scope extends to a law that requires such providers to retain traffic and location data, because to do so necessarily involves processing by those providers of personal data, and to a law relating to government access to the data retained by providers.
So, for example, protecting the confidentiality of electronic communications, guaranteed in Article 5(1), applies to the measures taken. As confirmed in recital 21, the Directive’s aim is to prevent unauthorized access to communications, including “any data related to such communications,” to protect the confidentiality of electronic communications. In those circumstances, a law that, on the basis of Article 15(1), requires providers, for the purposes set out in that provision, to grant the government, on the conditions laid down in the measure, access to retained data, concerns the processing of personal data by those providers, which falls within the scope of the Directive.
Further, since data is retained only to make it accessible, when necessary, to the government, national legislation that imposes data retention necessarily entails the existence of provisions relating to government access to data retained by the providers. That interpretation is confirmed by Article 15(1b), which provides that providers are to establish internal procedures for responding to requests for access to users’ personal data, based on provisions of national law adopted pursuant to Article 15(1). It therefore follows that laws such as that at issue here fall within the scope of Directive.
General and Indiscriminate Retention. The court next turned to the interpretation of Article 15(1), in the light of Articles 7, 8, 11, and 52(1) of the Charter of Fundamental Rights of the EU (hereinafter, the “Charter”).
15.5 Under Directive Article 6, processing and storage of traffic data are permitted only to the extent and for the time necessary for billing and marketing of services and the provision of value-added services. Directive Article 9(1) provides that location data other than traffic data that data may be processed only subject to certain conditions, and after it has been anonymized or the consent of the subscribers has been obtained. The scope of Directive Articles 5, 6 and 9(1), which seek to ensure confidentiality of communications and minimize the risks of misuse, must be assessed in the light of recital 30: “Systems for the provision of electronic communications networks and services should be designed to limit the amount of personal data necessary to a strict minimum.”
Although Directive Article 15(1) enables Member States to restrict the scope of certain obligation, that provision must, be interpreted strictly. Otherwise, the exception to that obligation and, in particular, to the prohibition on data storage, in Article 5, would become the rule, and Article 5 would be rendered largely meaningless. Further, Article 15(1) provides that ‘[a]ll the measures referred to [in Article 15(1)] shall be in accordance with the general principles of [European Union] law, including those referred to in Article 6(1) and (2) [EU]’, which include the general principles and fundamental rights now guaranteed by the Charter. Article 15(1) must, therefore, be interpreted in the light of the fundamental rights guaranteed by the Charter.
In that regard, the obligation imposed on providers to retain traffic data in order, when necessary, to make it available to the government, raises questions relating to compatibility not only with Charter Articles 7 and 8, but also with the freedom of expression guaranteed in Charter Article 11. Accordingly, the right to privacy in Charter Article 7, and the right to protection of personal data, in Charter Article 8, must be considered in interpreting Directive Article 15(1). The same is true of the right to freedom of expression, guaranteed in Charter Article 11.
In that regard, under Charter Article 52(1), any limitation on the exercise of the rights recognized by the Charter must be provided by law and must respect the essence of those rights. With due regard to proportionality, limitations may be imposed on the exercise of those rights only if necessary and if they genuinely meet objectives of general interest recognized by the EU, or the need to protect the rights of others. On this last point, Directive Article 15(1) provides that Member States may adopt a measure that derogates from confidentiality where it is a “necessary, appropriate and proportionate measure within a democratic society,” in view of the objectives laid down in that provision. And Directive Recital 11 states that such a measure must be “strictly” proportionate to the intended purpose. In particular, the requirement laid down in Directive Article 15(1) is that data be retained “for a limited period” and be for one of the objectives stated in Directive Article 15(1).
Due regard to proportionality also derives from the rule that protection of the fundamental right to respect for private life requires that derogations from and limitations on the protection of personal data apply only in so far as strictly necessary. The legislation here provides for a general and indiscriminate retention of all traffic and location data of all subscribers relating to all means of electronic communication, and it imposes on providers an obligation to retain that data systematically and continuously, without exception. The categories of data covered by that law correspond to the data whose retention was required by Directive 2006/24.
15.6The data that providers must therefore retain makes it possible to trace and identify the source and destination of a communication, date, time, duration and type of communication, users’ equipment, and location of mobile communication equipment. That data includes subscriber name and address, caller’s phone number, number called, and IP address. That data makes it possible to identify the person with whom a subscriber has communicated, by what means, time of communication and place from which it took place. That data also makes it possible to know how often the subscriber communicated with certain persons in a given period.
That data, taken as a whole, might allow very precise conclusions concerning the private lives of persons whose data has been retained, such as everyday habits, permanent or temporary places of residence, daily or other movements, activities carried out, social relationships, and social environments. It provides the means for establishing a profile of the individual, information that is no less sensitive than the content of communications. The interference entailed by such legislation with the fundamental rights enshrined in Charter Articles 7 and 8 is far-reaching and particularly serious. The fact that the data is retained without notice to the subscriber is likely to cause persons to feel that their private lives are subject to constant surveillance.
Even if that legislation does not permit retention of content and does not, therefore, affect adversely the essence of those rights, retention of traffic and location data could nonetheless have an effect on the use of means of electronic communication and, consequently, on the exercise by the users thereof of their freedom of expression, guaranteed in Charter Article 11. Given the seriousness of the interference in the fundamental rights concerned represented by national legislation which, for the purpose of fighting crime, provides for retention of traffic and location data, only the objective of fighting serious crime is capable of justifying such a measure.
Further, while the effectiveness of the fight against serious crime, including terrorism, may depend largely on the use of modern investigation techniques, such an objective cannot in itself justify national legislation requiring general and indiscriminate retention of all traffic and location data. The effect of such legislation, in light of its characteristic features is that retention of traffic and location data is the rule, whereas the system put in place by the Directive requires retention to be the exception. In addition, national legislation such as that here, which covers all subscribers, all means of electronic communication, and all traffic data, provides for no differentiation, limitation or exception according to the objective pursued. It is comprehensive in that it affects all persons using electronic communication services, even though they are not, even indirectly, in a situation liable to give rise to criminal proceedings. Further, it provides for no exception, and consequently applies even to persons whose communications are subject under national law to an obligation of professional secrecy.
Such legislation requires no relationship between the data to be retained and a threat to public security. It is not restricted as to time period, geographical area, or group of persons likely to be involved in a serious crime, or to persons who could, for other reasons, contribute, through their data being retained, to fighting crime. National legislation such this therefore exceeds the limits of what is strictly necessary and cannot be considered to be justified, within a democratic society, as required by Directive Article 15(1), read in the light of Charter Articles 7, 8, 11 and 52(1).
However, Directive Article 15(1), read in that light, does not prevent a Member State from adopting legislation permitting targeted retention of traffic and location data to fight serious crime, provided that the retention is limited as to (i) categories of data, (ii) means of communication affected, (iii) persons concerned. and (iv) retention period limited to what is strictly necessary. To satisfy these requirements, that national legislation must, first, lay down clear and precise rules governing scope and application and imposing minimum safeguards, so that affected persons have sufficient guarantees of protection of their personal data against misuse. That legislation must indicate in what circumstances and under which conditions a data retention measure may, as a preventive measure, be adopted, thereby ensuring that it is limited to what is strictly necessary.
In addition, as to substantive conditions that must be satisfied by national legislation that authorizes retention as a preventive measure for fighting crime, the retention must continue to meet objective criteria that establish a connection between the data and the objective. And as to limits with respect to the public and the situations that may be affected, the national legislation must be based on objective evidence that makes it possible to identify a public whose data is likely to reveal some link with serious criminal offences, and to contribute in some way to fighting serious crime or preventing a serious risk to public security. Such limits may be geographical if the government, on the basis of objective evidence, concludes that there exists, in certain geographical areas, a high risk of offences.
Thus, Directive Article 15(1), read in the light of Charter Articles 7, 8, 11 and 52(1) precludes national legislation that, for the purpose of fighting crime, provides for the general and indiscriminate retention of all traffic and location data of all subscribers relating to all means of electronic communication.
Government Access and Locatin Requirement. The next issue was whether Directive Article 15(1), read in the light of Charter Articles 7, 8, and Article 52(1), precludes national legislation governing the protection and security of traffic and location data, and government access to retained data, where (i) that legislation does not restrict access solely to the objective of fighting serious crime, (ii) access is not subject to prior review by a court or an independent administrative authority, and (iii) there is no requirement that the data be held within the EU.
With regard to objectives capable of justifying national legislation that derogates from confidentiality of electronic communications, the list of objectives set out in Directive Article 15(1) is exhaustive. Accordingly, access to the retained data must correspond, genuinely and strictly, to one of those objectives. And since the objective pursued by that legislation must be proportionate to the seriousness of the interference in fundamental rights that access entails, when it comes to prevention, investigation, detection and prosecution of criminal offences, only the objective of fighting serious crime can justify access to retained data. As proportionality, the national legislation must ensure that such access does not exceed the limits of what is strictly necessary.
Further, since the legislation referred to in Directive Article 15(1) must, in accordance with Directive Recital 11, “be subject to adequate safeguards,” a data retention measure must lay down clear and precise rules indicating in what circumstances and under which conditions providers must grant access. Likewise, such a measure must be legally binding under domestic law. To ensure that government access is limited to what is strictly necessary, national law should determine the conditions under which providers must grant such access. However, the national legislation cannot be limited to requiring that access be only for one of the objectives referred to in Directive Article 15(1). Rather, that legislation must also lay down substantive and procedural conditions governing access.
General access to all retained data, regardless of whether there is any link with the intended purpose, cannot be regarded as limited to what is strictly necessary. Accordingly, the national legislation must be based on objective criteria to define the circumstances and conditions for access. As a general rule, access can be granted in relation to the objective of fighting crime, only to the data of individuals suspected of planning, committing or having committed a serious crime or of being implicated in one. However, in particular situations, where for example vital national security, defense or public security interests are threatened by terrorist activities, access to the data of other persons might also be granted where there is objective evidence suggesting that such data might contribute to combating such activities. To ensure, in practice, that those conditions are fully respected, government access should generally, except in cases of validly established urgency, be subject to prior review carried out by a court or an independent administrative body, whose decision is made following a reasoned request by those authorities submitted within the framework of procedures for the prevention, detection or prosecution of crime.
Likewise, the government must notify the persons affected as soon as notification would not jeopardize the investigations. Notification is necessary to enable those persons to exercise their right to a legal remedy under Directive Article 15(2), read together with Directive 95/46 Article 22. As to the rules relating to the security of retained data, Directive Article 15(1) does not allow Member States to derogate from Directive Article 4(1) and Article 4(1a).
15.7 Those provisions require providers to take appropriate technical and organizational measures to ensure the effective protection of retained data against risks of misuse and unlawful access. Given the quantity of retained data, its sensitivity and the risk of unlawful access, to ensure integrity and confidentiality, providers must guarantee a particularly high level of security by means of technical and organizational measures. The national legislation must provide for the data to be retained within the EU, and for its irreversible destruction at the end of the data retention period.
Member States must ensure independent review of compliance with the level of protection guaranteed by EU law for individuals in the processing of personal data, as required by Charter Article 8(3) and constituting an essential element of respect for the protection of individuals in relation to the processing of personal data. Were it not so, persons would be deprived of the right, guaranteed in Charter Articles 8(1) and (3), to lodge with the DPAs a claim seeking protection of their data. Referring courts must determine whether and to what extent the national legislation here satisfies the requirements stemming from Directive Article 15(1), read in the light of Charter Articles 7, 8, 11 and 52(1) with respect to both government access to the retained data and its protection and level of security.
Thus, Directive Article 15(1), read in the light of Charter Articles 7, 8, 11, and 52(1), precludes national legislation governing the protection and security of traffic and location data and government access to retained data, where the objective pursued by that access, in the context of fighting crime, is not restricted solely to fighting serious crime, where access is not subject to prior review by a court or independent administrative authority, and where there is no requirement that the data be held in the EU.
In closing, the court spoke to the issue of whether Charter Articles 7 and/or 8 expands the scope conferred on Article 8 of the European Convention on Human Rights (the “Convention”) by the European Court of Human Rights. Fundamental rights recognized by the Convention constitute general principles of EU law, but as long as the European Union has not acceded to it, the Convention does not constitute a legal instrument formally incorporated into EU law. Accordingly, interpretation of Directive 2002/58 must be undertaken solely in the light of the Charter. Further, Charter Article 52(3) is intended to ensure the necessary consistency between the Charter and the Convention, “without thereby adversely affecting the autonomy of Union law and … that of the Court of Justice of the European Union.” In particular, as stated in Charter Article 52(3), Article 52(3) does not preclude Union law from providing protection that is more extensive than the Convention. Also, Charter Article 8 concerns a fundamental right distinct from that enshrined in Charter Article 7, and which has no equivalent in the Convention.
However, the court noted that preliminary rulings
15.8 are not for advisory opinions on general or hypothetical questions, but rather for the effective resolution of a dispute concerning EU law. In this case, the question whether the protection conferred by Charter Articles 7 and 8 is broader than that guaranteed in Convention Article 8 does not affect the interpretation of Directive 2002/58, read in the light of the Charter, which is the matter in dispute in the proceedings here. Accordingly, an answer to this question cannot provide any interpretation of points of EU law required for resolution of that dispute. This question, then, is “inadmissible.”
Ministerio Fiscal. In
Ministerio Fiscal,
15.9 Spanish police sought metadata from a telecom service provider in investigating the robbery of a mobile phone and wallet (deemed not to be a particularly serious offense). The CJEU held that not particularly serious criminal offenses may nevertheless justify police access to personal data retained by providers if that access does not amount to a serious privacy violation.
Here, Spanish police asked the magistrate for access to data identifying the users of phone numbers activated with the stolen phone for a period of 12 days after the robbery. The magistrate denied the request, one basis being that the crime did not constitute a “serious” offense—i.e., one punishable by a term of imprisonment exceeding five years, ruling that access to identification data was possible only with regard to serious offences.
The Ministerio Fiscal (Spanish Public Prosecutor’s Office) appealed to the Provincial Court in Tarragona, Spain. The EU law at issue was Articles 3 and 15(1) of the e-Privacy Directive
15.10 stating that Member States may restrict citizens’ rights when that constitutes a necessary, appropriate and proportionate measure within a democratic society in order to safeguard national security, defense, public security, and the prevention, investigation, detection, and prosecution of criminal offenses or of unauthorized use of the electronic communication system. In particular, this provision is not limited to “serious” offenses. The Provincial Court stated that, following adoption of the magistrate’s decision, the Spanish legislature introduced two alternative criteria for determining in this context the degree of seriousness of an offence. The first was a substantive criterion, relating to specific and serious criminal offences that were particularly harmful; the second was a criterion setting a threshold of three years’ imprisonment. In addition, the Provincial Court ruled that the State’s interest in repressing criminal conduct could not justify
disproportionate interferences with the fundamental rights in the Charter. That court therefore sought guidance from the CJEU on determining the threshold of seriousness of offences above which an interference with fundamental rights, such as national authorities’ access to personal data retained by providers, may be justified.
In its decision, the CJEU observed that, in connection with a criminal investigation, national authorities’ access to personal data retained by providers fell within the scope of the Directive. In addition, such access for the purpose of identifying the owners of SIM cards activated with a stolen mobile telephone (their surnames, first names, and addresses) constituted interference with their fundamental rights under the Charter. Nevertheless, the Court ruled that such interference was not serious enough to justify a restriction, in the area of prevention, investigation, detection, and prosecution of criminal offences, to the objective of fighting serious crime. The Court indicated that police access to personal data retained by providers constituted an interference with the fundamental rights of respect for private life and protection of data under the Charter, even if (I) the interference was not “serious,” (ii) the information was not sensitive, and (iii) no one was inconvenienced in any way. But the Directive listed objectives capable of justifying national legislation governing police access to such data and thereby derogating from the confidentiality of electronic communications. That list of objectives is exhaustive, so any legal access must fit within one of those objectives. The Court noted in that regard that, with regard to preventing, investigating, detecting and prosecuting criminal offenses, the wording of the Directive did not limit that objective to the battle against serious crime, but referred to “criminal offences” generally.
The Court observed that, in Tele2 Sverige2, it ruled that only the objective of fighting serious crime could justify public authority access to personal data retained by providers which, taken as a whole, permitted precise conclusions concerning the private lives of data subjects. But, the Ministerio Fiscal Court opined, that interpretation was based on the proposition that the objective of legislation governing that access must be proportionate to the seriousness of the interference with fundamental privacy rights. Thus, the Court held that, in accordance with the principle of proportionality, serious interference could be justified in that field only by the goal of fighting crime “that must also be defined as ‘serious.’” But the Court held that, nevertheless, when the interference is not serious, that access may be justified by the objective of preventing, investigating, detecting, and prosecuting “criminal offences” generally. The Court further found that access to only the data referred to here was not “serious” interference with fundamental privacy rights, because that data did not allow precise conclusions to be drawn as to their private lives. The interference that access to such data engendered was therefore justified by the objective of preventing, investigating, detecting and prosecuting “criminal offences” generally, without it being necessary that they be defined as “serious.”
1 Bender on Privacy and Data Protection § 31.03[3] (2020)
1 Bender on Privacy and Data Protection § 31.03[3][a] (2020)
In 2006 the EU released the Data Retention Directive.
16 As indicated
infra in this subsection, in 2014 the Court of Justice of the European Union, the EU’s highest court, held
17 this directive invalid retroactively, from the day the directive went into effect. Nevertheless, that directive is discussed here for whatever historical value and insight into EU thinking it may provide. The preamble notes that, in accord with prior Directives, some Member States have enacted laws dealing with data retention by service providers, and that variation in those laws presented obstacles to the internal market for electronic communications. Because data retention is a necessary and effective tool for law enforcement, it was necessary to ensure that retained data is made available to law enforcement for a certain period. However, the retention provisions must comply with Art. 8 of the European Convention on Human Rights and Fundamental Freedoms (ECHR), which permits government to interfere with an individual’s right to respect for his private life and correspondence only in accordance with law and where necessary in a democratic society. Given the importance of tracking and location data for investigating crime, there was a need to ensure that data generated or processed in providing communications is retained for a certain period. This Directive does not deal with the content of communications. The EU Data Protection Directive and the EU e-Privacy Directive apply to the retained data. The Member States retain power to adopt laws concerning government’s right to access retained data for public security, defense, national security, and criminal law purposes, so long as those laws respect fundamental rights as recognized in the Member State constitutions and the ECHR.
Article 1 recites that the aim is to harmonize the obligations of providers of publicly available electronic communications to retain certain data they generate or process, and to ensure that this data is available for investigating and prosecuting serious crime. It applies to “traffic and location data on both legal entities and natural persons and to the related data necessary to identify the subscriber … .” It does not apply to content.
Article 2 sets forth definitions. Incorporating those in the Data Protection Directive, and in the e-Privacy Directive, and defining the terms “data,” “user,” “telephone service,” “user ID,”
18 “cell ID,” and “unsuccessful call attempt.” Article 3 imposes an obligation to retain certain categories of data, to the extent generated or processed by providers of publicly available electronic communications services or public communications networks. This obligation extends to unsuccessful call attempts to the extent it is stored or logged by the carriers. Article 4 requires that this data be provided only to government in specific cases in accordance with national law.
Article 5 defines the categories of data subject to the Directive: data necessary to trace and identify communication source;
19 data necessary to identify destination;
20 data necessary to identify date, time and duration;
21 data necessary to identify type of communication;
22 data necessary to identify user’s equipment;
23 and data necessary to identify location of mobile communications equipment.
24 “No data revealing content of the communications may be retained pursuant to this Directive.”
25 Article 6 states the retention periods may be as short as six months and as long as two years. Article 7 requires providers to use security for retained data that is of the same quality as that used for network data, and to use appropriate technical and organizational measures to protect against destruction, alteration, unauthorized storage, processing, access, or disclosure, and access. Article 7 also requires destruction of the data at the end of the retention period.
Article 8 requires storage such that data can be transmitted to governmental agencies on request without delay. Article 9 requires each Member State to designate an independent governmental agency (which may be its DPA) to be responsible for monitoring application of its law regarding security of the retained data. Article 10 requires annually providing the European Commission with retention statistics, which must not contain personal data. Article 11 amends Article 15 of the e-Privacy Directive by inserting a new paragraph, into Article 15 of the e-Privacy Directive, stating that the e-Privacy Directive does not apply to data subject to the Data Retention Directive. Article 12 sets out a procedure whereunder a Member State “facing particular circumstances” may seek to extend the maximum retention period beyond two years. Article 13 requires the Member States to ensure that the remedies, liabilities, and sanctions imposed pursuant to the Data Protection Directive apply to processing of data under the Data Retention Directive.
In 2014 the Court of Justice of the European Union, the EU’s highest court, handed down a decision
26 holding the retention directive invalid retroactively, from the day the directive went into effect. The Court characterized the main objective of the directive as harmonization of Member State provisions concerning retention of certain data generated or processed by public electronic communications services. The directive seeks to ensure that this data is available to prevent, investigate, detect and prosecute serious crime, such as organized crime and terrorism. It requires these providers retain traffic, location, and related data necessary to identify the subscriber, and does not permit retention of communication content.
The High Court of Ireland and the Constitutional Court of Austria asked the CJEU to examine the validity of the directive, in light of two fundamental rights under the Charter of Fundamental Rights of the EU,
27 i.e., the fundamental right to respect for private life,
28 and the fundamental right to protection of personal data.
29 The Court observed that the data made it possible (1) identify the person with whom a subscriber communicated and by what means, (2) to identify time of communication and place from which that communication occurred, and (3) to know frequency of communications of the subscriber with certain persons during a given period. Taken as a whole, this data may provide precise information on the private lives of the persons whose data are retained, such as habits of everyday life, places of residence, daily or other movements, activities carried out, social relationships and the social environments frequented.
The Court opined that, by requiring this retention and allowing national authorities to access the data, the directive interfered in a particularly serious manner with the fundamental rights to respect for private life, and to the protection of personal data. Furthermore, the fact that data is retained and subsequently used without the subscriber being informed is likely to generate a feeling that private lives are the subject of constant surveillance. The Court then examined whether such an interference with these fundamental rights was justified.
It held that the required retention is not such as adversely to affect the essence of the fundamental rights to respect for private life and to the protection of personal data. The directive does not permit the acquisition of knowledge of the content of the electronic communications as such, and provides that service providers must respect certain principles of data protection. Further, retention for the purpose of possible transmission to national authorities genuinely satisfies an objective of general interest, namely the fight against serious crime and, ultimately, public security. However, the Court opined that, by adopting the directive, the EU legislature exceeded the limits imposed by compliance with the principle of proportionality.
30The Court observed that, in view of the important role played by the protection of personal data in light of the fundamental right to respect for private life, and the extent and seriousness of the interference with that right caused by the directive, the EU legislature’s discretion is reduced, with the result that review of that discretion should be strict. Although the retention required may be appropriate for attaining the directive’s objective, the directive’s wide-ranging and particularly serious interference with the fundamental rights at issue is not sufficiently circumscribed to ensure that that interference is limited to that strictly necessary.
- The directive covers, in a generalized manner, all individuals, all means of electronic communication and all traffic data without any differentiation, limitation or exception in light of the objective of fighting serious crime.
- The directive lacks any objective criterion to ensure that national authorities have access to the data and can use it only to prevent, detect, or prosecute offenses that, in view of the extent and seriousness of the interference with fundamental rights in question (Charter, Arts. 7, 8), are sufficiently serious to justify such interference. On the contrary, the directive (Art. 1(1)) simply refers generally to ‘serious crime’ as defined by each Member State. In addition, the directive does not lay down substantive and procedural conditions under which the national authorities may have access to and use the data. In particular, access is not made dependent on prior judicial or administrative review
- So far as concerns the data retention period, the directive (Art. 6) imposes a period of at least six months, without distinguishing between categories of data on the basis of persons concerned or possible usefulness of the data in relation to the objective pursued. Furthermore, that period is set at between six months and 24 months, but the directive states no objective criteria for determining the period in order to ensure that it is limited to that strictly necessary. Accordingly, the directive does not lay down clear rules governing the extent of interference with the fundamental rights enshrined in Articles 7 and 8 of the Charter, and entails a wide-ranging and serious interference with those fundamental rights, without that interference being precisely circumscribed by provisions to ensure that it is limited to that strictly necessary. The Court also found that the directive did not provide for sufficient safeguards to ensure effective protection of the data against the risk of abuse and unlawful access and use. The court noted that, inter alia, the directive permits service providers to consider economic considerations when determining the level of security which they apply (particularly as regards the costs of implementing security measures) and that it does not ensure the irreversible destruction of the data at the end of the retention period. Therefore, the legislature exceeded the limits imposed by compliance with the principle of proportionality in light of Charter Arts, 7, 8 and 52(1).
- The Court also stated that the directive does not require that the data be retained within the EU. Therefore, the directive does not fully ensure the control of compliance with the requirement of protection by an independent authority, as is explicitly required by the Charter (Art. 8(3)). Such control, on the basis of EU law, is an essential component of the protection of individuals as to the processing of personal data.
Thus, the Court held: “Directive 2006/24/EC of the European Parliament and of the Council of 15 March 2006 on the retention of data generated or processed in connection with the provision of publicly available electronic communications services or of public communications networks and amending Directive 2002/58/EC is invalid.”
1 Bender on Privacy and Data Protection § 31.03[3][b] (2020)
In 2011, pursuant to Art. 14 of the Directive, the European Commission transmitted its report on this Directive.
31 The Report examined the Directive’s application by Member States, its impact on economic operators and consumers, the implications of the Directive for fundamental rights, and whether modification was necessary to address concerns associated with the criminal use of anonymous SIM cards. It concluded that: data retention is a valuable tool for EU criminal justice systems and law enforcement; harmonization of data retention has been limited; the EU should continue through common rules to ensure high standards for storage, retrieval and use of traffic and location data. The Commission announced its intent to propose amendments to the Directive.
Although the Directive obliges Member States to ensure that data is retained to investigate, detect and prosecute “serious crime,” as defined by national law, the definition of “serious crime” varies across the Member States. Ten refer to a minimum prison sentence, the possibility of a custodial sentence, or a list of offenses. Eight require data retention in relation to
all criminal offenses and for crime prevention, or on general grounds of national or state and/or public security. And four simply refer to “serious crime” or “serious offense” without defining it.
32 Most transposing Member States allow access and use for purposes beyond those covered by the Directive, including preventing and combating crime generally. While this is permitted under the e-Privacy Directive, the degree of harmonization achieved by EU legislation in this area remains limited. The Commission announced an intention to assess the need for more harmonization.
Communication entities covered. The Directive applies to “the providers of publicly available electronic communications services or of public communications networks.” Two Member States (for cost reasons) do not require small operators to retain data, and four others have alternative administrative arrangements. While large operators present in several Member States benefit from economies of scale, smaller operators tend to set up joint ventures or outsource to reduce costs. The Commission announced an intention to examine the impact of security issues on small-and medium-sized enterprises.
Right of access. Member States are required “to ensure that [retained data] are provided only to the competent national authorities in specific cases and in accordance with national law,” and must define in their national law the procedures to be followed and the conditions to be fulfilled. In all Member States, the national police forces and, except in common law jurisdictions (Ireland and United Kingdom), prosecutors may access retained data. Fourteen Member States list security or intelligence services or the military among the competent authorities. Six list tax and/ or customs authorities, and three list border authorities. One allows other public authorities to access the data if authorized for specific purposes under secondary legislation. Eleven require judicial authorization for each access. In three, judicial authorization is required in most cases. Four others require authorization from a senior authority but not a judge. In two Member States, the only condition appears to be a written request. The Commission announced an intention to assess the need for more harmonization with respect to the authorities having, and the procedure for obtaining, access.
Categories of data covered. The Directive applies to fixed network telephony, mobile telephony, Internet access, Internet email and Internet telephony. Article 5 specifies the categories of data to be retained, namely data necessary for identifying: source; destination; date, time and duration; type of communication; users’ communication equipment; and location of mobile communication equipment. Content may not be retained. Search queries, i.e., server logs generated through the offering of a search engine service, are outside scope of the Directive as content.
Twenty one Member States provide for the retention of each category. Belgium has not provided for the types of telephony data to be retained, nor does it provide for Internet-related data. Member States did not see a need to amend categories of data. In its report on the second enforcement action, the Article 29 Working Party argued that the Directive’s categories should be exhaustive. The Commission stated it would assess the necessity of all of these data categories.
Appropriate retention period. Member States must ensure that the required categories are retained for six months to two years. The maximum retention period may be extended by a Member State “facing particular circumstances that warrant an extension for a limited period.” All Member States that have transposed the Directive, except one, apply retention periods within these bounds. However, there is no consistent approach across the EU. Fifteen Member States specify a single period for all categories: one specifies two years, one specifies 1.5 years, ten specify one year, and three specify six months. Five Member States have defined different retention periods for different categories of data. One retains all data for one year except for data on unsuccessful call attempts (six months). One Member State has not specified any data retention period for the categories specified in the Directive.
While The Directive permits this diversity, it provides only limited legal certainty and foreseeability across the EU. The Commission is considering further harmonizing retention periods, and will consider applying different periods for different categories of data, for different categories of serious crimes or a combination of the two. At the time access is requested, 90 percent of the data are no more than six months old, and 70 percent are no more than three months old.
Security and supervisory authorities. The Directive requires Member States to mandate that operators ensure that retained data shall be: of the same quality and subject to the same security as data on the public communications network; subject to appropriate technical and organization measures to protect the data against accidental or unlawful destruction, accidental loss or alteration, or unauthorized or unlawful storage, processing, access or disclosure; subject to appropriate technical and organizational measures to ensure access by specially authorized personnel only; and destroyed at the end of the retention period, except for those preserved for purposes set down in the Directive. Operators are prohibited from processing under the Directive data retained for other purposes that would not otherwise have been retained. Member States are required to designate a governmental agency (which may be the DPA) responsible for monitoring, with complete independence, the application of these principles. Fifteen Member States have transposed all of these principles. Four have transposed two or three but do not explicitly provide for data destruction. Two provide for destruction. Twenty-two Member States have a supervisory authority responsible for monitoring.
Transposition of Article 7 (data protection and data security) is inconsistent. The Commission announced it would consider options for strengthening data security and data protection standards, including privacy-by-design, to ensure these standards are met for both storage and transmission.
Statistics. Member States are required to provide the Commission with annual statistics on data retention, including:
- cases where information was provided to competent authorities in accordance with national law;
- time elapsed between retention and request; and
- cases where requests could not be met.
The Commission asked Member States to supply details on instances of individual requests for data. Nevertheless, statistics provided differed in scope and detail: some Member States distinguished between different types of communication, some indicated the age of the data at the moment of request, while others provided only annual statistics without any detailed breakdown. Nineteen Member States provided statistics on the number of data requests for 2009 and/or 2008; this included three where there was no transposing legislation at the time, and two where data retention legislation had been annulled. Seven Member States that transposed the Directive did not provide statistics, although one provided an estimate of the volume of annual requests for telephony data.
Reliable data is crucial in demonstrating the necessity and value of data retention. It has not been possible to acquire adequate data, given that most Member States only fully transposed in the two years prior to the Report and used different interpretations for the source of statistics. The Commission will aim to develop feasible metrics and reporting procedures.
Decisions of Constitutional Courts. The Romanian, German, and Czech Constitutional Courts annulled their transposition laws because they were unconstitutional, and those Member States were considering how to re-transpose. Cases on data retention were also brought before constitutional courts in Bulgaria (resulting in a revision of the transposing law), Cyprus (court orders issued under the transposing law were held unconstitutional), and Hungary (case pending at time of the Report). The Commission will consider the issues raised by national case law in its future proposal on revising the data retention framework.
Ongoing enforcement. The Commission expects Member States who lack enforceable transposing legislation to enact it as soon as possible. Two Member States that had not transposed (Austria and Sweden) were found by the European Court of Justice to have violated their obligations. In April 2011, after the Swedish Parliament postponed transposing legislation for a year, the Commission decided to refer Sweden for a second time to the Court for failure to comply with the judgment in Case C-185/09, and sought financial penalties. The Commission continues to monitor the situation in Austria, which provided a timetable for the imminent adoption of transposing legislation.
Volume of retained data accessed. The volume of traffic and requests for traffic data is increasing. Over 2 million data requests were submitted in each of 2008 and 2009, with significant variance among Member States, from less than 100/yr. (Cyprus) to over 1 million (Poland). The most frequently requested type of data was related to mobile phones. For mobile telephony, authorities had to submit the same request to each main mobile operator, so that the actual numbers of requests were much lower than statistics suggested.
The use of retained data older than three or six months is less frequent but can be crucial; its use has tended to fall into three categories: Internet-related data; investigations of particularly serious crimes, a series of crimes, organized crime and terrorist incidents; and data held in another Member State.
33 Member States generally reported data retention to be valuable. The aggregate volume of requests reported by the 19 Member States who supplied 2009 and/or 2008 data was 2.6 million (all crimes reported, not only serious crimes), amounting to two requests/police officer per year, or 11 requests/100 recorded crimes.
Retained data enables the construction of trails of evidence leading up to an offence. It is used to discern, or to corroborate other forms of evidence on, the activities and links between suspects. Location data in particular has been used, both by law enforcement and defendants, to exclude suspects from crime scenes and to verify alibis. Certain crimes involving Internet or phone communication can be investigated only via data retention.
34 While law enforcement and courts in most Member States keep no statistics on what type of evidence proved crucial, retained data is integral to criminal investigation and prosecution. Law enforcement needs to keep pace with technological developments which are used to commit crime, and data retention is among the necessary criminal investigation tools.
Operators and consumers. In a joint statement to the Commission, five major industry associations stated that the economic impact of the Directive was substantial for smaller service providers. Eight operators submitted widely varying estimates of compliance costs. A study carried out before the transposition estimated the cost of establishing a retention system for an ISP with half a million customers at €375 240, with €9 870/ mo. thereafter, and of setting up a data retrieval system at €131 190, with €28 960/mo. thereafter. Per unit data retention costs are inversely related to operator size. Most operators were unable to quantify impact of the Directive on competition, consumer prices, or investment in new infrastructure and services. The Commission intends to assess the impact of future changes to the Directive on industry and consumers.
Reimbursement of costs. The Directive does not regulate reimbursement of operator data retention costs, comprising:
- operational expenditure (operating costs or recurring expenses for operation of the business, a device, component, piece of equipment or facility); and
- capital expenditure (expenditures creating future benefits, or the cost of developing or providing non-consumable parts for the product or system).
The Directive has not achieved its aim of establishing a level playing field for operators. The Commission will consider how to minimize obstacles to the functioning of the internal market by ensuring that operators are consistently reimbursed for retention compliance costs, with particular attention to small-and medium-sized operators.
Fundamental rights to privacy. Data retention limits the right to private life and the protection of personal data, fundamental rights in the EU. Such a limitation must be “provided for by law and respect the essence of those rights, subject to the principle of proportionality,”
35 and justified as necessary. Any limitation must:
- be formulated in a clear and predictable manner;
- be necessary to achieve an objective of general interest or protect rights and freedoms of others;
- be proportionate to the desired aim; and
- preserve the essence of the fundamental rights concerned.
Article 8(2) of the European Convention of Human Rights recognizes that governmental interference with a person’s right to privacy may be justified as necessary for national security, public safety or crime prevention. Any limits on the right to privacy must be precise and enable foreseeability.
Any limits on right to privacy must be necessary with minimum safeguards. In Copland v. the United Kingdom, the European Court of Human Rights held that government monitoring of phone calls, email, and Internet usage was necessary only if based on relevant domestic legislation. In S. and Marper v. the United Kingdom, the Court held that retention of DNA profiles or fingerprints of any person acquitted of crime could be justified only if it answered a pressing social need, was proportionate to the aim pursued, and the government’s justification was relevant and sufficient. Core principles of data protection required retention to be proportionate to the purpose of collection, and the retention period to be limited.
Any limits on the right to privacy must be proportionate to the general interest. The European Court of Justice, in Schecke & Eifert concerning the Internet publication of recipients of agricultural subsidies, found that the EU legislature had not taken appropriate steps to balance the right to privacy with the general interest (transparency) as recognized by the EU. Lawmakers had not considered other methods consistent with the objective that would have interfered less with the privacy right of subsidy recipients. Consequently, the Court held, lawmakers exceeded the limits of proportionality, as “limitations in relation to the protection of personal data must apply only insofar as is strictly necessary.”
Criticisms of the principle of data retention. Some civil society organizations argued that data retention is an unjustified and unnecessary restriction of individuals’ right to privacy. They consider the non-consensual “blanket and indiscriminate” retention of individuals’ telecommunications traffic, location and subscriber data to be an unlawful restriction of fundamental rights. Following a case brought in Ireland by a civil rights group, the question of the legality of the Directive is expected to be referred to the European Court of Justice. Also, the European Data Protection Supervisor expressed doubts about the necessity of the measure.
Calls for stronger data security and data protection rules. The Article 29 Working Party’s report on the second enforcement action argued that risks of confidentiality breach were inherent in the storage of traffic data. It criticized certain aspects of national implementation. The Working Party called for a clarification that the recited categories are exhaustive. The European Data Protection Supervisor asserted that the Directive “has failed to harmonize” and that use of retained data is not limited to combating serious crime. He urged the EU to adopt a comprehensive legislative framework that obligates operators to retain data, and regulates how Member States use it for law enforcement. DPAs argued that data retention implies a risk of privacy breaches, which the Directive does not address at an EU level. While there are no concrete examples of serious breaches, the risk of data security breaches remains, and may grow with developments in technology and forms of communications, irrespective of why data is stored, unless further safeguards are put in place.
Conclusions and Recommendations.
The EU should support and regulate data retention as a security measure. Most Member States believe that EU rules on data retention remain necessary for law enforcement. Harmonized rules should ensure that data retention is an effective tool in combating crime, that industry has legal certainty in a smoothly functioning internal market, and that the high levels of respect for privacy are applied consistently throughout the EU.
Transposition has been uneven. Transposing legislation is in force in 22 Member States. The considerable leeway left to Member States to adopt data retention measures under Article 15(1) of the e-Privacy Directive renders assessment of the Directive problematic. There are considerable differences between transposing legislation in the areas of purpose limitation, access to data, periods of retention, data protection and data security and statistics. The Commission will continue in its role of enforcing EU law, ultimately using infringement proceedings if required.
The Directive has neither harmonized the approach to data retention nor created a level playing field for operators. Data retention now takes place in most Member States. The Directive does not guarantee that retained data is being stored, retrieved and used in compliance with the right to privacy. Responsibility for enforcing these rights lies with Member States. The Directive sought only partial harmonization of approaches to data retention; therefore it is unsurprising that there is no common approach. However, beyond the variation explicitly provided for by the Directive, differences in national application of data retention present difficulties for operators.
Operators should be consistently reimbursed for the costs they incur. There is a lack of legal certainty for industry. The obligation to retain and retrieve data represents a substantial cost, especially for smaller operators, and they are reimbursed to different degrees in the various Member States, although there is no evidence that the telecommunications sector has been adversely affected by the Directive. The Commission will consider ways of providing consistent reimbursement.
Ensuring proportionality in the end-to-end process of storage, retrieval and use. The Commission will ensure that any future data retention proposal respects proportionality, is appropriate for combating serious crime and terrorism, and does not go beyond what is necessary to achieve it. It will recognize that exemptions or limitations in the protection of personal data should apply only insofar necessary. The following areas in particular should be examined:
- consistency in limiting purpose of data retention and types of crime for which retained data may be used;
- more harmonization of, and possibly shortening, retention periods;
- ensuring independent supervision of access requests and the overall data retention regime;
- limiting authorities authorized to access;
- reducing data categories to be retained;
- guidance on technical and organizational security measures for access;
- guidance on use including prevention of data mining; and
- developing feasible metrics and reporting procedures to facilitate comparisons.
Next steps. The Commission will propose a revision of the current data retention framework, and will devise a number of options in consultation with other stakeholders. It will also research further public perceptions of data retention and its impact on behavior.
1 Bender on Privacy and Data Protection § 31.03[3][c] (2020)
However, in December 2013 the Advocate-General of the Court of Justice of the EU issued a preliminary ruling finding that the retention requirement in this Directive conflicted with the right to privacy guaranteed by the EU charter.
36 This ruling was in response to a request from the High Court of Ireland and the Constitutional Court of Austria requesting a ruling on the circumstances in which it is constitutional for the EU to limit the exercise of fundamental rights within the specific meaning of Article 52(1) of the Charter of Fundamental Rights of the European Union (the “Charter”), by means of a directive and national transposition measures. The Advocate-General was concerned that the retained data could “create an accurate and exhaustive map of a large portion of a person’s private conduct, or even a complete and accurate picture of his private identity.”
37 Moreover, the Advocate-General found two factors that enhanced the risk that this data might be used for unlawful purposes: This data was retained by telecom companies, rather than by government (or even under direct control of government), and might be stored at unknown locations (because the Directive does not require storage within an EU Member State).
38Because of the major conflict with the Charter’s right to privacy, the Advocate-General ruled that the Directive should itself have prescribed the principles necessary for controlling access to the data and its use, instead of delegating that to the Member States. “The European Union legislature cannot, when adopting an act imposing obligations which constitute serious interference with the fundamental rights of citizens of the Union, entirely leave to the Member States the task of defining the guarantees capable of justifying that interference. It cannot content itself either with assigning the task of defining and establishing those guarantees to the competent legislative and/or administrative authorities of the Member States called upon, where appropriate, to adopt national measures implementing such an act or with relying entirely on the judicial authorities responsible for reviewing its practical application. It must, if it is not to render the provisions of Article 51(1) of the Charter meaningless, fully assume its share of responsibility by defining at the very least the principles which must govern the definition, establishment, application and review of observance of those guarantees.”
39The Advocate-General therefore proposed that the Court should find:
“(1) Directive 2006/24/EC of the European Parliament and of the Council of 15 March 2006 on the retention of data generated or processed in connection with the provision of publicly available electronic communications services or of public communications networks and amending Directive 2002/58/EC is as a whole incompatible with Article 52(1) of the Charter of Fundamental Rights of the European Union, since the limitations on the exercise of fundamental rights which that directive contains because of the obligation to retain data which it imposes are not accompanied by the necessary principles for governing the guarantees needed to regulate access to the data and their use.”
“(2) Article 6 of Directive 2006/24 is incompatible with Articles 7 and 52(1) of the Charter of Fundamental Rights of the European Union in that it requires Member States to ensure that the data specified in Article 5 of that directive are retained for a period whose upper limit is set at two years.”
40The Advocate-General suggested suspending, for a reasonable time, the effects of his finding so that the EU legislature could adopt measures necessary to remedy the Directive’s invalidity.
41