Europe’s new privacy law, the General Data Protection Regulation (GDPR) will enter into force in May 2018. Understandably, given that data breaches and privacy violations have been in the headlines lately — and given that the GDPR will reshuffle privacy protection in Europe and beyond — many in the United States are looking to the GDPR for ideas of what to do – and what not to do. We think that it would be impractical and ineffective to copy and paste the GDPR to U.S. law — the institutions and legal systems are just too different.
However, here are some aspects of the GDPR that we think Congress should pay attention to when thinking about how to protect Americans’ privacy. It is important to remember two things when thinking about the GDPR and internet platforms. First, the GDPR has not been implemented yet. It will likely take years to see the full extent of its consequences and effects. Second, there is another piece of legislation, the ePrivacy Regulation, which clarifies the consequences of the GDPR for electronic communications, that is still in the European legislative process. We expect the ePrivacy Regulation to be approved before the end of 2019.
(Please also keep in mind that this blog post does not intend to be an exhaustive guide to the GDPR — it’s a long law! — and that elsewhere, we published our long read on Principles for Privacy Legislation.)
Here are the aspects of the GDPR that we think Congress should consider:
-A definition of personal data (article 4). Even if the transposition of the GDPR’s definition of personal data to U.S. law might be impractical, it should start with assuming personal data is ”any information relating to an identified or identifiable natural person.” Naming the most relevant pieces of information that constitute personal data (the GDPR chooses name or “an identification number,” among others) is also a useful baseline for the U.S. Article 9 of the GDPR establishes a different set of conditions for the processing of “special categories of personal data,” including but not limited to personal data revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, trade union membership, or sexual orientation.
–Emphasis on consent. Article 4 of the GDPR defines consent as a “freely given, specific, informed and unambiguous indication of the data subject’s wishes by which he or she, by a statement or by a clear affirmative action, signifies agreement to the processing of personal data relating to him or her.”
Article 7 of the GDPR outlines the conditions for consent. Explicit, clear, granular, and informed consent should be a key part of any U.S. privacy law. Consumers should be clearly informed of what is going to happen to their personal data. If the GDPR is implemented as expected, users will have to explicitly and in a clearly informed way agree to share their data with third parties, especially data brokers. Many will probably agree to share their data. But they will have a clear option to say no.
Consent can be withdrawn at any time. In addition, Recital 43 specifies that consent is not clearly given and therefore invalid if it is made contingent for the provision of a service for which a specific category of personal data is not necessary — in other words, organizations cannot impose consent to unnecessary data sharing for a provision of a service that does not require such data to function.
–Non-consent based lawful processing. Organizations should sometimes be allowed to process data without consent, given narrow and specific circumstances (Article 6). The GDPR allows for some other instances where an organization could collect and process personal data without consent, such as the legitimate interest of processing a contract or guaranteeing network security. Recitals 47, 48, and 50 deal with legitimate interest.
–Pseudonymous data (article 4 and recital 28). The GDPR defines pseudonymization as the “processing of personal data in such a manner that the personal data can no longer be attributed to a specific data subject without the use of additional information” and recognizes the ability of pseudonymization to help protect the rights of individuals while also enabling data utility. Pseudonymization may facilitate data processing beyond original collection purposes and scientific research. Likewise, it is also important to know that organizations will not need to guarantee individual users rights if the data no longer identifies to a specific individual. Anonymous data falls out of the scope of the GDPR (Recital 26). Like the GDPR, Congress should consider incentivizing the pseudonymization and anonymization of personal data.
–Data minimization (article 5). The GDPR establishes that data processing should only use as much data as is required to successfully accomplish a given task. In addition, the GDPR clarifies that data collected for one purpose cannot be repurposed without further consent.
Congress should encourage incentives to change data collection in America from “collect first, think of uses later” to “think uses first, design collection mechanisms later.”
–Listing user rights (chapter 3). In the digital era, it is important that consumers know their rights in regards to their personal data. We do not think that all the rights outlined in the GDPR are desirable, and we are particularly wary of the the Right to be Forgotten (article 17), as we are afraid it can easily be abused against the public interest — for example by a corrupt politician running for reelection to eliminate public information regarding his or her wrongdoings.
There are other rights from chapter 3 that Congress should consider importing with translation, particularly:
1) The right to data portability (article 20) would increase transparency and user control by obliging platforms to share with their users in a machine-readable format the personal data that platforms have collected about a specific user. It is important to keep in mind that while the right to data portability might increase platform competition, this would be a by-product of the GDPR. In fact, in their guidelines for understanding the right to data portability, EU privacy authorities write that “the GDPR is regulating personal data and not competition.”
2) The right to transparent information (article 12) establishes that organizations must inform individuals “in a concise, transparent, intelligible and easily accessible form, using clear and plain language, in particular for any information addressed specifically to a child” when their data is being collected. Empowering users starts with clear information.
3) The right to be informed and to access (articles 13, 14, 15): individuals have the right to be informed about the collection and use of their personal data. If an organization obtains data from other sources, it must provide individuals with information regarding the uses and management of that data within one month.
4) Rights related to automated decision-making, including profiling, (article 22) are designed to protect individuals when organizations are undertaking automated decision-making processes that carry legal or other significant effects on them. The GDPR asks organizations to give individuals information about the processing, allow consumers to request human intervention or challenge a decision, and regularly verify that the systems are working as intended.
5) The right to object to data processing (article 21) will allow Europeans to object to data processing (for example for direct marketing purposes).
–Mandating independent supervisory authorities (article 6). Each EU country has one or more independent Data Protection Agencies (DPAs) in charge of the overseeing the implementation and the enforcement of the GDPR. The European Data Protection Agencies have been central for the development of a culture of data protection in Europe and the update of privacy laws across Europe.
The U.S. does not necessarily need to copy the European institutional framework. But something is clear: one or more agencies must be definitively empowered and equipped to oversee, enforce, and advocate for Americans’ privacy rights. The EU institutional framework gives the U.S. another valuable lesson in what regards to decentralization: there is no single EU-wide DPA. Instead, there are many national DPAs. The U.S. should also allow states to explore different ways to guarantee privacy rights. State action preemption is not a condition for privacy rights enforcement. A federal framework and specialized institutions are.
–Privacy by design and by default (article 25). Privacy by design establishes that privacy has to be a fundamental consideration from the initial design stages of new products, services, or processes that involve personal data and throughout development. Privacy by default means that the default privacy settings of products or services should be set to the most privacy-friendly levels. These are not new or even European-only principles: Canada is moving towards privacy by design too.
If Congress wishes to be forward looking and encourage innovation that is respectful of privacy, mandating privacy by design and by default is a step in the right direction.
–Data Protection Impact Assessments (DPIAs, article 35). Organizations are required to assess and mitigate privacy risks in new data processing activities. This is especially important when a new product is launched or a new technology used. This would help organizations rethink their products before launching them. The GDPR clarifies that DPIAs “shall in particular be required” in case of a “systematic monitoring of a publicly accessible area on a large scale,” “[p]rocessing on a large scale of special categories of data or of personal data relating to criminal convictions and offences,” or profiling. Given ongoing complaints about racial and other biases in algorithms and machine learning, this risk mitigation could be more important to a multicultural U.S. than the more comparatively homogeneous Europe.
As with privacy by design and by default, DPIAs are safeguards to guarantee that technological innovation will be respectful of privacy. Congress should consider adopting DPIAs in the U.S.
–Data Protection Officer (DPO, article 39). DPO a privacy and security leadership role that the GDPR requires basically all organizations processing, controlling, or collecting personal data to have. DPOs are in charge of overseeing an organization's compliance with the GDPR requirements and cooperating with Data Protection Agencies when necessary.
The idea behind the figure of the DPO is to oblige all organizations processing personal data to have someone with knowledge of the law to educate and train data processing employees, to think about data protection, and to dialogue with the authorities. Any company that processes and collects data on a regular basis will have to appoint a DPO. Congress should study whether all small and medium enterprises should be required to have DPOs. But DPOs could create a permeable culture for privacy protection in the U.S.
–Certification mechanisms (article 42). The GDPR encourages the creation of EU-level certifications, seals, and marks that companies can earn to demonstrate compliance with the GDPR for some or all of their services or products. This would be particularly useful for consumers, since it would allow them to quickly identify the most privacy protective products and services.
Certifications, seals, and marks would not be a novelty in the U.S. consumer protection landscape. They already exist for chemicals and food. Congress should consider easing consumer understanding of privacy policies, data protection, and security practices by encouraging or mandating certification mechanisms.
–Data breach notification (article 33). The GDPR establishes that the organizations that ask consumers for data and establish the purposes and means of processing personal data should notify the relevant DPA of data breaches within 72 hours of occurrence, unless the personal data breach is unlikely to result in a risk to the rights and freedoms of natural persons. Organizations processing data on behalf of other organizations are required to notify the first without undue delay after becoming aware of a personal data breach.
Congress should include data breach notification in any comprehensive privacy bill.
–Security of processing (article 32). The GDPR mandates that all organizations handling or requesting personal data “implement appropriate technical and organisational measures to ensure a level of security appropriate to the risk” of the data processing, including using encryption. Congress should also consider security and risk of data breach when thinking of privacy.
–Penalties (chapter 8). Here the principle is simple: any good law needs teeth.
Americans deserve privacy protection. The GDPR outlines some interesting elements for a strong privacy bill. We suggest Congress critically study and understand them. Copying and pasting EU law would not be an efficient or reasonable way to protect Americans’ privacy. But there is significant value in looking at examples of how other jurisdictions address policy issues, and in this case is particularly important for a nation in urgent need of data protection reform.
Image credit: Flickr user thedescrier