Federal Privacy Legislation Is [TBD]: Congress Has Made Progress, but Still Has a Ways to Go
Federal Privacy Legislation Is [TBD]: Congress Has Made Progress, but Still Has a Ways to Go
Federal Privacy Legislation Is [TBD]: Congress Has Made Progress, but Still Has a Ways to Go

    Get Involved Today

    Late last year, while folks were carving up turkeys and shopping for holiday gifts, the Commerce Committees on both sides of Capitol Hill were busy introducing long-awaited privacy legislation. For years now, Public Knowledge has been calling for comprehensive federal privacy legislation to protect our fundamental right to privacy online that includes (among other things) data minimization requirements, prohibitions on discriminatory data use, and offers strong federal, state, and individual enforcement. So, the arrival of legislative language was welcome. First came the Consumer Online Privacy Act (COPRA), which was introduced by Senate Commerce Committee Ranking Member Maria Cantwell along with Senators Schatz, Klobuchar, and Markey. Next up was the United States Consumer Data Privacy Act (CDPA), a discussion draft introduced by Commerce Committee Chairman Roger Wicker on Thanksgiving Day. Finally, in December, the House Energy and Commerce Committee released an untitled draft of legislation that was the product of bipartisan staff discussions (House Draft) and requested stakeholder feedback on the proposed language.

    Notably, significant portions of the House Draft contained language in brackets, which is a legislative drafting tool used to indicate provisions that remain subject to negotiation. This signals that plenty of areas of uncertainty and disagreement still exist on important details. Similar differences can be seen when the details of COPRA and CDPA are compared. At the State of the Net conference here in D.C., Representative Jan Schakowsky, who is leading the effort on the House Draft, noted that dozens of stakeholders submitted feedback on the House Draft and that, “a lot of people on all sides are not happy.” On one hand, this is a good thing because it indicates that lawmakers are now tackling the tough and contentious issues. On the other hand, these are tough and contentious issues. It’s critical that they are resolved in a way that benefits consumers and the public interest. In this blog post, we highlight just a few of the many sticky details in privacy legislation, and determine which (if any) of the current Committee proposals offer the best approach. 

    Sensitive Information and Persistent User Tracking 

    When it comes to the foundational question of which types of information are covered, the three bills take the same general approach, which is to define covered information broadly. This is a good thing. Data of all kinds — not merely traditional “personally identifiable information” like government-issued IDs — is constantly being collected, compiled, mixed, and matched by a vast ecosystem of data brokers, ad-tech firms, and other entities often without user knowledge and consent to create comprehensive and highly detailed consumer profiles and consumer scores. These data sets can be used in ways that further the public interest, like public health research, but can also be used in harmful ways, like digital redlining or voter suppression. Because of this, we have long advocated that all data should be protected equally. Representatives Eshoo and Lofgren wisely avoid this distinction in their strong  Online Privacy Act. Unfortunately, the Committee bills reject this approach in lieu of distinguishing between covered information and “sensitive” information.

    To the extent that distinctions among data are going to be drawn in legislation, the definition of sensitive data must be highly inclusive and not subject to narrow readings. The data generated by persistent online tracking of individuals is particularly privacy-invasive and needs to be given the heightened protections of sensitive information. Online tracking is hard to shake; unique identifiers persist even when a user resets operating system-level advertiser IDs. And tracking is often carried out using pixels or device fingerprinting techniques for example, by entities that may have no intuitive relationship with the product or service that you are using. Importantly, unique identifiers are not anonymous because the data is still identifiable. We also note that tracking doesn’t end when you go offline. A person’s device can be passively tracked anywhere by sensors that detect the device’s ID, which is why it is important that information relating to devices be covered by privacy legislation.

    If consent is going to be used as a grounds for processing data, including sensitive data, it must be meaningful. That means it must be informed, affirmative, and express. The continued use of a product or service should not be construed as consent. Companies must also be prohibited from using dark patterns and other manipulative user interfaces when obtaining user consent. Of course, reasonable and proper exceptions need to be made for uses of sensitive information to further the public interest through, for example, scientific or historical research. COPRA’s creation in Section 110 of an institutional review board or similar oversight entity to monitor and govern such research provides a promising approach.

    Of the three bills, COPRA provides the best protections for pervasive online tracking. The Committee bills all include a definition of sensitive information, which can only be processed after an individual has given affirmative consent. In addition, each of the bills includes a category of information related to online activities over time in their definition of sensitive information. The three bills differ, however, in what kinds of data related to online activities over time is considered sensitive. Sen. Wicker’s CDPA is the most narrow, only covering online activities that relate to sensitive data. The House Draft includes a similar category of “online browsing history [with respect to sensitive information]” in its definition of sensitive information. As mentioned before, the brackets here indicate language that remains under negotiation. Even with the narrow, bracketed language removed, “online browsing history” could be read to only include a list of the website names or URLs that an individual has visited. It should be noted that Section 6 of the House Draft helpfully requires express consent for cross-site tracking, but that critical provision only extends to first party tracking and is also bracketed for potential exclusion. COPRA’s definition of “sensitive information” by contrast includes much broader coverage for “information revealing online activities over time and across third-party website or online services.” Lawmakers should follow the approach taken in COPRA. 

    Further, the federal regulator should have the authority to revisit the definition as needed to include other categories of sensitive data as needed to protect consumer privacy and marketplace competition. All three bills give the Federal Trade Commission rulemaking authority under the Administrative Procedure Act to modify the definition of sensitive data to include additional categories, but the House Draft has bracketed the entire provision, and CDPA only permits additional categories if the FTC determines that processing or transferring such data, “would be likely to be highly offensive to a reasonable individual.” This requirement is too restrictive and would likely not permit many of the categories that are already included in CDPA’s definition.

    Data Minimization and Waivers of Rights

    Comprehensive privacy legislation should establish a baseline of robust consumer rights and protections. Again, we will focus on a couple of examples of rights that appear at risk of exclusion. The first issue is data minimization. Data minimization is a core component of many privacy laws and principles, including the European Union’s General Data Protection Regulation, which requires that personal data shall be “…limited to what is necessary in relation to the purposes for which they are processed.” A data minimization requirement helps to ensure that companies subject to the law don’t collect and stockpile needless or extraneous amounts of personal information, creating a privacy and security risk for individuals and groups. This is an important “backstop” legal protection to include in laws based on notice and consent, which can place a lot of burden on individuals to be their own privacy managers. 

    A key detail in the GDPR definition is that the minimization is “tied” to each processing purpose, which is independent of the broader provision of a product or service. This requirement ensures that data collection, use, or retention are focused on relevant and necessary information and makes it easier on companies to keep track of and manage data flows by keeping information siloed according to its processing purpose. Data minimization has proved to be good for companies’ bottom line, saving them money on storage costs and expenses related to data breaches

    Both COPRA (Section 106) and the House Draft (Section 7) provide good data minimization language that is tied to processing purposes, but only COPRA extends beyond retention to cover data collection and use as well. Federal privacy legislation needs to take a comprehensive approach to data minimization, which requires that the collection, use, and retention of data are necessary, proportionate, and limited to carry out the specific purpose for which it is collected, used, or retained. Reps. Eshoo and Lofgren helpfully provide comprehensive minimization requirements for the collection, processing, disclosure, and maintenance of data in their Online Privacy Act

    It’s worth noting that the important privacy rights of access, correction, and deletion can also provide the secondary benefit of an incentive to minimize data. This is proving true in California where companies are subject to the California Consumer Privacy Act, which grants people the right to request access to the information that companies have on them and to request that such data get deleted. As a result, companies are opting to proactively delete unnecessary sensitive data or avoid collecting it altogether. We’re glad to see that all three Committee bills provide individuals with these important access, correction, and deletion rights.

    A second issue is the extent to which the law allows companies to force consumers to waive their privacy rights or to consent to unnecessary data collection, use, or sharing in order to use a product or service. Many apps today will not allow people to use the product or service without agreeing to a very permissive privacy policy. In practice, this means that users don’t actually have a choice about how their data is used. If they don’t like the company’s data practices, they usually aren’t able to use the app at all.

    Under CDPA Section 101(b), consumers can waive nearly all of their privacy rights under the law in an agreement with a company. Again, in practice this is not going to be an arm’s length negotiation; it will be a take-it-or-leave it deal. COPRA Section 109 prohibits the waiver of some of the bill’s protections, but allows individuals to affirmatively consent to waive their access, correction, and deletion rights and rights to grant and withdraw consent to the processing of data that is “strictly necessary to provide a product or service.” This is much less problematic than the CDPA approach, but could be read expansively. Section 12 of the House Draft offers a clear and strong prohibition on conditioning the provision of a product or service or the “quality of a customer experience” on a waiver of any of the rights in the bill or on the individual’s consent to the processing of covered information other that what is necessary to provide the product or service. Once again, this important user protection is bracketed, meaning this important consumer protection could be removed altogether.

    In our view, the House Draft approach provides the most clear and robust consumer protection, but it is disappointing to see that portions remain bracketed. Outside of situations in which the functionality of a product or service is logically impossible without allowing a certain purpose of data processing, the offering of that product or service must not be conditioned on user consent to other data collection, use, and retention. Companies should not be permitted to “guilt” users into consenting away their privacy when it is unnecessary.

    Accountability and Enforcement

    If anything should be non-controversial when it comes to the privacy debate, it is the need for for the law to be vigorously enforced and for companies to be held properly accountable for the way they protect (or don’t protect) the privacy of their customers. Unfortunately, a lack of consensus on enforcement and accountability continues to plague the process of legislative development. 

    For example, the Committee bills take differing approaches to holding company executives personally accountable for their companies’ privacy practices. CDPA contains no requirement for senior management officials to certify annual privacy filings to the FTC. COPRA (Section 201) requires senior management officers of large companies, including the highest ranking officer of the entity, to certify to the FTC that the company maintains both adequate internal controls for legal compliance and reporting structures to ensure that the certifying officers are involved in and responsible for compliance decisions. The core of the House Draft officer certification requirement (Section 3(a)(3)) is quite strong and mandates that senior officers — notably “principal executive officer” is in brackets — of large companies certify that they have reviewed the filing and that, based on the officers’ knowledge, the filing doesn’t contain false statements of material fact (or omit material facts) and fairly presents the company’s privacy practices.

    But such accountability is wiped away by the following section, which is written broadly to include the ability of executive officers to rely on “other sources” to be deemed in compliance with the certification requirements. We note that the Cambridge Analytica scandal was facilitated in part through Facebook’s reliance on Alexander Nix and SCL’s “certification” that they had deleted the user data at issue. Corporate practices of the kind that precipitated one of history’s biggest privacy scandals must not be enshrined into federal statute as lawful behavior either through the statute itself or through rules of construction or other exceptions or limitations.

    We reiterate the need to include a private right of action (along with a ban on forced arbitration agreements) in federal privacy legislation. State attorneys general provide important enforcement at the state level, but they are limited in their capacity. The California Attorney General’s office has gone on record to say that it will only have the ability to bring three enforcement actions per year under CCPA. Private suits brought under the forward-thinking Illinois Biometric Privacy Act continue to hold companies accountable for the use of biometric information without user knowledge and consent and can have a prophylactic effect on industry best practices. This underscores the critical role that individual enforcement plays in protecting consumer and user privacy.

    Further, comprehensive federal privacy legislation must serve as a “floor” of privacy protections that allows for states to pass stronger consumer protection laws as the need arises. Federal privacy law should not preempt, displace, or supplant critical state-level protections, including consumer protection laws, civil rights laws, and laws that govern the privacy rights of employees or students.

    Conclusion

    Privacy and personal autonomy in our online, digital, and connected world are not abstract academic or philosophical matters. They are fundamental human rights. And the misuse and abuse of personal information through commercial data practices generates real harms that affect real people, both individually and collectively. Progress has been made on the creation of comprehensive federal privacy legislation but much work remains to be done on the important details. Public Knowledge will continue to fight to ensure that the law is crafted in a way that best protects our right to privacy and personal autonomy online.

    “Privacy – Privacy Online” by perspec_photo88 is licensed under CC BY-SA 2.0