Back in 2011, the Federal Trade Commission alleged that Facebook deceived consumers by failing to keep its promises to protect user privacy. The two parties agreed to settle the charges through something called an “agreement containing consent order.” The Commission also signed a consent agreement with Google that same year. The FTC issued a final Decision and Consent Order regarding the Facebook allegations in 2012. (A consent order is an FTC enforcement tool that operates like a legal settlement.) Without admitting to the complaint’s counts, the parties involved signed a document that basically says, “we both agree to enter this agreement to resolve the allegations in the complaint, so now you have to do the following things, and if you fail to do any of them, the FTC is going to impose financial penalties.”
Although consent orders sound good in theory, recent revelations about Facebook’s behavior have left consumers doubting that they work in practice. While consent orders remain an important tool in the FTC’s enforcement toolkit, the Commission lacks the resources to properly administer them. Further, even if consent orders were fully and consistently enforced, the FTC’s ex post facto enforcement can only address consumer privacy violations after they have occurred. These problems must be resolved through comprehensive federal privacy legislation that provides the Commission with both additional administrative support and ex ante rulemaking authority.
The Current System Isn’t Working
Consent orders and FTC privacy enforcement are topics typically confined to wonky D.C. policy panels and corporate conference rooms, but they have now become the stuff of dinner table conversation as the public learns more about the shady machinations of the “big data” economy. Facebook’s sharing of user information with at least 60 device companies without user consent is merely the latest illustration of an online platform conducting business in a manner that is at best out of step with consumer expectations and preferences and at worst part of an ongoing bad faith effort to amass monopoly profits at the expense of consumer privacy. Although Facebook has been in the headlines, its behavior leaves little reason to assume that other platforms and entities within the big data ecosystem that profit off of consumer data collection are adequately protecting our privacy either. Consumers are in fact assuming quite the opposite. In a 2014 Pew study, only 9 percent of those polled believed they had “a lot of control” over the information that is collected about them despite a significant majority (74 percent) responding that control over who can get information about them is “very important.”
This widespread consumer distrust points to the urgent need for entities to be meaningfully held accountable when they fail to fulfill their obligations to steward our personal information. As discussed in more detail below, the FTC’s handling of the Facebook Consent Order calls into question whether the agency’s current enforcement capabilities are adequate to police individual privacy settlements let alone the entire tech market. So, what can be done? Here are a couple of things: (1) give the Commission sufficient funding and staffing to protect consumer privacy in the digital age; (2) give the Commission rulemaking authority to promulgate rules to prevent consumer privacy harms before they occur.
The FTC Needs More Operational Support
When the Commission announced its consent order with Facebook, then-Chairman Jon Leibowitz said, “Facebook’s innovation does not have to come at the expense of consumer privacy. The FTC action will ensure it will not.” Cambridge Analytica gained access to user data in 2014, but the FTC only took action after the news became public earlier this year. What happened? Something went wrong with the Commission’s consent order enforcement, and now numerous questions remain unresolved. Were the order’s required third party audits completed? If so, did the FTC ever request them? Were the audits inaccurate? If so, did they contain errors or did they contain deliberate misinformation? Was the Commission vigilant? It was unable to identify the issues in real time and must now go back to investigate after consumers’ privacy was once again compromised.
Some of these questions have surfaced because the Facebook Consent Order’s mandated independent privacy assessments have not been readily accessible to the public. Transparency concerns notwithstanding, the Commission’s handling of the Facebook Order demonstrates a larger issue with FTC enforcement capabilities. As former FTC Commissioner Terrell McSweeny recently noted, the FTC is not currently strong enough to enforce consumer privacy in the digital age. The Commission should be empowered in a couple of ways. First, it should receive more funding. Since 2010, FTC funding has fallen 5 percent. We can’t expect the Commission to be an effective internet privacy “cop on the beat” by cutting its funding during an era in which consumers cannot meaningfully participate in modern society without sharing their data with myriad known and unknown entities.
Second, the Commission needs more technical expertise. Congress could achieve this end through a variety of means. McSweeny has suggested, for example, creating a dedicated Bureau of Technology that is fully staffed with experts and/or an enhanced ability to contract with outside experts to address this deficiency. Former Director of the FTC’s Bureau of Consumer Protection, Jessica Rich, has noted that the Commission lacks the funding to lure technologists away from the private sector with competitive salary offers. Whatever the means that they choose to adopt, it’s time for Congress to take action and provide the FTC with sufficient resources to properly utilize its current enforcement tools.
The FTC Needs Rulemaking Authority to Prevent Consumer Privacy Harms
A more fundamental concern with FTC enforcement is that it can only address privacy violations after they have already occurred. The FTC should be in the business of preventing privacy harms in addition to remedying privacy harms. As we have noted in our “Principles for Privacy Legislation” white paper, existing laws are poorly designed to protect consumers in the digital age. The Commission lacks specific statutory authority over consumer data protection. In addition, the recent LabMD v. FTC ruling raises questions about the FTC’s ability to bring future deceptive trade practice enforcement actions solely on the basis of substantial consumer injury from a data breach. Congress should provide the Commission with the authority to issue rules that protect consumers from future privacy and security harms and to impose substantial liability on companies that violate those rules. These rules could, for example, target data collection quantity through data minimization requirements. Far from being a burden on the dynamic digital ecosystem, such ex ante regulation provides certainty to companies that currently must rely on interpreting vague agency guidance and past enforcement actions and provides the FTC with the flexibility to adapt the rules to address market developments.