Google Plus Demonstrates We Can’t Trust Companies to Do the Right Thing
Google Plus Demonstrates We Can’t Trust Companies to Do the Right Thing
Google Plus Demonstrates We Can’t Trust Companies to Do the Right Thing

    Get Involved Today

    Last March, Google discovered a Google Plus bug that permitted developers to access as many as 500,000 Google Plus users’ private information – but didn’t tell anyone. We are learning about it now, not because Google decided to tell those 500,000 users, but because some intrepid Wall Street Journal reporters discovered the Google memos discussing the bug and broke the story. Instead of coming clean, Google had opted to quietly fix the problem while all eyes were on Facebook and Cambridge Analytica. There is no evidence that any of the 400 plus people with access to the developer API actually accessed the private information, but Google’s decision to remain mum reveals an important lesson for the overarching privacy and security policy debate.

    One of the issues up for debate on Capitol Hill is whether entities that maintain our personal information will be required to tell us when they experience a data breach, or when our data are exposed or accessed in an unauthorized way, under what’s called a harm standard or under an occurrence standard. The harm standard, which the industry favors, only requires an entity to disclose a data breach or unauthorized access or data exposure when there is good reason to believe that that access has resulted in or will result in legally cognizable harm (think financial loss or physical injury). By contrast, the occurrence standard requires entities to disclose a breach or unauthorized access or exposure when it occurs.

    The occurrence standard is the more consumer-friendly standard, because it permits consumers to take measures to prevent harm from the breach, exposure, or unauthorized access. Furthermore, it accounts for harms that may not be legally cognizable, but that are no less real – such as embarrassment, re-endangering a domestic violence victim, or Cambridge Analytica-style “psychographics.”

    In contrast, codifying the harm standard would simply allow the entity that has already failed to sufficiently protect private information to determine, in its sole discretion – when it has every financial incentive to keep a data breach, exposure, or unauthorized access secret – whether or not consumers have been or will be harmed and thus whether or not consumers should be informed of the breach, exposure, or unauthorized access. The Google Plus case illustrates the problem with this proposition. It is likely that Google thought there was no evidence that consumers had been or would be harmed and thus declined to inform the 500,000 users whose private information was implicated about the vulnerability.

    Our personal information is just that – personal. We should know when that information is breached, accessed, exposed, or vulnerable so that we can take measures – and make product choices – to protect ourselves. Because we cannot count on companies to do the right thing out of the goodness of their hearts, it is imperative that Congress require them to do the right thing by codifying the occurrence standard in any comprehensive privacy law.