“Facebook and Big Tech are facing a Big Tobacco moment, a moment of reckoning. The parallel is striking. I sued Big Tobacco as Connecticut’s attorney general; I helped to lead the states in that legal action and I remember very, very well the moment in the course of our litigation when we learned of those files that showed not only that Big Tobacco knew that its product caused cancer, but that they had done the research and they concealed the files. Big Tech now faces that Big Tobacco, jaw-dropping moment of truth.”
– Senator Richard Blumenthal, Chairman of the Commerce Committee’s Subcommittee on Consumer Protection, Product Safety and Data Security, and one of the members of Congress most empowered to regulate Big Tech, at the hearing, “Protecting Kids Online: Testimony from a Facebook Whistleblower” on October 5, 2021.
Big Tech and Big Tobacco: it’s an analogy that has been made a lot lately (also here, and here) due to the emergence of a whistleblower, Frances Haugen, with thousands of pages of research showing Facebook is aware of some of its platforms’ worst impacts on teens, political polarization, and ethnic genocide. But the analogy goes back to at least 2018, in an interview with Salesforce’s Marc Benioff (it’s a theme he has repeated since). It had also become a rallying cry for some academics (and Public Knowledge used the analogy earlier this year, too).
So is it true? Does the analogy hold up? And if it does, what do we do about it now?
The Parallels Between Big Tech and Big Tobacco
The Big Tech/Big Tobacco comparison is being made all over the place now because Haugen’s receipts showed Facebook is aware of the harms it causes and conceals them, similar to a decades-long campaign by the biggest tobacco companies to mislead the public about the cancerous and habit-forming effects of cigarettes. Both industries undermined and tried to cast doubt on public health research (even their own) while denying researchers access to their data, which would be the best way to study users’ habits and outcomes related to their products. The Surgeon General released its first conclusions about smoking and cancer and other diseases in 1964. This year, the Surgeon General released a report identifying misinformation on social media platforms as a “significant public health challenge” and asking technology companies to “take responsibility for addressing the harms of their products.”
It wasn’t until Jeffrey Wigand, the tobacco whistleblower, appeared for his own interview on “60 Minutes” in 1995 that long-standing suspicions that tobacco companies had been fully aware of the catastrophic health outcomes of smoking for decades were confirmed, and regulation of the tobacco industry gained momentum.
Both industries applied their knowledge of their products’ harms not to mitigate the damage but to evolve their business strategies. For example, they recruited new, younger users to ensure their continued dominance. In the tobacco industry, this meant the creation of “replacement smokers” and the use of cartoon characters like Joe the Camel in marketing efforts to teenagers. For Facebook this has meant using Messenger Kids (for 6-12 year-olds) or Instagram (for 13+ year-olds, and despite knowledge of its impact on teen girls’ body image and eating disorders) and the now paused development of Instagram Kids to attract younger users. Both industries also exploited underdeveloped markets in the quest for growth and under-resourced the safety mechanisms that ostensibly keep users safe in countries with lower risk of regulation. (They may have running from their toxic brands in common, too.)
Both industries also made efforts to make their products more addictive. You don’t have to take our word for it: In his September 2020 testimony to the House Committee on Energy and Commerce, a former Facebook executive charged with early monetization of the platform, Tim Kendall, acknowledged that Facebook “took a page from Big Tobacco’s playbook, working to make our offering addictive at the outset.” He compared Facebook’s status updates, photo tagging, and likes to Big Tobacco’s bronchodilators, which spread smoke over more surface area of the lungs, and compared “extreme, incendiary content” to the ammonia added to cigarettes to increase the speed with which nicotine traveled to the brain.
(What does Facebook have to say about the analogy now? In an interview with CNN, Nick Clegg, Facebook’s Vice President of Global Affairs and Communication, noted that the comparison was “profoundly false” because, after all, “a third of the world’s population enjoys using” Facebook’s family of apps. It was a bad day for Nick, since a correspondent for NPR quickly pointed out that when Jeffrey Wigand appeared on “60 Minutes” about 30 percent of the world’s population smoked.)
And it isn’t just Facebook and Instagram. The reliance on “engagement-based ranking” to maximize users’ time on their platforms is shared by Twitter, TikTok, Pinterest, YouTube, and other platforms that rely on advertising for their business model. For example, research has shown that YouTube’s algorithmic recommendation engine often leads to radicalization and extremism, and employees described how their proposals to change the recommendations engine were ignored in the pursuit of user engagement. And this week , executives from Snapchat, TikTok, and YouTube were called to a Senate hearing to examine how their companies treat young audiences.
Policy Implications of Big Tech’s Big Tobacco Moment
The analogy may be imperfect. Most notably, in an ideal world the tobacco industry would be gone altogether, while even Big Tech’s biggest critics admit that it provides substantial benefits. But in 1995, an industry whistleblower “crystallized and confirmed years of suspicions and helped compel U.S. government authorities to act” on tobacco regulation. This past month, Frances Haugen did the same thing. Right now, this is the only part of the analogy that matters, and only Congress can make it happen.
Frances Haugen outlined a list of potential regulatory solutions in her testimony to the Consumer Protection Subcommittee, and experts have proposed many more before and since. Congress has plenty of legislative solutions to consider to take advantage of her revelations. But unfortunately, and as tempting as it may be, this complex set of issues is not conducive to a single, simple, headline-grabbing regulatory solution. That would be the equivalent of regulating the symptoms rather than the underlying causes of the industry behavior.
In fact, other than the documents she provided to substantiate her claims, nothing in Haugen’s testimony was completely new. Certainly it reinforced the idea that self-regulation will never be an effective antidote to Big Tech’s ills, just as it wasn’t an effective solution for the tobacco industry. There are too many clashing incentives. But it has also reinforced the need for broader, more durable policy solutions, some of which Public Knowledge has advocated for years.
For example, we need a dedicated digital regulator. Congress was ultimately forced to provide explicit power to the FDA to regulate tobacco in addition to food and pharmaceuticals. A specialized regulator may be even more important for the digital platforms due to the dynamic and diverse nature of the industry; individual pieces of legislation run the risk of obsolescence or may not pertain to all the relevant platforms. An agency with a watchdog capability would ensure more transparency for government, researchers, and the public. Frances Haugen wants one. Public Knowledge wants one (in fact, we were first, as my colleague Harold Feld recently pointed out). Even Facebook wants one, for heaven’s sake. Let’s do this!
We also need antitrust enforcement, and new laws and rules to promote competition against Facebook and the other dominant digital platforms. This would make it easier for users to switch when they are frustrated or fed up with a platform’s missteps, and help create an environment where real alternatives to today’s dominant platforms can thrive. If users have a real choice, Facebook, YouTube, and other platforms may feel more pressure to get these tricky decisions right.
Federal data privacy regulation that goes beyond notice and consent regimes would limit how user data is collected and used. The optimal federal privacy law would provide substantive use limitations, include data minimization requirements, make no distinction between so-called “sensitive” or “non-sensitive” information, penalize companies that use dark patterns to circumvent user choice, and offer consumers meaningful redress.
One of the breakthroughs in regulation of tobacco — after years of industry efforts to position smoking and acceptance of its risks as a “personal choice” — came from public campaigns to communicate the dangers of secondhand smoke; that is, when it was understood that individual choices to engage in certain behaviors could harm other people. We have offered an analogous policy proposal: A “Superfund for the Internet” would consider the societal harms associated with disinformation on digital platforms and mandate that they adopt an approach in their content moderation that serves the public interest. It calls for the creation of an independent trust fund, funded by a federal user fee from the platforms, and a mandate to use fact-checking along with labeling and other friction strategies (including some Haugen described in her testimony) to slow down the virality of harmful disinformation.
“Instagram is that first childhood cigarette meant to get teens hooked early. Exploiting the peer pressure of popularity and ultimately endangering their health. Facebook is just like Big Tobacco, pushing a product that they know is harmful to the health of young people, pushing it to them early. Also, Facebook can make money. IG stands for Instagram, but it also stands for ‘Insta-greed.’”
– Senator Edward Markey, member of the Commerce Committee’s Subcommittee on Consumer Protection, Product Safety and Data Security, and one of the architects of the Children’s Online Privacy Protection Act, at the hearing, “Protecting Kids Online: Testimony from a Facebook Whistleblower” on October 5, 2021.
Congress — and Congress alone — now has the ability to deliver on their own claims about Big Tech’s Big Tobacco moment. If they don’t, this will just be like Big Tech’s Cambridge Analytica moment, or its $170 million fine for violating kids’ privacy rights moment, or any one of many other seemingly breakthrough moments that have turned into…nothing much.