You know the old proverb about “new wine in old bottles?” Derived from a biblical parable, it refers to an existing concept or idea being offered as though it were a new one.
In our view, the EARN IT Act of 2022 is just…old. Despite overwhelming objections to the original version based on how it would threaten free expression and jeopardize access to encrypted services, its sponsors have brought back the act with virtually every one of its flaws still intact. And it still will not accomplish its stated goal: to encourage digital platforms to report and remove more child sexual abuse material, or CSAM, by threatening their Section 230 protections for hosting it.
First, some background: The EARN IT Acts of both 2020 and 2022 establish a “national commission” to develop “best practices” for interactive computer services to moderate material that sexually exploits children, including CSAM. The original EARN IT Act conditioned Section 230 liability protections to platforms that follow these “best practices.” We warned how this bill could threaten user privacy and security when it was originally introduced back in 2020 (fact sheet here, and blog posts here, and here). The new EARN IT Act simply eliminates Section 230 protections for any interactive computer services provider facing a claim derived from child exploitation law.
Here’s the truth about the bill as it was reintroduced last month:
The “best practices” of the “national commission” are still likely to result in discouraging platforms from using privacy-protecting technologies such as encryption. End-to-end encryption is one of the best technological tools to protect user privacy and safety. It ensures that no one except the sender of a communication and its recipient or recipients can read it. Combined with device security, end-to-end encryption protects users from bad actors and cybersecurity threats. It benefits journalists, activists, domestic violence survivors, military personnel, and children. However, one of the bill’s principal sponsors, Senator Lindsay Graham, has been an outspoken critic of tech companies’ use of encryption; another sponsor, Senator Richard Blumenthal, has argued that technology companies might use a blanket exemption for encryption as a “get out of jail free card” when it comes to platforms monitoring what users say to each other; members of the Judiciary Committee have held multiple hearings emphasizing the challenges encryption creates for law enforcement; and the national commission will be dominated by law enforcement leaders. It seems highly unlikely that its recommendations will still allow for end-to-end encryption. It’s also possible the commission may institute a “best practice” that requires platforms to monitor content, and encryption will de facto mean the platform is in breach of that duty.
The new version of EARN IT still holds the same perils for online safety. It includes a specious new “carve-out” for encryption that was designed to address the overwhelming pushback from digital rights organizations, activists, and academics about the likely impact of the bill. But read closely: It notes that the use of encryption can’t serve as “an independent basis for liability.” That means a platform’s use of end-to-end encryption (or its inability to decrypt their users’ communications) can still be used as evidence against them in court, and if a plaintiff or a judge can find just one more little thing to support it, the carve out no longer applies. This will have the same effect as the 2020 version: targeting platforms that use end-to-end encryption to protect the content and communications of their users.
This risk still extends to both federal and state civil cases. Federal criminal law already requires platforms to report any CSAM they discover. It bears repeating: Section 230 has never protected platforms from federal criminal law related to CSAM. However, without Section 230, state criminal, and state and federal civil law, could impose new duties on platforms not just to report CSAM they discover, but also to more actively scan and monitor their users to uncover more. Platforms could be liable if they have designed their services with privacy in mind, so the bill encourages platforms to drop such user privacy and security features.
In sum, the true primary goal of the EARN IT Act of 2022 is still to encourage digital platforms to more actively monitor user communications, even if that means that they may no longer offer secure, encrypted communications for users. It happens to accomplish that goal by removing their Section 230 protections for CSAM. What the standard of liability would be for a provider without Section 230 for this material, and what role the best practices play (if any), is necessarily uncertain, as the bill simply removes a liability protection. The best practices themselves are not legally binding. The law does not specify what the new standard of liability would be, and it could vary state by state and over time as new laws are passed. Nevertheless, a clear goal of this legislation is to incentivize, with the threat of massive liability, platforms to change how they are designed and how user communications are monitored — changes that would necessarily affect all users all the time.
We had, and have, other concerns about the EARN IT Act relative to the principles we have articulated to ensure proposals for Section 230 reform protect free expression online. You can see these on our Section 230 Principles scorecard for this bill (although the goal of EARN IT has more to do with law enforcement than content moderation).
Lastly, others have made cases for the unconstitutionality of the EARN IT Act on the basis of either the First or Fourth Amendment or both.
A Lesson in Unanticipated Consequences
As we write this, Congress has also just reintroduced a bill calling for an assessment of “the unintended impacts” of SESTA-FOSTA, the combined package of the Stop Enabling Sex Traffickers Act (SESTA) and the Allow States and Victims to Fight Online Sex Trafficking Act (FOSTA) that passed Congress in early 2018. SESTA-FOSTA holds websites liable for user-generated content that facilitates sex trafficking, but overwhelming evidence indicates that it has had profoundly negative effects on the health and safety of sex workers and people engaged in consensual, transactional sex. It should serve as a proof point and a lesson that in the face of uncertainty about what legal standards will apply to their content moderation and business practices, platforms will necessarily over-moderate, silence protected speech, and shut down accounts and pages in order to minimize legal and financial exposure. In the meantime, the bad actors move to offshore sites and the dark web, making enforcement even more difficult.
We do not have to sacrifice the privacy and security of our online communications to stop the exploitation of children. There are ways that Congress can truly address the scourge of CSAM, like providing more enforcement resources and victim assistance or addressing the poverty and housing instability that make children more vulnerable. They can also adopt and pass the various bills designed to directly address real-life violence and abuse of women and children. Unfortunately, the “new” EARN IT Act still distracts from achieving that result and instead harms CSAM victims by giving the broader public the impression that EARN IT will actually do something to help them and future victims.
Any way you look at it, the “new” EARN IT Act is an old idea. Don’t buy it.