Frustrated With Its Own Failure, Congress Issues an Ultimatum to Big Tech

A new bill that proposes to "sunset" Section 230 could work against Congress' very own goals of reining in Big Tech.

Earlier this week, House Energy and Commerce Committee Chair Cathy McMorris Rodgers and Ranking Member Frank Pallone introduced a discussion draft of a bill to “sunset” Section 230 of the Communications Act at the end of 2025 unless Big Tech “works with Congress… to enact a new legal framework.” In their accompanying opinion commentary in the Wall Street Journal (note: paywalled), the House leaders noted “Congress’s failure to revisit this law” and said that “lawmakers have tried to no avail” to address the concerns they have with Big Tech “putting profit ahead of the health of our society.” Two days later, the House leaders scheduled a hearing on the proposal. 

Ironically, this proposal to repeal Section 230 follows an April 11 hearing of the House Energy and Commerce Subcommittee on Communications and Technology – which Representative McMorris Rodgers also chairs – titled, “Where Are We Now: Section 230 of the Communications Decency Act of 1996.” The stated purpose of that hearing was to “provide an opportunity to reexamine the purpose of Section 230 and discuss what Congress can do to bring this law into the 21st Century.” After asking lots of questions and listening and nodding thoughtfully at ideas brought forward by their hand-picked witnesses, there was strong bipartisan agreement among the legislators that thoughtful reform, not repeal, was the optimal path. 

That was a month ago. Apparently, ensuing internal discussions of “what Congress can do” went nowhere. 

Section 230 of the Communications Act shields online websites and platforms from lawsuits relating to content produced by individual users using their services, and for the moderation of that third-party content by the platform or website itself. It is designed to encourage good faith content moderation. Though discussions of Section 230 are usually framed in terms of dominant social media platforms, it actually also covers small companies, startup platforms, newspapers with comment sections, review sites like Yelp, dating apps, and every other individual website or online service that accepts material from users. Repealing the law would mean that every company that relies on user content would be motivated to avoid the risk of liability by aggressively moderating, downranking, or deleting user content. History shows that the greatest impact would fall on communities most marginalized or outside the mainstream. The wrong kind of reform of Section 230, and certainly the repeal of it, would have enormous negative consequences for our ability to freely express ourselves online. That’s because, while it benefits a wide range of digital services, Section 230’s primary beneficiaries are users: it shields users from liability for their retweets, shares, and forwards of others’ content. 

Repealing Section 230 would also help turn Big Tech into Even Bigger Tech, since heightened legal risk and moderating costs would be barriers to entry for new players that may better align their content moderation approaches with Americans’ personal values. The right kind of reform, expressed in Public Knowledge’s principles to protect free expression on the internet, would focus on the platforms’ own conduct, not user content, and/or concentrate any new liability on content they are paid to publish – that is, their ad-based business model. 

Some of the things McMorris Rodgers and Pallone decry in their opinion commentary – like lack of platform transparency and accountability, dangerous product design, and an uneven technology playing field – are in fact conducive to legislation. Proposals for every one of them have crossed their desks – the Digital Services and Oversight Act (DSOSA) and the American Innovation and Choice Online Act (AICOA), to name a few. 

Oddly, McMorris Rodgers and Pallone point out that “the First Amendment is the basis for our free speech protections in the U.S.” They’re right – which means that even if Section 230 were repealed tomorrow the vast majority of attempts to assign liability for harms associated with user content would be rejected on constitutional grounds, anyway. Also oddly, McMorris Rodgers and Pallone say Section 230 means platforms can’t be held responsible for posts “selling drugs or illegal weapons” or from “criminals.” But Section 230 does NOT shield platforms from liability for content that violates federal criminal law.

We favor national privacy legislation, competition policy, accountability and transparency for platforms’ moderation choices, a dedicated digital regulator, and other means to “ensure the internet is a safe, healthy place,” as Chair McMorris Rodgers and Ranking Member Pallone describe. But sunsetting 230 is not the answer. Tell Congress to protect Section 230.