The Senate Commerce Section 230 Hearing that Could Have Been
The Senate Commerce Section 230 Hearing that Could Have Been
The Senate Commerce Section 230 Hearing that Could Have Been

    Get Involved Today

    Yesterday’s Senate Commerce Committee hearing on Section 230 — six days before the national election — could have been about what responsibility platforms have in addressing foreign attempts to interfere in our election. It could have been about the degradation of local news. It could even have been a good faith debate about the merits and flaws of those 26 words that created the internet. Instead, it was an opportunity for Senate Commerce Republicans to aggressively assert the myth of anti-conservative bias on Twitter, Facebook, and Google that has been repeatedly proven false.

    Many of the Senate Republicans serving on the Commerce Committee have rebranded legitimate content moderation as censorship, stretching the idea of censorship far beyond its reasonable limits and asking 69 questions about it, according to the New York Times. Senator Mike Lee, for example, articulated an Orwellian definition of censorship (he called it a “term of art”), claiming that it includes: “blocking content, fact checks, labelling content, or demonetized websites.” What he described is not censorship but content moderation, and it is protected by Section 230 for a reason. Section 230 gives platforms a legal shield to take down “otherwise objectionable” content, and Google CEO Sundar Pichai pointed out that this ambiguity allows platforms to respond to unforeseeable events, from the seemingly lighthearted but eventually dangerous “Tide Pod challenge” to the horrific live streamed Christchurch shooting. Platforms have used Section 230 to remove anti-semitic content like Holocaust denials as well as conspiracy theories like QAnon. And, of course, when a platform engages in its own counter-speech, such as through fact-check labels, this speech is not protected by Section 230 — just by the First Amendment in the same way as traditional media outlets like newspapers and TV channels are when they fact-check misinformation.

    Rather than doubling down on accusations of censorship, it would be more useful to engage with a marketplace of ideas. There are other platforms like Parler that employ more lax content moderation, and there are bipartisan, pro-competition ideas like promoting interoperability between platforms. Competition between platforms with different policies would allow users to choose the environment they want online. If users are dissatisfied with platform options currently on the market, this hearing would have been an ideal opportunity to expand on anti-competitive monopoly power just laid out in a report by the House’s Antitrust Subcommittee. 

    Hauling the CEOs of Twitter, Facebook, and Google to a committee hearing that was functionally a campaign event six days before voting ends is not a coincidence. This action appears as a threat against legitimate content moderation. The hearing is a perfect complement to bills that link Section 230 protections to a plethora of dubious goals, from the President’s calls to “revoke 230” to the Federal Communications Commission’s recent decision that it can interpret Section 230 however it sees fit despite an obvious lack of authority to do so

    With voting disinformation rampant and election results expected to be delayed, Senator Schatz’s urging to these social media CEOs is crucial: “Do not let the United States Senate bully you into carrying the water for those who want to advance misinformation.” In other words, do not “bend over backwards” to accommodate absurd demands just because of accusations that you are being liberal. Honest, good faith content moderation and transparent policies will speak for themselves. We look forward to, as Sen. Tester said, “getting past this political garbage” to get back to the work of the bipartisan efforts, led by Senators Thune and Schatz, to discuss reasonable measures to promote greater transparency, accountability, and due process for content moderation practices that can empower user choices between these dominant platforms and other smaller options.