A Supreme Court Ruling in Murthy v. Missouri Could Help – or Hinder – Democracy Next Year

In August, Public Knowledge published a perspective on Missouri v. Biden, a lawsuit brought by the attorneys general of Louisiana and Missouri arguing that the Biden administration improperly coordinated with social media companies to censor conservative viewpoints under the guise of anti-misinformation campaigns.

In August, Public Knowledge published a perspective on Missouri v. Biden, a lawsuit brought by the attorneys general of Louisiana and Missouri arguing that the Biden administration improperly coordinated with social media companies to censor conservative viewpoints under the guise of anti-misinformation campaigns. Our position is that while we must ensure that platforms are not under undue or inappropriate pressure from the government, decisions that restrict informed, independent, and responsible content moderation will likely lead to the further deterioration of our information ecosystem.  

The case has seen several twists and turns since then, and they show more clearly the plaintiffs’ intent. As background: Based on an analysis colorfully described by a prominent First Amendment scholar as “a dog’s breakfast,” in early September the Fifth U.S. Circuit Court of Appeals upheld but also significantly narrowed a lower court’s injunction freezing communications between social media companies and large swaths of the federal government. The Fifth Circuit decision meant, among other things, that only the White House, the Surgeon General, the CDC, and the FBI were subject to the preliminary injunction. 

The plaintiffs quickly asked the Fifth Circuit to revisit its decision. In particular, they wanted the court to maintain the preliminary injunction against the Cybersecurity and Infrastructure Security Agency (CISA) and the State Department’s Global Engagement Center, and the prohibition against federal officials collaborating with the Election Integrity Partnership (EIP). What do these particular organizations have in common? They work to protect election information integrity. The appeals court substituted an opinion in which CISA – and only CISA – was additionally subject to the Louisiana court’s preliminary injunction. It did not explain why it had changed its mind

The Supreme Court, having already issued and extended a stay on the previous version of the modified order, granted a temporary stay on this one, as well. It was a response to an application from the Solicitor General, on behalf of Surgeon General Vivek H. Murthy, that described the lower court’s actions as “novel,” “unbounded,” “startling,” “radical,” and “ill-defined” (among other things). Justices Alito, Thomas, and Gorsuch dissented. 

The Supreme Court subsequently agreed to hear what is now Murthy v. Missouri during its 2023–24 term. The Court also lifted the injunction, which means all federal agencies can continue to collaborate with social media platforms to inform their content moderation policies and procedures. We hope the Supreme Court’s eventual decision will preserve these freedoms. Again, although we agree platforms must not be inappropriately pressured by the government, collaboration with specialized agencies with distinct expertise and information is necessary to ensure informed content moderation in the public interest. 

In our view, this case is about much more than past content moderation decisions in regard to the pandemic and Hunter Biden’s laptop. One year out from the next national election, motivated plaintiffs are determinedly asking courts to disallow government national security agencies from collaborating with researchers whose stated aims are to “strengthen platform standards for combating election-related misinformation.” We believe the plaintiffs’ aim is to use every political and legal tool at their disposal to preserve the ability to use networked disinformation as a political strategy.

In fact, the Supreme Court case is only one manifestation of an orchestrated effort to equate government collaboration with platforms with “censorship,” in the interest of chilling platform content moderation and academic research about it. Documents from the discovery process in Missouri v. Biden have been the basis of hearings and angry letters from legislators making the discredited point that informed platform content moderation represents targeted suppression of conservative political viewpoints. The impact on election information integrity efforts has been well-documented: stalled coordination, reductions in research investments, and chilling of platform content moderation that will increase the likelihood of either foreign or domestic manipulation of U.S. elections. Now new philanthropic funds are starting up to provide researchers who receive subpoenas for information relating to their research with legal advice. 
Taken together, as one recent research paper noted, “…it is difficult to avoid the realization that one side of politics – mainly in the U.S. but also elsewhere – appears more threatened by research into misinformation than by the risks to democracy arising from misinformation itself.”  The Supreme Court’s decision in Murthy v. Missouri could help us avoid the latter.