In July, the Supreme Court issued its decision in the NetChoice cases, Moody v. NetChoice (Florida) and NetChoice v. Paxton (Texas). (Because they raised such similar issues, the Court considered them together and wrote one opinion deciding both.) Both laws sought to counter the perceived liberal bias of major social media platforms, putting restrictions on how platforms moderate user-submitted content, and forcing them to host content that violates their policies. The Eleventh U.S. Circuit Court of Appeals found Florida’s law to be unconstitutional, but the Fifth Circuit ruled the other way for Texas’ law, and the Supreme Court stepped in to resolve this circuit split. Public Knowledge filed a brief in the case. We agree with the Court’s decision in regard to platform moderation and appreciate that nothing in the Court’s First Amendment analysis prevents reasonable public interest regulation of internet platforms – potential examples of which are discussed below.
Going into the case, there were two main worries. The first was that the Court might simply uphold the laws entirely, allowing state governments to decide what speech social media users are allowed to see. This was unlikely to begin with, as it would involve ignoring decades of First Amendment precedent. After the Court’s oral argument, this was almost certainly not going to be the outcome. But nowadays, who knows.
The other potential worrying outcome was that the Court would overreach — finding that Texas and Florida acted unconstitutionally, but making such broad statements that other forms of online consumer protection laws, ones that do not raise such obvious First Amendment issues as the Texas and Florida laws, would also be found unconstitutional. This was a more realistic possibility.
Thankfully, the Court rejected the attempts by Texas and Florida to override the content moderation policies of social media platforms without overreaching. Applying decades of precedent, Justice Kagan, writing for the majority, explained that social media companies have the same First Amendment rights as any other private actor, such as newspapers, to select, edit, and remove the content on their platforms. This is a win for free expression and for social media users. As we explained in our brief, the Texas and Florida laws “would have deleterious effects on the functionality and usefulness of social media platforms, including requiring or incentivizing them to publish pro-terrorist content, hate speech, spam, Holocaust denial, snake-oil ‘medical’ claims, lies about the time and place of elections, and fraud.”
Requiring that social media companies carry content of this kind does not promote free expression – it corrodes it. At the same time, the First Amendment protects platforms that choose to adopt more hands-off policies in some areas, such as Elon Musk’s X, formerly Twitter. Users should be able to use social media platforms that take different approaches to content moderation, not one-size-fits-all policies imposed by politicians that openly state a wish to force platforms to host and promote conservative viewpoints and to punish companies whose editorial standards they disagree with.
The Court, however, did not strike down the laws entirely. NetChoice, a trade association representing tech platforms and the plaintiff in these cases, brought what is known as a “facial” challenge to the laws, asking the Court to completely invalidate the laws – to find them unconstitutional in every respect. If NetChoice had prevailed on this point, the litigation would be over. But the Court did not take up NetChoice on its invitation. While the Court very thoroughly explained how it would be unconstitutional to apply the state laws to social media feeds, curation, and content moderation, the laws themselves are written very broadly and might constitutionally apply in contexts other than social media, or to other kinds of tech platforms. Because of this, the Court sent the cases back to the lower courts to explore these issues.
Instead of ruling that all forms of “nondiscrimination” laws are unconstitutional as applied to tech platforms – which would have been far too broad a ruling that threatens many basic consumer protections – the Court managed to thread the needle, addressing the important free expression issues while leaving room for other forms of regulation of tech platforms. This is not a guarantee that any given tech regulation would be upheld, but at least some justices appear to be favorable towards some kinds of regulation – including ones that organizations like NetChoice might not like. To be clear, the Texas and Florida laws themselves are poorly drafted and confusing, and the Court explained how their primary intended purpose violates the First Amendment. But a broad ruling from the Court could have threatened other, better, laws and policies.
For example, net neutrality rules, which prevent internet service providers from discriminating against certain types of traffic, have been upheld against Constitutional challenges, and moreover, are good policy. Our brief to the Supreme Court elaborated on why net neutrality is beneficial while similar regulations for social media would be harmful and unconstitutional.
Beyond net neutrality, there are other areas where platform regulation, including nondiscrimination laws, might be both constitutional and beneficial. For instance, regulations around data privacy, competition, and transparency in advertising and other practices could be enforced without infringing on First Amendment rights. The Court did not give a free pass to any and every other kind of tech regulation, but consumer protection and pro-competition laws that do not target expressive activity should be on safe ground.
Some of the kinds of laws that may still be upheld after the Court’s ruling include:
- Nondiscrimination Requirements for Digital Payments, Ride-Hailing, or Other Tech-Enabled Services: Many activities that once were conducted offline – and thoroughly regulated – should not escape consumer protection requirements just because they now happen online. For example, regulations ensuring fair treatment in digital payments and ride-hailing or taxi services may be both beneficial and constitutional, and in keeping with how we have long regulated offline activities. At oral argument, Justice Barrett pointed out about the Florida law, “it looks to me like it could cover Uber,” and Justice Sotomayor speculated that it might apply to online marketplaces like Etsy. Even if the Florida law itself would not be the ideal approach, it’s easy to see how rules preventing these kinds of platforms from discriminating against users or service providers arbitrarily could be justified. Ride-hailing services could be mandated to offer equitable access across regions and demographics, preventing discriminatory practices against certain groups of users or drivers. As another example, digital payment platforms could be required to process transactions from all legitimate businesses, or to not hold up people’s money based on political disagreements.
- Product Safety: Tech platforms are products like any other, and like any other product, their creators and sellers should be liable for foreseeable harms. Of course, many platforms are speech platforms, and we should be wary of proposals that say, in effect, that a platform is liable for an unsafe design, if the safety concerns amount to objections to the content platforms carry, how they moderate it, or whether they “promote” it. (The NetChoice decision, in fact, would rule out most such proposals.) But though some in the tech industry would argue otherwise, product design and safety considerations can be consistent with the First Amendment. A ride-hailing platform that connects riders with dangerous drivers is as defective as a faulty car, for instance.
- Data Privacy Regulations: Laws that require tech platforms to protect (or not collect) user data and provide transparency about data usage could be upheld. These regulations focus on user rights and the responsible handling of information rather than on content itself.
- Due Process and Transparency Rights for Users: Platforms have the right to set their own content moderation policies – but users have a right to fairness and consistency. While it would be a bad idea to allow a judge or a regulator to fault a platform due to differing interpretations of what constitutes, for example, “hate speech,” it does not impinge on a platform’s independent editorial judgment to require them to provide users an appeal process, or for them to explain their decisions.
- Competition Laws: Antitrust regulations that prevent monopolistic practices and promote competition in the tech industry are likely to withstand constitutional challenges. These laws ensure a fair marketplace and protect consumers from the dominance of a few large players – and they tend to benefit, not harm, free expression, by ensuring that public discourse isn’t dominated by a few large players.
- Transparency in Advertising and Other Practices: Regulations that demand transparency in how advertisements and other content are displayed and targeted could be beneficial. Such laws would ensure that users are aware of how their data is used without infringing on the platforms’ editorial discretion.
- Interconnection Requirements: Obligations for platforms to interconnect with others, such as the interoperability between different messaging services, or compatibility between major platforms and third-party developers, can promote competition and consumer choice, and offer a path for regulators to limit gatekeeper control of major platforms without attempting to regulate content moderation decisions.
- Nondiscrimination Requirements for broadband, SMS and other telecom services: Public Knowledge has long argued that broadband, SMS, and some internet voice services should be classified as telecommunications services under Title II, subjecting them to nondiscrimination and accessibility requirements. This classification would ensure that all users, regardless of their device or service provider, have equal access to essential communication services. Title II doesn’t only just allow specific nondiscrimination rules like net neutrality, but provides the legal basis for the FCC to oversee the telecommunications services that are at the core of its jurisdiction.
The Supreme Court’s decision in the NetChoice case reaffirms the importance of protecting editorial discretion on social media platforms while leaving the door open for other forms of consumer protection rules. Policymakers looking to protect users, open markets, and to promote, rather than suppress free expression, should take note.