Digital platforms can sometimes undermine the public’s faith in facts by amplifying lies and conspiracy theories. Algorithms programmed to maximize user engagement on digital platforms fail to distinguish between true, false, and somewhere in between, and regardless of its veracity, outrageous content gets the most clicks. Misinformation now runs so rampant on digital platforms that many people don’t know what to believe anymore.
Misinformation, harassment, and hate speech can have dire real-world consequences. At the same time, it is a Gordian knot for social media platforms and would-be regulators who must decide how to moderate content. Section 230 of the Communications Act protects social media companies from liability relating to hosting, editing, or taking down third-party content. It is one of the most important and wide-reaching laws that affect the internet. Platforms are not liable for user-posted content that they carry, and can take down objectionable content without fear of lawsuits. This allows digital platforms to rid their services of misinformation, hate speech, and other forms of objectionable content.
A diverse and competitive media and online information ecosystem is the best way to ensure that all voices are heard. That is why those who disagree with choices popular platforms make should support policies designed to empower users and increase competition. With increased competition and interoperability, users will not be tied to one platform. Different platforms have different policies and users should be able to choose which platforms they support.
One possible solution is Public Knowledge’s proposed “Internet Superfund.” The Internet Superfund proposal tackles misinformation by calling on dominant social media companies to adopt a process in their content moderation that serves the public interest: partnering with organizations and authoritative sources of information that are independent and qualified in fact-checking, including local news organizations. This would have the dual benefit of mitigating misinformation and helping to fund local journalism. Fact-checking already exists on platforms such as YouTube, Twitter, and Facebook, but their processes tend to be confusing or ambiguous. The Internet Superfund would elevate reliable sources of trusted information and make moderation more transparent.
Policymakers should focus on solutions to promote greater transparency, accountability, and due process for content moderation practices, as well as solutions to improve competition in order to give users more choices in the platform marketplace. Section 230 enables content moderation that protects free speech and we evaluate proposed reforms to ensure that any reforms to this communications law encourages content moderation in the public interest.
View our resource page to defeat poor legislation like the Journalism Competition and Preservation Act threatening news outlets and competition in news.