In 2016, Facebook, Twitter, and other digital platforms were used as they were designed — to help content go viral — but for destructive purposes. Misinformation that ramped up during the election cycle was not just coming from within the United States, but from a coordinated influence campaign wherein Russian bots produced and disseminated disinformation — false information distributed with the intent to mislead. Some of the disinformation campaigns went so far as to recruit Americans to write articles or organize protests. It even took a year after the election for Facebook to admit the platform had a misinformation problem. The aim of foreign election interference via disinformation, both now and in 2016, is not to support one candidate over the other, but to undermine Americans’ faith in democratic institutions. As The New York Times explains, “Chaos is the point.”
Genuine efforts from platforms to address disinformation began in 2018, two years after the devastating risk of disinformation was laid bare. Facebook now looks for things it didn’t before, from “fake ads paid for in rubles” to “self-proclaimed Texas secessionists logging in from St. Petersburg.” The disinformation campaigns from foreign actors certainly haven’t stopped just because social media platforms have begun to defend against them. Russian groups that pumped out disinformation in 2016 have reactivated, launching far-right and far-left websites intended to highlight division and mistrust. The fact remains that content moderation is a massive challenge. The difference between a Russian-backed call to disavow election results with a genuine call to action from an American citizen is narrower than ever and the stakes couldn’t be higher.
Social media platforms are still struggling to protect the integrity of the elections process. Applying close scrutiny to paid political ads, like the reforms called for in the bipartisan “Honest Ads Act,” is essential — if platforms are going to profit from an ad, they should ensure that it meets their code of conduct and that they maintain public databases of all online political advertisements, regardless of whether they mention specific candidates. They should also work to identify and stop trolls, foreign and domestic, that produce organic content and pay to heavily promote it to micro-targeted groups. Additionally, platforms should avoid using algorithms that maximize engagement by promoting the most extreme, outrageous content. These changes won’t come from self-regulation alone — platforms are loath to admit that their business model plays a damaging role in democracy. Public policy actions, like a superfund for the internet and a specialized regulatory authority, are needed to rein in the disinformation campaigns that plague social media platforms.
Digital platforms are not the only ones responsible for defending and upholding American democracy. Elections and intelligence officials need to adapt to the post-2016 reality of foreign interference, and in some ways, they have. The FBI has taken tentative action, warning that “foreign actors” may attempt to spread disinformation even after November 3, and urging Americans to be wary of any new websites and social media content that “discredit the electoral process.”
Foreign adversaries who aim to sow strife in the American electorate are bolstered by domestic actors who spread misinformation that benefit their political agenda. Platforms are carriers of mis- and disinformation, but lies can have an equally destructive impact on trust in democratic institutions when they are presented as legitimate by news sources. Cable news is a potent culprit within the right-wing media ecosystem. Fox News, in particular, is an active proponent of disinformation, serving as a dangerous “transmission vector” for right-wing conspiracy theories that often originate online. Conspiracy theories like Pizzagate develop in fringe right-wing communities online, spread outward to a broader audience via more mainstream social media platforms, and become touchstones of a right-wing media ecosystem that is isolated from center and left-leaning media.
For example, a Harvard study found that the debunked narrative of voter fraud is amplified by President Trump and Republican officials. The impact on American democratic institutions is all too real. A Pew Research poll from September found that almost half of Republicans believe that fraudulent mail-in ballots have been a major problem. Of Republicans who only get news from conservative talk radio shows and/or Fox News, 61% report that mail-in ballots pose a major problem. The disinformation campaign to sway public trust in mail-in ballots may have a material impact in the 2020 election — and it will be too late to look back on what could have been done to prevent lies from becoming commonly held beliefs among many Americans.
Without strong, thoughtful policy action, much of the work of combatting disinformation campaigns falls to individuals without the resources of platforms, the intelligence community, partisan politicians, or election officials. Citizens need to stay vigilant in protecting democratic institutions, even after election day. Consumers who want to combat misinformation should learn to distinguish between genuine profiles and trolls. Real social media accounts usually cover a variety of topics important to their communities as opposed to just parroting partisan messages. New tools, like Clemson University’s “Spot the Troll” quiz, are launching to help users learn how to identify troll accounts meant to spread misinformation from genuine accounts.
It’s equally important for consumers to differentiate between legitimate news stories and propaganda or rumor articles meant to sow social discord. This is why experts recommend doing a quick search to see if a controversial event is being reported in other credible sources before sharing a news story. Our democracy will be especially vulnerable to misinformation campaigns this election, as vote counting is expected to stretch past election day. This could present bad actors with the perfect opportunity to spread disinformation, potentially undermining public faith in democratic institutions when we need them most.
Image by Wokandapix from Pixabay.