By Parmy Olson
Not long after Hurricane Helene wrought destruction across the southern US, a more bewildering storm blew through: Officials with the Federal Emergency Management Agency (FEMA) bumped up against angry residents and armed militia in Tennessee and North Carolina, people who’d been riled up by rumors that the officials were there to take their homes. FEMA evacuated its teams, leaving behind communities that desperately needed help.
A cursory search of X (formerly Twitter) brought up several viral videos suggesting that FEMA was bulldozing bodies under the rubble, but press reports like this one in the Washington Post were unclear about exactly where and how the rumors were spreading. They were just… spreading. That posed something even more troubling: How could you hold online platforms accountable for conspiracy theories if you didn’t know where they were being shared?
The answer is “You can’t,” because the people studying the flow of disinformation are being sued by those who seem to benefit from the spread of “alternative facts.”
A raft of lawsuits and congressional investigations against several groups studying disinformation in the US, coming largely from Republican lawmakers and tech billionaire Elon Musk, have had a chilling effect on the broader effort to tackle viral falsehoods. These research groups study how lies spread online and alert the public when they find coordinated campaigns to mislead people. They analyze networks of accounts, map viral posts and document who creates and shares misleading content.
Why the aggro? In part, because of the way that some of the disinformation campaigns tracked by these groups have also aligned with conservative positions.
Also Read
Take the Covid-19 pandemic. A number of Republican leaders and influencers, including Donald Trump himself, questioned many of the social-distancing measures and mask mandates, and created the vaccine skepticism that became part of conservative messaging. When disinformation groups called on social media platforms to remove posts with Covid misinformation (which they did), Republicans saw that as a partisan attack. When they did the same with posts about the “stolen” 2020 election, that was seen as yet another attack on conservatives.
The tech giants have attracted some ire for this, but it’s the small disinformation groups that are most vulnerable, especially if Trump gets voted in on Nov. 5. There has already been a noticeable decline in their research output the past year — hence the lack of information about how the FEMA rumors were spreading. They’re too busy defending against lawsuits.
A standout example was the unwinding in June of the Stanford Internet Observatory, which was founded in 2019 by Alex Stamos, the former chief security officer of Facebook, after his frustration that the social network wasn’t more transparent about Russian influence operations on its platform during the 2016 US presidential election. His new group went on to uncover large networks of fake Facebook accounts being used to warp political discourse. But that work came with a price.
The Observatory found itself having to pay millions of dollars in lawyers’ fees to defend itself against several lawsuits; one 2023 suit from Trump adviser Stephen Miller claimed that the Observatory and other research groups “conspired with the federal government to conduct a mass surveillance and censorship operation targeting the political speech of millions of Americans.” (Stanford University denied in June that the group had been dismantled but admitted its founding grants would “soon be exhausted.” It didn’t respond to a request for comment.)
Lawsuits, congressional subpoenas and probes have hit similar organizations. They have names like Graphika, the University of Washington Disinformation Lab, Atlantic Council’s Digital Forensic Research Lab, Global Disinformation Index, NewsGuard, the Institute for Strategic Dialogue and the Center for Countering Digital Hate.
The latter is fighting a lawsuit from Musk over a report it published in September 2023, which claimed Musk’s X was profiting from neo-Nazi accounts. Musk has also sued Media Matters, a liberal media watchdog group, for reporting in November 2023 that ads from major brands on X appeared next to Nazi-related posts, a case that is still ongoing.
Even some government initiatives have been targeted, including the State Department’s Global Engagement Center, which tackled foreign information but now faces a shutdown.
Shining a spotlight on how disinformation spreads isn’t illegal, yet these groups’ critics have dubbed them a “censorship industrial complex,” a sentiment that plays dangerously into Trump’s comments about Americans having an “enemy within.”
Trump has pledged to "shatter the left-wing censorship regime" if reelected, while the Heritage Foundation's Project 2025 proposes ending all government funding for disinformation research. Doing so would leave America more vulnerable to manipulation and confusion, particularly at a time when social media firms have, partly in response to the growing pressure, cut back on their trust and safety teams and closed access to researchers, most notably with Facebook’s shutdown in August of its trend-monitoring tool CrowdTangle.
In early October, the head of the US intelligence community warned of a serious threat from foreign actors including Russia, Iran and China, aimed at “undermining trust” in polls and the US democratic process, ostensibly through social media.
The coming election is set to be one of the closest for decades, threatening a raft of new conspiracy theories about a rigged vote. Calling disinformation research “censorship” erodes the already-scant checks and balances we have on large technology platforms. It leaves Americans more exposed to the next storm.
Disclaimer: This is a Bloomberg Opinion piece, and these are the personal opinions of the writer. They do not reflect the views of www.business-standard.com or the Business Standard newspaper