Begin typing your search...

    I study disinformation. This election will be grim.

    The way theories of “the steal” went viral was eerily routine. First, an image or video, such as a photo of a suitcase near a polling place, was posted as evidence of wrongdoing.

    I study disinformation. This election will be grim.
    X

    Representative Image

    By Renee DiResta

    NEW YORK: In 2020, the Stanford Internet Observatory, where I was until recently the research director, helped lead a project that studied election rumors and disinformation. As part of that work, we frequently encountered conspiratorial thinking from Americans who had been told the 2020 presidential election was going to be stolen.

    The way theories of “the steal” went viral was eerily routine. First, an image or video, such as a photo of a suitcase near a polling place, was posted as evidence of wrongdoing. The poster would tweet the purported evidence, tagging partisan influencers or media accounts with large followings. Those accounts would promote the rumor, often claiming, “Big if true!” Others would join and the algorithms would push it out to potentially millions more. Partisan media would follow.

    If the rumor was found to be false and it usually was corrections were rarely made and even then, little noticed. The belief that “the steal” was real led directly to the events of Jan. 6, 2021.

    Within a couple of years, the same online rumor mill turned its attention to us the very researchers who documented it. This spells trouble for the 2024 election.

    For us, it started with claims that our work was a plot to censor the right. The first came from a blog related to the Foundation for Freedom Online, the project of a man who said he “ran cyber” at the State Department. This person, an alt-right YouTube personality who’d gone by the handle Frame Game, had been employed by the State Department for just a couple of months.

    Using his brief affiliation as a marker of authority, he wrote blog posts styled as research reports contending that our project, the Election Integrity Partnership, had pushed social media networks to censor 22 million tweets. He had no firsthand evidence of any censorship, however: his number was based on a simple tally of viral election rumors that we’d counted and published in a report after the election was over. Right-wing media outlets and influencers nonetheless called it evidence of a plot to steal the election, and their followers followed suit.

    Here’s what we actually did: Teams of student analysts identified social media posts that were potentially misleading the public about voting procedures, or which tried to delegitimize the outcome of an election. Sometimes a nonprofit clearinghouse that included state and local election officials shared with us posts that concerned them. In some cases, if a post we examined appeared to be going viral, and appeared to violate a social media platform’s election policies, we let the companies know. Most of the time, the platforms took no action; when they did act, it was primarily to label the post as disputed, or to attach a fact check.

    The real impact of the rumors about us came offline. After the House flipped to Republican control in 2022, the investigations began. The “22 million tweets” claim was entered into the congressional record by witnesses during a March 2023 hearing of a House Judiciary subcommittee. Two Republican members of the subcommittee, Jim Jordan and Dan Bishop, sent letters demanding our correspondence with the executive branch and with technology companies as part of an investigation into our role in a Biden “censorship regime.” Subpoenas soon followed, and the investigations eventually expanded to requesting that our staff submit to closed-door video-recorded testimonies. That included students who worked on the project.

    It was obvious to us what would happen next: The documents we turned over would be leaked and sentences cherry-picked to fit a pre-existing narrative. This supposed evidence would be fodder for hyperpartisan influencers, and the process would begin again. Indeed, this is precisely what happened, albeit with a wrinkle. Material the subcommittee obtained under subpoena or in closed-door hearings ended up in the hands of a right-wing group that had sued us, which was led by Mr. Jordan’s longtime ideological ally Stephen Miller. We do not know how.

    This brings us to the present, when another election looms. The 2024 rerun is already being viciously fought. Since 2020, the technological landscape has shifted. There are new social media platforms in the mix, such as Bluesky, Threads and Truth Social. Election integrity policies and enforcement priorities are in flux at some of the biggest platforms. What used to be Twitter is under new ownership and most of the team that focused on trust and safety was let go.

    Fake audio generated by artificial intelligence has already been deployed in a European election, and A.I.-powered chatbots are posting on social-media platforms. Overseas players continue to run influence operations to interfere in American politics; in recent weeks, OpenAI has confirmed that Russia, China and others have begun to use generative text tools to improve the quality and quantity of their efforts.

    Offline, trust in institutions, government, media and fellow citizens is at or near record lows and polarization continues to increase. Election officials are concerned about the safety of poll workers and election administrators perhaps the most terrible illustration of the cost of lies on our politics.

    As we enter the final stretch of the 2024 campaign, it will not be other countries that are likely to have the greatest impact. Rather, it will once again be the domestic rumor mill. The networks spreading misleading notions remain stronger than ever, while the networks of researchers and observers who worked to counter them are being dismantled.

    Universities and institutions have struggled to understand and adapt to lies about their work, often remaining silent and allowing false claims to ossify. Lies about academic projects are now matters of established fact within bespoke partisan realities.

    Costs, both financial and psychological, have mounted. Stanford is refocusing the work of the Observatory and has ended the Election Integrity Partnership’s rapid-response election observation work. Employees including me did not have their contracts renewed.

    This is disappointing, though not entirely surprising. The investigations have led to threats and sustained harassment for researchers who find themselves the focus of congressional attention. Misleading media claims have put students in the position of facing retribution for an academic research project. Even technology companies no longer appear to be acting together to disrupt election influence operations by foreign countries on their platforms.

    Republican members of the House Judiciary subcommittee reacted to the Stanford news by saying their “robust oversight” over the center had resulted in a “big win” for free speech. This is an alarming statement for government officials to make about a private research institution with First Amendment rights.

    The work of studying election delegitimization and supporting election officials is more important than ever. It is crucial that we not only stand resolute but speak out forcefully against intimidation tactics intended to silence us and discredit academic research. We cannot allow fear to undermine our commitment to safeguarding the democratic process.

    NYT Editorial Board
    Next Story