Always know what’s #trending

52 F
New York

Misinformation is more than just bad facts: How and why people spread rumors is key to understanding how false information travels and takes root

On Sept. 20, 2024, a newspaper in Montana reported an issue with ballots provided to overseas voters registered in the state: Kamala Harris was not on the ballot. Election officials were able to quickly remedy the problem but not before accusations began to spread online, primarily among Democrats, that the Republican secretary of state had purposefully left Harris off the ballot.

This false rumor emerged from a common pattern: Some people view evidence such as good-faith errors in election administration through a mindset of elections being untrustworthy or “rigged,” leading them to misinterpret that evidence.

As the U.S. approaches another high-stakes and contentious election, concerns about the pervasive spread of falsehoods about election integrity are again front of mind. Some election experts worry that false claims may be mobilized – as they were in 2020 – into efforts to contest the election through tactics such as lawsuits, protests, disruptions to vote-counting and pressure on election officials to not certify the election.

Our team at the University of Washington has studied online rumors and misinformation for more than a decade. Since 2020, we have focused on rapid analysis of falsehoods about U.S. election administration, from sincere confusion about when and where to vote to intentional efforts to sow distrust in the process. Our motivations are to help quickly identify emerging rumors about election administration and analyze the dynamics of how these rumors take shape and spread online.

Through the course of this research we have learned that despite all the discussion about misinformation being a problem of bad facts, most misleading election rumors stem not from false or manipulated evidence but from misinterpretations and mischaracterizations. In other words, the problem is not just about bad facts but also faulty frames, or the mental structures people rely on to interpret those facts.

Misinformation may not be the best label for addressing the problem – it’s more an issue of how people make sense of the world, how that sensemaking process is shaped by social, political and informational dynamics, and how it begets rumors that can lead people to a false understanding of events.

There is a long history of research on rumors going back to World War II and earlier. From this perspective, rumors are unverified stories, spreading through informal channels that serve informational, psychological and social purposes. We are applying this knowledge to the study of online falsehoods.

Though many rumors are false, some turn out to be true or partially true. Even when false, rumors can contain useful indications of real confusions or fears within a community.

Rumors can be seen as a natural byproduct of collective sensemaking – that is, efforts by groups of well-meaning people to make sense of uncertain and ambiguous information during dynamic events. But rumors can also emerge from propaganda and disinformation campaigns that lead people to misinterpret or mischaracterize their own and others’ experiences.

Prior research describes collective sensemaking as a process of interactions between evidence and frames. Evidence includes the things people see, read and hear in the world. Frames are mental schema that shape how people interpret that evidence.

The relationship between evidence and frames flows in two directions. When people encounter novel events or new evidence, they try to select the best frame from their mental filing cabinets. The selected frame then determines what evidence they focus on and what evidence they exclude in their interpretations. This evidence-frame view of collective sensemaking can help researchers understand rumors and disinformation.

Everyone has their own ways of interpreting events based on their unique experiences. But your frames are not yours alone. Frames are shaped, sometimes intentionally, by information from media, political leaders, communities, colleagues, friends, neighbors and family. Framing – the process of using, building, reinforcing, adapting, challenging and updating frames – can be a deliberate strategy of political communication.

Frames play a role in generating rumors, shaping how people interpret emerging events and novel evidence. False rumors occur when sensemaking goes awry, often due to people focusing on the wrong piece of evidence or applying the wrong frame. And disinformation, from this perspective, is the intentional manipulation of the sensemaking process, either by introducing false evidence or distorting the frames through which people interpret that evidence.

In 2020, we saw these dynamics at work in a rumor about Sharpie pens in Arizona. In the lead-up to the election, President Donald Trump and his allies repeatedly alleged that the election would be rigged – setting a powerful frame for his followers. When voters noted that the Sharpie pens provided by election officials were bleeding through their ballots, many interpreted their experiences through the frame of a “rigged election” and became concerned that their ballots would not be counted.

Some people shared those experiences online, where they were soon amplified and given meaning by others, including online influencers. Concerns and suspicions grew. Soon, members of Trump’s family were repeating false claims that the bleed-through was systematically disenfranchising Republican voters. The effect was circular and mutually reinforcing. The strategic frame inspired misinterpretations of evidence – real bleed-through falsely seen as affecting ballot counting – that were shared and amplified, strengthening the frame.

Collective sensemaking is increasingly taking place online, where it is profoundly shaped by social media platforms, from features such as repost and like buttons to algorithmic recommendations to the connections between accounts.

Not so long ago, many people hoped that the internet would democratize information flows by removing the historical gatekeepers of information and disrupting their ability to set the agenda – and the frames – of conversation. But the gatekeepers have not been erased; they have been replaced. A group of newsbrokering influencers have taken their place, in part by gaming the ways online systems manipulate attention.

Many of these influencers work by systematically seeking out and amplifying content that aligns with prevailing political frames set by elites in politics and media. This gives creators the incentive to produce content that resonates with those frames, because that content tends to be rewarded with attention, the primary commodity of social media.

These dynamics were at work in February 2024, when an aspiring creator produced a man-on-the-street video interviewing migrants to the U.S. that was selectively edited and captioned to falsely claim to show undocumented migrants planning to vote illegally in U.S. elections. This resonated with two prominent frames: the same rigged-election frame from 2020 and another that framed immigration as harmful to the U.S.

The video was shared across multiple platforms and exploded in views after being amplified by a series of accounts with large followings on X, formerly Twitter. X CEO Elon Musk commented with an exclamation point on one post with the embedded video. The creator soon found himself on Fox News. He currently has hundreds of thousands of followers on TikTok and Instagram and continues to produce similar content.

Interactions between influencers and online audiences result in content that fits strategic frames. Emerging events provide new evidence that people can twist to fit prevailing frames, both intentionally and unintentionally. Rumors are the byproducts of this process, and online attention dynamics fuel their spread.

Heading into the 2024 election, false and misleading claims about election integrity remain widespread. Our team has tracked more than 100 distinct rumors since the beginning of September. The machinery for quickly converting perceived evidence from elections into widely shared rumors and conspiracy theories is increasingly well oiled.

One concerning development is an increase in so-called election integrity organizations that seek to recruit volunteers who share the rigged-election frame. The groups aim to provide volunteers with tools to streamline the collection and amplification of evidence to support the rigged-election frame.

One worry is that these volunteers may misinterpret what they see and hear on Election Day, generating additional rumors and false claims about election integrity that reinforce that increasingly distorted frame. Another is that these false claims will feed lawsuits and other attempts to contest election results.

However, we hope that by shedding light on some of these dynamics, we can help researchers, journalists, election officials and other decision-makers better diagnose and respond to rumors about election integrity in this cycle. Most importantly, we believe that this collective sensemaking lens can help us all to both empathize with well-meaning people who get caught up in sharing false rumors and see how propagandists manipulate these processes for their gain.

This article is republished from The Conversation, a nonprofit, independent news organization bringing you facts and trustworthy analysis to help you make sense of our complex world. It was written by: Kate Starbird, University of Washington and Stephen Prochaska, University of Washington

Read more: 7 ways to avoid becoming a misinformation superspreader when the news is shocking Online rumors sparked by the Trump assassination attempt spread rapidly, on both ends of the political spectrum Disinformation campaigns are murky blends of truth, lies and sincere beliefs – lessons from the pandemic

Kate Starbird receives funding from the National Science Foundation, Knight Foundation, Hewlett Foundation, and Craig Newmark Philanthropies.

Stephen Prochaska has received funding from the National Science Foundation, Knight Foundation, and Hewlett Foundation.

Related Articles

Skip to content