The following text is reprinted with the user’s permission The conversationan online publication featuring the latest research.
We are increasingly aware of how misinformation can influence elections. About 73% of Americans report seeing misleading election news, and about half are struggling to discern what is true or false.
When it comes to disinformation, “going viral” sounds like more than just a catchphrase. Scientists have found a close analogy between the spread of misinformation and the spread of viruses. In fact, it may be how misinformation moves effectively described using mathematical models designed to simulate the spread of pathogens.
About supporting science journalism
If you like this article, please consider supporting our award-winning journalism subscribe. By purchasing a subscription, you’re helping to ensure a future of impactful stories about the discoveries and ideas that shape our world.
Concerns about disinformation are widespread, a The latest UN survey suggesting that 85% of people worldwide are concerned.
These concerns are well founded. Foreign disinformation has grown sophistication and reach Since the 2016 US election. The 2024 election cycle has seen dangerous conspiracy theories “Weather Manipulation” weakening proper hurricane management, surrounding fake news immigrants eating pets Inciting violence against the Haitian community, and misleading the elections conspiracy theories the richest man in the world, Elon Musk, increased.
recently studies they have used mathematical models derived from it epidemiology (analyzing how and why diseases occur in the population). These models were originally developed to study the spread of viruses, but can be effectively used to study the spread of misinformation in social networks.
It is a class of epidemiological models that work for misinformation susceptible-infectious-recovery (SIR) model. These simulate the dynamics between susceptible (S), infected (I) and recovered or resistant (R) individuals.
These models are created from differential equations (which help mathematicians understand rates of change) and are easily applied to spread disinformation. For example, in social networks, false information spreads from one individual to another, and some of them become infected, while others remain immune. Others serve as asymptomatic vectors (disease carriers), spreading misinformation without knowing it or being harmed.
These models are incredibly useful because they allow us to guess and simulate population dynamics and come up with measures such as the basic reproduction number (R0) – the average number of cases produced by an “infected” individual.
As a result, it has grown interest when applying such epidemiological approaches to our information ecosystem. Most social media platforms have one appreciated An R0 greater than 1 indicates that platforms have the potential to spread misinformation like an epidemic.
Looking for solutions
Mathematical modeling usually involves what is called phenomenological research (which researchers describe observed patterns) or mechanistic work (which involves making predictions based on known relationships). These models are particularly useful because they allow us to explore how possible interventions can help reduce the spread of misinformation on social media.
We can illustrate this basic process with a simple illustrative model shown in the graph below, which allows us to analyze how a system can evolve under different hypothetical assumptions, which can then be verified.
Prominent social media figures with large followings can ‘become’super expanders” election disinformation, potentially exploding falsehoods hundreds of millions the people This reflects the current situation with election officials report that is being exceeded in attempts to verify the information.
In our model, if we conservatively assume that people have only a 10% chance of infection after exposure, the disinformation only works. small effectaccording to the exams. Under the 10% chance of infection scenario, the population infected by electoral misinformation grows rapidly (orange line, left panel).

A “compartment” model of misinformation in a user cohort spread over a week, where misinformation has a 10% chance of infecting a susceptible unvaccinated individual at exposure. Debunking efficiency is assumed to be 5%. If prebunking is introduced and is twice as effective as debunking, the dynamics of disinformation contagion change significantly.
Sander van der Linden/Robert David Grimes
A psychological “vaccine”.
The analogy of the viral spread of disinformation is apt precisely because it allows scientists to simulate ways of combating the spread. These interventions include an approach called “Psychological Inoculation”also known as prebunking.
This is when researchers introduce a preemptive fallacy and then disprove it so that people become immune to misinformation in the future. It is similar to vaccination, where people are given a (weakened) dose of the virus to prevent their immune system from future exposure.
For example, recently to analyze It used AI chatbots to conduct pre-projects to debunk common election fraud myths. This involved warning people in advance that political operatives can manipulate their opinion with sensational stories such as the false claim that “massive ballot dumping is overturning elections overnight”, along with key tips on spotting misleading rumours. These “inoculations” can be incorporated into population models of the spread of disinformation.
You can see in our graph that if prebunking is not used, it takes people much longer to build immunity to disinformation (left panel, orange line). The right panel shows how, if prebunking expands to scale, it can contain the number of misinformed people (orange line).
The purpose of these models is not to make the problem scary or to make people seem like plausible disease vectors. But it is clear proof Some fake news spreads like a simple contagion, infecting users instantly.
Meanwhile, other stories play out like a contagion complex, where people need repeated exposure to misleading sources of information before they become “infected.”
The fact that individual susceptibility to misinformation may vary does not negate the utility of approaches drawn from epidemiology. For example, models can be adjusted based on how difficult or difficult it is for misinformation to “infect” different subpopulations.
While it may be psychologically uncomfortable for some to think of people this way, it is mostly misinformation scattered through small amounts of highly effective superpropagators, as with viruses.
taking one epidemiological Our approach to analyzing fake news allows us to predict and model its spread efficiency interventions like prebunking.
Some recent work validated viral view Using social media dynamics in the 2020 US presidential election. Research suggests that a combination of interventions can be effective in reducing the spread of misinformation.
Models are never perfect. But if we want to stop the spread of disinformation, we need to understand how to effectively deal with the damage to society.
This article was originally published The conversation. read it original article.