The enduring allure of conspiracies
Conspiracy theories seem to meet psychological needs and can be almost impossible to eradicate. One remedy: Keep them from taking root in the first place.
The United States of America was founded on a conspiracy theory. In the lead-up to the War of Independence, revolutionaries argued that a tax on tea or stamps is not just a tax, but the opening gambit in a sinister plot of oppression. The signers of the Declaration of Independence were convinced — based on “a long train of abuses and usurpations” — that the king of Great Britain was conspiring to establish “an absolute Tyranny” over the colonies.
“The document itself is a written conspiracy theory,” says Nancy Rosenblum, a political theorist emerita at Harvard University. It suggests that there’s more going on than meets the eye, that someone with bad intentions is working behind the scenes.
If conspiracy theories are as old as politics, they’re also — in the era of Donald Trump and QAnon — as current as the latest headlines. Earlier this month, the American democracy born of an eighteenth century conspiracy theory faced its most severe threat yet — from another conspiracy theory, that (all evidence to the contrary) the 2020 presidential election was rigged. Are conspiracy theories truly more prevalent and influential today, or does it just seem that way?
The research isn’t clear. Rosenblum and others see evidence that belief in conspiracy theories is increasing and taking dangerous new forms. Others disagree. But scholars generally do agree that conspiracy theories have always existed and always will. They tap into basic aspects of human cognition and psychology, which may help explain why they take hold so easily — and why they’re seemingly impossible to kill.
Once someone has fully bought into a conspiracy theory, “there’s very little research that actually shows you can come back from that,” says Sander van der Linden, a social psychologist at the University of Cambridge whose research focuses on ways to combat misinformation. “When it comes to conspiracy theories, prevention is better than cure.”
When Joseph Uscinski began studying conspiracy theories a decade ago, he was one of only a handful of scholars — mostly psychologists and political scientists — interested in the topic. “No one cared about this at the time,” says Uscinski, a political scientist at the University of Miami in Florida. American Conspiracy Theories, the 2014 book he cowrote with political scientist Joseph Parent, became a landmark in conspiracy theory research.
To investigate how conspiracy beliefs have changed with time, Uscinski, Parent and a small army of research assistants analyzed more than 100,000 letters to the editors of the New York Times printed between 1890 and 2010. Among these, they identified 875 letters that dabbled in conspiracy talk — that some group was acting in secret to steal power, or bury the truth, or reap some other benefit at the expense of the common good.
Many of the letters alleged geopolitical conspiracies: In 1890, it was England and Canada conspiring to take back territory from the United States, and in 1906, Japan was supposedly sending soldiers in disguise to prepare to seize Hawaii. Others focused on domestic political conspiracies, such as President Harry Truman covering up Communist infiltration of the government in the 1950s, and the idea that the 9/11 attacks were coordinated by the US to smear the Saudis. Still others were just bizarre, such as a 1973 letter claiming that lesbianism is a CIA-inspired plot.
When Uscinski and Parent graphed the prevalence of such newspaper conspiracy-theory letters between 1890 and 2010, the result was a very jagged line that showed, if anything, a slight downward trend over time (the most prominent peak marks McCarthyism and the Red Scare of the early 1950s). More recent polling research by Uscinski suggests that this overall picture has remained the same — with belief in specific conspiracy theories rising and falling over time, but no evidence for an overall increase. “The general hypothesis that’s put out there in the media is [that] everyone’s becoming conspiracists, and now is the golden age of conspiracy theory,” Uscinski says. “We find no such thing whatsoever.”
Uscinski’s research suggests that conspiracy thinking is more or less evenly distributed across the political spectrum, with Democrats becoming more vocal about conspiracy theories when Republicans are in power, and vice versa. Democrats tend to be suspicious of corporations and conservatives. Republicans are more likely to be suspicious of communists and liberals. In a chapter memorably titled “Conspiracy Theories Are for Losers,” Uscinski and Parent write that conspiracies are a way for those who’ve lost or lack power to explain their losses, channel their anger, close ranks and regroup.
During his presidency, Donald Trump was the exception that proves the rule, Uscinski says. It’s not easy for one of the most powerful people in the world to claim they’re the victim of a conspiracy (it didn’t work, for instance, when allies of Bill Clinton blamed a “vast right-wing conspiracy” for the president’s troubles during his impeachment trial in the late 1990s). Trump, however, cast himself as a political outsider from the beginning, Uscinski says: “He sets himself up, not only as a victim of the other side, but of both parties and the entire system and what he calls the deep state … so everything is rigged against him.” The Russia inquiry and his 2019 impeachment, Uscinski adds, helped to feed this narrative, which has continued through the chaotic aftermath of the 2020 election.
A new — and dangerous — form
Rosenblum argues that Trump epitomizes a new type of “conspiracy without theory” that relies on sheer assertion and repetition rather than evidence and reason. (Rosenblum is coeditor of the Annual Review of Political Science.) Trump’s baseless tweets that the election was rigged, she says, stand in contrast to Kennedy assassination conspiracists obsessing over bullet trajectories or 9/11 conspiracists diving into data on the temperature at which jet fuel burns. “This conspiracy thinking that’s going on today takes a very different and novel and dangerous form,” she says, because it seeks to delegitimize political rivals, government agencies, the press and others who might stand in the way. “It unsettles the ground on which we argue, negotiate, and even disagree,” she and coauthor Russell Muirhead wrote in their 2019 book, A Lot of People Are Saying: The New Conspiracism and the Assault on Democracy. “It makes democracy unworkable — and ultimately, it makes democracy seem unworthy.”
One of the most influential ideas in conspiracy theory scholarship is that people who identify themselves as politically conservative are more likely to believe in conspiracy theories. In a widely cited 1964 essay in Harper’s Magazine, Columbia University historian Richard Hofstadter argued that a “paranoid style” runs through conservative political movements of the twentieth century, fed by distrust of “cosmopolitans and intellectuals.” Uscinski says his polling research finds no evidence that conservatives are more prone to believe in conspiracy theories than liberals, but other researchers still think there’s something to this idea.
In a recent series of studies, van der Linden and colleagues conducted online surveys of more than 5,000 Americans from across the political spectrum, asking them to rate their political preferences and respond to questions that were developed by psychologists to measure conspiratorial thinking and paranoia. One survey item, for example, asked participants to rate on a scale of 0 to 100 their agreement with the statement: “I think that events which superficially seem to lack a connection are often the result of secret activities.”
People at both extremes of the political spectrum were more prone to conspiracy thinking than those in the middle, but conservatives tended to be more conspiratorial than liberals, the researchers reported in Political Psychology last year. “We think this is convincing evidence … of these differences between liberals and conservatives,” van der Linden says. “I wouldn’t say it’s a large effect, but it wasn’t tiny, either.”
This difference, he thinks, may be rooted in group psychology. “There’s a lot of research that shows that, whereas the liberals are a bit more extroverted and rebellious and so on, conservatives tend to be focused on managing uncertainty and threat and in-group values,” he says. Conspiracy theories are one way to make sense of events that seem overwhelming and may feel as though they threaten the groups and values that people most identify with, he says. “It’s definitely a mechanism to try to restore a sense of agency and control over the narrative.”
Van der Linden is quick to note, however, that liberals aren’t immune from conspiracy thinking. Conspiracy theories about technology seem more popular among liberals, for example, including ones involving pharmaceutical companies and genetically modified crops.
One reason that conspiracy theories find fertile ground in the human mind has to do with epistemology — the philosophy of how we know what we know (or think we do). Because any individual can know only a tiny sliver of the world firsthand, we have no choice but to accept a great deal of information we can’t verify for ourselves. Most people believe (correctly) that Antarctica is very cold and populated with penguins, despite never having been there. The assumptions and cognitive shortcuts we use to decide what’s true make sense most of the time, but they also leave the door open for bad information, including conspiracy theories.
Since most of the information we encounter in everyday life (at least outside of social media) is true, that creates a bias toward accepting new information, says Nadia Brashier, a psychologist and neuroscientist at Harvard. And hearing a claim multiple times makes it seem even more true. “One of the most insidious influences on our judgment involves repetition,” Brashier says.
Dozens of studies have documented this “illusory truth effect,” mainly by asking participants to rate the veracity of trivia, rumors, product claims, fake news reports and other bits of information, Brashier and Duke University psychologist and neuroscientist Elizabeth Marsh write in a recent Annual Review of Psychology paper about how people determine what’s true. Even people who recognize a statement as false the first time they see it are more likely to judge it as probably true after seeing it multiple times, Brashier says.
Ordinarily, it’s rational to assume that the more times you hear something, the more likely it is to be true, she says. “But we’re seeing bad actors hijack these shortcuts that we use that make sense in a lot of situations [but] that can lead us astray in others.”
Conspiracy theories also take advantage of our tendency to look for patterns and explanations, says Karen Douglas, a psychologist who studies conspiracy thinking at the University of Kent in the United Kingdom. Pattern detection serves us well in everyday life, Douglas says: It’s how we piece together how people typically behave in given situations, for example. Believing in a bogus conspiracy theory amounts to seeing a pattern that’s not really there.
In a 2018 paper, Douglas and colleagues recruited hundreds of volunteers online and quizzed them about their belief in various conspiracy theories, some well-known ones and some invented by the researchers. Participants who agreed more strongly with a sample of well-known conspiracy theories were more likely than others to also see meaningful patterns in a series of random coin tosses and in the chaotic, splotchy paintings of abstract expressionist artist Jackson Pollock. “It seems that seeing patterns in random phenomena such as coin tosses and abstract paintings relates to the tendency to see patterns in political and social events that are happening in the world,” Douglas says.
Such studies reveal a human tendency to attribute events to the intentional actions of others rather than to pure chance, Douglas says. Work by others has shown that we also tend to assume that when something huge happens, something huge must have caused it. This also feeds into conspiracy thinking, Douglas says. The assassination of John F. Kennedy was too momentous an event to have been pulled off by a lone gunman, conspiracists argue. Surely the US government was involved — or the KGB, or the Mafia.
Social and emotional factors are likely at play as well. “People are most susceptible to conspiracy theories when particular psychological needs are frustrated,” Douglas says. “Specifically, people need knowledge and certainty to feel safe, secure and in control, and to feel good about themselves and the social groups they belong to.” When these needs are unmet — say, amidst the fear and uncertainty of a global pandemic — conspiracy theories might seem to offer consolation, Douglas says.
But her research suggests that they might actually do the opposite. “Reading about conspiracy theories, instead of making people feel more powerful, makes people feel less powerful,” she says. It may even make people less likely to take actions that would give them more control over their situation. In experiments where volunteers read about conspiracy theories before answering questionnaires about their likelihood to engage in various behaviors, Douglas and others have found evidence that conspiracy theories reduce people’s inclination to vote, to vaccinate their children, or to help fight climate change. The people in such studies also express greater prejudice and a greater inclination to commit petty crime, at least in their responses to researchers.
“Our reasoning is that if people perceive that others are conspiring and doing antisocial things, then it seems OK for people to do these things too,” Douglas says. “Also, if they feel that the world is run by a select few and that everything is determined, why bother to go out and vote or engage with a corrupt system?” She adds, however, that more work is needed to determine whether these responses in conspiracy belief studies actually translate to antisocial behaviors in the real world.
Talking a true believer out of their belief in a conspiracy can be nearly impossible. (The believer will assume you’re hopelessly naïve or, worse, that you’re part of the cover-up). Even when conspiracy theories have bold predictions that don’t come true, such as QAnon’s claim that Trump would win reelection, followers twist themselves in logical knots to cling to their core beliefs. “These beliefs are important to people, and letting them go means letting go of something important that has determined the way they see the world for some time,” Douglas says.
As a result, some researchers think that preventing conspiracy theories from taking hold in the first place is a better strategy than fact-checking and debunking them after they do — and they have been hard at work developing and testing such strategies. Van der Linden sees inoculation as a useful metaphor here. “I think one of the best solutions we have is to actually inject people with a weakened dose of the conspiracy … to help people build up mental or cognitive antibodies,” he says.
One way he and his colleagues have been trying to do that (no needles required) is by developing online games and apps. In a game called Bad News, for example, players assume the role of a fake news creator trying to attract followers and evolve from a social media nobody into the head of a fake-news empire. The 15-minute game is meant to teach people how fake news spreads so that they can recognize it more readily. (In one of the activities, players create and promote their own conspiracy theory.)
To assess the game’s effects, van der Linden and colleagues recruited more than 14,000 people to play Bad News. Before and after playing, participants were asked to identify misinformation within a selection of real and made-up tweets and headlines. Playing the game improved players’ resistance to fake news, the researchers reported in 2019: When presented with dubious tweets and news headlines, they were more likely to rate them as unreliable. The researchers termed the improvement “small to moderate.” A follow-up study found that it persisted for at least three months after the game was played.
More recently, the researchers created a game based on Bad News that specifically tackles conspiracies and other misinformation related to Covid-19. Called Go Viral!, it was developed with support from the UK government and released in October. The World Health Organization and the United Nations have promoted the game as a resource for fighting misinformation, “so that we can hopefully reach millions of people around the world,” van der Linden says.
Stopping the spread
The critical question — pushing the vaccine metaphor to its limits — is how to achieve herd immunity, the point at which enough of the population is immune so that conspiracy theories can’t go viral. It might be difficult to do that with games because they require people to take the time to engage, says Gordon Pennycook, a behavioral scientist at the University of Regina in Canada. So Pennycook has been working on interventions that he believes will be easier to scale up.
His research suggests that people are pretty good at spotting fake news, including bogus conspiracy theories — but that doesn’t mean they don’t share fake stuff on social media. “People are sharing headlines that they could identify as being false if they bothered to think about it,” he says.
To counter this, Pennycook and colleagues have been developing ways to nudge people to think more critically about the information they share without explicitly telling them to do so. In one recent study conducted online, they asked 856 volunteers to rate how likely they would be to share various Covid-19 news headlines — some true ones from credible sources, others that were bogus or debunked — if they saw them on social media. Before doing this, roughly half the participants were asked to rate the accuracy of a single, politically neutral headline unrelated to Covid-19 (one had to do with a neutron star discovery, another had to do with Seinfeld coming to Netflix). Taking a moment to contemplate accuracy made participants nearly three times more discerning in what they decided to share, the researchers reported in Psychological Science last year.
Social media companies have started to deploy similar strategies: An example is Twitter’s recent rollout of a prompt that advises users to read an article before sharing it. Pennycook thinks that such moves are worthwhile. In a recent study, yet to be published, he and colleagues found that a 30-second video prompting people to think about accuracy cut viewers’ willingness to share fake news in half (at least as reported on a survey — the researchers weren’t able to track the actual social media behavior).
Even as researchers push to develop such measures, they acknowledge that eradicating bogus conspiracy theories may not be possible. Conspiracy theories flourished as far back as the Roman Empire, and they inspired an angry mob to storm the US Capitol just last week. Specific theories may come and go, but the allure of conspiracy theories for people trying to make sense of events beyond their control seems more enduring. For better — and of late, very much for worse — they appear to be a permanent part of the human condition.