Voters with a 'party-over-reality' bias may play a decisive role in the election
Geoffrey Cohen is the author of “Belonging: The Science of Creating Connection and Bridging Divides,” a professor of organizational studies in education and business and a professor of psychology at Stanford University. Michael Schwalbe is a postdoctoral scholar at Stanford’s department of psychology.
Donald Trump’s outrageous claims about immigrants, election fraud and more seem absurd to many of us, especially on the left — outright fabrications that no reasonable person could believe. But new research conducted with our colleagues reveals something unsettling: Gullibility and delusion are not confined to Trump’s supporters, opposers or any specific group. The battle for the truth is about recognizing that everyone, including the educated and well-informed, can fall prey to misinformation. This is especially important now because voters’ incorrect beliefs may play a decisive role in the election.
That’s why we have to be aware of the power of misinformation and know how to combat it. But our research suggests that our understanding of the assault on truth — and of strategies to counter it — is hindered by three blind spots.
While it may seem surprising, there is doubt about the extent of misinformation’s effects on the general population. Many scholars believe the problem is relatively minor and that most errors in judgment are logical mistakes unmotivated by partisanship. This implies that misinformation may not deepen political divides.
Our research tells a different story.
We presented true and fake news stories to American voters that either supported or challenged their political allegiances. We found a stark party-over-reality bias: Participants were more than twice as likely to believe and share inaccurate stories that supported their political views than they were to share news that was factually accurate but challenged their ideologies. This bias persisted even when the headlines were blatantly false. For example, conservative voters were more willing to accept the fabricated story “Donald Trump ‘Serious Contender’ for the Nobel Prize in Economics,” while liberal voters were more likely to accept an invented story with the headline “Trump Attended Private Halloween Gala with Sex Orgies Dressed as Pope.” Political allegiance overshadowed the truth.
We’re blind to not just the power of misinformation but also its broad appeal. Many of us tend to believe that others are more credulous because of their partisan leanings or lack of education or intelligence. However, our research shows that anyone, regardless of party affiliation, education level or cognitive ability, can easily fall victim to misinformation. Even people with advanced degrees and strong reasoning skills exhibited a party-over-reality bias. In fact, participants who excelled at reasoning often used that skill selectively, scrutinizing false stories only when they contradicted their political beliefs. When the misinformation aligned with their views — such as supporting their preferred presidential candidate — they shut down their critical thinking and accepted falsehood as truth.
The third blind spot is the misconception that the assault on truth arises solely from external misinformation. Many wrongly believe the issue could be resolved by controlling the flow of misinformation through fact-checking and establishing policies that would curb fake news. While these measures can help, they are insufficient because our own minds also contribute to the problem. Even if all misinformation from traditional and social media were eliminated, our cognitive filters would still lead us to resist truths that challenge our beliefs.
Indeed, our study found that the tendency to disbelieve and avoid sharing accurate news that contradicts our political views was more powerful than the tendency to accept and promote fake news that confirms our opinions. In other words, the problem isn’t just belief in misinformation. It’s resistance to truth.
This means that the problem goes beyond the supply of lies. It also comes from our willingness to believe them — and our reluctance to accept inconvenient truths. We often seek news that reassures us that we are right, and this need for validation is at the root of our own contributions to the misinformation machine.
So what can be done? Intellectual humility is one antidote. The small number of respondents in our study who prioritized truth over politics were more likely to acknowledge that their political side was just as vulnerable to misinformation and propaganda as the opposing side. Recognizing this danger seemed to allow them to question their perceptions and check their biases. Our research also found that those who prioritized truth consumed less media that is politically one-sided.
The real divide appears to be between those who believe they know the truth and those who remain open to the possibility that they might be mistaken. To address our role in the problem, we can encourage people to become critical consumers of media, beginning with the practice of being critical of their own thinking. A key part of this is diversifying our news consumption and disconnecting from the media echo chamber.
Another solution is to cultivate community. When people feel connected to each other in ways outside of partisanship, they are less likely to accept false political narratives, even those that confirm their beliefs, and are more open to information that challenges long-held ideas.
It’s ironic that shared needs — for certainty and tribal connection — separate us. Recognizing and addressing these needs and the biases they trigger will help us bridge the divides that our own minds conspire to create.
Geoffrey Cohen is the author of “Belonging: The Science of Creating Connection and Bridging Divides,” a professor of organizational studies in education and business and a professor of psychology at Stanford University. Michael Schwalbe is a postdoctoral scholar at Stanford’s department of psychology.