Skip to main content

News / Articles

Cognitive Bias and How We Vote

Published on 1/6/2024
Happy New Year! This may be an election year with more high drama than most years. As various outlets report on polls, we will hear leads like, “It's the top issue for Americans, one that often decides presidential elections - the economy” (NPR), and it will influence us in what some brain scientists call our “lizard brain” - our limbic system which is the section of the brain that taps into our instincts, our emotions, and sublimates the rational part of the brain we use to challenge those. 

Voting, it seems, “is not a mindful process. Wired for physical survival, our brains are vigilantly reactive to threat, social status, and ingroup - outgroup affiliations,” writes Melanie Greenburg of Psychology Today. “Humans are, at heart, and by history, social and territorial beings,” writes Greenburg, making us more socially influenced than we often realize. 

We are vulnerable to cognitive biases at the intersection of our limbic system and our social nature. Cognitive biases are “the way a particular person understands events, facts, and other people, which is based on their own particular set of beliefs and experiences and may not be reasonable or accurate” (Cambridge Dictionary). Researchers also define cognitive biases as errors in rational thinking that affect our decisions. Because our ways of understanding have deep roots in our socialization and emotional intelligence, we can be susceptible to “the candidate that most effectively targets these aspects of our natures,” Greenburg notes. 
Numerous studies point to at least 11 cognitive biases to be aware of, starting with two top-Cs” - confirmation bias and concision bias. Confirmation bias’ roots go back to ancient Greece, described by historian Thucydides, in The History of the Peloponnesian War as “a habit of mankind to entrust to careless hope what they long for and to use sovereign reason to thrust aside what they do not want.” In 1960, psychologist Peter Wason documented and named it confirmation bias, as we now know it.

We demonstrate cognitive bias when we favor or seek out information that affirms our pre-existing beliefs. It causes us to downplay the weight and amount of evidence to the contrary of our value, or we reject, forget or ignore that evidence. Its influence has been confirmed in numerous studies, including a 1970 Stanford study about capital punishment where researchers gave numerous students with opposing views studies riddled with false data - one said capital punishment deters crime, the other said it has no noticeable effect on deterrence - and researchers found that participants were skeptical of data that differed from their pre-existing opinions. 

A more recent study documents the effect of the algorithms used to deliver content on the internet. Since humans wrote the algorithms that serve internet users more of the content we prefer rather than fairly balancing what we see, we tend to fall into “filter bubbles” where we’re shielded from information that challenges our beliefs and fed information that confirms our existing opinions (The Decision Lab). 

SIDEBAR: From 1949 to 2011, the FCC required broadcasters to both “devote some of their programming to controversial issues of public importance and allow the airing of opposing views on those issues” meaning that programs covering politics had to include opposing opinions on any topic discussed and to determine the spectrum of views, including speakers who best represented those views. “Additionally, the rule mandated that broadcasters alert anyone subject to a personal attack in their programming and give them a chance to respond, and required any broadcasters who endorse political candidates to invite other candidates to respond,” reported Dylan Matthews of the Washington Post shortly after the doctrine was eliminated. A number of experts attribute the elimination of the Fairness Doctrine to polarization in U.S. politics.

Concision bias “occurs when the focus is placed mainly on surface-level knowledge or only one small aspect of an issue. The rest of the conversation and nuance is lost because of the attention given to those minor aspects,” writes Shiksha Sharma for Callhub.io. In short, we get only talking points or soundbites without thoughtful pushback. Not unlike this are other observed and documented biases, which often interact.

The World Economic Forum and the Visual Capitalist educate voters about eight more interacting biases that affect our voting - authority bias, the Dunning-Kruger effect, availability cascade, halo effect, declinism, framing effect, groupthink, and false consensus.

With authority bias, we trust or are influenced by authority figures, regardless of whether they are experts in a topic or issue beyond their specialization. For instance, when a doctor who specializes in cardiothoracic medicine provides medical advice on epidemiology. Though medical experts may share skills in reading medical journals or studies, they lack the specialized knowledge to observe gaps in reporting. They may be as susceptible as any of us to the over-confidence that is the Dunning-Kruger effect. This rational error makes us think we understand the nuances and depths of complex topics because we have limited knowledge about the topic. 

1997’s Wag the Dog starring Robert DeNiro and Dustin Hoffman illustrates availability cascade. In the movie, political operatives scheme to distract voters from a candidate’s sex scandal by fabricating a war. It echoes legends about William Randolph Hurst’s yellow journalists instigating the Spanish-American War, but operatives in Wag the Dog use the constant hint at a non-existent bomber. What starts as “there is no such thing as the B-3 bomber” becomes “what about the B-3 bomber?” The idea of the bomber accumulates credibility as it spreads, which is the availability cascade in action.

Then there’s the halo effect where what we think of a person makes us more or less inclined to believe them. When Hollywood writers went on strike in 2007-2008, producers turned to reality TV. Shortly afterward, The Apprentice became a massive money-maker for Donald Trump, who was on the financial brink. The faltering businessman, previously the model for Back to the Future II’s Biff Tannen, the short-tempered, vindictive mogul who destroyed the marriage of Marty McFly’s parents and turned McFly’s mother into a plastic surgery trophy wife miserable in her marriage.

Back to the Future was originally a nostalgia movie in a period when declinism was becoming a common appeal in rhetoric. Declinism conjures an ideal past, the way we wish we were. Back to the Future dared to jump back into the world of “Ozzie and Harriet” and “Leave It To Beaver” and show the flaws, foreshadowing our own imperfect present with Marty McFly’s baby uncle in his barred playpen foreshadowing his future imprisonment. 

Declinism taps into negativity bias, where the power of the negative energizes the populace more than the positive. Fear, violence, disgust and negativity are known to have far more immediate power over our “lizard brains,” triggering immediate reactions. The fear of decline, the spectre of violence, the disgust and discomfort that trigger negativity are easily tapped according to Callhub.io, an organization that helps political candidates leverage human psychology to win campaigns. 

“Here’s how you can leverage negativity bias,” Callhub.io’s blog educates. “Find loopholes and flaws in your opponent’s stance on issues. Run attack advertisements. Draw comparisons between their failures and your achievements in particular areas. For example, if they failed to be more empathetic to a situation you empathize with.”

Callhub.io also advises candidates that “people who are easily disgusted are more likely to describe themselves as politically conservative.” Their research reports that “people who rate high on disgust sensitivity were more likely to oppose LGBTQ+ folks. People lower on the disgust sensitivity scale were likelier to be more tolerant of differences.” They provide guidance on how to leverage disgust sensitivity: “Understand where your voter base falls on the disgust sensitivity scale average. Use this information to strategize campaigns that either trigger their disgust sensitivity in your favor or against your opponent. Use disgust sensitivity to sway swing voters. Test out campaigns aimed at swing voters in your constituency and understand how they respond to your messaging.” In short, they are teaching how to exploit the framing effect - the bias where we draw conclusions based on how an idea is presented - or groupthink, which is how like-minded thinkers conform to a way of thinking about the issue or others. 

Our struggle as human beings is to be aware that our rational minds fail to balance our social nature, our emotions and our instincts. In such circumstances, we can be open to the notion that we could be unduly influenced by what messages dominate our attention, which messages we are not seeing, and how political operatives leverage their messages. We can approach voting with the gravity that our most democratic, patriotic act - voting - requires. We can both educate ourselves with opposing viewpoints and be aware of when we are being manipulated.