Opinions

I’m Right Because I Believe I Am: How To Combat Confirmation Bias

We spend too much time looking for information that confirms what we think. It's time to let contending ideas flourish.

Share this article

Share this article

We spend too much time looking for information that confirms what we think. It's time to let contending ideas flourish.

Opinions

I’m Right Because I Believe I Am: How To Combat Confirmation Bias

We spend too much time looking for information that confirms what we think. It's time to let contending ideas flourish.

Share this article

When faced with contradictory evidence, the desire to side with what is most comfortable – cognitive dissonance – and to avoid ideas that are contrary to what we believe – motivated reasoning – produces a very strong tendency for us to do little more than to seek to confirm our existing beliefs.

This is the basis of one of our most powerful mental biases. If I had to identify one aspect of our mental processes that deserves particular attention because of its potential to lead to errors, over half a century of research has shown that this one would have to be a prime candidate.

The name given to it is “confirmation bias”, and it is inseparable from cognitive dissonance and motivated reasoning.

Confirmation bias refers to the tendency of decision makers to actively seek out and assign more weight to evidence that confirms some pre-existing expectation or hypothesis or that justifies their decisions, while at the same time ignoring or underweighting information that could disconfirm their ideas.

Its purpose is to frame the world so it makes sense to us, but the price we pay is that we find it easier to believe propositions that we would like to be true than those we would prefer to be false. As a result, we will very often be wrong while at the same time being completely oblivious of this fact.

There are three ways in which our confirmation bias can distort even the best decision-making process:

·         Selection bias: It biases choices about what evidence is relevant, restricting the information that gets selected for evaluation.

·         Interpretation bias: It leads to a biased interpretation of the information that does get selected.

·         Hindsight bias: It distorts our memory.

To examine the selection bias, researchers from Ohio State University secretly recorded how long  participants spent reading articles in an online forum(1).  The articles offered opposing views on a range of subjects, and it was found that when participants agreed with the perspective they were reading about they spent, on average, 36% more time reading the article.

This selection bias becomes more polarised as people become more committed to their beliefs. An analysis by Valdis Krebs of political book-buying patterns during the 2008 United States presidential election campaign provides a great example(2).

In the five years that he had been analysing these patterns there had always been some books that were purchased by supporters of both parties.

However, as polling day approached, the overlap became smaller and smaller until, by October 2008, for the first time in several years of research, Krebs found no overlap – not a single book that appealed to the mindsets of both sides!

This mirrored the polarisation and animosity in the campaign rallies of the opponents. People who already supported Obama bought books that had a positive message about him, and vice versa. Since then, we have seen Donald Trump’s shock victory in a race that was even more intense and vitriolic.

It is hardly surprising, given what we know about confirmation bias, that the 14.2 million Donald Trump supporters who signed up to receive his unfiltered message via Twitter were completely disinterested in attempts by his opponents to discredit him.

Donald Trump

Donald Trump's opponents failed to convince his supporters to turn on him

Such criticism would only appeal to those who already supported Hillary Clinton.

Given this awareness, I hope it is starting to become obvious why “birds of a feather flock together”. Think about it: how often do you buy a newspaper or book that you disagree with?

The classic study on how confirmation bias affects our interpretation of new information was done in 1979 by researchers from Stanford University(3).

They exposed two groups of volunteers, one that supported capital  punishment and one that opposed it, to two different studies (both were fictional, though participants didn’t know this), one confirming and one disconfirming their existing beliefs about whether the death penalty deters violent crime.

It would be natural to expect that, if we process information rationally, the additional evidence presented in the articles would lead to a greater understanding of the complexity and trade-offs involved in the decision.

It might then seem likely that each group would move closer to the centre-ground. What actually happened was startling. Firstly, when asked to rate the studies, each group had a strong bias towards the one that matched their initial opinion.

This led the researchers to conclude that “people of opposing views can each find support for those views in the same body of evidence”.

Secondly, rather than move together, there was even more polarisation in attitude at the end of the experiment than there had been at the start. It seems that, because of our desire to avoid dissonance, we can read the same information and reach opposite conclusions. Both sides, of course, remain absolutely convinced that they are right.

Other research shows how powerfully this confirmation bias can lead to the persistence of beliefs even when all supporting evidence has been discredited.

argument

Same evidence, different views

Counter-intuitively, information that goes against our point of view can actually make us even more convinced that we are right (and hence the repeated media denouncement of Donald Trump would likely have made his supporters even more certain of their belief in him).

As the old saying goes: “a man convinced against his will is of the same opinion still.”

So the problem with confirmation bias isn’t really what it might look like on the surface: the result of stubborn people consciously ignoring the evidence. If it was, that would be rather easy to deal with for anyone holding an authentic intention to improve their decision making.

Rather, the problem is the automatic, unconscious result of how our brains process information.

The brain is designed to help us to filter the enormous amount of information that we have to contend with daily, and it does so by paying attention to that which confirms what we already know or believe to be important. And that makes dealing with it a considerable challenge.


NOTES

1. Silvia Knobloch-Westerwick and Jingbo Meng, “Looking the Other Way: Selective Exposure to Attitude-Consistent and Counterattitudinal Political Information”, SAGE Publishing, April 2009.

2. Valdis Krebs, “Your choices reveal who you are: Mining and visualizing social patterns”, in Beautiful Visualisation, O’Reilly Media Inc., 2010.

3. Charles G. Lord, Lee Ross and Mark R. Lepper, “Biased assimilation and attitude polarization: The effects of prior theories on subsequently considered evidence” (1979) Journal of Personality and Social Psychology, 37(11): 2098–2109.


This is an edited extract from The Little Black Book of Decision Making: Making Complex Decisions with Confidence in a Fast-Moving World by Michael Nicholas (Published by Capstone, A Wiley Brand, July 2017).

Related Articles
Get news to your inbox
Trending articles on Opinions

I’m Right Because I Believe I Am: How To Combat Confirmation Bias

Share this article