This is a big one. Confirmation bias is a tendency to selectively notice or look for new information that confirms one’s beliefs, and not notice or avoid information that would contradict prior beliefs.
The effect is also known as belief bias, belief preservation, belief overkill, hypothesis locking, polarization effect, positive bias, the Tolstoy syndrome, selective thinking, myside bias and Morton’s demon.
Cognitive psychologist Peter Cathcart Wason studied this phenomenon in the 1960s:
Subjects were presented with the numbers 2-4-6 and told that they conformed to a rule. They were then asked to try to discover what the rule is by generating their own sequences of three numbers and submitting them to the experimenter. Each time they did, they would get feedback on whether or not the sequence conformed to the rule. They were also instructed that once they were certain that they had found the rule, they should announce it to the experimenter.
While the actual rule was simply “any ascending sequence”, the subjects seemed to have a great deal of difficulty in inducing it, often announcing rules that were far more complex than the correct rule. More interestingly, the subjects seemed to test only “positive” examples — triples that subjects believed would conform to their rule and confirm their hypothesis. What they did not do was attempt to challenge or falsify their hypotheses by testing triples that they believed would not conform to their rule. Wason referred to this phenomenon as confirmation bias, whereby subjects systematically seek only evidence that confirms their hypotheses.
Basically, most subjects generated a hypothesis almost immediately after looking at the numbers (“even numbers..”, “add two to the last number”) and then tried a few sequences to see if their hypothesis worked or not. They didn’t think to generate sequences that purposefully didn’t conform to see if they could falsify their hypothesis, and that’s why they reached a wrong conclusion. There was no trick and the rule was simple. It was really a cognitive bias at work.
Eliezer Yudkowsky on confirmation bias in politics:
Politics is an extension of war by other means. Arguments are soldiers. Once you know which side you’re on, you must support all arguments of that side, and attack all arguments that appear to favor the enemy side; otherwise it’s like stabbing your soldiers in the back – providing aid and comfort to the enemy. People who would be level-headed about evenhandedly weighing all sides of an issue in their professional life as scientists, can suddenly turn into slogan-chanting zombies when there’s a Blue or Green position on an issue.
Partisans filter information and notice and seek what supports their side, and avoid or ignore what supports the other side. This doesn’t mean that partisans are always wrong, but it certainly increases the chances that their positions are not rational.
Russian writer Leo Tolstoy famously said:
“I know that most men, including those at ease with problems of the greatest complexity, can seldom accept the simplest and most obvious truth if it be such as would oblige them to admit the falsity of conclusions which they have proudly taught to others, and which they have woven, thread by thread, into the fabrics of their life”.
We need to be vigilant because without positive efforts, it’s very easy to fall prey to this bias without noticing it. The negative results of that can be seen all around us.
- Confirmation Bias at Wikipedia
- Confirmation bias at the Skeptic’s Dictionary
- Politics is the Mind-Killer by Eliezer Yudkowsky
See also: Rationality Resources