Confirmation Bias: The Mind's Immune System

Blog || Politics || Philosophy || Science || Fiction || Quotes



In the book Lila, Robert Pirsig talks about a time he was sailing into port at Cleveland. He thought he was at a particular harbor when in fact he had unknowingly come in elsewhere many miles away. For quite a while, he did not realize his mistake; the landscape seemed to match up with the chart. But of course, he realized later he had been ignoring differences between the chart and where he was because he figured the shoreline must have changed some since the map was made. Writing in the third person:

"Wherever the chart disagreed with his observations he rejected the observations and followed the chart. Because of what his mind thought it knew, it had built up a static filter, an immune system, that was shutting out all information that did not fit. Seeing is not believing. Believing is seeing.

"If this were just an individual phenomenon it would not be so serious. But it is a huge cultural phenomenon too and it is very serious. We build up whole cultural intellectual patterns based on past 'facts' which are extremely selective. When a new fact comes in that does not fit the pattern we don't throw out the pattern. We throw out the fact. A contradictory fact has to keep hammering and hammering, sometimes for centuries, before maybe one or two people will see it."

The thing is, we all do this, and not just in science. It is a psychological mechanism built into our brains. Through our lives we unconsciously develop assumptions about how the world works, a framework in which to integrate new facts and against which to test others' claims. That framework includes various filters to notice useful information and ignore noise. But we tend to become practiced at unconsciously biasing ourselves in regards to the information we let through to consciousness. This psychological process is known as "confirmation bias", and it has been the study of cognitive psychologists since.

It is useful to have a basic view of the world, so that you can respond with appropriate incredulity when someone claims that, for example, elephants can fly. But this same mechanism can unconsciously cause us to be overconfident in our views, and to perhaps ignore (by filtering out before we even think about it) crucial data points that might help us refine our mental framework.

Confirmation bias is likely at play in a lot of pseudoscience or parapsychology. For example, someone who believes in ESP and is trying to prove it scientifically might take any success (even those that could be explained by chance alone) as evidence for their hypothesis and yet ignore, explain away or just selectively fail to remember non-successes. Studies have shown that when examining our memories about data regarding some issue, we tend to recall confirming data (e.g. Gilovich, 1993).

What this means is that we have to remain vigilant against our own biases and sometimes just step back and question even some of our basic assumptions. By recognizing that confirmation bias exists, hopefully we will be more likely to seek out ways to mediate its effects (and control for confounds in scientific experiments). In the end, this is what is needed if we are to navigate our world successfully and not run ourselves into the wrong mental harbors like Pirsig did.

Originally Written: 02-05-07
Last Updated: 05-23-07