Farnam Street helps you make better decisions, innovate, and avoid stupidity.

With over 400,000 monthly readers and more than 93,000 subscribers to our popular weekly digest, we've become an online intellectual hub.

How Our Brains Filter Information

Our brains are constantly processing information. Unfortunately, to cope with such large volumes, it often takes shortcuts without informing us.

All brains process information in a biased way. For example, we are hardwired to discount information that does not conform to our beliefs. What you might not know is that reading information that goes against your point of view can make you all the more convinced you are right. As someone who regularly tries to read the other side of an argument, this struck me as incredibly interesting.

How is it that two groups of people with polarizing opinions on a social issue can read the same information and both come away with stronger opinions than before?

In one experiment (abstract below), researches had participants on both sides of the capital punishment argument read two scholarly articles on the issue. One article favored capital punishment and the other didn’t. One would expect, reasonable, rational people to discover that the issue was more complex than the subjects had initially suspected and therefore subjects would move slightly off the edges and closer to one another (rather than increasing polarization). At worst you would expect the disconfirming evidence to be ignored. Confirmation Bias and Dissonance Theory, predict that people would filter that information and find reasons to refute the paper not supporting their view finding every possible flaw.

Through such biased assimilation of information people increase their support for an entrenched position and both sides of a given issue can have their positions bolstered by the same information.

* * *

Biased Assimilation and Attitude Polarization: The Effects of Prior Theories on Subsequently Considered Evidence

People who hold strong opinions on complex social issues are likely to examine relevant empirical evidence in a biased manner. They are apt to accept “con- firming” evidence at face value while subjecting “disconfirming” evidence to critical evaluation, and as a result to draw undue support for their initial positions from mixed or random empirical findings. Thus, the result of exposing contending factions in a social dispute to an identical body of relevant empirical evidence may be not a narrowing of disagreement but rather an in- crease in polarization. To test these assumptions and predictions, subjects supporting and opposing capital punishment were exposed to two purported studies, one seemingly confirming and one seemingly disconfirming their existing beliefs about the deterrent efficacy of the death penalty. As predicted, both proponents and opponents of capital punishment rated those results and procedures that confirmed their own beliefs to be the more convincing and probative ones, and they reported corresponding shifts in their beliefs as the various results and procedures were presented. The net effect of such evaluations and opinion shifts was the postulated increase in attitude polarization.

Source: Biased Assimilation and Attitude Polarization: The Effects of Prior Theories on Subsequently Considered Evidence. Journal of Personality and Social Psychology. 1979, Vol. 37, No. 11, 2098-2109