What the human being is best at doing is interpreting all new information so that their prior conclusions remain intact.
Yes, of course, journalists are biased (and not always for conflict of interest reasons!).
The story below is interesting because it basically renames confirmation bias to journalism bias. (Few people ever talk about the laziness and stupidity biases in journalists.)
We are all subject to the confirmation bias because of the way our mind is wired (literally, how our brains retrieve and search for information).
…But the reporters' questions weren't geared toward getting a better understanding of those points. They were narrowly focused on one or two aspects of the story. And from the questions that were being asked, I realized–because I had so much more information on the subject–that the reporters were missing a couple of really important pieces of understanding about the product and its use. And as the event progressed, I also realized that the questions that might have uncovered those pieces weren't being asked because the reporters already had a story angle in their heads and were focused only on getting the necessary data points to flesh out and back up what they already thought was the story.
There is always a tension, as a journalist, between asking open-ended questions that allow an interview subject to explain something and pressing or challenging them on accuracy or details. But if you think you already know the subject, or already have a story angle half-formed in your head, it's easy to overlook the first part.
The journalists at the press conference didn't have a bias as the term is normally used; that is, I didn't get the sense that they were inherently for or against the company or its product. They just appeared to think they knew the subject well enough, or had a set enough idea in their heads as to what this kind of story was about, that they pursued only the lines of questioning necessary to fill in the blanks of that presumed story line. As a result, they left the press conference with less knowledge and understanding than they otherwise might have had. And while nobody could have said the resulting stories were entirely wrong, they definitely suffered from that lapse. Especially, as might be expected, when it came to the predictions they made about the product's evolution or future.
In his new book, How We Decide, Jonah Lehrer cites a research study done by U.C. Berkeley professor Philip Tetlock. Tetlock questioned 284 people who made their living “commenting or offering advice on political and economic trends,” asking them to make predictions about future events. Over the course of the study, Tetlock collected quantitative data on over 82,000 predictions, as well as information from follow-up interviews with the subjects about the thought processes they'd used to come to those predictions. …
Read what you've been missing. Subscribe to Farnam Street via Email, RSS, or Twitter.
Shop at Amazon.com and support Farnam Street