Farnam Street helps you make better decisions, innovate, and avoid stupidity.

With over 400,000 monthly readers and more than 93,000 subscribers to our popular weekly digest, we've become an online intellectual hub.

BP, the Ash Cloud, and Anchoring

Jonah Lehrer explains why it was in BP’s best interest to come out with incredibly low estimates of flow and going as far as suggesting it wasn’t a very bad leak—and then failing to update that in light of new evidence.

This is a form of anchoring…What does this have to do with the ash cloud and the oil spill? While anchoring is typically interpreted as a consumer mistake – we anchor to the wrong price – I think the bias can also be applied to our beliefs. Consider the ash cloud: After the cloud began drifting south, into the crowded airspace of Western Europe, officials did the prudent thing and canceled all flights. They wanted to avoid a repeat of the near crash of a Boeing 747 in 1989, when the KLM aircraft flew, at night, above the ash cloud of an Alaskan volcano. This led to a complete engine failure and an emergency landing.

Given the limited amount of information, anchoring to this previous event (and trying to avoid a worst case scenario) was the only reasonable reaction. The problems began, however, when these initial beliefs about the risk of the ash cloud proved resistant to subsequent updates. Over the next few days, numerous test flights were sent up into the atmosphere. Much of the data collected by these flights suggested that the ash might not be such a severe danger, at least at the periphery. …

My point is absolutely not that the ash cloud wasn’t dangerous, or that the aviation agencies were wrong to cancel thousands of flights, at least initially. Instead, I think we simply need to be more aware that our initial beliefs about a crisis – those opinions that are most shrouded in ignorance and uncertainty – will exert an irrational influence on our subsequent actions, even after we have more (and more reliable) information. The end result is a kind of epistemic stubbornness, in which we’re irrationally anchored to an outmoded assumption.

The same thing happened with the BP oil spill. Initial reports suggested that the leak was relatively minor, less than a thousand barrels a day. As a result, BP and the government were slow to launch into crisis mode. Days were squandered; little was done. In fact, it took BP seventeen days before the company attempted the first fix of the underwater leak. Because BP and the government were anchored to a false belief – the spill wasn’t supposed to be that bad – they spent weeks thinking the wrong thing in a crisis.

The only way to avoid anchoring is to know about it. We need to be more aware that anchoring is a fundamental flaw of human decision making, and that our first reaction to an event will continue to shape our ensuing thoughts, even after that reaction is no longer relevant or valid. Our old beliefs might be wrong, but they’re influence lingers on, an intellectual anchor holding us back.

Read the full article @ Science Blogs

Jonah Lehrer is the author of How We Decide and Proust Was a Neuroscientist.