Just In Time Information Gathering

We’re becoming more like factories.

Just-in-time is a production strategy aimed at, among other things, reducing the need for excess inventory. Parts are supplied only when needed in the amount needed. While it makes a business more capital efficient, it also makes it more fragile.

We’ve adopted a similar strategy for information gathering. We’re so consumed by noise and busy work that the only time we really seek out signal is when we need it the most: right before we make a decision.

This creates a host of problems.

The worst time to look for information is when we need it to make a decision. When we do that we’re more likely to see what’s unique and miss the historical context. We’re also more likely to be biased by what is available. And searching for information at the time of need is an indication that you have no idea what you’re doing.

“Pattern recognition,” says Alice Schroeder, “creates an impulse always to connect new knowledge to old and to primarily be interested in new knowledge that genuinely builds on the old.” It helps knowledge snowball.

If we can’t connect the current situation to something we already understand, we might reason that it is not in our circle of competence and thus we shouldn’t be drawing conclusions. If we can, however, connect it to something we previously understood then we’re less likely to draw conclusions on the basis of “this time is different.”

There is merit to thinking about information gathering as a logistics problem.

This Time is Different

When we look at situations we’re always looking for what’s unique. We should, however, give more thought to similarities.

“This time is different” could be the 4 most costly words ever spoken. It’s not the words that are costly so much as the conclusions they encourage us to draw.

We incorrectly think that differences are more valuable than similarities.

After all, anyone can see what’s the same but it takes true insight to see what’s different, right? We’re all so busy trying to find differences that we forget to pay attention to what is the same.

Imagine sitting in a meeting where people are about to make the same mistake they made last year on the same decision. Let’s say, for example, Jack has a history of messing up the annual tax returns. He’s a good guy. He’s nice to everyone. In fact, he buys everyone awesome Christmas presents. But the last three years—the period he’s been in charge of tax returns—have been nothing short of a disaster causing more work for you and your department.

The assignment for the tax return comes up and Jack is once again nominated.

Before you have a chance to voice your concerns, one of your co-workers speaks up: “I know Jack has dropped the ball on this assignment in the past but I think this time is different. He’s been working hard to make sure he’s better organized.”

That’s all it takes. Conversation over — everyone is focused on what’s unique about this time and it’s unlikely, despite ample evidence, that you’ll be able to convince them otherwise.

In part, people want to believe in Jack because he’s a nice guy. In part, we’re all focused on why this time is different and we’ll ignore evidence to the contrary.

Focusing on what’s different makes it easy to forget historical context. We lose touch with the base rate. We only see the evidence that supports our view (confirmation bias). We become optimistic and overconfident.

History provides context. And what history shows us is that no matter how unique things are today there are a lot of similarities with the past.

Consider investors and the dotcom bubble. Collectively people saw this as unprecedented and unique, a massive transformation that was unparalleled.

That reasoning, combined with a blindness to what was the same about this situation and previous ones, encouraged us to draw conclusions that proved costly. We reasoned that everything would change and everyone who owned internet companies would prosper and the old non-internet companies would quickly go into bankruptcy.

All of a sudden profits didn’t matter. Nor did revenue. They would come in time, we hoped. Market share mattered no matter how costly it was to acquire.

More than that, if you didn’t buy now you’d miss out. These companies would take over the world and you’d be left out.

We got so caught up in what was different that we forgot to see what was the same.

And there were historical parallels: Automobiles, Radio, Television, and Airplanes to name a few. At the time these innovations completely transformed the world as well. You can consider them the dotcoms of yesteryear.

And how did these massively transformational industries end up for investors?

At one time there were allegedly over 70 different auto manufacturing operations in the United States. Only 3 of them survived (and a few of those even required government funds.)

If you catch yourself reasoning based on “this time is different” remember that you are probably speculating. While you may be right, odds are, this time is not different. You just haven’t looked for the similarities.

I have a free weekly newsletter jam-packed with awesomeness including stuff that never makes the blog or twitter. Over 20k people love it. Read what you’ve been missing.

Avoiding Ignorance

This is a continuation of two types of ignorance.

You can’t deal with ignorance if you can’t recognize its presence. If you’re suffering from primary ignorance it means you probably failed to consider the possibility of being ignorant or you found ways not to see that you were ignorant.

You’re ignorant and unaware, which is worse than being ignorant and aware.

The best way to avoid this, suggests Joy and Zeckhauser, is to raise self-awareness.

Ask yourself regularly: “Might I be in a state of consequential ignorance here?”

They continue:

If the answer is yes, the next step should be to estimate base rates. That should also be the next step if the starting point is recognized ignorance.

Of all situations such as this, how often has a particular outcome happened. Of course, this is often totally subjective.

and its underpinnings are elusive. It is hard to know what the sample of relevant past experiences has been, how to draw inferences from the experience of others, etc. Nevertheless, it is far better to proceed to an answer, however tenuous, than to simply miss (primary ignorance) or slight (recognized ignorance) the issue. Unfortunately, the assessment of base rates is challenging and substantial biases are likely to enter.

When we don’t recognize ignorance the base rate is extremely underestimated. When we do recognize ignorance, we face “duelling biases; some will lead to underestimates of base rates and others to overestimates.”

Three biases come into play while estimating base rates: overconfidence, salience, and selection biases.

So we are overconfident in our estimates. We estimate things that are salient – that is, “states with which (we) have some experience or that are otherwise easily brought to mind.” And “there is a strong selection bias to recall or retell events that were surprising or of great consequence.”

Our key lesson is that as individuals proceed through life, they should always be on the lookout for ignorance. When they do recognize it, they should try to assess how likely they are to be surprised—in other words, attempt to compute the base rate. In discussing this assessment, we might also employ the term “catchall” from statistics, to cover the outcomes not specifically addressed.

It’s incredibly interesting to view literature through the lens of human decision making.

Crime and Punishment is particularly interesting as a study of primary ignorance. Raskolnikov deploys his impressive intelligence to plan the murder, believing, in his ignorance, that he has left nothing to chance. In a series of descriptions not for the squeamish or the faint-hearted, the murderer’s thoughts are laid bare as he plans the deed. We read about his skills in strategic inference and his powers of prediction about where and how he will corner his victim; his tactics at developing complementary skills (what is the precise manner in which he will carry the axe?; what strategies will help him avoid detection) are revealed.

But since Raskolnikov is making decisions under primary ignorance, his determined rationality is tightly “bounded.” He “construct[s] a simplified model of the real situation in order to deal with it; … behaves rationally with respect to this model, [but] such behavior is not even approximately optimal with respect to the real world” (Simon 1957). The second-guessing, fear, and delirium at the heart of Raskolnikov’s thinking as he struggles to gain a foothold in his inner world show the impact of a cascade of Consequential Amazing Development’s (CAD), none predicted, none even contemplated. Raskolnikov anticipated an outcome in which he would dispatch the pawnbroker and slip quietly out of her apartment. He could not have possibly predicted that her sister would show up, a characteristic CAD that challenges what Taleb (2012) calls our “illusion of predictability.”

Joy and Zeckhauser argue we can draw two conclusions.

First, we tend to downplay the role of unanticipated events, preferring instead to expect simple causal relationships and linear developments. Second, when we do encounter a CAD, we often counter with knee-jerk, impulsive decisions, the equivalent of Raskolnikov committing a second impetuous murder.

Bringing you Farnam Street took thousands of dollars and nearly 1,500 hours in 2013. If you find any value in it, I’d greatly appreciate your support with a modest donation. For extra Karma points, become a member with a monthly donation.

References: Ignorance: Lessons from the Laboratory of Literature (Joy and Zeckhauser).

A Decision-Making Magic Trick

Two important nuggets from an interview with Chip Heath, co-author of Decisive (more here), on improving our ability to make better decisions:

A decision-making magic trick

The closest thing to a decision-making magic trick that I’ve found is the question, “What would you advise your best friend to do if they were in your situation?” So often when I ask that question, people blurt out an answer and their eyes get wide. They’re shocked at how easy it is when you just imagine you’re advising someone else.

This time isn’t so different

businesses make decisions about mergers and acquisitions that are hundreds of millions of dollars, and to the senior leader it seems like, “Well this is a different situation than the last acquisition we made.” And yet in that room making the decision is a set of people who have probably seen a dozen acquisitions, but they don’t take the time to do even the equivalent of the three-out-of-five-stars rating that we would get from Amazon.com.

The kind of decisions that senior people make always present themselves as though they are completely different than anything else. Big decisions are subtle in a way because they all seem to come one at a time. The advantage of smaller decisions is we realize we are in a repeated situation where we’re going to see the same thing a lot.

Yet lots of big decisions end up having that property as well. If we take the time to move our mental spotlight around, we can always find other decisions that are similar to this one. We think the chief financial officer we’re trying to hire is in a unique position in the company, and that this is a unique position in time with unique demands. But the fact is, we’ve made other senior hires and we know how that process goes. Stepping back and taking in the broader context is just as useful for senior leaders as it is for the frontline worker who’s making a decision on the 35th mortgage application or the 75th customer complaint.

The 12 cognitive biases that prevent you from being rational

i09 produced a great overview of some cognitive biases.

First, the difference between cognitive biases and logical fallacies:

A logical fallacy is an error in logical argumentation (e.g. ad hominem attacks, slippery slopes, circular arguments, appeal to force, etc.). A cognitive bias, on the other hand, is a genuine deficiency or limitation in our thinking — a flaw in judgment that arises from errors of memory, social attribution, and miscalculations (such as statistical errors or a false sense of probability).

Confirmation Bias

We love to agree with people who agree with us. It’s why we only visit websites that express our political opinions, and why we mostly hang around people who hold similar views and tastes.

Ingroup Bias

Ultimately, the ingroup bias causes us to overestimate the abilities and value of our immediate group at the expense of people we don’t really know.

Gambler’s Fallacy

We tend to put a tremendous amount of weight on previous events, believing that they’ll somehow influence future outcomes.

Post-Purchase Rationalization (aka commitment and consistency bias)

[A] kind of built-in mechanism that makes us feel better after we make crappy decisions, especially at the cash register. Also known as Buyer’s Stockholm Syndrome, it’s a way of subconsciously justifying our purchases — especially expensive ones.

Neglecting Probability (aka Insensitivity To Base Rates)

It’s the same phenomenon that makes us worry about getting killed in an act of terrorism as opposed to something far more probable, like falling down the stairs or accidental poisoning.

Observational Selection Bias (Availability Bias?)

This is that effect of suddenly noticing things we didn’t notice that much before — but we wrongly assume that the frequency has increased.

Status-Quo Bias

We humans tend to be apprehensive of change, which often leads us to make choices that guarantee that things remain the same, or change as little as possible.

Negativity Bias

People tend to pay more attention to bad news — and it’s not just because we’re morbid. Social scientists theorize that it’s on account of our selective attention and that, given the choice, we perceive negative news as being more important or profound.

Bandwagon Effect (aka social proof)

Though we’re often unconscious of it, we love to go with the flow of the crowd.

Projection Bias

We tend to assume that most people think just like us — though there may be no justification for it. This cognitive shortcoming often leads to a related effect known as the false consensus bias where we tend to believe that people not only think like us, but that they also agree with us.

The Current Moment Bias

We humans have a really hard time imagining ourselves in the future and altering our behaviors and expectations accordingly.

Anchoring Effect

Also known as the relativity trap, this is the tendency we have to compare and contrast only a limited set of items. It’s called the anchoring effect because we tend to fixate on a value or number that in turn gets compared to everything else.

Still curious? Check out the Farnam Street Latticework of Mental Models.