Farnam Street helps you make better decisions, innovate, and avoid stupidity.

With over 350,000 monthly readers and more than 87,000 subscribers to our popular weekly digest, we've become an online intellectual hub.

Category Archives: Decision Making

Why Fiddling With Prices Doesn’t Work

The fact is, if you don’t find it reasonable that prices should reflect relative scarcity,
then fundamentally you don’t accept the market economy,
because this is about as close to the essence of the market as you can find.

— Joseph Heath

***

Inevitably, when the price of a good or service rises rapidly, there follows an accusation of price-gouging. The term carries a strong moral admonition on the price-gouger, in favor of the price-gougee. Gas shortages are a classic example. With a local shortage of gasoline, gas stations will tend to mark up the price of gasoline to reflect the supply issue. This is usually rewarded with cries of unfairness. But does that really make sense?

In his excellent book Economics Without Illusions, Joseph Heath argues that it doesn’t.

In fact, this very scenario is market pricing reacting just as it should. With gasoline in short supply, the market price rises too so that those who need gasoline have it available, and those who simply want it do not. The price system ensures that everyone makes their choice correctly. If you’re willing to pay up, you pay up. If you’re not, you make alternative arrangements – drive less, use less heat, etc. This is exactly what market pricing is for – to give us a reference as we make our choices. But it’s still hard for many well-intentioned people to understand. Let’s think it through a little, with Heath’s help.

***

To continue reading you must be a Farnam Street member or purchase a copy. (Current members can log-in here.)

To learn more about our membership options please visit this page or instantly sign up and become a Farnam Street VIP.

If you don’t want a membership but you do want to read this article, a copy is available for purchase.

Garrett Hardin on the Three Filters Needed to Think About Problems

One of the best parts of Garrett Hardin‘s wonderful Filters Against Folly is when he explores the three filters that help us interpret reality. No matter how much we’d like it to, the world does not only operate in our circle of competence. Thus we must learn ways to distinguish reality in areas where we lack even so much as a map.

Hardin’s genius reminds us of this quote by Sports Illustrated’s Andy Benoit.

Andy Benoit

Mental Tools

We need not be a genius in every area but we should understand the big ideas of most disciplines and try to avoid fooling ourselves. That’s the core to the mental models approach. When you’re not an expert in a field, often the best approach is one that avoids stupidity. There are few better ways of avoiding stupidity than understanding how the world works.

Hardin begins by outlining his goal: to understand reality and understand human nature as it really is, removing premature judgment from the analysis.

He appropriately quotes Spinoza, who laid out his principles for political science thusly:

That I might investigate the subject matter of this science with the same freedom of spirit we generally use in mathematics, I have labored carefully not to mock, lament, or execrate human actions, but to understand them; and to this end I have looked upon passions such as love, hatred, anger, envy, ambition, pity, and other perturbations of the mind, not in the light of vices of human nature, but as properties just as pertinent to it as are heat, cold, storm, thunder, and the like to the nature of the atmosphere.

The goal of these mental filters, then, is to understand reality by improving our ability to judge the statements of experts, promoters, and persuaders of all kinds. As the saying goes, we are all laymen in some field.

Hardin writes:

What follows is one man’s attempt to show that there is more wisdom among the laity than is generally concluded, and that there are some rather simple methods of checking on the validity of the statements of experts.

Three Filters Needed to Think About Problems

Literate Filter

The first filter through which we must interpret reality, says Hardin, is the literate filter: What do the words really mean? The key to remember is that Language is action. Language is not just a way to communicate or interpret; language acts as a call to, or just as importantly, an inhibitor to action.

The first step is to try to understand what is really being said. What do the words and the labels actually mean? If a politician proposes a “Poverty Assistance Plan,” that sounds almost inarguably good, no? Many a pork-barrel program has passed based on such labels alone.

But when you examine the rhetoric, you must ask what those words are trying to do: Promote understanding, or inhibit it? If the program had a rational method of assistance to the deserving poor, the label might be appropriate. If it was simply a way to reward undeserving people in his or her district for their vote, the label might be simply a way to fool. The literate filter asks if we understand the true intent behind the words.

In a chapter called “The Sins of the Literate,” Hardin discusses the misuse of language by examining literate, but innumerate, concepts like “indefinite” or “infinite”:

He who introduces the words “infinity” or any of its derivatives (“forever” or “never” for instance) is also trying to escape discussion. Unfortunately he does not honestly admit the operational meaning of the high-flown language used to close off discussion. “Non-negotiable” is a dated term, no longer in common use, but “infinity” endures forever.

Like old man Proteus of Greek mythology, the wish to escape debate disguises itself under a multitude of verbal forms: infinity, non-negotiable, never, forever, irresistible, immovable, indubitable, and the recent variant “not meaningfully finite.” All these words have the effect of moving discussion out of the numerate realm, where it belongs, and into a wasteland of pure literacy, where counting and measuring are repudiated.

Later, in the final chapter, Hardin repeats:

The talent for handling words is called “eloquence.” Talent is always desirable, but the talent may have an unfair, even dangerous, advantage over those with less talent. More than a century ago Ralph Waldo Emerson said, “The curse of this country is eloquent men.” The curse can be minimized by using words themselves to point out the danger of words. One of their functions is to act as inhibitors of thought. People need to be made allergic to such thought-stoppers as infinity, sacred, and absolute. The real world is a world of quantified entities: “infinity” and its like are no words for quantities but utterances used to divert attention from quantities and limits.

It is not just innumerate exaggeration we are guarding against, but the literate tendency to replace actors with abstractions, as Hardin calls it. He uses the example of donating money to a poor country (Country X), which on its face sounds noble:

Country X, which is an abstraction, cannot act. Those who act in its name are rich and powerful people. Human nature being what it is, we can be sure that these people will not voluntarily do anything to diminish either their power or their riches…

Not uncommonly, the major part of large quantities of food sent in haste to a poor country in the tropics rot on the docks or is eaten up by rats before it can be moved to the people who need it. The wastage is seldom adequately reported back to the sending country…(remember), those who gain personally from the shipping of food to poor nations gain whether fungi, rats, or people eat the food.

The Numerate Filter

Hardin is clear on his approach to numerical fluency: The ability to count, weigh, and compare values in a general or specific way is essential to understanding the claims of experts or assessing any problem rationally:

The numerate temperament is one that habitually looks for approximate dimensions, ratios, proportions, and rates of change in trying to grasp what is going on in the wold. Given effective education–a rare commodity, of course–a numerate orientation is probably within the reach of most people.

Just as “literacy” is used here to mean more than merely reading and writing, so also will “numeracy” be used to mean more than measuring and counting. Examination of the origins of the sciences shows that many major discoveries were made with very little measuring and counting. The attitude science requires of its practitioners is respect, bordering on reverence, for ration, proportions, and rates of change.

Rough and ready back-of-the-envelope calculations are often sufficient to reveal the outline of a new and important scientific discovery….In truth, the essence of many of the major insights of science can be grasped with no more than child’s ability to measure, count, and calculate.

***

To explain the use of the literate and numerate filters together, Hardin uses the example of the Delaney Amendment, passed in 1958 to restrict food additives. This example should be familiar to us today:

Concerned with the growing evidence that many otherwise useful substances can cause cancer, Congress degreed that henceforth, whenever a chemical at any concentration was found to cause cancer–in any fraction of any species of animal–that substance must be totally banned as an additive to human food.

From a literate standpoint, this sounds logical. The Amendment sought to eradicate harmful food additives that the free market had allowed to surface. However, says Hardin:

The Delaney Amendment is a monument to innumerate thought. “Safe” and “unsafe” are literate distinctions; nature is numerate. Everything is dangerous at some level. Even molecular oxygen, essential to human life, becomes lethal as the concentration approaches 100 percent.

Sensitivity is ordinarily expressed as “1 part per X,” where X is a large number. If a substance probably increases the incidence of cancer at a concentration of 1 part per 10,000, one should probably ban it at that concentration in food, and perhaps at 1 in 100,000. But what about 1 part per million?…In theory there is no final limit to sensitivity. What about 1 milligram per tank car? Or 1 milligram per terrestrial globe?

Obviously, some numerical limits must be applied. This is the usefulness of the numerate filter. As Charlie Munger says, “Quantify, always quantify.”

Ecolacy

Hardin introduces his final filter by requiring that we ask the question “And then what?”  There is perhaps no better question to prompt second-order thinking.

Even if we understand what is truly being said and have quantified the effects of a proposed policy or solution, it is imperative that we consider the second layer of effects or beyond. Hardin recognizes that this opens the door for potentially unlimited paralysis (the poorly understood and innumerate “Butterfly Effect”), which he boxes in by introducing his own version of the First Law of Ecology:

We can never merely do one thing.

This is to say, all proposed solutions and interventions will have a multitude of effects, and we must try our best to consider them in their totality.

In proposing this filter, Hardin is very careful to guard against the Slippery Slope argument, or the idea that one step in the wrong direction will lead us directly to Hell. This, he says, is a purely literate but wholly innumerate approach to thinking.

Those who take the wedge (Slippery Slope) argument with the utmost seriousness act as though they think human beings are completely devoid of practical judgment. Countless examples from everyday life show the pessimists are wrong…If we took the wedge argument seriously, we would pass a law forbidding all vehicles to travel at any speed greater than zero. That would be an easy way out of the moral problem. But we pass no such law.

In reality, the ecolate filter helps us understand the layers of unintended consequences. Take inflation:

The consequences of hyperinflation beautifully illustrate the meaning of the First Law of Ecology. A government that is unwilling or unable to stop the escalation of inflation does more than merely change the price of things; it turns loose a cascade of consequences the effects of which reach far into the future.

Prudent citizens who have saved their money in bank accounts and government bonds are ruined. In times of inflation people spend wildly with little care for value, because the choice and price of an object are less important than that one put his money into material things. Fatalism takes over as society sinks down into a culture of poverty….

To Conclude

In the end, the filters must be used wisely together. They are ways to understand reality, and cannot be divorced from one another. Hardin’s general approach to thinking sums up much like his multi-disciplinary friend Munger’s:

No single filter is sufficient for reaching a reliable decision, so invidious comparisons between the three is not called for. The well-educated person uses all of them.

Garrett Hardin on the Three Filters Needed to Think About Problems Click To Tweet

Check out our prior posts about Filters Against Folly:

The Two Types of Knowledge: The Max Planck/Chauffeur Test

Charlie Munger, the billionaire business partner of Warren Buffett, frequently tells the story below to illustrate how to distinguish real knowledge from pretend knowledge.

At the 2007 Commencement to the USC Law School, Munger explained it this way:

I frequently tell the apocryphal story about how Max Planck, after he won the Nobel Prize, went around Germany giving the same standard lecture on the new quantum mechanics.

Over time, his chauffeur memorized the lecture and said, “Would you mind, Professor Planck, because it’s so boring to stay in our routine, if I gave the lecture in Munich and you just sat in front wearing my chauffeur’s hat?” Planck said, “Why not?” And the chauffeur got up and gave this long lecture on quantum mechanics. After which a physics professor stood up and asked a perfectly ghastly question. The speaker said, “Well I’m surprised that in an advanced city like Munich I get such an elementary question. I’m going to ask my chauffeur to reply.”

The point of the story is not the quick wittedness of the protagonist, but rather — to echo Richard Feynman — it’s about making a distinction between the two types of knowledge.

Two Kinds of Knowledge

In this world we have two kinds of knowledge. One is Planck knowledge, the people who really know. They’ve paid the dues, they have the aptitude. And then we’ve got chauffeur knowledge. They’ve learned the talk. They may have a big head of hair, they may have fine temper in the voice, they’ll make a hell of an impression.

But in the end, all they have is chauffeur knowledge. I think I’ve just described practically every politician in the United States.

And you are going to have the problem in your life of getting the responsibility into the people with the Planck knowledge and away from the people with the chauffeur knowledge.

And there are huge forces working against you. My generation has failed you a bit… but you wouldn’t like it to be too easy now would you?

Real knowledge comes when people do the work. This is so important that Elon Musk tries to tease it out in interviews.

Real knowledge comes when people do the work. Click To Tweet

On the other hand, we have the people who don’t do the work — they pretend. While they’ve learned to put on a good show, they lack understanding. They can’t answer questions that don’t rely on memorization. They can’t explain things without using jargon or vague terms. They have no idea how things interact. They can’t predict consequences.

The problem is that it’s difficult to separate the two.

One way to tease out the difference between Planck and chauffeur knowledge is to ask them why.

In The Art of Thinking Clearly Rolf Dobelli offers some commentary on distinguishing fake from real knowledge:

With journalists, it is more difficult. Some have acquired true knowledge. Often they are veteran reporters who have specialized for years in a clearly defined area. They make a serious effort to understand the complexity of a subject and to communicate it. They tend to write long articles that highlight a variety of cases and exceptions. The majority of journalists, however, fall into the category of chauffeur. They conjure up articles off the tops of their heads or, rather, from Google searches. Their texts are one-sided, short, and— often as compensation for their patchy knowledge— snarky and self-satisfied in tone.

The same superficiality is present in business. The larger a company, the more the CEO is expected to possess “star quality.” Dedication, solemnity, and reliability are undervalued, at least at the top. Too often shareholders and business journalists seem to believe that showmanship will deliver better results, which is obviously not the case.

One way to guard against this is to understand your circle of competence.

True experts recognize the limits of what they know and what they do not know. Click To Tweet

Dobelli concludes with some advice worth taking to heart.

Be on the lookout for chauffeur knowledge. Do not confuse the company spokesperson, the ringmaster, the newscaster, the schmoozer, the verbiage vendor, or the cliché generator with those who possess true knowledge. How do you recognize the difference? There is a clear indicator: True experts recognize the limits of what they know and what they do not know. If they find themselves outside their circle of competence, they keep quiet or simply say, “I don’t know.” This they utter unapologetically, even with a certain pride. From chauffeurs, we hear every line except this.

***

If you liked this, you’ll love these other Farnam Street articles:

Circle of Competence — Knowing your Circle of Competence helps intelligent people like Charlie Munger and Warren Buffett stay out of trouble.

Learn Anything Faster with the Feynman Technique — The Feynman Technique helps you learn anything faster by quickly identifying gaps in your understanding. It’s also a versatile thinking tool.

How Using a Decision Journal can Help you Make Better Decisions

"Odds are you’re going to discover two things right away. First, you’re right a lot of the time. Second, it’s often for the wrong reasons."
“Odds are you’re going to discover two things. First, you’re right a lot of the time. Second, it’s often for the wrong reasons.”

 

One question I’m often asked is what should a decision journal look like?

You should care enormously whether you’re making good ones or bad ones. After all, in most knowledge organizations, your product is decisions.

A good decision process matters more than analysis by a factor of six. A process or framework for making decisions, however, is only one part of an overall approach to making better decisions.

The way to test the quality of your decisions, whether individually or organizationally, is by testing the process by which they are made. And one way to do that is to use a decision journal.

You can think of a decision journal as quality control — something like we’d find in a manufacturing plant.

Conceptually this is pretty easy but it requires some discipline and humility to implement and maintain. In consulting with various organizations on how to make better decisions I’ve seen everything from people who take great strides to improve their decision making to people who doctor decision journals for optics over substance.

“The idea,” says Michael Mauboussin, “is whenever you are making a consequential decision, write down what you decided, why you decided as you did, what you expect to happen, and if you’re so inclined, how you feel mentally and physically.”

Whenever you’re making a consequential decision either individually or as part of a group you take a moment and write down:

  1. The situation or context;
  2. The problem statement or frame;
  3. The variables that govern the situation;
  4. The complications or complexity as you see it;
  5. Alternatives that were seriously considered and why they were not chosen; (think: the work required to have an opinion).
  6. A paragraph explaining the range of outcomes
  7. A paragraph explaining what you expect to happen and, importantly, the reasoning and actual probabilities you assign to each. (The degree of confidence matters, a lot.)
  8. Time of day the decision was made and how you feel physically and mentally (if you’re tired, for example, write it down.)

Of course, this can be tailored to the situation and context. Specific decisions might include tradeoffs, weighting criteria, or other relevant factors.

One point, worth noting, is not to spend too much time on the brief and obvious insight. Often these first thoughts are system one, not system two. Any decision you’re journaling is inherently complex (and may involve non-linear systems). In such a world small effects can cause disproportionate responses whereas bigger ones can have no impact. Remember that causality is complex, especially in complex domains.

Review
One tip I’ve learned from helping organizations implement this is that there are two common ways people wiggle out of their own decision: hindsight bias and jargon.

I know we live in an age of computers but you simply must do this by hand because that will help reduce the odds of hindsight bias. It’s easy to look at a print-out and say, I didn’t see it that way. It’s a lot harder to look at your own writing and say the same thing.

Another thing to avoid is vague and ambiguous wording. If you’re talking in abstractions and fog, you’re not ready to make a decision, and you’ll find it easy to change the definitions to suit new information. This is where writing down the probabilities as you see them comes into play.

These journals should be reviewed on a regular basis—every six months or so. The review is an important part of the process. This is where you can get better. Realizing where you make mistakes, how you make them, what types of decisions you’re bad at, etc. will help you make better decisions if you’re rational enough. This is also where a coach can help. If you share your journal with someone, they can review it with you and help identify areas for improvement.

And keep in mind it’s not all about outcome. You might have made the right decision (which, in our sense means a good process) and had a bad outcome. We call that a bad break.

Odds are you’re going to discover two things right away. First, you’re right a lot of the time. Second, it’s often for the wrong reasons.

This can be somewhat humbling.

Let’s say you buy a stock and it goes up, but it goes up for reasons that are not the same as the ones you thought. You’re probably thinking high-fives all around right? But in a very real sense you were wrong. This feedback is incredibly important.

If you let it, the information provided by this journal will help identify cases where you think you know more than you do but in fact you’re operating outside your circle of competence.

It will also show you how your views change over time, when you tend to make better decisions, and how serious the deliberations were.

How Using a Decision Journal can Help you Make Better Decisions Click To Tweet

Decisions Under Uncertainty

"We confuse risk and uncertainty."
“We confuse risk and uncertainty.”

If you’re a knowledge worker you make decisions everyday. In fact, whether you realize it or not, decisions are your job.

Decisions are how you make a living. Of course not every decision is easy. Decisions tend to fall into different categories.  The way we approach the actual decision should vary based on category.

Here are a few basic categories that decisions fall into.

There are decisions where:

  1. Outcomes are known. In this case the range of outcomes is known and the individual outcome is also known. This is the easiest way to make decisions. If I hold out my hand and drop a ball, it will fall to the ground. I know this with near certainly.
  2. Outcomes are unknown, but probabilities are known.In this case the range of outcomes are known but the individual outcome is unknown. This is risk. Think of this as going to Vegas and gambling. Before you set foot at the table, all of the outcomes are known as are the probabilities of each. No outcome surprises an objective third party.
  3. Outcomes are unknown and probabilities are unknown. In this case the distribution of outcomes are unknown and the individual outcomes are necessarily unknown. This is uncertainty.

We often think we’re making decisions in #2 but we’re really operating in #3. The difference may seem trivial but it makes a world of difference.

Decisions Under Uncertainty

Ignorance is a state of the world where some possible outcomes are unknown: when we’ve moved from #2 to #3.

One way to realize how ignorant we are is to look back, read some old newspapers, and see how often the world did something that wasn’t even imagined.

Some examples include the Arab Spring, the collapse of the Soviet Union, the financial meltdown.

We’re prepared for a world much like #2 — the world of risk, with known outcomes and probability that can be estimated, yet we live in a world with a closer resemblance to #3.

Read part two of this series: Two types of ignorance.

References: Ignorance: Lessons from the Laboratory of Literature (Joy and Zeckhauser).

Charlie Munger: A Two-Step Process for Making Effective Decisions and Improving Your Thinking

A simple and lightweight approach to decision making that prevents us from being manipulated.

  1. Understand the forces at play.
  2. Understand how your subconscious might be leading you astray.

***

While most of us make decisions daily, few of us have an effective framework for thinking that protects us when making decisions. We’re going to explore Munger’s two-step process for making effective decisions and reducing human misjudgment.

In A Lesson on Elementary Worldly Wisdom, Charlie Munger offers a simple two-step filter for making effective decisions:

Personally, I’ve gotten so that I now use a kind of two-track analysis. First, what are the factors that really govern the interests involved, rationally considered? And second, what are the subconscious influences where the brain at a subconscious level is automatically doing these things-which by and large are useful, but which often misfunction.

One approach is rationality-the way you’d work out a bridge problem: by evaluating the real interests, the real probabilities and so forth. And the other is to evaluate the psychological factors that cause subconscious conclusions-many of which are wrong.

Let’s take a closer look.

1. The Forces at Play

The key to the first step is knowing what you know and what you don’t know. You need to understand your circle of competence. It’s just as important to know what you don’t know as it is to know what you do know.

If you know what you don’t know, you might still have to make a decision, but your approaches for making that decision will change. For example, if you’re forced to make a decision in an area that you know is well outside your circle of competence, one tool you can use is inversion.

While there are millions of factors that go into decisions there will always be a few variables and factors that will carry the bulk of the weight. If you’re operating within your circle of competence, it should be relatively easy to figure out the relevant variables and forces at play.

I can’t tell you the relevant variables. There is no magic formula. In order to make consistently good decisions you need to develop a deep fluency in the area in which you are making decisions and you need to pull in the big ideas from multiple disciplines to make sure you’re exercising good judgment.

2. The Psychological Factors

There are many causes of human misjudgment — our mental models page lists a few. These are the subtle ways that your mind might be leading you astray at a subconscious level. Your subconscious mind is larger than your conscious mind and yet we rarely pay attention to how we might be tricking ourselves. One way to mislead yourself, for instance, is to make decisions based on a small sample size and extrapolate the results to a larger population. Another way we fool ourselves is to remain committed to something we’ve said in the past. We might rely on an authority figure or default to what everyone else is doing. You get the idea.

Usually when we have extreme success or failure there are four or five factors working in the same direction. The same goes for psychology. The more human misjudgment factors there are working against us, the more likely we are to make an ill-informed decision.