Tag: fsbook

The Art and Science of High-Stakes Decisions

How can anyone make rational decisions in a world where knowledge is limited, time is pressing, and deep thought is often unattainable?

Some decisions are more difficult than others and yet we often make these decisions in the same way easy decisions are made, on autopilot.

We have difficulty contemplating and taking protective actions towards low probability, high stakes threats. It almost seems perverse when you consider we are least prepared to make the decisions that matter most.

Sure we can pick between the store brand of peanut butter and the Kraft label and we can no doubt surf the internet with relative ease, yet life seems to offer few opportunities to prepare for decisions where the consequences of a poor decision are catastrophic. If we pick the wrong type of peanut butter, we are generally not penalized too harshly. If we fail to purchase flood insurance, on the other hand, we can be financially and emotionally wiped out.

Shortly after the planes crashed into the towers in Manhattan some well known academics got together to discuss how skilled people were at making choices involving low and ambiguous probability of a high-stakes loss

High-stakes decisions involve two distinctive properties: 1) existence of a possible large loss (financial or emotional) and 2) the costs to reverse a decisions once made are high. More importantly, these professors wanted to determine if prescriptive guidelines for improving decision making process could be created in an effort to help make better decisions.

Whether we're buying something at the grocery store or making a decision to purchase earthquake insurance, we operate in the same way. The presence of potentially catastrophic costs of errors does little to reduce our reliance on heuristics (or rules of thumb). Such heuristics serve us well on a daily basis. For simple decisions, not only are heuristics generally right but the costs of errors are small, such as being caught without an umbrella or regretting not picking up the Kraft peanut butter after discovering the store band doesn't taste as you remember. However, in high-stakes decisions, heuristics can often be a poor method of forecasting.

In order to make better high-stakes decisions we need a better understanding of why we generally make poor decisions.

Here are several causes.

Poor understanding of probability.
Several studies show that people either utilize probability information insufficiently when it is made available to them, or ignore it all together. In one study, 78% of subjects failed to seek out probability information when evaluating between several risky managerial decisions.

In the context of high-stakes decisions the probability of an event causing loss may seem sufficiently low that organizations and individuals consider them not worth worry about. In doing so, they effectively treat the probability of something as zero or close to it.

An excessive focus on short time horizons.
Many high-stakes decisions are not obvious to the decision-maker. In part, this is because people tend to focus on the immediate consequences and not the long-term consequences.

A CEO near retirement has incentives to skimp on insurance to report slightly higher profits before leaving (shareholders are unaware of the increased risk and appreciate the increased profits).Governments tend to under-invest in less visible things like infrastructure because they have short election cycles. The long-term consequences of short term thinking can be disastrous.

The focus on short term decision making is one of the most widely-documented failings of human decision making. People have difficulty considering the future consequences of current actions over long periods of time. Garrett Hardin, author of Filters against Folly, suggests we look at things through three filters (literacy, numeracy, and ecolacy). In ecolacy, the key question is “and then what?” And then what helps us avoid a focus solely on the short-term.

Excessive attention to what's available
Decisions requiring difficult trade-offs between attributes or entailing ambiguity as to what a right answer looks like, often leads people to resolve choices by focusing on the information most easily brought to mind. Sometimes things can be difficult to bring to mind.

Constant exposure to low-risk events without realization, leads to us being less concerned than we probability would warrant (it makes these events less available) and “proves” our past decisions to ignore low-risk events were right.

People refuse to buy flood insurance even when it is heavily subsidized and priced far below an actuarially fair value. Kunreuther et. al. (1993) suggests underreaction to threats of flooding may arise from “the inability of individuals to conceptualize floods that have never occurred… Men on flood plains appear to be very much prisoners of their experience… Recently experienced floods appear to set an upward bound to the size of loss with which managers believe they ought to be concerned.”Paradoxically, we feel more secure even as the “risk” may have increased.

Distortions under stress
Most high-stakes decisions will be made under perceived (or real) stress. A large number of empirical studies find that stress focuses decision-makers on a selective set of cues when evaluating options and leads to greater reliance on simplifying heuristics. When we're stressed, we're less likely to think things through.

Over-reliance on social norm
Most individuals have little experience with high-stakes decisions and are highly uncertain about how to resolve them (procedural uncertainty). In such cases—and combined with stress—the natural course of action is to mimic the behavior of others or follow established social norms. This is based on the psychological desire to fail conventionally.

The tendency to prefer the status-quo
What happens when people are presented with difficult choices and no obvious right answer? We tend to prefer making not decision at all-that is, we choose the norm.

In high-stakes decisions many options are better than the status-quo and we must make trade-offs. Yet, when faced with decisions that involve life-and-death trade-offs, people frequently remark “I'd rather not think about it.”

Failures to learn
Although individuals and organizations are eager to derive intelligence from experience, the inferences stemming from that eagerness are often misguided. The problems lie partly in errors in how people think, but even more so in properties of experience that confound learning from it. Experience may possibly be the best teacher, but it is not a particularly good teacher.

As an illustration, one study finds that participants in an earthquake simulation tended to over-invest in mitigation that was normatively ineffective but under-invest when it is normatively effective. The reason was misinterpretation of feedback; when mitigation was ineffective, respondents attributed the persistence of damage to the fact that they had not invested enough. by contract, when it was effective, they attributed the absence of damage to a belief that earthquakes posted limited damage risk.

Gresham's Law of Decision making
Over time, bad decisions will tend to drive out good decisions in an organization.

Improving
What can you do to improve your decision-making?

A few things: 1) learn more about judgment and decision making; 2) encourage decision makers to see events through alternative frames, such as gains versus losses and changes in the status-quo; 3) adjust the time frame of decisions—while the probability of an earthquake at your plant may be 1/100 in any given year, the probability over the 25 year life of the plant will be 1/5; and 4) read Farnam Street!

Mental Model: Raising prices and increasing consumption

Other things being equal when the price of a good rises, the quantity demanded should fall. Economics refer to this as the law of demand. Because there are many counter-examples to this law it's more of a quick rule of thumb.

Giffen goods are ones which people consume more of as the price rises, thus violating the law of demand. The name comes from Sir Robert Griffen, who first proposed this paradox from his observations of the purchasing habits of the Victorian era poor.

Alferd Marshall, explains it like this:

a rise in the price of bread makes so large a drain on the resources of the poorer labouring families and raises so much the marginal utility of money to them, that they are forced to curtail their consumption of meat and the more expensive farinaceous foods: and, bread being still the cheapest food which they can get and will take, they consume more, and not less of it.

But this exceptions to the “law of demand” can be more broader than food or even inferior goods. Often a price can act as a quality indicator. More mischetvious people could rasie prices and use part of the profits to bribe the purchasing agent (like Visa). Amazon does this. Brian Zen does this. Mutual Funds do this. Bribing the purchasing agent is actually fairly common once you open your mind to it.

In his speech “Academic Economics Strengths and Faults after Considering Interdisciplinary needs,” Chalie Munger explained:

One of the most extreme examples is in the investment management field. Suppose you're the manager of a mutual fund, and you want to sell more. People commonly come to the following answer: You raise the commissions, which of course reduces the number of units of real investments delivered to the ultimate buyer, so you're increasing the price per unit of real investment that you're selling the ultimate customer. And you're using that extra commission to bribe the customer's purchasing agent. You're bribing the broker to betray his client and put the client's money into the high-commission product. This has worked to produce at least a trillion dollars of mutual fund sales.

* * *

A real world example of what Munger was talking about:

Sanjay Bakshi offers a great real-world example of how Brian Zen, from Zenway bribes the purchasing agent.

(Brian) wants me to recommend to you, as my student, his online investment course which has helped “people with limited means to become millionaires and multi-millionaires.”

Dr. Zen will charge you $800 to sign up. He promises to pay me $400, if you do.

You will, I hope, recall the connection between Dr. Zen's offer and Mr. Charlie Munger's example of “bribing the purchasing agent”.

…How, then, does one go about selling a high-priced product derived out of something so cheap? That's simple. One uses, the reward super-response tendency and the associated incentive-caused bias (whose bread I eat, his song I sing) which it produces– Mr. Munger's terms – by pricing the product high and offering a very significant part of the sales proceeds to people like me having access to “captive audience” like you.

There is nothing illegal about Dr. Zen designing his business model in this manner. After all, seeking profits is the essence of capitalism, isn't it? But I doubt it very much – if the fathers of value investing – Mr. Graham and Mr. Buffett – would approve of the marketing strategies used by Dr. Zen, for promoting products created out of their knowledge, which they generously shared with the world, without any profit motive involved.

When Wal-Mart pushes its suppliers to lower their prices, and then passes on these low prices to its customers, and yet is able to earn a respectable return on capital, it's an example of a positive-sum game which benefits civilization as a whole. Wal-Mart does not make money off its customers – it makes money with them. But when someone pushes a high-priced product using as ammunition, mouth-watering commissions offered to people who are in a position to influence others, it largely becomes, at least in my view, a zero-sum game. You're not making money with your clients anymore- you're making money off them. And, this aspect of capitalism is not very good for civilization.

You can read more on Griffen Goods at wikipedia.

The Art and Science of Asking Better Questions

At the recommendation of Warren Buffett's Biographer, Alice Schroeder, I've been reading The Craft of Interviewing.

Schroeder seems pretty crafty at knowing when, what, and how to ask.

I want to ask better questions. I want to learn to suppress my ego and stop thinking about what I want to say when the other person is talking.

I've never been taught how to ask questions, which makes me wonder if I'm getting the most out of the questions I do ask.

If you think about it, asking better questions is really just a clever way to steal from the rich and give to the poor. In this case I'm stealing knowledge.

I have a lot of smart friends — by smart, I mean incredibly smart, not just plain smart — and I want to maximize the knowledge I gain from this privilege when we're together

Here are some of the things I dog-eared while reading this book that you might be interested in:

  • The interview, generally, may take two shapes: one, like a funnel, and the other like an inverted funnel. The funnel-shaped interview opens with generalities – “What are the benefits of nuclear warfare, Mr. President?” – then pins down the generalizations – “When and were has it produced those spectacular sunsets that you mention?” The funnel allows the subject some say in the direction of the interview.
  • Sherlock Holmes would have been fond of the inverted-funnel; it opens with hard, fast, specific questions, then ascends to a more general ground. Used appropriately this form can help put people at ease. Another way to put people at ease is to start with the easy questions. (Learn to think more like Holmes.)
  • Don't ever make someone feel as if he can't get his point across, no matter how hard he tries.
  • Far too many people ask questions that try to put the spotlight on themselves rather than the person with the information.
  • Avoid two-part, hypothetical, and leading questions.
  • People won't confess their inner thoughts unless they have proof the person asking those questions is sympathetic.
  • Mike Wallace says “The single most interesting thing you can do in television, I find, is to ask a good question and then let the answer hang there for two or three seconds or four seconds as though you're expecting more.”
  • Envelope tough questions with “people are saying” because that helps avoid the person responding from thinking the questioner is attacking them. (Blame someone else for the question.) Another technique for this is to imply the question is a playful one, “I'd like to play the devil's advocate for a moment.” You can also preface the question with praise.

If anyone knows of other books on asking better questions shoot me an email.

Buy The Craft of Interviewing.

Choice Under Uncertainty

Some of the general heuristics—rules of thumb—that people use in making judgments that produce biases towards classifying situations according to their representativeness, or toward judging frequencies according to the availability of examples in memory, or toward interpretations warped by the way in which a problem has been framed. These heuristics have important implications for individuals and society.

Insensitivity to Base Rates
When people are given information about the probabilities of certain events (e.g., how many lawyers and how many engineers are in a population that is being sampled), and then are given some additional information as to which of the events has occurred (which person has been sampled from the population), they tend to ignore the prior probabilities in favor of incomplete or even quite irrelevant information about the individual event. Thus, if they are told that 70 percent of the population are lawyers, and if they are then given a noncommittal description of a person (one that could equally well fit a lawyer or an engineer), half the time they will predict that the person is a lawyer and half the time that he is an engineer–even though the laws of probability dictate that the best forecast is always to predict that the person is a lawyer.

Insensitivity to Sample Size
People commonly misjudge probabilities in many other ways. Asked to estimate the probability that 60 percent or more of the babies born in a hospital during a given week are male, they ignore information about the total number of births, although it is evident that the probability of a departure of this magnitude from the expected value of 50 percent is smaller if the total number of births is larger (the standard error of a percentage varies inversely with the square root of the population size).

Availability
There are situations in which people assess the frequency of a class by the ease with which instances can be brought to mind. In one experiment, subjects heard a list of names of persons of both sexes and were later asked to judge whether there were more names of men or women on the list. In lists presented to some subjects, the men were more famous than the women; in other lists, the women were more famous than the men. For all lists, subjects judged that the sex that had the more famous personalities was the more numerous.

Framing and Loss Aversion
The way in which an uncertain possibility is presented may have a substantial effect on how people respond to it. When asked whether they would choose surgery in a hypothetical medical emergency, many more people said that they would when the chance of survival was given as 80 percent than when the chance of death was given as 20 percent.

Source: Decision Making and Problem Solving, Herbert A. Simon

The Anatomy of a Decision: An Introduction to Decision Making

An Introduction to Decision Making

“The only proven way to raise your odds of making a good decision is
to learn to use a good decision-making process—one that can
get you the best solution with a minimal
loss of time, energy, money, and composure.”
— John Hammond

***

This is an introduction to decision making.

A good decision-making process can literally change the world.

Consider the following example from Predictable Surprises: In 1962, when spy planes spotted Soviet missiles in Cuba, U.S. military leaders urged President Kennedy to authorize an immediate attack. Fresh from the bruising failure of the Bay of Pigs, Kennedy instead set up a structured decision-making process to evaluate his options. In a precursor of the Devil's Advocacy method, Kennedy established two groups each including government officials and outside experts, to develop and evaluate the two main options–attack Cuba or set up a blockade to prevent more missiles from reaching its shores. Based on the groups' analysis and debate, Kennedy decided to establish a blockade. The Soviets backed down, and nuclear war was averted. Recently available documents suggest that if the United States had invaded Cuba the consequences would have been catastrophic: Soviet missiles that had not been located by U.S. Intelligence could still have struck several U.S. cities.

The concept of a decision-making process can be found in the early history of thinking. Decisions should be the result of rational and deliberate reasoning. Plato argues that human knowledge can be derived on the basis of reason alone using deduction and self-evident propositions. Aristotle formalized logic with logical proofs where someone could reasonably determine if a conclusion was true or false. However, as we will discover not all decisions are perfectly rational. Often, we let our system one thinking–intuition–make decisions for us. Our intuition is based on long-term memory that has been primarily acquired over the years through learning and allows our mind to process and judge without conscious awareness. System one thinking, however, does not always lead to optimal solutions and often tricks our mind to thinking that consequences and second-order effects are either non-existent or less probable than reality would indicate.

In Predictable Surprises Max Bazerman writes:

Rigorous decision analysis combines a systematic assessment of the probabilities of future events with a hard-headed evaluation of the costs and benefits of particular outcomes. As such, it can be an invaluable tool in helping organizations overcome the biases that hinder them in estimating the likelihood of unpleasant events. Decision analysis begins with a clear definition of the decision to be made, followed by an explicit statement of objectives and explicit criteria for assessing the “goodness” of alternative courses of action, by which we mean the net cost or benefit as perceived by the decision-maker. The next steps involve identifying potential courses of action and their consequences. Because these elements often are laid out visually in a decision tree, this technique is known as “decision tree analysis.” Finally, the technique instructs decision-makers to explicitly assess and make trade-offs based on the potential costs and benefits of different courses of action.

To conduct a proper decision analysis, leaders must carefully quantify costs and benefits, their tolerance for accepting risk, and the extent of uncertainty associated with different potential outcomes. These assumptions are inherently subjective, but the process of quantification is nonetheless extremely valuable' it forces participants to express their assumptions and beliefs, thereby making them transparent and subject to challenge and improvement.

From Judgment in Management Decision Making by Max Bazerman:

The term judgment refers to the cognitive aspects of the decision-making process. To fully understand judgment, we must first identify the components of the decision-making process that require it.

Let's look at six steps you should take, either implicitly or explicitly, when applying a “rational” decision-making process to each scenario.

1. Define the problem. (M)anagers often act without a thorough understanding of the problem to be solved, leading them to solve the wrong problem. Accurate judgment is required to identify and define the problem. Managers often err by (a) defining the problem in terms of a proposed solution, (b) missing a bigger problem, or (c) diagnosing the problem in terms of its symptoms. Your goal should be to solve the problem not just eliminate its temporary symptoms.

2. Identify the criteria. Most decisions require you to accomplish more than one objective. When buying a car, you may want to maximize fuel economy, minimize cost, maximize comfort, and so on. The rational decision maker will identify all relevant criteria in the decision-making process.

3. Weight the criteria. Different criteria will vary in importance to a decision maker. Rational decision makers will know the relative value they place on each of the criteria identified. The value may be specified in dollars, points, or whatever scoring system makes sense.

4. Generate alternatives. The fourth step in the decision-making process requires identification of possible courses of action. Decision makers often spend an inappropriate amount of search time seeking alternatives, thus creating a barrier to effective decision making. An optimal search continues only until the cost of the search outweighs the value of added information.

5. Rate each alternative on each criterion. How well will each of the alternative solutions achieve each of the defined criteria? This is often the most difficult stage of the decision-making process, as it typically requires us to forecast future events. The rational decision maker carefully assesses the potential consequences on each of the identified criteria of selecting each of the alternative solutions.

6. Compute the optimal decision. Ideally, after all of the first five steps have been completed, the process of computing the optimal decision consists of (a) multiplying the ratings in step 5 by the weight of each criterion, (b) adding up the weighted ratings across all of the criteria for each alternative, and (c) choosing the solution with the highest sum of weighted ratings.

Hammond, Keeney, and Raiffa suggest 8 steps in their book Smart Choices:

1. Work on the right problem.
2. Identify all criteria.
3. Create imaginative alternatives.
4. Understand the consequences.
5. Grapple with your tradeoffs.
6. Clarify your uncertainties.
7. Think hard about your risk tolerance.
8. Consider linked decisions.

* * *

People, however, are not always perfectly logical machines. In Judgment in Managerial Decision Making, the distinction between System One and System Two thinking becomes clear:

System 1 thinking refers to our intuitive system, which is typically fast, automatic, effortless, implicit, and emotional. We make most decisions in life using System 1 thinking. For instance, we usually decide how to interpret verbal language or visual information automatically and unconsciously. By contrast, System 2 refers to reasoning that is slower, conscious, effortful, explicit, and logical. System 2 thinking can be broken down into (1) define the problem; (2) identify the criteria; (3) weigh the criteria; (4) generate alternatives; (5) rate each alternative on each criterion; (6) compute the optimal decision.

In most situations, our system 1 thinking is quite sufficient; it would be impractical, for example, to logically reason through every choice we make while shopping for groceries. But System 2 logic should preferably influence our most important decisions.

* * *

When making a decision we are psychologically influenced either consciously or unconsciously. By exploring these biases and other elementary worldly wisdom, I hope to make you a better decision maker.

Following a rational decision process can help us focus on outcomes that are low in probability but high in potential costs. Without easily quantifiable costs, we often dismiss low probability events or fall prey to biases. We don't want to be the fragilista.

Even rational decision-making processes like the one presented above make several assumptions. The first assumption is that a rational decision maker is completely informed which means they know about all the possible options and outcomes. The second major assumption is that the decision maker does not fall prey to any biases that might impact the rational decision.

In researching decision-making processes it struck me as odd that few people question the information upon which criteria are measured. For instance, if you are purchasing a car and use fuel efficiency as the sole criterion for decision making you would need to make sure that the cars under consideration were all tested and measured fuel consumption in the same way. This second order of thinking can help you make better decisions.

If you want to make better decisions, you should read Judgment in Managerial Decision Making. Hands down that is the best book I've come across on decision making. If you know of a better one, please send me an email.

Stanovich’s book, What Intelligence Tests Miss: The Psychology of Rational Thought, proposes a whole range of cognitive abilities and dispositions independent of intelligence that have at least as much to do with whether we think and behave rationally.

 

Follow your curiosity to The best books on the psychology behind human decision making and Problem Solving 101.