How Situations Influence Decisions

Michael Mauboussin, the first guest on my podcast, The Knowledge Project, explains how our situations influence our decisions enormously in Think Twice: Harnessing the Power of Counterintuition.

Mistakes born out of situations are difficult to avoid, in part because the influences on us are operating at a subconscious level. “Making good decisions in the face of subconscious pressure,” Mauboussin writes, “requires a very high degree of background knowledge and self-awareness.”

How do you feel when you read the word “treasure”? Do you feel good? What images come to mind? If you are like most people, just ruminating on “treasure” gives you a little lift. Our minds naturally make connections and associate ideas. So if someone introduces a cue to you— a word, a smell, a symbol— your mind often starts down an associative path. And you can be sure the initial cue will color a decision that waits at the path’s end. All this happens outside of your perception.

People around us also influence our decisions, often with good reason. Social influence arises for a couple of reasons. The first is asymmetric information, a fancy phrase meaning someone knows something you don’t. In those cases, imitation makes sense because the information upgrade allows you to make better decisions.

Peer pressure, or the desire to be part of the in-group, is a second source of social influence. For good evolutionary reasons, humans like to be part of a group— a collection of interdependent individuals— and naturally spend a good deal of time assessing who is “in” and who is “out.” Experiments in social psychology have repeatedly confirmed this.

We explain behavior based on an individual’s choices and disposition and not the situation. That is, we associate bad behaviour with the person and not the situation. Unless, of course, we’re talking about ourselves. This is “the fundamental attribution error”, a phrase coined by Lee Ross, a social psychologist at Stanford University.

There are two sides to this sword as the power of situations can work for good and evil. “Some of the greatest atrocities known to mankind,” Mauboussin writes, “resulted from putting normal people into bad situations.”

We believe our choices are independent of circumstance, however, the evidence points in another direction.

***
Some Wine With Your Music?

Consider how something as simple as the music playing in a store influences what wine we purchase.

Imagine strolling down the supermarket aisle and coming upon a display of French and German wines, roughly matched for price and quality. You do some quick comparisons, place a German wine in your cart, and continue shopping. After you check out, a researcher approaches and asks why you bought the German wine. You mention the price, the wine’s dryness, and how you anticipate it will go nicely with a meal you are planning. The researcher then asks whether you noticed the German music playing and whether it had any bearing on your decision. Like most, you would acknowledge hearing the music and avow that it had nothing to do with your selection.

But this isn’t a hypothetical, it’s an actual study and the results affirm that the environment influences our decisions.

In this test, the researchers placed the French and German wines next to each other, along with small national flags. Over two weeks, the scientists alternated playing French accordion music and German Bierkeller pieces and watched the results. When French music played, French wines represented 77 percent of the sales. When German music played, consumers selected German wines 73 percent of the time. (See the image below) The music made a huge difference in shaping purchases. But that’s not what the shoppers thought.

While the customers acknowledged that the music made them think of either France or Germany, 86 percent denied the tunes had any influence on their choice.

Music_decisions

This is an example of priming, which psychologists formally define as “the incidental activation of knowledge structures by the current situational context.”1 and priming happens all the time. For priming to be most effective it must have a strong connection to our situation’s goals.

Another example of how situations influence us is the default. In a fast moving world of non-stop bits and bytes the default is the path of least resistance — that is, it’s the system one option. To move away from the default is labour intensive on our brains. Studies have repeatedly shown that most people go with defaults.

This applies to a wide array of choices, from insignificant issues like the ringtone on a new cell phone to consequential issues like financial savings, educational choice, and medical alternatives. Richard Thaler, an economist, and Cass Sunstein, a law professor, call the relationship between choice presentation and the ultimate decision “choice architecture.” They convincingly argue that we can easily nudge people toward a particular decision based solely on how we arrange the choices for them.

One context for decision making is how choices are structured. Knowing that many people opt for the default option, we can influence (for better or worse) large groups of people.

Mauboussin relates a story about a prominent psychologist popular on the speaking circuit that “underscores how underappreciated choice architecture remains.”

When companies call to invite him to speak, he offers them two choices. Either they can pay him his set fee and get a standard presentation, or they can pay him nothing in exchange for the opportunity to work with him on an experiment to improve choice architecture (e.g., redesign a form or Web site). Of course, the psychologist benefits by getting more real-world results on choice architecture, but it seems like a pretty good deal for the company as well, because an improved architecture might translate into financial benefits vastly in excess of his speaking fee. He noted ruefully that so far not one company has taken him up on his experiment offer.

(As a brief aside, I engage in public speaking on a fairly regular basis. I’ve toyed with similar ideas. Once I even went as far as offering to speak for no pre-set fee, only “value added” as judged by the client. They opted for the fee.)

Another great example of how environments affect behaviour is Stanley Milgram’s famous experiment on obedience to authority. “Ordinary people, simply doing their jobs, and without any particular hostility on their part, can become agents in a terrible destructive process,” wrote Stanley Milgram. The Stanford Prison Experiment is, yet, another example.

***

The key point is that situations are generally more powerful than we think and we can do things to resist the pull of “unwelcome social influence.” Mauboussin offers four tips.

1. Be aware of your situation.

You can think of this in two parts. There is the conscious element, where you can create a positive environment for decision making in your own surroundings by focusing on process, keeping stress to an acceptable level, being a thoughtful choice architect, and making sure to diffuse the forces that encourage negative behaviors.

Then there is coping with the subconscious influences. Control over these influences requires awareness of the influence, motivation to deal with it, and the willingness to devote attention to address possible poor decisions. In the real world, satisfying all three control conditions is extremely difficult, but the path starts with awareness.

2. Consider the situation first and the individual second.

This concept, called attributional charity, insists that you evaluate the decisions of others by starting with the situation and then turning to the individuals, not the other way around. While easier for Easterners than Westerners, most of us consistently underestimate the role of the situation in assessing the decisions we see others make. Try not to make the fundamental attribution error.

3. Watch out for the institutional imperative.

Warren Buffett, the celebrated investor and chairman of Berkshire Hathaway, coined the term institutional imperative to explain the tendency of organizations to “mindlessly” imitate what peers are doing. There are typically two underlying drivers of the imperative. First, companies want to be part of the in-group, much as individuals do. So if some companies in an industry are doing mergers, chasing growth, or expanding geographically, others will be tempted to follow. Second are incentives. Executives often reap financial rewards by following the group. When decision makers make money from being part of the crowd, the draw is nearly inescapable.

One example comes from a Financial Times interview with the former chief executive officer of Citigroup Chuck Prince in 2007, before the brunt of the financial crisis. “When the music stops, things will be complicated,” offered Prince, demonstrating that he had some sense of what was to come. “But as long as the music is playing, you’ve got to get up and dance.” The institutional imperative is rarely a good dance partner.

4. Avoid inertia.

Periodically revisit your processes and ask whether they are serving their purpose. Organizations sometimes adopt routines and structures that become crystallized, impeding positive change. Efforts to reform education in the United States, for example, have been met with resistance from teachers and administrators who prefer the status quo.

We like to think that we’re better than the situation, that we follow the decision making process and rationally weigh the facts, consider alternatives, and determine the best course of action. While others are easily influenced, we are not. This is how we’re wrong.

Decision making is fundamentally a social exercise, something I cover in my Re:Think Decision Making workshop.

1. “Automaticity of Social Behavior: Direct Effects of Trait Construction and Stereotype Activation on Action”

The Wisdom of Crowds and The Expert Squeeze

As networks harness the wisdom of crowds, the ability of experts to add value in their predictions is steadily declining. This is the expert squeeze.

As networks harness the wisdom of crowds, the ability of experts to add value in their predictions is steadily declining. This is the expert squeeze.

In Think Twice: Harnessing the Power of Counterintuition, Michael Mauboussin, the first guest on my podcast, The Knowledge Project, explains the expert squeeze and its implications for how we make decisions.

As networks harness the wisdom of crowds and computing power grows, the ability of experts to add value in their predictions is steadily declining. I call this the expert squeeze, and evidence for it is mounting. Despite this trend, we still pine for experts— individuals with special skill or know-how— believing that many forms of knowledge are technical and specialized. We openly defer to people in white lab coats or pinstripe suits, believing they hold the answers, and we harbor misgivings about computergenerated outcomes or the collective opinion of a bunch of tyros.

The expert squeeze means that people stuck in old habits of thinking are failing to use new means to gain insight into the problems they face. Knowing when to look beyond experts requires a totally fresh point of view, and one that does not come naturally. To be sure, the future for experts is not all bleak. Experts retain an advantage in some crucial areas. The challenge is to know when and how to use them.

The Value of Experts
The Value of Experts

So how can we manage this in our role as decision maker? The first step is to classify the problem.

(The figure above — The Value of Experts) helps to guide this process. The second column from the left covers problems that have rules-based solutions with limited possible outcomes. Here, someone can investigate the problem based on past patterns and write down rules to guide decisions. Experts do well with these tasks, but once the principles are clear and well defined, computers are cheaper and more reliable. Think of tasks such as credit scoring or simple forms of medical diagnosis. Experts agree about how to approach these problems because the solutions are transparent and for the most part tried and true.

[…]

Now let’s go to the opposite extreme, the column on the far right that deals with probabilistic fields with a wide range of outcomes. Here are no simple rules. You can only express possible outcomes in probabilities, and the range of outcomes is wide. Examples include economic and political forecasts. The evidence shows that collectives outperform experts in solving these problems.

[…]

The middle two columns are the remaining province for experts. Experts do well with rules-based problems with a wide range of outcomes because they are better than computers at eliminating bad choices and making creative connections between bits of information.

Once you’ve classified the problem, you can turn to the best method for solving it.

… computers and collectives remain underutilized guides for decision making across a host of realms including medicine, business, and sports. That said, experts remain vital in three capacities. First, experts must create the very systems that replace them. … Of course, the experts must stay on top of these systems, improving the market or equation as need be.

Next, we need experts for strategy. I mean strategy broadly, including not only day-to-day tactics but also the ability to troubleshoot by recognizing interconnections as well as the creative process of innovation, which involves combining ideas in novel ways. Decisions about how best to challenge a competitor, which rules to enforce, or how to recombine existing building blocks to create novel products or experiences are jobs for experts.

Finally, we need people to deal with people. A lot of decision making involves psychology as much as it does statistics. A leader must understand others, make good decisions, and encourage others to buy in to the decision.

So what are the practical tips you can do to make the expert squeeze work for you instead of against you? Here Mauboussin offers 3 tips.

1. Match the problem you face with the most appropriate solution.

What we know is that experts do a poor job in many settings, suggesting that you should try to supplement expert views with other approaches.

2. Seek diversity.

(Philip) Tetlock’s work shows that while expert predictions are poor overall, some are better than others. What distinguishes predictive ability is not who the experts are or what they believe, but rather how they think. Borrowing from Archilochus— through Isaiah Berlin— Tetlock sorted experts into hedgehogs and foxes. Hedgehogs know one big thing and try to explain everything through that lens. Foxes tend to know a little about a lot of things and are not married to a single explanation for complex problems. Tetlock finds that foxes are better predictors than hedgehogs. Foxes arrive at their decisions by stitching “together diverse sources of information,” lending credence to the importance of diversity. Naturally, hedgehogs are periodically right— and often spectacularly so— but do not predict as well as foxes over time. For many important decisions, diversity is the key at both the individual and collective levels.

3. Use technology when possible. Leverage technology to side-step the squeeze when possible.

Flooded with candidates and aware of the futility of most interviews, Google decided to create algorithms to identify attractive potential employees. First, the company asked seasoned employees to fill out a three-hundred-question survey, capturing details about their tenure, their behavior, and their personality. The company then compared the survey results to measures of employee performance, seeking connections. Among other findings, Google executives recognized that academic accomplishments did not always correlate with on-the-job performance. This novel approach enabled Google to sidestep problems with ineffective interviews and to start addressing the discrepancy.

Learning the difference between when experts help or hurt can go a long way toward avoiding stupidity. This starts with identifying the type of problem you’re facing and then considering the various approaches to solve the problem with pros and cons.

Still curious? Follow up by reading Generalists vs. Specialists, Think Twice: Harnessing the Power of Counterintuition, and reviewing the work of Philip Tetlock on why how you think matters more than what you think.

Countering the Inside View and Making Better Decisions

Countering the Inside View And Making Better Decisions

You can reduce the number of mistakes you make by thinking about problems more clearly.

In his book Think Twice: Harnessing the Power of Counterintuition, Michael Mauboussin discusses how we can “fall victim to simplified mental routines that prevent us from coping with the complex realities inherent in important judgment calls.” One of those routines is the inside view, which we’re going to talk about in this article but first let’s get a bit of context.

No one wakes up thinking, “I am going to make bad decisions today.” Yet we all make them. What is particularly surprising is some of the biggest mistakes are made by people who are, by objective standards, very intelligent. Smart people make big, dumb, and consequential mistakes.

[…]

Mental flexibility, introspection, and the ability to properly calibrate evidence are at the core of rational thinking and are largely absent on IQ tests. Smart people make poor decisions because they have the same factory settings on their mental software as the rest of us, and that software isn’t designed to cope with many of today’s problems.

We don’t spend enough time thinking and learning from the process. Generally we’re pretty ambivalent about the process by which we make decisions.

… typical decision makers allocate only 25 percent of their time to thinking about the problem properly and learning from experience. Most spend their time gathering information, which feels like progress and appears diligent to superiors. But information without context is falsely empowering.

That reminds me of what Daniel Kahneman wrote in Thinking, Fast and Slow:

A remarkable aspect of your mental life is that you are rarely stumped … The normal state of your mind is that you have intuitive feelings and opinions about almost everything that comes your way. You like or dislike people long before you know much about them; you trust or distrust strangers without knowing why; you feel that an enterprise is bound to succeed without analyzing it.

So we’re not really gathering information as much as trying to satisfice our existing intuition. The very thing a good decision process should help root out.

***
Ego Induced Blindness

One prevalent error we make is that we tend to favour the inside view over the outside view.

An inside view considers a problem by focusing on the specific task and by using information that is close at hand, and makes predictions based on that narrow and unique set of inputs. These inputs may include anecdotal evidence and fallacious perceptions. This is the approach that most people use in building models of the future and is indeed common for all forms of planning.

[…]

The outside view asks if there are similar situations that can provide a statistical basis for making a decision. Rather than seeing a problem as unique, the outside view wants to know if others have faced comparable problems and, if so, what happened. The outside view is an unnatural way to think, precisely because it forces people to set aside all the cherished information they have gathered.

When the inside view is more positive than the outside view you effectively have a base rate argument. You’re saying (knowingly or, more likely, unknowingly) that this time is different. Our brains are all too happy to help us construct this argument.

Mauboussin argues that we embrace the inside view for a few primary reasons. First, we’re optimistic by nature. Second, is the “illusion of optimism” (we see our future as brighter than that of others). Finally, is the illusion of control (we think that chance events are subject to our control).

One interesting point is that while we’re bad at looking at the outside view when it comes to ourselves, we’re better at it when it comes to other people.

In fact, the planning fallacy embodies a broader principle. When people are forced to look at similar situations and see the frequency of success, they tend to predict more accurately. If you want to know how something is going to turn out for you, look at how it turned out for others in the same situation. Daniel Gilbert, a psychologist at Harvard University, ponders why people don’t rely more on the outside view, “Given the impressive power of this simple technique, we should expect people to go out of their way to use it. But they don’t.” The reason is most people think of themselves as different, and better, than those around them.

So it’s mostly ego. I’m better than the people tackling this problem before me. We see the differences between situations and use those as rationalizations as to why things are different this time.

Consider this:

We incorrectly think that differences are more valuable than similarities.

After all, anyone can see what’s the same but it takes true insight to see what’s different, right? We’re all so busy trying to find differences that we forget to pay attention to what is the same.

***
How to Incorporate the Outside View into your Decisions

In Think Twice, Mauboussin distills the work of Kahneman and Tversky into four steps and adds some commentary.

1. Select a Reference Class

Find a group of situations, or a reference class, that is broad enough to be statistically significant but narrow enough to be useful in analyzing the decision that you face. The task is generally as much art as science, and is certainly trickier for problems that few people have dealt with before. But for decisions that are common—even if they are not common for you— identifying a reference class is straightforward. Mind the details. Take the example of mergers and acquisitions. We know that the shareholders of acquiring companies lose money in most mergers and acquisitions. But a closer look at the data reveals that the market responds more favorably to cash deals and those done at small premiums than to deals financed with stock at large premiums. So companies can improve their chances of making money from an acquisition by knowing what deals tend to succeed.

2. Assess the distribution of outcomes.

Once you have a reference class, take a close look at the rate of success and failure. … Study the distribution and note the average outcome, the most common outcome, and extreme successes or failures.

[…]

Two other issues are worth mentioning. The statistical rate of success and failure must be reasonably stable over time for a reference class to be valid. If the properties of the system change, drawing inference from past data can be misleading. This is an important issue in personal finance, where advisers make asset allocation recommendations for their clients based on historical statistics. Because the statistical properties of markets shift over time, an investor can end up with the wrong mix of assets.

Also keep an eye out for systems where small perturbations can lead to large-scale change. Since cause and effect are difficult to pin down in these systems, drawing on past experiences is more difficult. Businesses driven by hit products, like movies or books, are good examples. Producers and publishers have a notoriously difficult time anticipating results, because success and failure is based largely on social influence, an inherently unpredictable phenomenon.

3. Make a prediction.

With the data from your reference class in hand, including an awareness of the distribution of outcomes, you are in a position to make a forecast. The idea is to estimate your chances of success and failure. For all the reasons that I’ve discussed, the chances are good that your prediction will be too optimistic.

Sometimes when you find the right reference class, you see the success rate is not very high. So to improve your chance of success, you have to do something different than everyone else.

4. Assess the reliability of your prediction and fine-tune.

How good we are at making decisions depends a great deal on what we are trying to predict. Weather forecasters, for instance, do a pretty good job of predicting what the temperature will be tomorrow. Book publishers, on the other hand, are poor at picking winners, with the exception of those books from a handful of best-selling authors. The worse the record of successful prediction is, the more you should adjust your prediction toward the mean (or other relevant statistical measure). When cause and effect is clear, you can have more confidence in your forecast.

***

The main lesson we can take from this is that we tend to focus on what’s different whereas the best decisions often focus on just the opposite: what’s the same. While this situation seems a little different, it’s almost always the same.

As Charlie Munger has said: “if you notice, the plots are very similar. The same plot comes back time after time.”

Particulars may vary but, unless those particulars are the variables that govern the outcome of the situation, the pattern remains. If we’re going to focus on what’s different rather than what’s the same, you’d best be sure the variables you’re clinging to matter.

Michael Mauboussin on Intuition, Experts, Technology, and Making Better Decisions

Michael Mauboussin, Credit Suisse
Michael Mauboussin, Credit Suisse

Welcome to The Knowledge Project, an experimental podcast aimed at acquiring wisdom through interviews with key luminaries from across the globe to gain insights into how they think, live, and connect ideas. The core themes will seem familiar to readers: Decision Making, Leadership, Innovation. But it also touches on questions about what it means to live a good life.

***

The first episode of The Knowledge Project features Michael Mauboussin, the head of Global Financial Strategies at Credit Suisse. He’s also written numerous books, including More Than You Know: Finding Financial Wisdom in Unconventional Places, Think Twice: Harnessing the Power of Counterintuition, and most recently The Success Equation: Untangling Skill and Luck in Business, Sports, and Investing. More importantly, Mauboussin spends more time thinking about thinking than most people.

In this episode we explore parenting, daily routines, reading, and how to make better decisions.

***

Here are a list of books mentioned in the podcast:

***

In this excerpt from the podcast, Mauboussin comments on the role of intuition in the decision making process:

The way I like to think about this, and by the way there’s a great book by David Myers on this, called “Intuition.” It’s a book I really would recommend. It’s one of the better treatments of this, and more thoughtful treatments of this.

The way I think about this is, intuition is very domain-specific. Specifically, I would use the language of Danny Kahneman – System one, System two. System one is our experiential system. It’s fast, it’s automatic, but it’s not very malleable. It’s difficult to train.

Our System two, of course, our analytical system, is slower, more purposeful, more deliberate but more trainable. Intuition applies when you participate in a particular activity to a sufficient amount that you effectively train your System one.

So that things become, go from your slow system to your fast system. Where would this work, for instance? It would work in things like, obviously, with things like chess. Chess masters, we know, they chunk. They can see the board very quickly, know who’s at advantage, who’s not at advantage.

But it’s not going to work… So, the key characteristic is it’s going to work in what I would call stable linear environments. Stable linear environments. Athletics would be another example. For long parts of history, it was in warfare. Certain elements of warfare would work.

But if you get into unstable, non-linear environments, all bets are going to be off. There is a great quote from Greg Northcraft, which I love, when he says you have to differentiate between experience and expertise. Intuition relates to this.

He said expertise… An expert is someone who has a predictive model that works, and so just because you’ve been doing something for a long time doesn’t mean that you have a predictive model that works.

I would say intuition should be used with a lot of caution.

The key is to have disciplined intuition.

(Danny Kahneman) said, “You know, you’re going to have these base rates, or statistical ways of thinking about things, and then you’re going to have your intuition. How do you use those two things, and in what order?”

The argument he made was you should always start with the base rate the statistical approach, and then layer in your intuition. He called it “disciplined intuition.” Otherwise, if you go with your intuition first, you’re going to seek out, right, you’re going to seek out things that support your point of view.

I always think about it that way. I know that a lot of people make decisions using their gut or their intuition, but I don’t know that that’s the best way to do it in most settings. Some settings, yes, but most settings, no.

How Using a Decision Journal can Help you Make Better Decisions

"Odds are you’re going to discover two things right away. First, you’re right a lot of the time. Second, it’s often for the wrong reasons."
“Odds are you’re going to discover two things. First, you’re right a lot of the time. Second, it’s often for the wrong reasons.”

One question I’m often asked is what should a decision journal look like?

You should care enormously whether you’re making good ones or bad ones. After all, in most knowledge organizations, your product is decisions.

A good decision process matters more than analysis by a factor of six. A process or framework for making decisions, however, is only one part of an overall approach to making better decisions.

The way to test the quality of your decisions, whether individually or organizationally, is by testing the process by which they are made. And one way to do that is to use a decision journal.

You can think of a decision journal as quality control — something like we’d find in a manufacturing plant.

Conceptually this is pretty easy but it requires some discipline and humility to implement and maintain. In consulting with various organizations on how to make better decisions I’ve seen everything from people who take great strides to improve their decision making to people who doctor decision journals for optics over substance.

“The idea,” says Michael Mauboussin, “is whenever you are making a consequential decision, write down what you decided, why you decided as you did, what you expect to happen, and if you’re so inclined, how you feel mentally and physically.”

Whenever you’re making a consequential decision either individually or as part of a group you take a moment and write down:

  1. The situation or context;
  2. The problem statement or frame;
  3. The variables that govern the situation;
  4. The complications or complexity as you see it;
  5. Alternatives that were seriously considered and why they were not chosen; (think: the work required to have an opinion).
  6. A paragraph explaining the range of outcomes
  7. A paragraph explaining what you expect to happen and, importantly, the reasoning and actual probabilities you assign to each. (The degree of confidence matters, a lot.)
  8. Time of day the decision was made and how you feel physically and mentally (if you’re tired, for example, write it down.)

Of course, this can be tailored to the situation and context. Specific decisions might include tradeoffs, weighting criteria, or other relevant factors.

One point, worth noting, is not to spend too much time on the brief and obvious insight. Often these first thoughts are system one, not system two. Any decision you’re journaling is inherently complex (and may involve non-linear systems). In such a world small effects can cause disproportionate responses whereas bigger ones can have no impact. Remember that causality is complex, especially in complex domains.

Review
One tip I’ve learned from helping organizations implement this is that there are two common ways people wiggle out of their own decision: hindsight bias and jargon.

I know we live in an age of computers but you simply must do this by hand because that will help reduce the odds of hindsight bias. It’s easy to look at a print-out and say, I didn’t see it that way. It’s a lot harder to look at your own writing and say the same thing.

Another thing to avoid is vague and ambiguous wording. If you’re talking in abstractions and fog, you’re not ready to make a decision, and you’ll find it easy to change the definitions to suit new information. This is where writing down the probabilities as you see them comes into play.

These journals should be reviewed on a regular basis—every six months or so. The review is an important part of the process. This is where you can get better. Realizing where you make mistakes, how you make them, what types of decisions you’re bad at, etc. will help you make better decisions if you’re rational enough. This is also where a coach can help. If you share your journal with someone, they can review it with you and help identify areas for improvement.

And keep in mind it’s not all about outcome. You might have made the right decision (which, in our sense means a good process) and had a bad outcome. We call that a bad break.

Odds are you’re going to discover two things right away. First, you’re right a lot of the time. Second, it’s often for the wrong reasons.

This can be somewhat humbling.

Let’s say you buy a stock and it goes up, but it goes up for reasons that are not the same as the ones you thought. You’re probably thinking high-fives all around right? But in a very real sense you were wrong. This feedback is incredibly important.

If you let it, the information provided by this journal will help identify cases where you think you know more than you do but in fact you’re operating outside your circle of competence.

It will also show you how your views change over time, when you tend to make better decisions, and how serious the deliberations were.

A Wonderfully Simple Heuristic to Recognize Charlatans

Nassim Nicholas Taleb

“For the Arab scholar and religious leader Ali Bin Abi-Taleb (no relation), keeping one’s distance from an ignorant person is equivalent to keeping company with a wise man.”

The idea of inversion isn’t new.

While we can learn a lot from what successful people do in the mornings, as Nassim Taleb points out, we can learn a lot from what failed people do before breakfast too.

 

Inversion is actually one of the most powerful mental models in our arsenal. Not only does inversion help us innovate but it also helps us deal with uncertainty.

“It is in the nature of things,” says Charlie Munger, “that many hard problems are best solved when they are addressed backward.”

Sometimes we can’t articulate what we want. Sometimes we don’t know. Sometimes there is so much uncertainty that the best approach is to attempt to avoid certain outcomes rather than attempt to guide towards the ones we desire. In short, we don’t always know what we want but we know what we don’t want.

Avoiding stupidity is often easier than seeking brilliance.

The “apophatic,” writes Nassim Taleb in Antifragile, “focuses on what cannot be said directly in words, from the greek apophasis (saying no, or mentioning without meaning).”

The method began as an avoidance of direct description, leading to a focus on negative description, what is called in Latin via negativa, the negative way, after theological traditions, particularly in the Eastern Orthodox Church. Via negativa does not try to express what God is— leave that to the primitive brand of contemporary thinkers and philosophasters with scientistic tendencies. It just lists what God is not and proceeds by the process of elimination.

Statues are carved by subtraction.

Michelangelo was asked by the pope about the secret of his genius, particularly how he carved the statue of David, largely considered the masterpiece of all masterpieces. His answer was: “It’s simple. I just remove everything that is not David.”

Where Is the Charlatan?

Recall that the interventionista focuses on positive action—doing. Just like positive definitions, we saw that acts of commission are respected and glorified by our primitive minds and lead to, say, naive government interventions that end in disaster, followed by generalized complaints about naive government interventions, as these, it is now accepted, end in disaster, followed by more naive government interventions. Acts of omission, not doing something, are not considered acts and do not appear to be part of one’s mission.

I have used all my life a wonderfully simple heuristic: charlatans are recognizable in that they will give you positive advice, and only positive advice, exploiting our gullibility and sucker-proneness for recipes that hit you in a flash as just obvious, then evaporate later as you forget them. Just look at the “how to” books with, in their title, “Ten Steps for—” (fill in: enrichment, weight loss, making friends, innovation, getting elected, building muscles, finding a husband, running an orphanage, etc.).

We learn the most from the negative.

[I]n practice it is the negative that’s used by the pros, those selected by evolution: chess grandmasters usually win by not losing; people become rich by not going bust (particularly when others do); religions are mostly about interdicts; the learning of life is about what to avoid. You reduce most of your personal risks of accident thanks to a small number of measures.

Skill doesn’t always win.

In anything requiring a combination of skill and luck the most skillful don’t always win. That’s one of the key messages of Michael Mauboussin’s book The Success Equation: Untangling Skill and Luck in Business, Sports, and Investing. This is hard for us to swallow because we intuitively feel that if you are successful you have skill for the same reasons that if the outcome is good we think you made a good decision. We can’t predict whether a person who has skills will succeed but Taleb argues that we can “pretty much predict” that a person without skills will eventually have their luck run out.

Subtractive Knowledge
Taleb argues that the greatest “and most robust contribution to knowledge consists in removing what we think is wrong—subtractive epistemology.” He continues that “we know a lot more about what is wrong than what is right.” What does not work, that is negative knowledge, is more robust than positive knowledge. This is because it’s a lot easier for something we know to fail than it is for something we know that isn’t so to succeed.

There is a whole book on the half-life of what we consider to be ‘knowledge or fact’ called The Half-Life of Facts. Basically, because of our partial understanding of the world, which is constantly evolving, we believe things that are not true. That’s not the only reason that we believe things that are not true but it’s a big one.

The thing is we’re not so smart. If I’ve only seen white swans, saying “all swans are white” may be accurate given my limited view of the world but we can never be sure that there are no black swans until we’ve seen everything.

Or as Taleb puts it: “since one small observation can disprove a statement, while millions can hardly confirm it, disconfirmation is more rigorous than confirmation.”

Most people attribute this philosophical argument to Karl Popper but Taleb dug up some evidence that it goes back to the “skeptical-empirical” medical schools of the post classical era in the Eastern Mediterranean.

Being antifragile isn’t about what you do, but rather what you avoid. Avoid fragility. Avoid stupidity. Don’t be the sucker. …