Tag: Michael Mauboussin

The Generalized Specialist: How Shakespeare, Da Vinci, and Kepler Excelled

“What do you want to be when you grow up?” Do you ever ask kids this question? Did adults ask you this when you were a kid?

Even if you managed to escape this question until high school, then by the time you got there, you were probably expected to be able to answer this question, if only to be able to choose a college and a major. Maybe you took aptitude tests, along with the standard academic tests, in high school. This is when the pressure to go down a path to a job commences. Increasingly, the education system seems to want to reduce the time it takes for us to become productive members of the work force, so instead of exploring more options, we are encouraged to start narrowing them.

Any field you go into, from finance to engineering, requires some degree of specialization. Once you land a job, the process of specialization only amplifies. You become a specialist in certain aspects of the organization you work for.

Then something happens. Maybe your specialty is no longer needed or gets replaced by technology. Or perhaps you get promoted. As you go up the ranks of the organization, your specialty becomes less and less important, and yet the tendency is to hold on to it longer and longer. If it’s the only subject or skill you know better than anything else, you tend to see it everywhere. Even where it doesn’t exist.

Every problem is a nail and you just happen to have a hammer.

Only this approach doesn’t work. Because you have no idea of the big ideas, you start making decisions that don’t take into account how the world really works. These decisions ripple outward, and you have to spend time correcting your mistakes. If you’re not careful about self-reflection, you won’t learn, and you’ll make one version of the same mistakes over and over.

Should we become specialists or polymaths? Is there a balance we should pursue?

There is no single answer.

The decision is personal. And most of the time we fail to see the life-changing implications of it. Whether we’re conscious of this or not, it’s also a decision we have to make and re-make over and over again. Every day, we have to decide where to invest our time — do we become better at what we do or learn something new?

If you can’t adapt, changes become threats instead of opportunities.

There is another way to think about this question, though.

Around 2700 years ago, the Greek poet Archilochus wrote: “the fox knows many things; the hedgehog one big thing.” In the 1950s, philosopher Isaiah Berlin used that sentence as the basis of his essay “The Hedgehog and the Fox.” In it, Berlin divides great thinkers into two categories: hedgehogs, who have one perspective on the world, and foxes, who have many different viewpoints. Although Berlin later claimed the essay was not intended to be serious, it has become a foundational part of thinking about the distinction between specialists and generalists.

Berlin wrote that “…there exists a great chasm between those, on one side, who relate everything to a single central vision, one system … in terms of which they understand, think and feel … and, on the other hand, those who pursue many ends, often unrelated and even contradictory, connected, if at all, only in some de facto way.”

A generalist is a person who is a competent jack of all trades, with lots of divergent useful skills and capabilities. This is the handyman who can fix your boiler, unblock the drains, replace a door hinge, or paint a room. The general practitioner doctor whom you see for any minor health problem (and who refers you to a specialist for anything major). The psychologist who works with the media, publishes research papers, and teaches about a broad topic.

A specialist is someone with distinct knowledge and skills related to a single area. This is the cardiologist who spends their career treating and understanding heart conditions. The scientist who publishes and teaches about a specific protein for decades. The developer who works with a particular program.

In his original essay, Berlin writes that specialists “lead lives, perform acts and entertain ideas that are centrifugal rather than centripetal; their thought is scattered or diffused, moving on many levels, seizing upon the essence of a vast variety of experiences and objects … seeking to fit them into, or exclude them from, any one unchanging, all embracing … unitary inner vision.”

The generalist and the specialist are on the same continuum; there are degrees of specialization in a subject. There’s a difference between someone who specializes in teaching history and someone who specializes in teaching the history of the American Civil war, for example. Likewise, there is a spectrum for how generalized or specialized a certain skill is.

Some skills — like the ability to focus, to read critically, or to make rational decisions — are of universal value. Others are a little more specialized but can be used in many different careers. Examples of these skills would be design, project management, and fluency in a foreign language.

The distinction between generalization and specialization comes from biology. Species are referred to as either generalists or specialists, as with the hedgehog and the fox.

A generalist species can live in a range of environments, utilizing whatever resources are available. Often, these critters eat an omnivorous diet. Raccoons, mice, and cockroaches are generalists. They live all over the world and can eat almost anything. If a city is built in their habitat, then no problem; they can adapt.

A specialist species needs particular conditions to survive. In some cases, they are able to live only in a discrete area or eat a single food. Pandas are specialists, needing a diet of bamboo to survive. Specialist species can thrive if the conditions are correct. Otherwise, they are vulnerable to extinction.

A specialist who is outside of their circle of competence and doesn’t know it is incredibly dangerous.

The distinction between generalist and specialist species is useful as a point of comparison. Generalist animals (including humans) can be less efficient, yet they are less fragile amidst change. If you can’t adapt, changes become threats instead of opportunities.

While it’s not very glamorous to take career advice from a raccoon or a panda, we can learn something from them about the dilemmas we face. Do we want to be like a raccoon, able to survive anywhere, although never maximizing our potential in a single area? Or like a panda, unstoppable in the right context, but struggling in an inappropriate one?

Costs and Benefits

Generalists have the advantage of interdisciplinary knowledge, which fosters creativity and a firmer understanding of how the world works. They have a better overall perspective and can generally perform second-order thinking in a wider range of situations than the specialist can.

Generalists often possess transferable skills, allowing them to be flexible with their career choices and adapt to a changing world. They can do a different type of work and adapt to changes in the workplace. Gatekeepers tend to cause fewer problems for generalists than for specialists.

Managers and leaders are often generalists because they need a comprehensive perspective of their entire organization. And an increasing number of companies are choosing to have a core group of generalists on staff, and hire freelance specialists only when necessary.

The métiers at the lowest risk of automation in the future tend to be those which require a diverse, nuanced skill set. Construction vehicle operators, blue collar workers, therapists, dentists, and teachers included.

When their particular skills are in demand, specialists experience substantial upsides. The scarcity of their expertise means higher salaries, less competition, and more leverage. Nurses, doctors, programmers, and electricians are currently in high demand where I live, for instance.

Specialists get to be passionate about what they do — not in the usual “follow your passion!” way, but in the sense that they can go deep and derive the satisfaction that comes from expertise. Garrett Hardin offers his perspective on the value of specialists: 

…we cannot do without experts. We accept this fact of life, but not without anxiety. There is much truth in the definition of the specialist as someone who “knows more and more about less and less.” But there is another side to the coin of expertise. A really great idea in science often has its birth as apparently no more than a particular answer to a narrow question; it is only later that it turns out that the ramifications of the answer reach out into the most surprising corners. What begins as knowledge about very little turns out to be wisdom about a great deal.

Hardin cites the development of probability theory as an example. When Blaise Pascal and Pierre de Fermat sought to devise a means of dividing the stakes in an interrupted gambling game, their expertise created a theory with universal value.

The same goes for many mental models and unifying theories. Specialists come up with them, and generalists make use of them in surprising ways.

The downside is that specialists are vulnerable to change. Many specialist jobs are disappearing as technology changes. Stockbrokers, for example, face the possibility of replacement by AI in coming years. That doesn’t mean no one will hold those jobs, but demand will decrease. Many people will need to learn new work skills, and starting over in a new field will put them back decades. That’s a serious knock, both psychologically and financially.

Specialists are also subject to “‘man with a hammer” syndrome. Their area of expertise can become the lens they see everything through.

As Michael Mauboussin writes in Think Twice:

…people stuck in old habits of thinking are failing to use new means to gain insight into the problems they face. Knowing when to look beyond experts requires a totally fresh point of view and one that does not come naturally. To be sure, the future for experts is not all bleak. Experts retain an advantage in some crucial areas. The challenge is to know when and how to use them.

Understanding and staying within their circle of competence is even more important for specialists. A specialist who is outside of their circle of competence and doesn’t know it is incredibly dangerous.

Philip Tetlock performed an 18-year study to look at the quality of expert predictions. Could people who are considered specialists in a particular area forecast the future with greater accuracy than a generalist? Tetlock tracked 284 experts from a range of disciplines, recording the outcomes of 28,000 predictions.

The results were stark: predictions coming from generalist thinkers were more accurate. Experts who stuck to their specialized areas and ignored interdisciplinary knowledge faired worse. The specialists tended to be more confident in their erroneous predictions than the generalists. The specialists made definite assertions — which we know from probability theory to be a bad idea. It seems that generalists have an edge when it comes to Bayesian updating, recognizing probability distributions, and long-termism.

Organizations, industries, and the economy need both generalists and specialists. And when we lack the right balance, it creates problems. Millions of jobs remain unfilled, while millions of people lack employment. Many of the empty positions require specialized skills. Many of the unemployed have skills which are too general to fill those roles. We need a middle ground.

The Generalized Specialist

The economist, philosopher, and writer Henry Hazlitt sums up the dilemma:

In the modern world knowledge has been growing so fast and so enormously, in almost every field, that the probabilities are immensely against anybody, no matter how innately clever, being able to make a contribution in any one field unless he devotes all his time to it for years. If he tries to be the Rounded Universal Man, like Leonardo da Vinci, or to take all knowledge for his province, like Francis Bacon, he is most likely to become a mere dilettante and dabbler. But if he becomes too specialized, he is apt to become narrow and lopsided, ignorant on every subject but his own, and perhaps dull and sterile even on that because he lacks perspective and vision and has missed the cross-fertilization of ideas that can come from knowing something of other subjects.

What’s the safest option, the middle ground?

By many accounts, it’s being a specialist in one area, while retaining a few general iterative skills. That might sound like it goes against the idea of specialists and generalists being mutually exclusive, but it doesn’t.

A generalizing specialist has a core competency which they know a lot about. At the same time, they are always learning and have a working knowledge of other areas. While a generalist has roughly the same knowledge of multiple areas, a generalizing specialist has one deep area of expertise and a few shallow ones. We have the option of developing a core competency while building a base of interdisciplinary knowledge.

“The fox knows many things, but the hedgehog knows one big thing.”

— Archilochus

As Tetlock’s research shows, for us to understand how the world works, it’s not enough to home in on one tiny area for decades. We need to pull ideas from everywhere, remaining open to having our minds changed, always looking for disconfirming evidence. Joseph Tussman put it this way: “If we do not let the world teach us, it teaches us a lesson.”

Many great thinkers are (or were) generalizing specialists.

Shakespeare specialized in writing plays, but his experiences as an actor, poet, and part owner of a theater company informed what he wrote. So did his knowledge of Latin, agriculture, and politics. Indeed, the earliest known reference to his work comes from a critic who accused him of being “an absolute Johannes factotum” (jack of all trades).

Leonardo Da Vinci was an infamous generalizing specialist. As well as the art he is best known for, Da Vinci dabbled in engineering, music, literature, mathematics, botany, and history. These areas informed his art — note, for example, the rigorous application of botany and mathematics in his paintings. Some scholars consider Da Vinci to be the first person to combine interdisciplinary knowledge in this way or to recognize that a person can branch out beyond their defining trade.

Johannes Kepler revolutionized our knowledge of planetary motion by combining physics and optics with his main focus, astronomy. Military strategist John Boyd designed aircraft and developed new tactics, using insights from divergent areas he studied, including thermodynamics and psychology. He could think in a different manner from his peers, who remained immersed in military knowledge for their entire careers.

Shakespeare, Da Vinci, Kepler, and Boyd excelled by branching out from their core competencies. These men knew how to learn fast, picking up the key ideas and then returning to their specialties. Unlike their forgotten peers, they didn’t continue studying one area past the point of diminishing returns; they got back to work — and the results were extraordinary.

Many people seem to do work which is unrelated to their area of study or their prior roles. But dig a little deeper and it’s often the case that knowledge from the past informs their present. Marcel Proust put it best: “the real act of discovery consists not in finding new lands, but in seeing with new eyes.”

Interdisciplinary knowledge is what allows us to see with new eyes.

When Charlie Munger was asked whether to become a polymath or a specialist at the 2017 shareholders meeting for the Daily Journal, his answer surprised a lot of people. Many expected the answer to be obvious. Of course, he would recommend that people become generalists. Only this is not what he said.

Munger remarked:

I don’t think operating over many disciplines, as I do, is a good idea for most people. I think it’s fun, that’s why I’ve done it. And I’m better at it than most people would be, and I don’t think I’m good at being the very best at handling differential equations. So, it’s been a wonderful path for me, but I think the correct path for everybody else is to specialize and get very good at something that society rewards, and then to get very efficient at doing it. But even if you do that, I think you should spend 10 to 20% of your time [on] trying to know all the big ideas in all the other disciplines. Otherwise … you’re like a one-legged man in an ass-kicking contest. It’s not going to work very well. You have to know the big ideas in all the disciplines to be safe if you have a life lived outside a cave. But no, I think you don’t want to neglect your business as a dentist to think great thoughts about Proust.

In his comments, we can find the underlying approach most likely to yield exponential results: Specialize most of the time, but spend time understanding the broader ideas of the world.

This approach isn’t what most organizations and educational institutions provide. Branching out isn’t in many job descriptions or in many curricula. It’s a project we have to undertake ourselves, by reading a wide range of books, experimenting with different areas, and drawing ideas from each one.

Still curious? Check out the biographies of Leonardo da Vinci and Ben Fraklin


Comment on Facebook | Discuss on Twitter | Save to Pocket

Daniel Kahneman in Conversation with Michael Mauboussin on Intuition, Causality, Loss Aversion and More

Ever want to be the fly on the wall for a fascinating conversation. Well, here's your chance. Santa Fe Institute Board of Trustees Chair Michael Mauboussin interviews Nobel Prize winner Daniel Kahneman. The wide-ranging conversation talks about disciplined intuition, causality, base rates, loss aversion and so much more. You don't want to miss this.

Here's an excerpt from Kahneman I think you'll enjoy. You can read the entire transcript here.

The Sources of Power is a very eloquent book on expert intuition with magnificent examples, and so he is really quite hostile to my point of view, basically.

We spent years working on that, on the question of when can intuitions be trusted? What's the boundary between trustworthy and untrustworthy intuitions?

I would summarize the answer as saying there is one thing you should not do. People's confidence in their intuition is not a good guide to their validity. Confidence is something else entirely, and maybe we can talk about confidence separately later, but confidence is not it.

What there is, if you want to know whether you can trust intuition, it really is like deciding on a painting, whether it's genuine or not. You can look at the painting all you want, but asking about the provenance is usually the best guide about whether a painting is genuine or not.

Similarly for expertise and intuition, you have to ask not how happy the individual is with his or her own intuitions, but first of all, you have to ask about the domain. Is the domain one where there is enough regularity to support intuitions? That's true in some medical domains, it certainly is true in chess, it is probably not true in stock picking, and so there are domains in which intuition can develop and others in which it cannot. Then you have to ask whether, if it's a good domain, one in which there are regularities that can be picked up by the limited human learning machine. If there are regularities, did the individual have an opportunity to learn those regularities? That primarily has to do with the quality of the feedback.

Those are the questions that I think should be asked, so there is a wide domain where intuitions can be trusted, and they should be trusted, and in a way, we have no option but to trust them because most of the time, we have to rely on intuition because it takes too long to do anything else.

Then there is a wide domain where people have equal confidence but are not to be trusted, and that may be another essential point about expertise. People typically do not know the limits of their expertise, and that certainly is true in the domain of finances, of financial analysis and financial knowledge. There is no question that people who advise others about finances have expertise about finance that their advisees do not have. They know how to look at balance sheets, they understand what happens in conversations with analysts.

There is a great deal that they know, but they do not really know what is going to happen to a particular stock next year. They don't know that, that is one of the typical things about expert intuition in that we know domains where we have it, there are domains where we don't, but we feel the same confidence and we do not know the limits of our expertise, and that sometimes is quite dangerous.

***

Making Decisions in a Complex Adaptive System

complexadaptive

In Think Twice: Harnessing the Power of Counterintuition, Mauboussin does a good job adding to the work we've already done on complex adaptive systems:

You can think of a complex adaptive system in three parts (see the image at the top of this post). First, there is a group of heterogeneous agents. These agents can be neurons in your brain, bees in a hive, investors in a market, or people in a city. Heterogeneity means each agent has different and evolving decision rules that both reflect the environment and attempt to anticipate change in it. Second, these agents interact with one another, and their interactions create structure— scientists often call this emergence. Finally, the structure that emerges behaves like a higher-level system and has properties and characteristics that are distinct from those of the underlying agents themselves. … The whole is greater than the sum of the parts.

***

The inability to understand the system based on its components prompted Nobel Prize winner and physicist Philip Anderson, to draft the essay, “More Is Different.” Anderson wrote, “The behavior of large and complex aggregates of elementary particles, it turns out, is not to be understood in terms of the simple extrapolation of the properties of a few particles. Instead, at each level of complexity entirely new properties appear.”

Mauboussin comments that we are fooled by randomness:

The problem goes beyond the inscrutable nature of complex adaptive systems. Humans have a deep desire to understand cause and effect, as such links probably conferred humans with evolutionary advantage. In complex adaptive systems, there is no simple method for understanding the whole by studying the parts, so searching for simple agent-level causes of system-level effects is useless. Yet our minds are not beyond making up a cause to relieve the itch of an unexplained effect. When a mind seeking links between cause and effect meets a system that conceals them, accidents will happen.

***
Misplaced Focus on the Individual

One mistake we make is extrapolating the behaviour of an individual component, say an individual, to explain the entire system. Yet when we have to solve a problem dealing with a complex system, we often address an individual component. In so doing, we ignore Garrett Hardin's first law of Ecology, you can never do merely one thing and become a fragilista.

That unintended system-level consequences arise from even the best-intentioned individual-level actions has long been recognized. But the decision-making challenge remains for a couple of reasons. First, our modern world has more interconnected systems than before. So we encounter these systems with greater frequency and, most likely, with greater consequence. Second, we still attempt to cure problems in complex systems with a naïve understanding of cause and effect.

***

When I speak with executives from around the world going through a period of poor performance, it doesn't take long for them to mention they want to hire a star from another company. “If only we had Kate,” they'll say, “we could smash the competition and regain our footing.”

At first, poaching stars from competitors or even teams within the same organization seems like a winning strategy. But once the star comes over the results often fail to materialize.

What we fail to grasp is that their performance is part of an ecosystem and removing them from that ecosystem — that is isolating the individual performance — is incredibly hard without properly considering the entire ecosystem. (Reversion to the mean also likely accounts for some of the star's fading as well).

Three Harvard professors concluded, “When a company hires a star, the star’s performance plunges, there is a sharp decline in the functioning of the group or team the person works with, and the company’s market value falls.”

If it sounds like a lot of work to think this through at many levels, it should be. Why should it be easy?

Another example of this at an organizational level has to do with innovation. Most people want to solve the innovation problem. Ignoring for a second that that is the improper framing, how do most organizations go about this? They copy what the most successful organizations do. I can't count the number of times the solution to an organization's “innovation problem” is to be more like Google. Well-intentioned executives blindly copy approaches by others such as 20% innovation time, without giving an ounce of thought to the role the ecosystem plays.

Isolating and focusing on an individual part of a complex adaptive system without an appreciation and understanding of that system itself is sure to lead to disaster.

***
What Should We Do?

So this begs the question, what should we do when we find ourselves dealing with a complex adaptive system? Mauboussin provides three pieces of advice:

1. Consider the system at the correct level.

Remember the phrase “more is different.” The most prevalent trap is extrapolating the behavior of individual agents to gain a sense of system behavior. If you want to understand the stock market, study it at the market level. Consider what you see and read from individuals as entertainment, not as education. Similarly, be aware that the function of an individual agent outside the system may be very different from that function within the system. For instance, mammalian cells have the same metabolic rates in vitro, whether they are from shrews or elephants. But the metabolic rate of cells in small mammals is much higher than the rate of those in large mammals. The same structural cells work at different rates, depending on the animals they find themselves in.

2. Watch for tightly coupled systems.

A system is tightly coupled when there is no slack between items, allowing a process to go from one stage to the next without any opportunity to intervene. Aircraft, space missions, and nuclear power plants are classic examples of complex, tightly coupled systems. Engineers try to build in buffers or redundancies to avoid failure, but frequently don’t anticipate all possible contingencies. Most complex adaptive systems are loosely coupled, where removing or incapacitating one or a few agents has little impact on the system’s performance. For example, if you randomly remove some investors, the stock market will continue to function fine. But when the agents lose diversity and behave in a coordinated fashion, a complex adaptive system can behave in a tightly coupled fashion. Booms and crashes in financial markets are an illustration.

3. Use simulations to create virtual worlds.

Dealing with complex systems is inherently tricky because the feedback is equivocal, information is limited, and there is no clear link between cause and effect. Simulation is a tool that can help our learning process. Simulations are low cost, provide feedback, and have proved their value in other domains like military planning and pilot training.

Still Curious? Think Twice: Harnessing the Power of Counterintuition.

How Situations Influence Decisions

Michael Mauboussin, the first guest on my podcast, The Knowledge Project, explains how our situations influence our decisions enormously in Think Twice: Harnessing the Power of Counterintuition.

Mistakes born out of situations are difficult to avoid, in part because the influences on us are operating at a subconscious level. “Making good decisions in the face of subconscious pressure,” Mauboussin writes, “requires a very high degree of background knowledge and self-awareness.”

How do you feel when you read the word “treasure”? Do you feel good? What images come to mind? If you are like most people, just ruminating on “treasure” gives you a little lift. Our minds naturally make connections and associate ideas. So if someone introduces a cue to you— a word, a smell, a symbol— your mind often starts down an associative path. And you can be sure the initial cue will color a decision that waits at the path’s end. All this happens outside of your perception.

People around us also influence our decisions, often with good reason. Social influence arises for a couple of reasons. The first is asymmetric information, a fancy phrase meaning someone knows something you don’t. In those cases, imitation makes sense because the information upgrade allows you to make better decisions.

Peer pressure, or the desire to be part of the in-group, is a second source of social influence. For good evolutionary reasons, humans like to be part of a group— a collection of interdependent individuals— and naturally spend a good deal of time assessing who is “in” and who is “out.” Experiments in social psychology have repeatedly confirmed this.

We explain behavior based on an individual's choices and disposition and not the situation. That is, we associate bad behaviour with the person and not the situation. Unless, of course, we're talking about ourselves. This is “the fundamental attribution error”, a phrase coined by Lee Ross, a social psychologist at Stanford University.

There are two sides to this sword as the power of situations can work for good and evil. “Some of the greatest atrocities known to mankind,” Mauboussin writes, “resulted from putting normal people into bad situations.”

We believe our choices are independent of circumstance, however, the evidence points in another direction.

***

Some Wine With Your Music?

Consider how something as simple as the music playing in a store influences what wine we purchase.

Imagine strolling down the supermarket aisle and coming upon a display of French and German wines, roughly matched for price and quality. You do some quick comparisons, place a German wine in your cart, and continue shopping. After you check out, a researcher approaches and asks why you bought the German wine. You mention the price, the wine’s dryness, and how you anticipate it will go nicely with a meal you are planning. The researcher then asks whether you noticed the German music playing and whether it had any bearing on your decision. Like most, you would acknowledge hearing the music and avow that it had nothing to do with your selection.

But this isn't a hypothetical, it's an actual study and the results affirm that the environment influences our decisions.

In this test, the researchers placed the French and German wines next to each other, along with small national flags. Over two weeks, the scientists alternated playing French accordion music and German Bierkeller pieces and watched the results. When French music played, French wines represented 77 percent of the sales. When German music played, consumers selected German wines 73 percent of the time. (See the image below) The music made a huge difference in shaping purchases. But that’s not what the shoppers thought.

While the customers acknowledged that the music made them think of either France or Germany, 86 percent denied the tunes had any influence on their choice.

Music_decisions

This is an example of priming, which psychologists formally define as “the incidental activation of knowledge structures by the current situational context.”1 and priming happens all the time. For priming to be most effective it must have a strong connection to our situation's goals.

Another example of how situations influence us is the default. In a fast moving world of non-stop bits and bytes the default is the path of least resistance — that is, it's the system one option. To move away from the default is labor intensive on our brains. Studies have repeatedly shown that most people go with defaults.

This applies to a wide array of choices, from insignificant issues like the ringtone on a new cell phone to consequential issues like financial savings, educational choice, and medical alternatives. Richard Thaler, an economist, and Cass Sunstein, a law professor, call the relationship between choice presentation and the ultimate decision “choice architecture.” They convincingly argue that we can easily nudge people toward a particular decision based solely on how we arrange the choices for them.

One context for decision making is how choices are structured. Knowing that many people opt for the default option, we can influence (for better or worse) large groups of people.

Mauboussin relates a story about a prominent psychologist popular on the speaking circuit that “underscores how underappreciated choice architecture remains.”

When companies call to invite him to speak, he offers them two choices. Either they can pay him his set fee and get a standard presentation, or they can pay him nothing in exchange for the opportunity to work with him on an experiment to improve choice architecture (e.g., redesign a form or Web site). Of course, the psychologist benefits by getting more real-world results on choice architecture, but it seems like a pretty good deal for the company as well, because an improved architecture might translate into financial benefits vastly in excess of his speaking fee. He noted ruefully that so far not one company has taken him up on his experiment offer.

(As a brief aside, I engage in public speaking on a fairly regular basis. I've toyed with similar ideas. Once I even went as far as offering to speak for no pre-set fee, only “value added” as judged by the client. They opted for the fee.)

Another great example of how environments affect behavior is Stanley Milgram's famous experiment on obedience to authority. “Ordinary people, simply doing their jobs, and without any particular hostility on their part, can become agents in a terrible destructive process,” wrote Stanley Milgram. The Stanford Prison Experiment is, yet, another example.

***

Situations are generally more powerful than we think

The key point is that situations are generally more powerful than we think and we can do things to resist the pull of “unwelcome social influence.”

Mauboussin offers four tips:

1. Be aware of your situation.

You can think of this in two parts. There is the conscious element, where you can create a positive environment for decision making in your own surroundings by focusing on process, keeping stress to an acceptable level, being a thoughtful choice architect, and making sure to diffuse the forces that encourage negative behaviors.

Then there is coping with the subconscious influences. Control over these influences requires awareness of the influence, motivation to deal with it, and the willingness to devote attention to address possible poor decisions. In the real world, satisfying all three control conditions is extremely difficult, but the path starts with awareness.

2. Consider the situation first and the individual second.

This concept, called attributional charity, insists that you evaluate the decisions of others by starting with the situation and then turning to the individuals, not the other way around. While easier for Easterners than Westerners, most of us consistently underestimate the role of the situation in assessing the decisions we see others make. Try not to make the fundamental attribution error.

3. Watch out for the institutional imperative.

Warren Buffett, the celebrated investor and chairman of Berkshire Hathaway, coined the term institutional imperative to explain the tendency of organizations to “mindlessly” imitate what peers are doing. There are typically two underlying drivers of the imperative. First, companies want to be part of the in-group, much as individuals do. So if some companies in an industry are doing mergers, chasing growth, or expanding geographically, others will be tempted to follow. Second are incentives. Executives often reap financial rewards by following the group. When decision makers make money from being part of the crowd, the draw is nearly inescapable.

One example comes from a Financial Times interview with the former chief executive officer of Citigroup Chuck Prince in 2007, before the brunt of the financial crisis. “When the music stops, things will be complicated,” offered Prince, demonstrating that he had some sense of what was to come. “But as long as the music is playing, you’ve got to get up and dance.” The institutional imperative is rarely a good dance partner.

4. Avoid inertia.

Periodically revisit your processes and ask whether they are serving their purpose. Organizations sometimes adopt routines and structures that become crystallized, impeding positive change. Efforts to reform education in the United States, for example, have been met with resistance from teachers and administrators who prefer the status quo.

We like to think that we're better than the situation, that we follow the decision-making process and rationally weigh the facts, consider alternatives, and determine the best course of action. While others are easily influenced, we are not. This is how we're wrong.

Decision making is fundamentally a social exercise, something I cover in my Re:Think Decision Making workshop.

1. “Automaticity of Social Behavior: Direct Effects of Trait Construction and Stereotype Activation on Action”

The Wisdom of Crowds and The Expert Squeeze

As networks harness the wisdom of crowds, the ability of experts to add value in their predictions is steadily declining. This is the expert squeeze.

As networks harness the wisdom of crowds, the ability of experts to add value in their predictions is steadily declining. This is the expert squeeze.

In Think Twice: Harnessing the Power of Counterintuition, Michael Mauboussin, the first guest on my podcast, The Knowledge Project, explains the expert squeeze and its implications for how we make decisions.

As networks harness the wisdom of crowds and computing power grows, the ability of experts to add value in their predictions is steadily declining. I call this the expert squeeze, and evidence for it is mounting. Despite this trend, we still pine for experts— individuals with special skill or know-how— believing that many forms of knowledge are technical and specialized. We openly defer to people in white lab coats or pinstripe suits, believing they hold the answers, and we harbor misgivings about computergenerated outcomes or the collective opinion of a bunch of tyros.

The expert squeeze means that people stuck in old habits of thinking are failing to use new means to gain insight into the problems they face. Knowing when to look beyond experts requires a totally fresh point of view, and one that does not come naturally. To be sure, the future for experts is not all bleak. Experts retain an advantage in some crucial areas. The challenge is to know when and how to use them.

The Value of Experts
The Value of Experts

So how can we manage this in our role as decision maker? The first step is to classify the problem.

(The figure above — The Value of Experts) helps to guide this process. The second column from the left covers problems that have rules-based solutions with limited possible outcomes. Here, someone can investigate the problem based on past patterns and write down rules to guide decisions. Experts do well with these tasks, but once the principles are clear and well defined, computers are cheaper and more reliable. Think of tasks such as credit scoring or simple forms of medical diagnosis. Experts agree about how to approach these problems because the solutions are transparent and for the most part tried and true.

[…]

Now let’s go to the opposite extreme, the column on the far right that deals with probabilistic fields with a wide range of outcomes. Here are no simple rules. You can only express possible outcomes in probabilities, and the range of outcomes is wide. Examples include economic and political forecasts. The evidence shows that collectives outperform experts in solving these problems.

[…]

The middle two columns are the remaining province for experts. Experts do well with rules-based problems with a wide range of outcomes because they are better than computers at eliminating bad choices and making creative connections between bits of information.

Once you've classified the problem, you can turn to the best method for solving it.

… computers and collectives remain underutilized guides for decision making across a host of realms including medicine, business, and sports. That said, experts remain vital in three capacities. First, experts must create the very systems that replace them. … Of course, the experts must stay on top of these systems, improving the market or equation as need be.

Next, we need experts for strategy. I mean strategy broadly, including not only day-to-day tactics but also the ability to troubleshoot by recognizing interconnections as well as the creative process of innovation, which involves combining ideas in novel ways. Decisions about how best to challenge a competitor, which rules to enforce, or how to recombine existing building blocks to create novel products or experiences are jobs for experts.

Finally, we need people to deal with people. A lot of decision making involves psychology as much as it does statistics. A leader must understand others, make good decisions, and encourage others to buy in to the decision.

So what are the practical tips you can do to make the expert squeeze work for you instead of against you? Here Mauboussin offers 3 tips.

1. Match the problem you face with the most appropriate solution.

What we know is that experts do a poor job in many settings, suggesting that you should try to supplement expert views with other approaches.

2. Seek diversity.

(Philip) Tetlock’s work shows that while expert predictions are poor overall, some are better than others. What distinguishes predictive ability is not who the experts are or what they believe, but rather how they think. Borrowing from Archilochus— through Isaiah Berlin— Tetlock sorted experts into hedgehogs and foxes. Hedgehogs know one big thing and try to explain everything through that lens. Foxes tend to know a little about a lot of things and are not married to a single explanation for complex problems. Tetlock finds that foxes are better predictors than hedgehogs. Foxes arrive at their decisions by stitching “together diverse sources of information,” lending credence to the importance of diversity. Naturally, hedgehogs are periodically right— and often spectacularly so— but do not predict as well as foxes over time. For many important decisions, diversity is the key at both the individual and collective levels.

3. Use technology when possible. Leverage technology to side-step the squeeze when possible.

Flooded with candidates and aware of the futility of most interviews, Google decided to create algorithms to identify attractive potential employees. First, the company asked seasoned employees to fill out a three-hundred-question survey, capturing details about their tenure, their behavior, and their personality. The company then compared the survey results to measures of employee performance, seeking connections. Among other findings, Google executives recognized that academic accomplishments did not always correlate with on-the-job performance. This novel approach enabled Google to sidestep problems with ineffective interviews and to start addressing the discrepancy.

Learning the difference between when experts help or hurt can go a long way toward avoiding stupidity. This starts with identifying the type of problem you're facing and then considering the various approaches to solve the problem with pros and cons.

Still curious? Follow up by reading Generalists vs. Specialists, Think Twice: Harnessing the Power of Counterintuition, and reviewing the work of Philip Tetlock on why how you think matters more than what you think.

Countering the Inside View and Making Better Decisions

Countering the Inside View And Making Better Decisions

You can reduce the number of mistakes you make by thinking about problems more clearly.

In his book Think Twice: Harnessing the Power of Counterintuition, Michael Mauboussin discusses how we can “fall victim to simplified mental routines that prevent us from coping with the complex realities inherent in important judgment calls.” One of those routines is the inside view, which we're going to talk about in this article but first let's get a bit of context.

No one wakes up thinking, “I am going to make bad decisions today.” Yet we all make them. What is particularly surprising is some of the biggest mistakes are made by people who are, by objective standards, very intelligent. Smart people make big, dumb, and consequential mistakes.

[…]

Mental flexibility, introspection, and the ability to properly calibrate evidence are at the core of rational thinking and are largely absent on IQ tests. Smart people make poor decisions because they have the same factory settings on their mental software as the rest of us, and that software isn’t designed to cope with many of today’s problems.

We don't spend enough time thinking and learning from the process. Generally we're pretty ambivalent about the process by which we make decisions.

… typical decision makers allocate only 25 percent of their time to thinking about the problem properly and learning from experience. Most spend their time gathering information, which feels like progress and appears diligent to superiors. But information without context is falsely empowering.

That reminds me of what Daniel Kahneman wrote in Thinking, Fast and Slow:

A remarkable aspect of your mental life is that you are rarely stumped … The normal state of your mind is that you have intuitive feelings and opinions about almost everything that comes your way. You like or dislike people long before you know much about them; you trust or distrust strangers without knowing why; you feel that an enterprise is bound to succeed without analyzing it.

So we're not really gathering information as much as trying to satisfice our existing intuition. The very thing a good decision process should help root out.

***
Ego Induced Blindness

One prevalent error we make is that we tend to favour the inside view over the outside view.

An inside view considers a problem by focusing on the specific task and by using information that is close at hand, and makes predictions based on that narrow and unique set of inputs. These inputs may include anecdotal evidence and fallacious perceptions. This is the approach that most people use in building models of the future and is indeed common for all forms of planning.

[…]

The outside view asks if there are similar situations that can provide a statistical basis for making a decision. Rather than seeing a problem as unique, the outside view wants to know if others have faced comparable problems and, if so, what happened. The outside view is an unnatural way to think, precisely because it forces people to set aside all the cherished information they have gathered.

When the inside view is more positive than the outside view you effectively have a base rate argument. You're saying (knowingly or, more likely, unknowingly) that this time is different. Our brains are all too happy to help us construct this argument.

Mauboussin argues that we embrace the inside view for a few primary reasons. First, we're optimistic by nature. Second, is the “illusion of optimism” (we see our future as brighter than that of others). Finally, is the illusion of control (we think that chance events are subject to our control).

One interesting point is that while we're bad at looking at the outside view when it comes to ourselves, we're better at it when it comes to other people.

In fact, the planning fallacy embodies a broader principle. When people are forced to look at similar situations and see the frequency of success, they tend to predict more accurately. If you want to know how something is going to turn out for you, look at how it turned out for others in the same situation. Daniel Gilbert, a psychologist at Harvard University, ponders why people don’t rely more on the outside view, “Given the impressive power of this simple technique, we should expect people to go out of their way to use it. But they don’t.” The reason is most people think of themselves as different, and better, than those around them.

So it's mostly ego. I'm better than the people tackling this problem before me. We see the differences between situations and use those as rationalizations as to why things are different this time.

Consider this:

We incorrectly think that differences are more valuable than similarities.

After all, anyone can see what’s the same but it takes true insight to see what’s different, right? We’re all so busy trying to find differences that we forget to pay attention to what is the same.

***
How to Incorporate the Outside View into your Decisions

In Think Twice, Mauboussin distills the work of Kahneman and Tversky into four steps and adds some commentary.

1. Select a Reference Class

Find a group of situations, or a reference class, that is broad enough to be statistically significant but narrow enough to be useful in analyzing the decision that you face. The task is generally as much art as science, and is certainly trickier for problems that few people have dealt with before. But for decisions that are common—even if they are not common for you— identifying a reference class is straightforward. Mind the details. Take the example of mergers and acquisitions. We know that the shareholders of acquiring companies lose money in most mergers and acquisitions. But a closer look at the data reveals that the market responds more favorably to cash deals and those done at small premiums than to deals financed with stock at large premiums. So companies can improve their chances of making money from an acquisition by knowing what deals tend to succeed.

2. Assess the distribution of outcomes.

Once you have a reference class, take a close look at the rate of success and failure. … Study the distribution and note the average outcome, the most common outcome, and extreme successes or failures.

[…]

Two other issues are worth mentioning. The statistical rate of success and failure must be reasonably stable over time for a reference class to be valid. If the properties of the system change, drawing inference from past data can be misleading. This is an important issue in personal finance, where advisers make asset allocation recommendations for their clients based on historical statistics. Because the statistical properties of markets shift over time, an investor can end up with the wrong mix of assets.

Also keep an eye out for systems where small perturbations can lead to large-scale change. Since cause and effect are difficult to pin down in these systems, drawing on past experiences is more difficult. Businesses driven by hit products, like movies or books, are good examples. Producers and publishers have a notoriously difficult time anticipating results, because success and failure is based largely on social influence, an inherently unpredictable phenomenon.

3. Make a prediction.

With the data from your reference class in hand, including an awareness of the distribution of outcomes, you are in a position to make a forecast. The idea is to estimate your chances of success and failure. For all the reasons that I’ve discussed, the chances are good that your prediction will be too optimistic.

Sometimes when you find the right reference class, you see the success rate is not very high. So to improve your chance of success, you have to do something different than everyone else.

4. Assess the reliability of your prediction and fine-tune.

How good we are at making decisions depends a great deal on what we are trying to predict. Weather forecasters, for instance, do a pretty good job of predicting what the temperature will be tomorrow. Book publishers, on the other hand, are poor at picking winners, with the exception of those books from a handful of best-selling authors. The worse the record of successful prediction is, the more you should adjust your prediction toward the mean (or other relevant statistical measure). When cause and effect is clear, you can have more confidence in your forecast.

***

The main lesson we can take from this is that we tend to focus on what's different whereas the best decisions often focus on just the opposite: what's the same. While this situation seems a little different, it's almost always the same.

As Charlie Munger has said: “if you notice, the plots are very similar. The same plot comes back time after time.”

Particulars may vary but, unless those particulars are the variables that govern the outcome of the situation, the pattern remains. If we're going to focus on what's different rather than what's the same, you'd best be sure the variables you're clinging to matter.