Changing How We Think

Mind the Gap-compressed

What kind of thinking leads to better outcomes? That’s the question that Roger Martin addresses in his wonderful book Diaminds: Decoding the Mental Habits of Successful Thinkers.

The world is awash in complexity. Nearly every decision we make is uncertain. There is no one way to look at uncertainty. There are as many ways of seeing, experiencing and representing problems as there are people. Each person, in turn, brings their own mental models.

Successful thinking integrates several radically different models while preserving the thinker’s ability to act decisively. The successful thinker is an integrator who can quickly and effectively abstract the best qualities of radically different ways of seeing and representing; in doing so, that person develops ‘a better lens’ on the bewildering phenomenon we call the ‘world.’

Integrators attempt to hold two, often contradictory, ways of seeing the world. Rather than fearing the ensuing tension they embrace it.

This is reminiscent of F. Scott Fitzgerald, who said:

The test of a first-rate intelligence is the ability to hold two opposing ideas in mind at the same time and still retain the ability to function. One should, for example, be able to see that things are hopeless yet be determined to make them otherwise.

But that view is not out of reach to the layperson. It’s also not new. Thomas C. Chamberlin, President of the University of Wisconsin from 1887 to 1892, proposed the idea of “multiple working hypotheses.” In an article published in Science, he wrote:

In following a single hypothesis, the mind is presumably led to a single explanatory conception. But an adequate explanation often involves the co-ordination of several agencies, which enter into the combined result in varying proportions. The true explanation is therefore necessarily complex. Such complex explanations of phenomena are specially encouraged by the method of multiple hypotheses, and constitute one of its chief merits.

Martin believes that “thinkers who exploit opposing ideas to construct a new solution enjoy a built-in advantage over thinkers who can consider only one model at a time.”

This is not either/or thinking, it’s using a broad-based education to push past the limits of binary thinking and into new ways of combining things. Another of Martin’s books, Opposable Mind: Winning Through Integrative Thinking, defines integrative thinking as:

The ability to face constructively the tension of opposing ideas and, instead of choosing one at the expense of the other, generate a creative resolution of the tension in the form of a new idea that contains elements of the opposing ideas but is superior to each.

Martin argues that this type of thinking is both identifiable and learnable.

Thinking is Habitual

What we think is often made up of habits or, as Martin calls them, “repetitive and recurrent units of mental behaviour that occur on very short time scales.” Most of our mental models and internal stories work by matching patterns — they close off ideas as soon as one fits without attempting to falsify it. In an evolutionary context this makes a lot of sense. If you see a lion, you run. This is fairly closed. You don’t ask a lot of questions or attempt to discern if the lion is friendly or not. Our minds are optimized to think and act quickly. This is system one thinking at its evolutionary finest and it’s fairly inexpensive as far as mental habits go.

Think about how we respond when asked a question of the sort – why did you do X. Our inclination is to respond with a because answer, which rarely includes a cause. If you made a decision, you would offer a reason “this was the best decision given the information” and not “it caused me to avoid someone I’ve been trying to avoid.”

Habits are not all burdens. Often, as Martin points out, they “make thinking bearably simple” But we need a way to describe them if we are to understand them. In Diaminds: Decoding the Mental Habits of Successful Thinkers, Martin writes:

Mental habits work in conjunction with one another to make up patterns of thinking, which are analogous to patterns of behaviour in that they are reliably reproducible and yield predictable results in similar circumstances. It is a commonplace that ‘marketing people’ and ‘engineering people’ process information in different ways. But in what ways are they different? and what difference do their differences make?

This is where the language of cognitive science comes in handy as a language for describing patterns of thought. Thus one finds that ‘marketing people’ pay attention to a lot of information – they are informationally broad in their thinking patterns. They are constantly foraging the world for new bits of information and comparing that new information with parts of their existing database – which they keep around in working memory – in order to arrive at action prescriptions or decision rules. But they spend less time than engineers thinking about each piece of information and about how the various pieces fit together. In other words, compared to engineers, marketers are logically shallower in their thinking approach.

By contrast, one finds that ‘engineering people’ are informationally narrower but logically deeper in their thinking styles. They seek out far less information than do their marketing counterparts; then, having gathered it, they strive for logical consistency among the various pieces of information they deem relevant. For instance, they look for ways in which what they believe connects to what they know. They look for the logical implications of what they already know or believe in order to decide what new beliefs to hold out for testing. They look for connections of the logical and causal type among facts and quasi-facts, rather than just associations and correlations.

Aware of this difference, we can ask: In what circumstances should logical depth dominate informational breadth and, vice versa? In what situations is more thinking better than more foraging or more asking, given that one can only think (or forage) more if one forages (or thinks) less?

The point is to get marketing people and engineering people to think together to create better thinkers. We want to combine the broad thinking the marketers bring to bear on the problem with the logical depth the engineers bring to come up with better solutions.

Mental Habits: A Deep(er) Dive

Martin defines a mental habit as “a pattern of thought that is so entrenched and feels so natural that it has become unconscious and therefore goes unnoticed.” Elaborating on this, he writes:

What we look for when looking for a mental habit is a consistently recurring way in which these explanations are produced, a guiding rule or principle that remains unchanged no matter what the specific explanation.

These thoughts are unconscious and feel natural.

Consider the ‘place responsibility for negative effects elsewhere’ habit.

You arrive late to a meeting. While the reasons vary from situation to situation, you likely never tell the truth but rather blame something else. Traffic was bad. My alarm clock didn’t go off. You almost never hear anyone say, ‘I was late because I forgot to check the room for the meeting.’

This, in Martin’s words, is the difference between responsible and defensive behaviours.

Systematically producing responsible (rather than defensible) causes for one’s behaviour – even when it would be easy to produce irresponsible ones – is likely to feel effortful and to require an exercise of the will.

Consider the mental habit of ‘certainty entails truth’, also known as over-influence from precision.

A CEO asks his CFO if he is sure about the data in the report. If the CFO responds with a no, the CEO thinks he is incompetent. Or maybe the CEO asks the lead on an important project what the odds are they can hit the deadline. The person responds with 80% and the CEO’s next question is “how can we hit 100%?”

As unconscious and intuitive habits, these are often hidden from us and often come from our desire to feel good. Over time, however, these habits become our reality. We believe in the validity of these explanations and this affects how we think about ourselves and the world around us, and how we approach problems. The words we speak become the way we think.

Eminem had it wrong when he said, “I am whatever you say I am.” He should have said, “I am whatever I say I am.”

Tinker With your Thinker

Designing and controlling our mental habits is possible because thinking, when done deliberately, is a sequence of words and sentences. The Panopticon Effect impacts how we design our minds.

Thinking – especially thinking in words and sentences – is a form of internal communication. In thinking, you-in-the-present communicates with you-in-the-future. But though thinking is a private and covert activity, it is influenced by external interactions – in particular, by how you communicate with others. Communicative patterns become mental habits. The implication is that counterproductive – closed, oblivious, disconnected, narrow, hermetic, rigid – ways of communicating are thereby internalized and become counterproductive ways of thinking.

The key to changing how we think, then, is to switch from intuitive to deliberate thought, observe our patterns of communication, and then change the way in which we communicate. As Heinz von Foerster put it, ‘If you want to think differently, first learn to act differently.’

Communicating differently with others and yourself is the key to changing your mind.

What does this look like in practice?

… how one thinks is an internalized version of how one communicates – indeed, it sheds light on how one communicates. The systematic placing of responsibility elsewhere for ills and mishaps is a locally effective social strategy – one that is often rewarded by nods of understanding and (potentially false, but who looks that closely?) expressions of sympathy. It also helps one avoid being placed on the spot by difficult ‘why?’ questions. It feels good to get understanding from others and to avoid such difficult moments. In time, this way of communicating becomes a script.

There is one problem: it is difficult to produce the message convincingly without at least half-believing it. Most humans are reasonably good at identifying liars and dissimulators, even if they are not professionally trained to do so. … But a solution is at hand: make a habit of the way you communicate part of the very fabric of thinking. ‘Place responsibility for ills elsewhere’ thus becomes a mental habit, not just a social and communicative habit.

In the end changing our patterns of thinking becomes about changing the language we use for internal and external communication. We need to move to ‘I was late for the meeting because I forgot to check the room number’ instead of ‘I was late because of something outside my control.’ By addressing problems honestly (especially when they are ambiguous) we change how we think.

Our ‘mind design principle’ for new and more successful mental habits is thus a simple one: because thinking is self-talk, talk and thought are linked. To change patterns of thinking, change patterns of talking.

Diaminds: Decoding the Mental Habits of Successful Thinkers is a fascinating exploration of how we think and offers a new way to improve our ability to think.

(image source)

Charlie Munger on the Value of Thinking Backward and Forward

One of the five simple notions to solve problems is the concept of inversion. To solve problems we need to look at them both forward and backward.

But how does this look in practice? Let me give you an example that Charlie Munger gave during a speech.

Munger liked to give his family little puzzles. And one of the puzzles he gave his family was:

There’s an activity in America, with one-on-one contests, and a national championship. The same person won the championship on two occasions about 65 years apart.

“Now,” I said, “name the activity.”

Any ideas? How would you answer this?

“In my family,” Munger said, “not a lot of light bulbs were flashing.” Except for one.

But I have a physicist son who has been trained more in the type of thinking I like. And he immediately got the right answer, and here’s the way he reasoned:

It can’t be anything requiring a lot of hand-eye coordination. Nobody 85 years of age is going to win a national billiards tournament, much less a national tennis tournament. It just can’t be. Then he figured it couldn’t be chess, which this physicist plays very well, because it’s too hard. The complexity of the system, the stamina required are too great. But that led into checkers. And he thought, “Ah ha! There’s a game where vast experience might guide you to be the best even though you’re 85 years of age.”

And sure enough that was the right answer.

Flipping one’s thinking both forward and backward is a powerful sort of mental trickery that will help improve your thinking.

Ruth Chang: How to Make Hard Choices

"A world full of only easy choices would enslave us to reasons."
“A world full of only easy choices would enslave us to reasons.”

Ruth Chang is a philosopher at Rutgers University with an interesting background. After graduating with a J.D. from Harvard Law School and dipping her toe into the legal world, she went off to Oxford University to study philosophy. Her work focuses on how we make the decisions that shape our lives.

In her recent TED talk (video below), she talks about how we make hard choices and in the process offers a framework for making decisions consistent with who we truly are.

What makes a hard choice hard is the way alternatives relate.

In any easy choice, one alternative is better than the other. In a hard choice, one alternative is better in some ways, the other alternative is better in other ways, and neither is better than the other overall. You agonize over whether to stay in your current job in the city or uproot your life for more challenging work in the country because staying is better in some ways, moving is better in others, and neither is better than the other overall. We shouldn’t think that all hard choices are big. Let’s say you’re deciding what to have for breakfast. You could have high fiber bran cereal or a chocolate donut. Suppose what matters in the choice is tastiness and healthfulness. The cereal is better for you, the donut tastes way better, but neither is better than the other overall, a hard choice. Realizing that small choices can also be hard may make big hard choices seem less intractable. After all, we manage to figure out what to have for breakfast, so maybe we can figure out whether to stay in the city or uproot for the new job in the country.

In hard choices we tend to prefer the safest option.

… I can tell you that fear of the unknown, while a common motivational default in dealing with hard choices, rests on a misconception of them. It’s a mistake to think that in hard choices, one alternative really is better than the other, but we’re too stupid to know which, and since we don’t know which, we might as well take the least risky option. Even taking two alternatives side by side with full information, a choice can still be hard. Hard choices are hard not because of us or our ignorance; they’re hard because there is no best option.

Now, if there’s no best option, if the scales don’t tip in favor of one alternative over another, then surely the alternatives must be equally good, so maybe the right thing to say in hard choices is that they’re between equally good options. That can’t be right. If alternatives are equally good, you should just flip a coin between them, and it seems a mistake to think, here’s how you should decide between careers, places to live, people to marry: Flip a coin. There’s another reason for thinking that hard choices aren’t choices between equally good options.

Our search for physics like exactitude and our desire to quantify everything into scientific thinking combine to lead us astray.

We unwittingly assume that values like justice, beauty, kindness, are akin to scientific quantities, like length, mass and weight. Take any comparative question not involving value, such as which of two suitcases is heavier? There are only three possibilities. The weight of one is greater, lesser or equal to the weight of the other. Properties like weight can be represented by real numbers — one, two, three and so on — and there are only three possible comparisons between any two real numbers. One number is greater, lesser, or equal to the other. Not so with values. As post-Enlightenment creatures, we tend to assume that scientific thinking holds the key to everything of importance in our world, but the world of value is different from the world of science. The stuff of the one world can be quantified by real numbers. The stuff of the other world can’t.

Another way to see things is that they are in the same ball-park. This is what happens in hard choices, the alternatives are “on a par.”

When alternatives are on a par, it may matter very much which you choose, but one alternative isn’t better than the other. Rather, the alternatives are in the same neighborhood of value, in the same league of value, while at the same time being very different in kind of value. That’s why the choice is hard.

From the Independent on Sunday, Feb 19, 1995
From the Independent on Sunday, Feb 19, 1995

We create reasons.

Understanding hard choices in this way uncovers something about ourselves we didn’t know. Each of us has the power to create reasons. Imagine a world in which every choice you face is an easy choice, that is, there’s always a best alternative. If there’s a best alternative, then that’s the one you should choose, because part of being rational is doing the better thing rather than the worse thing, choosing what you have most reason to choose. … A world full of only easy choices would enslave us to reasons. … (However) when alternatives are on a par, the reasons given to us, the ones that determine whether we’re making a mistake, are silent as to what to do. It’s here, in the space of hard choices, that we get to exercise our normative power, the power to create reasons for yourself …

When we choose between options that are on a par, we can do something really rather remarkable. We can put our very selves behind an option. … This response in hard choices is a rational response, but it’s not dictated by reasons given to us. Rather, it’s supported by reasons created by us. When we create reasons for ourselves to become this kind of person rather than that, we wholeheartedly become the people that we are. You might say that we become the authors of our own lives.

When you face hard choices you need to look inside yourself.

… Instead of looking for reasons out there, we should be looking for reasons in here: Who am I to be? You might decide to be a pink sock-wearing, cereal-loving, country-living banker, and I might decide to be a black sock-wearing, urban, donut-loving artist. What we do in hard choices is very much up to each of us.

If you don’t exercise your normative powers you become a drifter.

Drifters allow the world to write the story of their lives. They let mechanisms of reward and punishment — pats on the head, fear, the easiness of an option — to determine what they do. So the lesson of hard choices reflect on what you can put your agency behind, on what you can be for, and through hard choices, become that person.

Hard choices are part of what makes us human.

Far from being sources of agony and dread, hard choices are precious opportunities for us to celebrate what is special about the human condition, that the reasons that govern our choices as correct or incorrect sometimes run out, and it is here, in the space of hard choices, that we have the power to create reasons for ourselves to become the distinctive people that we are. And that’s why hard choices are not a curse but a godsend.

Here is Ruth’s full TED talk:

5 Simple Notions that Help Solve Problems

Here are five simple notions, found in Damn Right!: Behind the Scenes with Berkshire Hathaway Billionaire Charlie Munger, that Charlie Munger, the Billionaire business partner of Warren Buffett, finds helpful in solving problems.

***
1. Simplify

My first helpful notion is that it is usually best to simplify problems by deciding big “no-brainer” questions first.

***
2. Numerical Fluency

The second helpful notion mimics Galileo’s conclusion that scientific reality is often revealed only by math, as if math was the language of God. Galileo’s attitude also works well in messy practical life. Without numerical fluency, in the part of life most of us inhabit, you are like a one-legged man in an ass-kicking contest.

***
3. Invert

Inverting the problem won’t always solve it, but it will help you avoid trouble. Call it the avoiding stupidity filter.

The third helpful notion is that it is not enough to think problems through forward. You must also think in reverse, much like the rustic who wanted to know where he was going to die so that he’d never go there. Indeed, many problems can’t be solved forward. And that is why the great algebraist, Carl Jacobi, so often said: “invert, always invert.” And why Pythagoras thought in reverse to prove that the square root of two was an irrational number.

***
4. Study The Basics

The basics are something that keeps coming up. The first of the five elements of effective thinking is understand deeply.

Munger also believes in the basics:

The fourth helpful notion is that the best and most practical wisdom is elementary academic wisdom. But there is one extremely important qualification: you must think in a multidisciplinary manner. You must routinely use all the easy-to-learn concepts from the freshman course in every basic subject. Where elementary ideas will serve, your problem solving must not be limited, as academia and many business bureaucracies are limited, by extreme balkanization into disciplines and subdisciplines, with strong taboos against any venture outside assigned territory. …

If, in your thinking, you rely on others, often through purchase of professional advice, whenever outside a small territory of your own, you will suffer much calamity.

This happens in part because professional advisors are often undone, not by their conscious malfeasance rather by troubles found in their subconscious bias.

His cognition will often be impaired, for your purposes, by financial incentives different from yours. And he will also suffer from the psychological defect caught by the proverb: to a man with a hammer, every problem looks like a nail.”

***
5. Lollapalooza Effects

And you need to combine really big things.

The fifth helpful notion is that really big effects, lollapalooza effects, will often come only from large combinations of factors. For instance, tuberculosis was tamed, at least for a long time, only by routine combined use in each case of three different drugs. And other lollapalooza effects, like the flight of an airplane, follow a similar pattern.

***
Still Curious?

See how Munger applies these in this essay. Learn more about the wit and wisdom of Charlie Munger by picking up a copy of Poor Charlie’s Almanack and Damn Right!: Behind the Scenes with Berkshire Hathaway billionaire Charlie Munger.

Winning An Argument

We spend a lot of our lives trying to persuade others.

This is one of the reasons that Daniel Pink says that we’re all in sales.

Some of you, no doubt, are selling in the literal sense— convincing existing customers and fresh prospects to buy casualty insurance or consulting services or homemade pies at a farmers’ market. But all of you are likely spending more time than you realize selling in a broader sense—pitching colleagues, persuading funders, cajoling kids. Like it or not, we’re all in sales now.

There are many ways to change minds. We often try to convince people. In the difference between persuading and convincing, Seth Godin writes:

Marketers don’t convince. Engineers convince. Marketers persuade. Persuasion appeals to the emotions and to fear and to the imagination. Convincing requires a spreadsheet or some other rational device.

It’s much easier to persuade someone if they’re already convinced, if they already know the facts. But it’s impossible to change someone’s mind merely by convincing them of your point.

But what do we do when this doesn’t work?

Kathryn Schulz, in her book Being Wrong: Adventures in the Margin of Error, explains:

… The first thing we usually do when someone disagrees with us is that we just assume they are ignorant. You know, they don’t have access to the same information we do and when we generously share that information with them, they are going to see the light and come on over to our team.

When that doesn’t work. When it turns out those people have all the same information and they still don’t agree with us we move onto a second assumption. They’re idiots …

This is what we normally do. We try to convince them that we’re right and they are wrong. (Most people, however, are not idiots.)

In many cases this is just us being overconfident about what we think — the illusion of explanatory depth. We really believe that we understand how something works when we don’t. In a study about a decade ago, Yale professors Leonid Rozenblit and Frank Keil, in a small study, asked students to explain how simple things work, like a flush toliet, a sewing machines, piano keys, a zipper, and a cylinder lock. But we’re not nearly as smart as we think. When their knowledge was put to the test, their familiarity with these things led to an (unwarranted) overconfidence about how they worked.

Most of the time people don’t put us to the test. When they do, the results don’t match our confidence. (Interestingly, one of the best ways to really learn how something works is to flip this around. It’s called the Feynman Technique.)

***
The Era of Fake Knowledge

It’s never been easier to fake what you know: to yourself and others.

It’s about energy conservation. Why put in the effort to learn something if we can get by most of the time without learning it?

Unable to discern between what we know and what we pretend to know, we ultimately become victims of our own laziness and intellectual dishonesty.

We end up fooling ourselves.

In a lecture at the Galileo Symposium in Italy in 1964, the future Nobel Laureate Richard Feynman, said “The first principle is that you must not fool yourself, and you are the easiest person to fool.”

***
How to Win an Argument

Research published last year and brought to my attention by Mind Hacks shows how this effect might help you convince people they are wrong.

Mind Hacks summarizes the work:

One group was asked to give their opinion and then provide reasons for why they held that view. This group got the opportunity to put their side of the issue, in the same way anyone in an argument or debate has a chance to argue their case.

Those in the second group did something subtly different. Rather than provide reasons, they were asked to explain how the policy they were advocating would work. They were asked to trace, step by step, from start to finish, the causal path from the policy to the effects it was supposed to have.

The results were clear. People who provided reasons remained as convinced of their positions as they had been before the experiment. Those who were asked to provide explanations softened their views, and reported a correspondingly larger drop in how they rated their understanding of the issues.

***

This simple technique is one to add to our tool belt.

If you want to win an argument, ask the person trying to convince you of something to explain how it would work. Odds are they have not done the work required to hold an opinion. If they can explain why they are correct and how things would work, you’ll learn something. If they can’t you’ll soften their views, perhaps nudging them ever so softly toward your views.

It is worth bearing in mind, however, that someone might do the same to you.

Miracles Happen — The Simple Heuristic That Saved 150 Lives

"In an uncertain world, statistical thinking and risk communication alone are not sufficient. Good rules of thumb are essential for good decisions."
“In an uncertain world, statistical thinking and risk communication alone are not sufficient. Good rules of thumb are essential for good decisions.”

Three minutes after taking off from LaGuardia airport in New York City, US Airways Flight 1549 ran into a flock of Canada geese. At 2800 feet, passengers and crew heard loud bangs as the geese collided with the engines rendering them both inoperable.

Gerd Gigerenzer picks up the story in his book Risk Savvy: How to Make Good Decisions:

When it dawned on the passengers that they were gliding toward the ground , it grew quiet on the plane. No panic, only silent prayer. Captain Chesley Sullenberger called air traffic control: “Hit birds. We’ve lost thrust in both engines. We’re turning back towards LaGuardia.”

But landing short of the airport would have catastrophic consequences, for passengers, crew , and the people living below. The captain and the copilot had to make a good judgment. Could the plane actually make it to LaGuardia, or would they have to try something more risky, such as a water landing in the Hudson River? One might expect the pilots to have measured speed, wind, altitude, and distance and fed this information into a calculator. Instead, they simply used a rule of thumb:

Fix your gaze on the tower: If the tower rises in your windshield, you won’t make it.

No estimation of the trajectory of the gliding plane is necessary. No time is wasted. And the rule is immune to calculation errors. In the words of copilot Jeffrey Skiles: “It’s not so much a mathematical calculation as visual, in that when you are flying in an airplane, things that— a point that you can’t reach will actually rise in your windshield. A point that you are going to overfly will descend in your windshield.” This time the point they were trying to reach did not descend but rose. They went for the Hudson.

In the cabin, the passengers were not aware of what was going on in the cockpit. All they heard was: “This is the captain: Brace for impact.” Flight attendants shouted: “Heads down! Stay down!” Passengers and crew later recalled that they were trying to grasp what death would be like, and the anguish of their kids, husbands, and wives. Then the impact happened, and the plane stopped. When passengers opened the emergency doors, sunlight streamed in. Everyone got up and rushed toward the openings. Only one passenger headed to the overhead bin to get her carry-on but was immediately stopped. The wings of the floating but slowly sinking plane were packed with people in life jackets hoping to be rescued. Then they saw the ferry coming. Everyone survived.

All this happened within the three minutes between the geese hitting the plane and the ditch in the river. During that time, the pilots began to run through the dual-engine failure checklist, a three-page list designed to be used at thirty thousand feet, not at three thousand feet: turn the ignition on, reset flight control computer, and so on. But they could not finish it. Nor did they have time to even start on the ditching checklist. While the evacuation was underway, Skiles remained in the cockpit and went through the evacuation checklist to safeguard against potential fire hazards and other dangers. Sullenberger went back to check on passengers and left the cabin only after making sure that no one was left behind. It was the combination of teamwork, checklists, and smart rules of thumb that made the miracle possible.

***

Say what? They used a heuristic?

Heuristics enable us to make fast, highly (but not perfectly) accurate, decisions without taking too much time and searching for information. Heuristics allow us to focus on only a few pieces of information and ignore the rest.

“Experts,” Gigerenzer writes, “often search for less information than novices do.”

We do the same thing, intuitively, to catch a baseball — the gaze heuristic.

Fix your gaze on an object, and adjust your speed so that the angle of gaze remains constant.

Professionals and amateurs alike rely on this rule.

… If a fly ball comes in high, the player fixates his eyes on the ball, starts running, and adjusts his running speed so that the angle of gaze remains constant. The player does not need to calculate the trajectory of the ball. To select the right parabola, the player’s brain would have to estimate the ball’s initial distance, velocity, and angle, which is not a simple feat. And to make things more complicated, real-life balls do not fly in parabolas . Wind, air resistance, and spin affect their paths. Even the most sophisticated robots or computers today cannot correctly estimate a landing point during the few seconds a ball soars through the air. The gaze heuristic solves this problem by guiding the player toward the landing point, not by calculating it mathematically . That’s why players don’t know exactly where the ball will land, and often run into walls and over the stands in their pursuit.

The gaze heuristic is an example of how the mind can discover simple solutions to very complex problems.

***

The broader point of Gigerenzer’s book is that while rational thinking works well for risks, you need a combination of rational and heuristic thinking to make decisions under uncertainty.

Certainty Is an Illusion

We all try to avoid uncertainty, even if it means being wrong. We take comfort in certainty and we demand it of others, even when we know it’s impossible.

Gerd Gigerenzer argues in Risk Savvy: How to Make Good Decisions that life would be pretty dull without uncertainty.

If we knew everything about the future with certainty, our lives would be drained of emotion. No surprise and pleasure, no joy or thrill— we knew it all along. The first kiss, the first proposal, the birth of a healthy child would be about as exciting as last year’s weather report. If our world ever turned certain, life would be mind-numbingly dull.

***
The Illusion of Certainty

We demand certainty of others. We ask our bankers, doctors, and political leaders (among others) to give it to us. What they deliver, however, is the illusion of certainty. We feel comfortable with this.

Many of us smile at old-fashioned fortune-tellers. But when the soothsayers work with computer algorithms rather than tarot cards, we take their predictions seriously and are prepared to pay for them. The most astounding part is our collective amnesia: Most of us are still anxious to see stock market predictions even if they have been consistently wrong year after year.

Technology changes how we see things – it amplifies the illusion of certainty.

When an astrologer calculates an expert horoscope for you and foretells that you will develop a serious illness and might even die at age forty-nine, will you tremble when the date approaches? Some 4 percent of Germans would; they believe that an expert horoscope is absolutely certain.

But when technology is involved, the illusion of certainty is amplified. Forty-four percent of people surveyed think that the result of a screening mammogram is certain. In fact, mammograms fail to detect about ten percent of cancers, and the younger the women being tested, the more error-prone the results, because their breasts are denser.

“Not understanding a new technology is one thing,” Gigerenzer writes, “believing that it delivers certainty is another.”

It’s best to remember Ben Franklin: “In this world nothing can be said to be certain, except death and taxes.”

***
The Security Blanket

Where does our need for certainty come from?

People with a high need for certainty are more prone to stereotypes than others and are less inclined to remember information that contradicts their stereotypes. They find ambiguity confusing and have a desire to plan out their lives rationally. First get a degree, a car, and then a career, find the most perfect partner, buy a home, and have beautiful babies. But then the economy breaks down, the job is lost, the partner has an affair with someone else, and one finds oneself packing boxes to move to a cheaper place. In an uncertain world, we cannot plan everything ahead. Here, we can only cross each bridge when we come to it, not beforehand. The very desire to plan and organize everything may be part of the problem, not the solution. There is a Yiddish joke: “Do you know how to make God laugh? Tell him your plans.”

To be sure, illusions have their function. Small children often need security blankets to soothe their fears. Yet for the mature adult, a high need for certainty can be a dangerous thing. It prevents us from learning to face the uncertainty pervading our lives. As hard as we try, we cannot make our lives risk-free the way we make our milk fat-free.

At the same time, a psychological need is not entirely to blame for the illusion of certainty. Manufacturers of certainty play a crucial role in cultivating the illusion. They delude us into thinking that our future is predictable, as long as the right technology is at hand.

***
Risk and Uncertainty

Two magnificently dressed young women sit upright on their chairs, calmly facing each other. Yet neither takes notice of the other. Fortuna, the fickle, wheel-toting goddess of chance, sits blindfolded on the left while human figures desperately climb, cling to, or tumble off the wheel in her hand. Sapientia, the calculating and vain deity of science, gazes into a hand-mirror, lost in admiration of herself. These two allegorical figures depict a long-standing polarity: Fortuna brings good or bad luck, depending on her mood, but science promises certainty.

Fortuna, the wheel-toting goddess of chance (left), facing Sapientia, the divine goddess of science (right).
Fortuna, the wheel-toting goddess of chance (left), facing Sapientia, the divine goddess of science (right).

This sixteenth -century woodcut was carved a century before one of the greatest revolutions in human thinking, the “probabilistic revolution,” colloquially known as the taming of chance. Its domestication began in the mid-seventeenth century. Since then, Fortuna’s opposition to Sapientia has evolved into an intimate relationship, not without attempts to snatch each other’s possessions. Science sought to liberate people from Fortuna’s wheel, to banish belief in fate, and replace chances with causes. Fortuna struck back by undermining science itself with chance and creating the vast empire of probability and statistics. After their struggles, neither remained the same: Fortuna was tamed, and science lost its certainty.

I explain more on the difference between risk and uncertainty here, but this diagram helps simplify things.
certainty_risk_uncertainty

***
The value of heuristics

The twilight of uncertainty comes in different shades and degrees. Beginning in the seventeenth century, the probabilistic revolution gave humankind the skills of statistical thinking to triumph over Fortuna, but these skills were designed for the palest shade of uncertainty, a world of known risk, in short, risk. I use this term for a world where all alternatives, consequences, and probabilities are known. Lotteries and games of chance are examples. Most of the time, however, we live in a changing world where some of these are unknown: where we face unknown risks, or uncertainty. The world of uncertainty is huge compared to that of risk. … In an uncertain world, it is impossible to determine the optimal course of action by calculating the exact risks. We have to deal with “unknown unknowns.” Surprises happen. Even when calculation does not provide a clear answer, however, we have to make decisions. Thankfully we can do much better than frantically clinging to and tumbling off Fortuna’s wheel. Fortuna and Sapientia had a second brainchild alongside mathematical probability, which is often passed over: rules of thumb, known in scientific language as heuristics.

***
How decisions change based on risk/uncertainty

When making decisions, the two sets of mental tools are required:
1. RISK: If risks are known, good decisions require logic and statistical thinking.
2. UNCERTAINTY: If some risks are unknown, good decisions also require intuition and smart rules of thumb.

Most of the time we need a combination of the two.

***

Risk Savvy: How to Make Good Decisions is a great read throughout.

Work in Pulses

We’re not designed to multitask and we’re certainly not designed to work continuously without a break.

We’re designed to pulse, that is alternate between expending energy and recovering.

***
Pulses

(via Overwhelmed: Work, Love, and Play When No One Has the Time)

The heart beats. The lungs breathe in and out. The brain makes waves. We wake and sleep. Even digestion is rhythmic.

We’re built the same way according to Tony Schwartz, author of The Way We’re Working Isn’t Working. Schwartz told Brigid Schulte, author of Overwhelmed: Work, Love, and Play When No One Has the Time that we’re not built for the modern environment.

***
Ignoring the Obvious

(via Overwhelmed: Work, Love, and Play When No One Has the Time)

(Because the ideal worker is measured in hours) we tend to put in long ones, (Schwartz) said. We ignore the signs of fatigue, boredom, and distraction and just power through. But we’re hardly doing our best work.

“We’ve lost touch,” Schwartz says, “with the value of rest, renewal, recovery, quiet time, and downtime.” The pressure of long hours, in a face time world, combined with the constant bombardment of modern interruptions (think email, phone calls, texts, meetings, etc.) means that increasingly we’re not doing our best thinking at work. Maybe we should heed the advice of some famous philosophers and take a walk.

***
Sleep

We sleep in 90 minute cycles, with our brain waves slowing and speeding, only to begin again.

Schwartz’s thinking was influenced by Anders Ericsson. Ericsson is the guy behind the 10,000 hour rule.

Here is Schulte explaining in Overwhelmed: Work, Love, and Play When No One Has the Time

Ericsson studied young violinists at the prestigious Academy of Music in Berlin to see what it takes to be the best. Ericsson is widely credited for coming up with the theory that it takes ten thousand hours of deliberate practice in anything to become an expert.

“That led to the assumption that the best way to get things done is to just work more hours ,” Schwartz said. But that’s only part of it.

Ericsson’s study found that not only did the best violinists practice more, they also practiced more deliberately: They practiced first thing in the morning, when they were freshest, they practiced intensely without interruption in typically no more than ninety-minute increments for no more than four hours a day.

Most important, the top violinists rested more — napping more during the day and sleeping longer at night. Sleep is actually more important than food. “Great performers,” Schwartz wrote in Be Excellent at Anything, “work more intensely than most of us do but also recover more deeply.”

Three hour meetings? That’s a recipe for disaster leading to subpar work and poor decisions, not to mention meeting marathons drive people to hate work.

***
Attention Deficit Disorder

A lot of adults I know think they suffer from ADD. These are the people who, when they get out of a 3 hour meeting, talk on the phone, send an email, and write the grocery list to “make up time.” Well you can’t really make up time, and working like this is incredibly ineffective. But before we get to that, is all of this multitasking driving us to disorder? Could Attention Deficit Disorder be driven by our always-on environment?

Ed Hallowell believes so.

He’s a psychiatrist with ADD, and he spent years working on practical solutions to help people being overloaded by too many demands on their time and energy.

I read his book, CrazyBusy: Overstretched, Overbooked, and About to Snap! Strategies for Handling Your Fast-Paced Life, a few years back.
He claims we have “culturally generated ADD.”

Having treated ADD since 1981, I began to see an upsurge in the mid-1990s in the number of people who complained of being chronically inattentive, disorganized, and overbooked. Many came to me wondering if they had ADD. While some did, most did not. Instead, they had what I called a severe case of modern life.

***
Breaks Inspire Creativity

Scientists have found that people who take time to daydream score higher on tests of creativity. And there’s a very good biochemical reason why your best ideas and those flashes of insight tend to come not when you’ve got your nose to the grindstone, oh ideal worker, but in the shower.

In a series of tests using brain imaging and electroencephalography, psychologists John Kounios and Mark Beeman have actually mapped what happens in the brain during the aha! moment, when the brain suddenly makes new connections and imagines, Kounios has said, “new and different ways to transform reality creatively into something better.” When the brain is solving a problem in a deliberate and methodical way, Kounios and Beeman found that the visual cortex, the part of the brain controlling sight, is most active. So the brain is outwardly focused. But just before a moment of insight, the brain suddenly turns inward, what the researchers called a “brain blink.” Alpha waves in the right visual cortex slow, just as when we often close our eyes in thought. Milliseconds before the insight, Kounios and Beeman recorded a burst of gamma activity in the right hemisphere in the area of the brain just above the ear, believed to be linked to our ability to process metaphors.

A positive mood heightens the chances for creative insight, as does taking time to relax, as Archimedes did in his bathtub before his eureka! moment about water displacement and as Einstein did when working out his Theory of Relatively while reportedly tootling around on his bicycle.

***
Working in Pulses

Terry Monaghan, a self-described productivity expert, whom we met in Work, Play, Love encouraged Brigid Schulte to work in pulses. The idea is to chunk your time. This is why one of the single most effective changes you can make to your work day is to move your creative work to the start of the day — you give yourself a chunk of time.

Discussing this in Overwhelmed: Work, Love, and Play When No One Has the Time, Schulte writes:

The idea was to chunk my time to minimize the constant multitasking , “role switching,” and toggling back and forth between work and home stuff like a brainless flea on a hot stove. The goal was to create periods of uninterrupted time to concentrate on work— the kind of time I usually found in the middle of the night—during the day. And to be more focused and less distracted with my family.

When it was time to work, I began to shut off e-mail and turn off the phone. When it was time to be with family, I tried to do the same. I began to gather home tasks in a pile and block off one period of time every day to do them. It was easier to stay focused on work knowing I’d given myself a grace period to get to the pressing home stuff later.

The Thirty Minute Pulse
When you find yourself procrastinating, avoiding something or otherwise stuck in a state of ambivalence, try a timer. Monaghan, recommends 30 minutes then taking a break. “Your brain,” she says, “can stay focused on anything, even an unpleasant task, if it knows it will last only thirty minutes.”

I find this useful. I have a 15 minute hour-glass sitting on my desk.

***
Putting It All Together

Work in pulses. Chunk your time. Do a daily brain dump to get things off your mind. Keep a notebook with you. If you feel worried or stressed, write it out in your worry journal. Add more of a routine to your day to help avoid decision fatigue. When things are automatic, they don’t consume as much energy.

Don’t wake up and check your email, get to the office and check your email, and then check your email hourly throughout the day. Check your email in batches: late morning and late afternoon.

Most importantly, make time to pause and think about what is most important to you. Narrow your focus and make 80% of your time on the three big things that are important to you. Let everything else fit in the 20% of time left. Let the truly sucky stuff fit in 5% of the time. If leisure is important to you and you can’t find time for it, schedule it in. When you wake up, do one thing that’s important to you right away.