Tag: Checklists

Atul Gawande: The Building Industry’s Strategy for Getting Things Right in Complexity

Old_timer_structural_worker2

Checklists establish a higher level of baseline performance.

***

A useful reminder from Atul Gawande, in The Checklist Manifesto:

In a complex environment, experts are up against two main difficulties. The first is the fallibility of human memory and attention, especially when it comes to mundane, routine matters that are easily overlooked under the strain of more pressing events. (When you’ve got a patient throwing up and an upset family member asking you what’s going on, it can be easy to forget that you have not checked her pulse.) Faulty memory and distraction are a particular danger in what engineers call all-or-none processes: whether running to the store to buy ingredients for a cake, preparing an airplane for takeoff, or evaluating a sick person in the hospital, if you miss just one key thing, you might as well not have made the effort at all.

A further difficulty, just as insidious, is that people can lull themselves into skipping steps even when they remember them. In complex processes, after all, certain steps don’t always matter. … “This has never been a problem before,” people say. Until one day it is.

Checklists seem to provide protection against such failures. They remind us of the minimum necessary steps and make them explicit. They not only offer the possibility of verification but also instill a kind of discipline of higher performance.

***

How you employ the checklist is also important. In the face of complexity most organizations tend to centralize decisions, which reduces the risk for egregious error. The costs to this approach are high too. Most employees loathe feeling like they need a hall pass to use the washroom. That's why these next comments were so inspiring.

There is a particularly tantalizing aspect to the building industry’s strategy for getting things right in complex situations: it’s that it gives people power. In response to risk, most authorities tend to centralize power and decision making. That’s usually what checklists are about—dictating instructions to the workers below to ensure they do things the way we want. Indeed, the first building checklist I saw, the construction schedule on the right-hand wall of O’Sullivan’s conference room, was exactly that. It spelled out to the tiniest detail every critical step the tradesmen were expected to follow and when—which is logical if you’re confronted with simple and routine problems; you want the forcing function.

But the list on O’Sullivan’s other wall revealed an entirely different philosophy about power and what should happen to it when you’re confronted with complex, nonroutine problems—such as what to do when a difficult, potentially dangerous, and unanticipated anomaly suddenly appears on the fourteenth floor of a thirty-two-story skyscraper under construction. The philosophy is that you push the power of decision making out to the periphery and away from the center. You give people the room to adapt, based on their experience and expertise. All you ask is that they talk to one another and take responsibility. That is what works.

The strategy is unexpectedly democratic, and it has become standard nowadays, O’Sullivan told me, even in building inspections. The inspectors do not recompute the wind-force calculations or decide whether the joints in a given building should be bolted or welded, he said. Determining whether a structure like Russia Wharf or my hospital’s new wing is built to code and fit for occupancy involves more knowledge and complexity than any one inspector could possibly have. So although inspectors do what they can to oversee a building’s construction, mostly they make certain the builders have the proper checks in place and then have them sign affidavits attesting that they themselves have ensured that the structure is up to code. Inspectors disperse the power and the responsibility.

“It makes sense,” O’Sullivan said. “The inspectors have more troubles with the safety of a two-room addition from a do-it-yourselfer than they do with projects like ours. So that’s where they focus their efforts.” Also, I suspect, at least some authorities have recognized that when they don’t let go of authority they fail.

Miracles Happen — The Simple Heuristic That Saved 150 Lives

"In an uncertain world, statistical thinking and risk communication alone are not sufficient. Good rules of thumb are essential for good decisions."
“In an uncertain world, statistical thinking and risk communication alone are not sufficient. Good rules of thumb are essential for good decisions.”

Three minutes after taking off from LaGuardia airport in New York City, US Airways Flight 1549 ran into a flock of Canada geese. At 2800 feet, passengers and crew heard loud bangs as the geese collided with the engines rendering them both inoperable.

Gerd Gigerenzer picks up the story in his book Risk Savvy: How to Make Good Decisions:

When it dawned on the passengers that they were gliding toward the ground , it grew quiet on the plane. No panic, only silent prayer. Captain Chesley Sullenberger called air traffic control: “Hit birds. We’ve lost thrust in both engines. We’re turning back towards LaGuardia.”

But landing short of the airport would have catastrophic consequences, for passengers, crew , and the people living below. The captain and the copilot had to make a good judgment. Could the plane actually make it to LaGuardia, or would they have to try something more risky, such as a water landing in the Hudson River? One might expect the pilots to have measured speed, wind, altitude, and distance and fed this information into a calculator. Instead, they simply used a rule of thumb:

Fix your gaze on the tower: If the tower rises in your windshield, you won’t make it.

No estimation of the trajectory of the gliding plane is necessary. No time is wasted. And the rule is immune to calculation errors. In the words of copilot Jeffrey Skiles: “It’s not so much a mathematical calculation as visual, in that when you are flying in an airplane, things that— a point that you can’t reach will actually rise in your windshield. A point that you are going to overfly will descend in your windshield.” This time the point they were trying to reach did not descend but rose. They went for the Hudson.

In the cabin, the passengers were not aware of what was going on in the cockpit. All they heard was: “This is the captain: Brace for impact.” Flight attendants shouted: “Heads down! Stay down!” Passengers and crew later recalled that they were trying to grasp what death would be like, and the anguish of their kids, husbands, and wives. Then the impact happened, and the plane stopped. When passengers opened the emergency doors, sunlight streamed in. Everyone got up and rushed toward the openings. Only one passenger headed to the overhead bin to get her carry-on but was immediately stopped. The wings of the floating but slowly sinking plane were packed with people in life jackets hoping to be rescued. Then they saw the ferry coming. Everyone survived.

All this happened within the three minutes between the geese hitting the plane and the ditch in the river. During that time, the pilots began to run through the dual-engine failure checklist, a three-page list designed to be used at thirty thousand feet, not at three thousand feet: turn the ignition on, reset flight control computer, and so on. But they could not finish it. Nor did they have time to even start on the ditching checklist. While the evacuation was underway, Skiles remained in the cockpit and went through the evacuation checklist to safeguard against potential fire hazards and other dangers. Sullenberger went back to check on passengers and left the cabin only after making sure that no one was left behind. It was the combination of teamwork, checklists, and smart rules of thumb that made the miracle possible.

***

Say what? They used a heuristic?

Heuristics enable us to make fast, highly (but not perfectly) accurate, decisions without taking too much time and searching for information. Heuristics allow us to focus on only a few pieces of information and ignore the rest.

“Experts,” Gigerenzer writes, “often search for less information than novices do.”

We do the same thing, intuitively, to catch a baseball — the gaze heuristic.

Fix your gaze on an object, and adjust your speed so that the angle of gaze remains constant.

Professionals and amateurs alike rely on this rule.

… If a fly ball comes in high, the player fixates his eyes on the ball, starts running, and adjusts his running speed so that the angle of gaze remains constant. The player does not need to calculate the trajectory of the ball. To select the right parabola, the player’s brain would have to estimate the ball’s initial distance, velocity, and angle, which is not a simple feat. And to make things more complicated, real-life balls do not fly in parabolas . Wind, air resistance, and spin affect their paths. Even the most sophisticated robots or computers today cannot correctly estimate a landing point during the few seconds a ball soars through the air. The gaze heuristic solves this problem by guiding the player toward the landing point, not by calculating it mathematically . That’s why players don’t know exactly where the ball will land, and often run into walls and over the stands in their pursuit.

The gaze heuristic is an example of how the mind can discover simple solutions to very complex problems.

***

The broader point of Gigerenzer's book is that while rational thinking works well for risks, you need a combination of rational and heuristic thinking to make decisions under uncertainty.

The Checklist Manifesto: How to Get Things Right

The Checklist Manifesto
Fuel Injected Cessna Checklist

It's no secret that I'm a huge fan of Atul Gawande. A reader recently pointed out that I hadn't covered his most recent book, The Checklist Manifesto: How to Get Things Right. I had only covered an interesting subset of the book—why we fail.

The Checklist Manifesto

To put us in the proper context, we're smart. Not scary smart but smart enough. We have skills and we generally put them in the most highly trained and hardworking people we can find. We have the most educated society in history. And we've accomplished a lot. Nevertheless, sometimes success escapes us for avoidable reasons. Not only are these failures common—across everything from medicine to finance—but they are also frustrating. We should know better but we don't. The reason we don't learn, Gawande argues, is evident:

the volume and complexity of what we know has exceeded our individual ability to deliver its benefits correctly, safely, or reliably. Knowledge has both saved us and burdened us.

To overcome this we need a strategy. Something that “builds on experience and takes advantage of the knowledge people have but somehow also makes up for our inevitable human inadequacies.” We need a checklist.

In response to increasing complexity we've become more specialized. We divide the problem up. It's not just the growing breadth and quantity of knowledge that makes things more complicated, although they certainly are significant contributors. It is also execution. In every field from medicine to construction there are a slew of practical procedures, policies and best practices. Gawande breaks this down for the modern medical case:

[Y]ou have a desperately sick patient and in order to have a chance of saving him you have to get the knowledge right and then you have to make sure that the 178 daily tasks that follow are done correctly—despite some monitor’s alarm going off for God knows what reason, despite the patient in the next bed crashing, despite a nurse poking his head around the curtain to ask whether someone could help “get this lady’s chest open.” There is complexity upon complexity. And even specialization has begun to seem inadequate. So what do you do?

The response of the medical profession, like most others, is to move from specialization to super-specialization. Gawande argues that these super-specialists have two advantages over ordinary specialists: greater knowledge of the things that matter and “a learned ability to handle the complexities of that particular job.” But even for these superspecialisits, avoiding mistakes is proving impossible.

Modern professions, like medicine, with their dazzling successes and spectacular failures, pose a significant: challenge: “What do you do when expertise is not enough? What do you do when even the super-specialists fail?”

The origins of the checklist.

On October 30, 1935, at Wright Air Field in Dayton, Ohio, the U.S. Army Air Corps held a competition for airplane manufacturers vying to build the next-generation of the long-range bomber. Only it wasn't supposed to be much of a competition at all. The Boeing Corporation's gleaming aluminum-alloy Model 299 was expected to steal the show, its design far superior to those of the competition. In other words, it was just a formality.

As the Model 299 test plane taxied onto the runway, a small group of army brass and manufacturing executives watched. The plane took off without a hitch. Then suddenly, at about 300 feet, it stalled, turned on one wing, and crashed killing two of the five crew members, including the pilot Major Hill.

Of course everyone wanted to know what had happened. An investigation revealed that there was nothing to indicate any problems mechanically with the plane. It was “pilot error.” The problem with the new plane, if there was one, was that it was substantially more complex than the previous aircraft. Among other things, there were 4 engines, each with its own fuel-mix, wing flaps, trim that needed constant adjustment, and propellers requiring pitch adjustment. While trying to keep up with the increased complexity, Hill had forgotten to release a new locking mechanism on the rudder controls. The new plane was too much for anyone to fly. The unexpected winner was the smaller designed Douglas.

Here is where it really gets interesting. The army, convinced of the technical superiority of the plane, ordered a few anyway. If you're thinking they'd just put the pilots through more training to fly the plane, you'd be wrong. Major Hill, the chief of flight testing, was an experienced pilot, so longer training was unlikely to result in improvement. Instead, they created a pilot's checklist.

The pilots made the list simple and short. It fit on an index card with step-by-step instructions for takeoff, flying, landing, and taxiing. It was as if someone all of a sudden gave an experienced automobile driver a checklist of things that would be obvious to them. There was nothing on the checklist they didn't know. Stuff like, check that the instruments are set, the door closed. Basics. That checklist changed the course of history and quite possibly the war. The pilots went on to fly the Model 299 a “total of 1.8 million miles” without a single accident and as a result the army ordered over 13,000 of them.

In The Checklist Manifesto, Gawande argues that most of today's work has entered a checklist phase.

Substantial parts of what software designers, financial managers, firefighters, police officers, lawyers, and most certainly clinicians do are now too complex for them to carry out reliably from memory alone.

Yet no one wants to use a checklist. We believe “our jobs are too complicated to reduce to a checklist.” After all, we don't work at McDonald's right?

In a complex environment, experts are up against two main difficulties. The first is the fallibility of human memory and attention, especially when it comes to mundane, routine matters that are easily overlooked under the strain of more pressing events. (When you’ve got a patient throwing up and an upset family member asking you what’s going on, it can be easy to forget that you have not checked her pulse.) Faulty memory and distraction are a particular danger in what engineers call all-or-none processes: whether running to the store to buy ingredients for a cake, preparing an airplane for takeoff, or evaluating a sick person in the hospital, if you miss just one key thing, you might as well not have made the effort at all.

A further difficulty, just as insidious, is that people can lull themselves into skipping steps even when they remember them. In complex processes, after all, certain steps don’t always matter. … “This has never been a problem before,” people say. Until one day it is.

Checklists seem to provide protection against such failures. They remind us of the minimum necessary steps and make them explicit. They not only offer the possibility of verification but also instill a kind of discipline of higher performance.

In news that would shock bureaucracies and governments alike, the strategy a lot of industries use to get things right in complex environments is to give employees power. Most authorities, in response to risk, tend to centralize power and decision making.

Sometimes that's even the point of a checklist—to make sure the people below you are doing things in the manner in which you want. These checklists, the McDonald's type checklists, spell out the tiniest detail of every critical step. These serve their purpose but they also create a group of employees no longer able to adapt. When things change, as they always do, you're now faced with a non-routine problem. “The philosophy,” writes Gawande, “is that you push the power of decision making out to the periphery and away from the center. You give people the room to adapt, based on their experience and expertise. All you ask is that they talk to one another and take responsibility. That is what works.”

…the real lesson is that under conditions of true complexity—where the knowledge required exceeds that of any individual and unpredictability reigns—efforts to dictate every step from the center will fail. People need room to act and adapt. Yet they cannot succeed as isolated individuals, either—that is anarchy. Instead, they require a seemingly contradictory mix of freedom and expectation—expectation to coordinate, for example, and also to measure progress toward common goals.

This was the understanding people in the skyscraper-building industry had grasped. More remarkably, they had learned to codify that understanding into simple checklists. They had made the reliable management of complexity a routine.

That routine requires balancing a number of virtues: freedom and discipline, craft and protocol, specialized ability and group collaboration. And for checklists to help achieve that balance, they have to take two almost opposing forms. They supply a set of checks to ensure the stupid but critical stuff is not overlooked, and they supply another set of checks to ensure people talk and coordinate and accept responsibility while nonetheless being left the power to manage the nuances and unpredictabilities the best they know how.

I came away from Katrina and the builders with a kind of theory: under conditions of complexity, not only are checklists a help, they are required for success. There must always be room for judgment, but judgment aided—and even enhanced—by procedure.

The Checklist Manifesto: How to Get Things Right is a fascinating read about an interesting subject by an amazing writer. Nuff said.

The Psychology Of The To-Do List

to-do-list

Ten years after David Allen’s bestselling productivity book Getting Things Done, scientific research caught up. We now know why the system is so popular and so effective.

We now know why the popular system is so effective.

The key behind GTD is writing everything down and sorting it effectively. This act of planning reduces the burden on the brain, which is struggling to hold the mental list of all the things we have to do. Releasing the burden of unfinished tasks on the mind frees it up to become more effective.

This act of planning reduces the burden on the brain, which is struggling to hold the mental list of all the things we have to do. Releasing the burden of unfinished tasks on the mind frees it up to become more effective.

Tom Stafford explores this further in his BBC Future column.

“Filing effectively”, in Allen’s sense, means a system with three parts: an archive, where you store stuff you might need one day (and can forget until then), a current task list in which everything is stored as an action, and a “tickler file” of 43 folders in which you organise reminders of things to do (43 folders because that’s one for the next thirty-one days plus the next 12 months).

The current task list is a special kind of to-do list because all the tasks are defined by the next action you need to take to progress them. This simple idea is remarkably effective in helping resolving the kind of inertia that stops us resolving items on our lists. …

Breaking everything down into its individual actions allows the system to take hold, freeing you to either do something or forget about it, knowing the knowledge has been captured in the system. The system does the remembering and monitoring for you.

So what’s the psychology that backs this up?

Roy Baumeister and EJ Masicampo at Florida State University were interested in an old phenomenon called the Zeigarnik Effect, which is what psychologists call our mind’s tendency to get fixated on unfinished tasks and forget those we’ve completed. You can see the effect in action in a restaurant or bar – you can easily remember a drinks order, but then instantly forget it as soon as you’ve put the drinks down. …

A typical way to test for the Zeigarnik Effect is to measure if an unfulfilled goal interferes with the ability to carry out a subsequent task. Baumeister and Masicampo discovered that people did worse on a brainstorming task when they were prevented from finishing a simple warm-up task – because the warm-up task was stuck in their active memory. What Baumeister and Masicampo did next is the interesting thing; they allowed some people to make plans to finish the warm-up task. They weren’t allowed to finish it, just to make plans on how they’d finish it. Sure enough, those people allowed to make plans were freed from the distracting effect of leaving the warm-up task unfinished.

Our attention has a limited capacity. The GTD system frees up the attention used to keep track of our mental to-do list and acts as a plan for how we will do things, freeing our mind for more effective uses. You don't actually need to do the things on your list; you only need a plan for when and how to do them.

There is some tension here though. While to-do lists might reduce the burden on your brain, the most productive people rarely use them.

Michael Mauboussin: Two Tips to Improve The Quality of Your Decisions

Michael Mauboussin, chief investment strategist at Legg Mason and our first interview on the podcast, offers two simple techniques to improve the quality of your decision making: a decision journal and a checklist.

1. Create a decision journal and starting using it. 

Many years ago when I first met Danny Kahneman, and Kahneman is one of the preeminent psychologists in the world who won a Nobel Prize for economics in 2002, even though he's never taught an economics class.

When I pose him the question, what is a single thing an investor can do to improve his or her performance, he said almost without hesitation, go down to a local drugstore and buy a very cheap notebook and start keeping track of your decisions. And the specific idea is whenever you're making a consequential decision, something going in or out of the portfolio, just take a moment to think, write down what you expect to happen, why you expect it to happen and then actually, and this is optional, but probably a great idea, is write down how you feel about the situation, both physically and even emotionally. Just, how do you feel? I feel tired. I feel good, or this stock is really draining me. Whatever you think.

The key to doing this is that it prevents something called hindsight bias, which is no matter what happens in the world. We tend to look back on our decision-making process, and we tilt it in a way that looks more favorable to us, right? So we have a bias to explain what has happened.

When you've got a decision-making journal, it gives you accurate and honest feedback of what you were thinking at that time. And so there can be situations, by the way, you buy a stock and it goes up, but it goes up for reasons very different than what you thought was going to happen. And having that feedback in a way to almost check yourself periodically is extremely valuable. So that's, I think, a very inexpensive; it's actually not super time consuming, but a very, very valuable way of giving yourself essential feedback because our minds won't do it normally.

2. Use a checklist. 

Mauboussin: So the best work on this I've seen is by Atul Gawande, who is a surgeon in Boston who wrote a book a couple of years ago called The Checklist Manifesto, and one of the points he makes in there is that when you go from field to field, wherever checklists have been used correctly and with fidelity, they've been extremely effective in proving outcomes. So we all know none of us would step on an airplane today without the pilot having gone through the checklist. It's been a big move into medicine, especially for example, in surgery where checklists have really made substantial inroads in reducing infections, for example, and hence mortality, and other areas like construction elsewhere.

So the question is, how do you become more systematic in applying what you know? And I'll just mention one other thing on this. There are two; Gawande talks about two kinds of checklists. By the way, this branch is right out of aviation. One is called a do-confirm checklist, a do-confirm, and that just basically says, Hey, just do your normal analysis the way you've always done it and been trained to do that, but stop periodically just to confirm that you've covered all the bases. So as an analyst that might say, hey, I'm going to do a really thorough evaluation work. I might look very carefully at return on capital trends. I might study the competitive strategy position. You are just going to do all that stuff, but you're going to stop every now and then, just to check to make sure you've done everything.

The second one is called, the second kind of checklist, is called a read-do checklist. This is when you get into a difficult situation, for example you're a pilot and one of your engines goes out, the redo will guide how you should approach that problem. So you don't have to think about it so much, you just sort of go through it systematically. And so for an investor that might be hey, what happens when a company misses a quarter? What happens when they have a negative announcement or an executive departure? Sometimes that means sell the stock. Sometimes that means buy more. Sometimes it means do nothing, and a read-do checklist can help guide some of that thinking as well. So it's really a way to be structured and consistent in your analysis.

A Simple Checklist to Improve Decisions

We owe thanks to the publishing industry. Their ability to take a concept and fill an entire category with a shotgun approach is the reason that more people are talking about biases.

Unfortunately, talk alone will not eliminate them but it is possible to take steps to counteract them. Reducing biases can make a huge difference in the quality of any decision and it is easier than you think.

In a recent article for Harvard Business Review, Daniel Kahneman (and others) describe a simple way to detect bias and minimize its effects in the most common type of decisions people make: determining whether to accept, reject, or pass on a recommendation.

The Munger two-step process for making decisions is a more complete framework, but Kahneman's approach is a good way to help reduce biases in our decision-making.

If you're short on time here is a simple checklist that will get you started on the path towards improving your decisions:

Preliminary Questions: Ask yourself

1. Check for Self-interested Biases

  • Is there any reason to suspect the team making the recommendation of errors motivated by self-interest?
  • Review the proposal with extra care, especially for overoptimism.

2. Check for the Affect Heuristic

  • Has the team fallen in love with its proposal?
  • Rigorously apply all the quality controls on the checklist.

3. Check for Groupthink

  • Were there dissenting opinions within the team?
  • Were they explored adequately?
  • Solicit dissenting views, discreetly if necessary.
  • Challenge Questions: Ask the recommenders

4. Check for Saliency Bias

  • Could the diagnosis be overly influenced by an analogy to a memorable success?
  • Ask for more analogies, and rigorously analyze their similarity to the current situation.

5. Check for Confirmation Bias

  • Are credible alternatives included along with the recommendation?
  • Request additional options.

6. Check for Availability Bias

  • If you had to make this decision again in a year’s time, what information would you want, and can you get more of it now?
  • Use checklists of the data needed for each kind of decision.

7. Check for Anchoring Bias

  • Do you know where the numbers came from? Can there be
  • …unsubstantiated numbers?
  • …extrapolation from history?
  • …a motivation to use a certain anchor?
  • Reanchor with figures generated by other models or benchmarks, and request new analysis.

8. Check for Halo Effect

  • Is the team assuming that a person, organization, or approach that is successful in one area will be just as successful in another?
  • Eliminate false inferences, and ask the team to seek additional comparable examples.

9. Check for Sunk-Cost Fallacy, Endowment Effect

  • Are the recommenders overly attached to a history of past decisions?
  • Consider the issue as if you were a new CEO.
  • Evaluation Questions: Ask about the proposal

10. Check for Overconfidence, Planning Fallacy, Optimistic Biases, Competitor Neglect

  • Is the base case overly optimistic?
  • Have the team build a case taking an outside view; use war games.

11. Check for Disaster Neglect

  • Is the worst case bad enough?
  • Have the team conduct a premortem: Imagine that the worst has happened, and develop a story about the causes.

12. Check for Loss Aversion

  • Is the recommending team overly cautious?
  • Realign incentives to share responsibility for the risk or to remove risk.

If you're looking to dramatically improve your decision making here is a great list of books to get started:

Nudge: Improving Decisions About Health, Wealth, and Happiness by Richard H. Thaler and Cass R. Sunstein

Think Twice: Harnessing the Power of Counterintuition by Michael J. Mauboussin

Think Again: Why Good Leaders Make Bad Decisions and How to Keep It from Happening to You by Sydney Finkelstein, Jo Whitehead, and Andrew Campbell

Predictably Irrational: The Hidden Forces That Shape Our Decisions by Dan Ariely

Thinking, Fast and Slow by Daniel Kahneman

Judgment and Managerial Decision Making by Max Bazerman