Over 500,000 people visited Farnam Street last month to learn howto make better decisions, create new ideas, and avoid stupid errors. With more than 100,000 subscribers to our popular weekly digest, we've become an online intellectual hub. To learn more about what we do, start here.
To put us in the proper context, we're smart. Not scary smart but smart enough. We have skills and we generally put them in the most highly trained and hardworking people we can find. We have the most educated society in history. And we've accomplished a lot. Nevertheless, sometimes success escapes us for avoidable reasons. Not only are these failures common—across everything from medicine to finance—but they are also frustrating. We should know better but we don't. The reason we don't learn, Gawande argues, is evident:
the volume and complexity of what we know has exceeded our individual ability to deliver its benefits correctly, safely, or reliably. Knowledge has both saved us and burdened us.
To overcome this we need a strategy. Something that “builds on experience and takes advantage of the knowledge people have but somehow also makes up for our inevitable human inadequacies.” We need a checklist.
In response to increasing complexity we've become more specialized. We divide the problem up. It's not just the growing breadth and quantity of knowledge that makes things more complicated, although they certainly are significant contributors. It is also execution. In every field from medicine to construction there are a slew of practical procedures, policies and best practices. Gawande breaks this down for the modern medical case:
[Y]ou have a desperately sick patient and in order to have a chance of saving him you have to get the knowledge right and then you have to make sure that the 178 daily tasks that follow are done correctly—despite some monitor’s alarm going off for God knows what reason, despite the patient in the next bed crashing, despite a nurse poking his head around the curtain to ask whether someone could help “get this lady’s chest open.” There is complexity upon complexity. And even specialization has begun to seem inadequate. So what do you do?
The response of the medical profession, like most others, is to move from specialization to super-specialization. Gawande argues that these super-specialists have two advantages over ordinary specialists: greater knowledge of the things that matter and “a learned ability to handle the complexities of that particular job.” But even for these superspecialisits, avoiding mistakes is proving impossible.
Modern professions, like medicine, with their dazzling successes and spectacular failures, pose a significant: challenge: “What do you do when expertise is not enough? What do you do when even the super-specialists fail?”
The origins of the checklist.
On October 30, 1935, at Wright Air Field in Dayton, Ohio, the U.S. Army Air Corps held a competition for airplane manufacturers vying to build the next-generation of the long-range bomber. Only it wasn't supposed to be much of a competition at all. The Boeing Corporation's gleaming aluminum-alloy Model 299 was expected to steal the show, its design far superior to those of the competition. In other words, it was just a formality.
As the Model 299 test plane taxied onto the runway, a small group of army brass and manufacturing executives watched. The plane took off without a hitch. Then suddenly, at about 300 feet, it stalled, turned on one wing, and crashed killing two of the five crew members, including the pilot Major Hill.
Of course everyone wanted to know what had happened. An investigation revealed that there was nothing to indicate any problems mechanically with the plane. It was “pilot error.” The problem with the new plane, if there was one, was that it was substantially more complex than the previous aircraft. Among other things, there were 4 engines, each with its own fuel-mix, wing flaps, trim that needed constant adjustment, and propellers requiring pitch adjustment. While trying to keep up with the increased complexity, Hill had forgotten to release a new locking mechanism on the rudder controls. The new plane was too much for anyone to fly. The unexpected winner was the smaller designed Douglas.
Here is where it really gets interesting. The army, convinced of the technical superiority of the plane, ordered a few anyway. If you're thinking they'd just put the pilots through more training to fly the plane, you'd be wrong. Major Hill, the chief of flight testing, was an experienced pilot, so longer training was unlikely to result in improvement. Instead, they created a pilot's checklist.
The pilots made the list simple and short. It fit on an index card with step-by-step instructions for takeoff, flying, landing, and taxiing. It was as if someone all of a sudden gave an experienced automobile driver a checklist of things that would be obvious to them. There was nothing on the checklist they didn't know. Stuff like, check that the instruments are set, the door closed. Basics. That checklist changed the course of history and quite possibly the war. The pilots went on to fly the Model 299 a “total of 1.8 million miles” without a single accident and as a result the army ordered over 13,000 of them.
In The Checklist Manifesto, Gawande argues that most of today's work has entered a checklist phase.
Substantial parts of what software designers, financial managers, firefighters, police officers, lawyers, and most certainly clinicians do are now too complex for them to carry out reliably from memory alone.
Yet no one wants to use a checklist. We believe “our jobs are too complicated to reduce to a checklist.” After all, we don't work at McDonald's right?
In a complex environment, experts are up against two main difficulties. The first is the fallibility of human memory and attention, especially when it comes to mundane, routine matters that are easily overlooked under the strain of more pressing events. (When you’ve got a patient throwing up and an upset family member asking you what’s going on, it can be easy to forget that you have not checked her pulse.) Faulty memory and distraction are a particular danger in what engineers call all-or-none processes: whether running to the store to buy ingredients for a cake, preparing an airplane for takeoff, or evaluating a sick person in the hospital, if you miss just one key thing, you might as well not have made the effort at all.
A further difficulty, just as insidious, is that people can lull themselves into skipping steps even when they remember them. In complex processes, after all, certain steps don’t always matter. … “This has never been a problem before,” people say. Until one day it is.
Checklists seem to provide protection against such failures. They remind us of the minimum necessary steps and make them explicit. They not only offer the possibility of verification but also instill a kind of discipline of higher performance.
In news that would shock bureaucracies and governments alike, the strategy a lot of industries use to get things right in complex environments is to give employees power. Most authorities, in response to risk, tend to centralize power and decision making.
Sometimes that's even the point of a checklist—to make sure the people below you are doing things in the manner in which you want. These checklists, the McDonald's type checklists, spell out the tiniest detail of every critical step. These serve their purpose but they also create a group of employees no longer able to adapt. When things change, as they always do, you're now faced with a non-routine problem. “The philosophy,” writes Gawande, “is that you push the power of decision making out to the periphery and away from the center. You give people the room to adapt, based on their experience and expertise. All you ask is that they talk to one another and take responsibility. That is what works.”
…the real lesson is that under conditions of true complexity—where the knowledge required exceeds that of any individual and unpredictability reigns—efforts to dictate every step from the center will fail. People need room to act and adapt. Yet they cannot succeed as isolated individuals, either—that is anarchy. Instead, they require a seemingly contradictory mix of freedom and expectation—expectation to coordinate, for example, and also to measure progress toward common goals.
This was the understanding people in the skyscraper-building industry had grasped. More remarkably, they had learned to codify that understanding into simple checklists. They had made the reliable management of complexity a routine.
That routine requires balancing a number of virtues: freedom and discipline, craft and protocol, specialized ability and group collaboration. And for checklists to help achieve that balance, they have to take two almost opposing forms. They supply a set of checks to ensure the stupid but critical stuff is not overlooked, and they supply another set of checks to ensure people talk and coordinate and accept responsibility while nonetheless being left the power to manage the nuances and unpredictabilities the best they know how.
I came away from Katrina and the builders with a kind of theory: under conditions of complexity, not only are checklists a help, they are required for success. There must always be room for judgment, but judgment aided—and even enhanced—by procedure.