Farnam Street helps you make better decisions, innovate, and avoid stupidity.

With over 400,000 monthly readers and more than 93,000 subscribers to our popular weekly digest, we've become an online intellectual hub.

Risky Business: James Bagian—NASA astronaut turned patient safety expert—on Being Wrong

A must read interview with James Bagian (h/t Joe ). James is, among other things, an engineer, an anesthesiologist, a NASA astronaut (he was originally scheduled to be on the fatal Challenger  mission), a private pilot, an Air Force-qualified freefall parachutist, and a mountain rescue instructor. And then there’s his current job: director of the Veteran Administration’s National Center for Patient Safety. In that capacity, Bagian is responsible for overseeing the reduction and prevention of harmful medical mistakes at the VA’s 153 hospitals.

Some extracts from the interview.

How does the healthcare industry compare to engineering and aeronautics when it comes to dealing with human error?

Not favorably. Much of my background is in what’s called high-reliability industries—the ones that operate under conditions of high hazard yet seldom have a bad event—and people in those fields tend to have a systems perspective. We’re not terribly interested in what some individual did. We want to know what led up to a bad event and what changes we need to make to reduce the likelihood of that event ever happening again.

When I got into healthcare, I felt like I’d stepped into an entirely different world. It was all about, “Let’s figure out who screwed up and blame them and punish them and explain to them why they’re stupid.” To me, it’s almost like whistling past the grave. When we demonize the person associated with a bad event, it makes us feel better. It’s like saying, “We’re not stupid so it won’t happen to us.” Whereas in fact it could happen to us tomorrow.

That’s true, but if at the end of the day all you can say is, “So-and-so made a mistake,” you haven’t solved anything. Take a very simple example: A nurse gives the patient in Bed A the medicine for the patient in Bed B. What do you say? “The nurse made a mistake”? That’s true, but then what’s the solution? “Nurse, please be more careful”? Telling people to be careful is not effective. Humans are not reliable that way. Some are better than others, but nobody’s perfect. You need a solution that’s not about making people perfect.

So we ask, “Why did the nurse make this mistake?” Maybe there were two drugs that looked almost the same. That’s a packaging problem; we can solve that. Maybe the nurse was expected to administer drugs to ten patients in five minutes. That’s a scheduling problem; we can solve that. And these solutions can have an enormous impact. Seven to 10 percent of all medicine administrations involve either the wrong drug, the wrong dose, the wrong patient, or the wrong route. Seven to 10 percent. But if you introduce bar coding for medication administration, the error rate drops to one tenth of one percent. That’s huge.

What kinds of tools have you introduced that do work?

One thing we do that’s unusual is we look at close calls. In the beginning, nobody did that in healthcare. Even today probably less than 10 percent of hospital facilities require that close calls be reported, and an even smaller percentage do root cause analyses on them. At the VA, 50 percent of all the root cause analyses we do are on close calls. We think that’s hugely important. So does aviation. So does engineering. So does nuclear power. But you talk to most people in healthcare, they’ll say, “Why bother? Nothing really happened. What’s the big deal?”

How do you get people to tell you about their close calls, or for that matter about their actual errors? Getting people to report problems has always been tricky in medicine.

Yeah, reporting is a huge issue, because obviously you can’t fix a problem if you don’t know about it. Back in 1998, we conducted a huge cultural survey on patient safety, and one of the questions we asked was, “Why don’t you report?” And the major reason—most people think it’s going to be fear of malpractice or punishment, but it wasn’t those. It was embarrassment, humiliation. So the question became, How do you get people to not be afraid of that? We talked about it a lot, and we devised what we called a blameworthy act, which we defined as possessing one of the following three characteristics: it involves assault, rape, or larceny; the caregiver was drunk or on illicit drugs; or he or she did something that was purposefully unsafe. If you commit a blameworthy act, that’s not a safety issue, although it might manifest as one. That’s going to get handled administratively, and probably you should be embarrassed. But we made it clear that blameworthiness was a very narrow case.

At the time that we conducted this survey, we were already considered to be a good reporting healthcare organization; our people reported more than in most places. But in the ten months after we implemented this definition, our reporting went up 30 fold. That’s 3,000 percent. And it has continued to go up ever since—not as dramatically, but a couple of percentage points every year.

It’s pretty sobering that the reporting rate can go up so much. I realize that that’s good news, but it also suggests that there was (and to a lesser extent presumably still is) a lot of bad stuff going on out there that we never hear about.

That’s true. But the only reason to have reporting is to identify vulnerabilities, not to count the number of incidents. Reports are never good for determining incidence or prevalence, because they’re essentially voluntary. Even if you say “You must report,” people will only report when they feel like it’s in their interest to do so.

Do you punish people for failing to report serious medical issues?

No. In theory, punishment sounds like a good idea, but in practice, it’s a terrible one. All it does is create a system where it’s not in people’s interest to report a problem.

Source: http://www.slate.com/blogs/blogs/thewrongstuff/archive/2010/06/28/risky-business-james-bagian-nasa-astronaut-turned-patient-safety-expert-on-being-wrong.aspx

Read what you’ve been missing. Subscribe to Farnam Street via Email, RSS, or Twitter.

Shop at Amazon.com and support Farnam Street.