Noting how different medicine is today from the iconic Marcus Welby, MD, was practicing (on television, 1969 to 1976), Dr. Wachter says: lFor one thing "it’s been estimated the average physician in Dr. Welby’s era had to master the use of approximately 10 medicines. Today, that number is close to 500!"
The fact is, most prescribing doctors do not know the hazardous effects of the drugs they prescribe. Most doctors rely on the drug myths delivered by sales reps and reinforced by colleagues whose reputation as "opinion leaders" is measured by their income from pharmaceutical companies.
The surgical arena is not much better:
"If you could walk into an intensive care unit or observe a modern heart surgery, you’d quickly appreciate that increasingly what determines whether a patient lives or dies is not whether the doctor is as smart as current television’s Dr. Gregory House, but rather whether there is a functioning computer system (which not only prevents handwriting errors but suggests the right medicine at the right time and reminds the doctor a certain two drugs shouldn’t be given together). And whether there are policies and procedures (such as strict hand-washing guidelines, and a protocol for the surgeon to sign the surgical site to prevent operating on the wrong leg) that are thoughtfully developed and rigorously enforced. And whether the doctors, nurses, technicians and hospital administrators work together as a team. "
Imagine, hospitals getting licenses without minimal basic functioning computer systems; imagine doctors requiring "strict hand-washing guidelines;" imagine surgeons operating without verifying that they are performing the surgery on the right organ!
"This final issue may be the most important of all. At my hospital (UCSF Medical Center) and several other centers around the United States, we have enlisted the help of commercial airline pilots to teach us how to communicate better, how to dampen down hierarchies (so that a young nurse feels comfortable questioning a senior doctor when something seems awry), and how to debrief participants after an operation, just as crew members are debriefed after a flight."
Yes, and the FDA should also turn to pilots to help develop a meaningful drug safety evaluation protocol.
Contact: Vera Hassner Sharav
veracare@ahrp.org
http://www.newsday.com/news/health/ny-opwac274866145aug27,0,5285311.story?coll=ny-health-print
Newsday August 27, 2006
Curing medical mistakes
The answer to any Stony Brook hospital problems likely rests with the culture of care, not a few ‘bad apples’
BY ROBERT M. WACHTER
Robert M. Wachter, M.D., is associate chairman of medicine at the University of California, San Francisco, and co-author of "Internal Bleeding: The Truth Behind America’s Terrifying Epidemic of Medical Mistakes."
The recent reports about unexpected deaths of children after cardiac surgery at Stony Brook University Hospital are tragic, and they raise important questions about how we measure and safeguard the quality of care. Although we will have to let the investigation take its course to find out whether medical errors or violations of care standards led to these deaths, the events at Stony Brook help frame the challenges we face in ensuring patients’ safety.
This is an important issue. A 1999 report by the Institute of Medicine, the medical branch of the National Academy of Sciences, estimated that 44,000 to 98,000 Americans die each year because of medical errors, the equivalent of a jumbo jet crashing every day. Research from the Rand Corp. think tank has demonstrated that medical care comports with recommended guidelines slightly more than half the time in the United States. And a recent report also by the Institute of Medicine found that, on average, a hospital patient experiences one medication error – not necessarily severe – every single day he or she is in the hospital.
How can this be? The answer, as we’ve come to learn, is not completely intuitive. Whenever we hear about bad things happening to patients, we naturally try to point fingers – to find (and then to sue and pull the license from) the "bad doctor" or "bad nurse" responsible for the problem.
And there are some bad health care providers – people who are poorly trained, don’t keep up with advances in their field or abuse drugs or alcohol – who shouldn’t be caring for patients. We should do more to identify these individuals and get them the help they need to improve or make sure they never hurt another patient.
But think about your own doctor, and the nurses you’ve met. Is it possible they are all lazy, poorly trained and careless? Of course not. Understanding this helps make the point that the quality and safety of health care is not just about the quality of individual practitioners, it is also about the systems of care in which we health care professionals work and in which you receive your care.
The "system-ness" of care is a relatively new phenomenon. When Marcus Welby, MD, was practicing (on television, 1969 to 1976), it may well have been possible for a well-trained, careful and competent doctor to prevent most medical errors. After all, it’s been estimated the average physician in Welby’s era had to master the use of approximately 10 medicines. Today, that number is close to 500!
If you could walk into an intensive care unit or observe a modern heart surgery, you’d quickly appreciate that increasingly what determines whether a patient lives or dies is not whether the doctor is as smart as current television’s Dr. Gregory House, but rather whether there is a functioning computer system (which not only prevents handwriting errors but suggests the right medicine at the right time and reminds the doctor a certain two drugs shouldn’t be given together). And whether there are policies and procedures (such as strict hand-washing guidelines, and a protocol for the surgeon to sign the surgical site to prevent operating on the wrong leg) that are thoughtfully developed and rigorously enforced. And whether the doctors, nurses, technicians and hospital administrators work together as a team.
This final issue may be the most important of all. At my hospital (UCSF Medical Center) and several other centers around the United States, we have enlisted the help of commercial airline pilots to teach us how to communicate better, how to dampen down hierarchies (so that a young nurse feels comfortable questioning a senior doctor when something seems awry), and how to debrief participants after an operation, just as crew members are debriefed after a flight.
This kind of team training does not come naturally to most doctors. But it did not come naturally to pilots either when it was introduced 20 years ago. After all, the old pilot personality (remember "The Right Stuff") was as macho as any surgeon’s. In fact, many pilots dismissed early teamwork training programs as "charm school."
It was only after several horrible plane crashes (including the 1977 collision of two 747s in the Canary Islands that killed nearly 600 people) that aviation learned the price to be paid when one member of a cockpit crew suspected something might be terribly wrong but didn’t feel comfortable challenging the boss’ judgment.
Since that time, aviation has required all its flight personnel to participate in teamwork and simulator training to create a more collaborative environment, with better communication among all the workers. Judging by the breathtaking decrease in commercial aviation accidents over the past generation, these efforts to create a safer culture have worked astonishingly well.
As we try to learn how to improve medical systems and culture – drawing on lessons from other industries where appropriate – it is critical to remember that caring for a sick patient is, of course, far more complex than flying a jumbo jet. It’s equally important we never forget we are not working with machines, but caring for sick human beings. Even as we embrace "systems thinking" to improve quality and make our care more reliable and less glitchy, medicine is, and will always be, a uniquely human undertaking.
It is too early to judge whether the problems at Stony Brook Hospital are widespread and systemic or represent an awful statistical fluke. At this point, the decision to suspend pediatric heart surgeries while this is sorted out seems prudent. If recent history is a predictor, the answer to any problems that may be uncovered at Stony Brook are more likely to be found in trying to improve the systems of care and the culture of safety than in trying to find and punish one or two bad apples.
Today’s medicine is so complicated that trying to mint the flawless doctor or nurse is a fool’s errand. Instead, we need to create a system that anticipates human beings – even very well trained, hardworking and compassionate ones – will blow it from time to time, and that catches these errors before they cause more tragedies.
FAIR USE NOTICE: This may contain copyrighted (© ) material the use of which has not always been specifically authorized by the copyright owner. Such material is made available for educational purposes, to advance understanding of human rights, democracy, scientific, moral, ethical, and social justice issues, etc. It is believed that this constitutes a ‘fair use’ of any such copyrighted material as provided for in Title 17 U.S.C. section 107 of the US Copyright Law. This material is distributed without profit.