1 / 62

Patient Safety Issues

Patient Safety Issues. Where Does the Lab Professional Fit In?. Mary Ann McLane, PhD, CLS(NCA) Region II Director. The patient must come first!. Objectives. At the conclusion of this seminar, the participant will be able to:

arch
Télécharger la présentation

Patient Safety Issues

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Patient Safety Issues Where Does the Lab Professional Fit In? Mary Ann McLane, PhD, CLS(NCA) Region II Director

  2. The patient must come first!

  3. Objectives At the conclusion of this seminar, the participant will be able to: • Describe the components of the Institute of Medicine’s 1999 “To Err Is Human” document which relate to the clinical lab. • Compare and contrast the programs offered by JCAHO’s Speak Up” initiative. • List at least 5 examples of errors involving patient safety and pre-analytical/post-analytical error.

  4. Unsafe acts are like mosquitoes… You can try to swat them one at a time, but there will always be others to take their place. The only effective remedy is to drain the swamps in which they breed. In the case of errors and violations, the "swamps" are equipment designs that promote operator error, bad communications, high workloads, budgetary and commercial pressures…

  5. Unsafe acts are like mosquitoes… …procedures that necessitate their violation in order to get the job done, inadequate organization, missing barriers, and safeguards . . . the list is potentially long but all of these latent factors are, in theory, detectable and correctable before a mishap occurs. James Reason, To Err Is Human

  6. Americans harmed by medical error • Two studies of large samples of hospital admissions • New York using 1984 data • Colorado and Utah using 1992 data • adverse event (injuries caused by medical management) were 2.9 and 3.7 percent respectively • adverse events attributable to errors (i.e., preventable adverse events) was 58 percent in New York, and 53 percent in Colorado and Utah

  7. extrapolated to the over 33.6 million admissions to U.S. hospitals in 1997 • 44,000 to 98,000 Americans die in hospitals each year as a result of medical errors • exceed the number attributable to the 8th-leading cause of death • exceed the deaths attributable to motor vehicle accidents (43,458), breast cancer (42,297) or AIDS (16,516)

  8. Total national costs • lost income, lost household production, disability, health care costs • $37.6 billion to $50 billion for adverse events • $17 billion to $29 billion for preventable adverse events • slightly higher than the direct and indirect costs of caring for people with HIV and AIDS.

  9. Lives lost • more than 6,000 Americans die from workplace injuries every year • in 1993 medication errors are estimated to have accounted for about 7,000 deaths • one out of 131 outpatient deaths • one out of 854 inpatient deaths

  10. Medication-related errors occur frequently in hospitals; not all result in actual harm, but those that do are costly. • 2% admissions at two large hospitals: preventable adverse drug event • average increased hospital costs of $4,700 per admission • about $2.8 million annually for a 700-bed teaching hospital.

  11. Medication-related errors • not all result in actual harm • those that do are costly • Preventable: $2 billion for the nation as a whole.

  12. Not just hospital patients • In 1998: ~2.5 billion prescriptions were dispensed by U.S. pharmacies at a cost of about $92 billion. • errors in • prescribing medications • dispensing by pharmacists • unintentional nonadherence on the part of the patient.

  13. Definitions • Adverse event • injury caused by medical management rather than the underlying condition of the patient. • Preventable adverse event • adverse event attributable to error

  14. Definitions • Error • the failure of a planned action to be completed as intended (i.e., error of execution) • the use of a wrong plan to achieve an aim (i.e., error of planning)

  15. Definitions • Negligent adverse event • the care provided failed to meet the standard of care reasonably expected of an average physician qualified to take care of the patient Discussion point: expected of an “average physician” only?

  16. Why focus on medication-related error? • One of the most common types of error • Substantial numbers of individuals are affected • Accounts for a sizable increase in health care costs

  17. Why focus on medication-related error? • Easy to identify an adequate sample of patients who experience adverse drug events • The drug prescribing process provides good documentation of medical decisions, residing in automated, easily accessible databases • Case of Comfort and Caring, Inc • Deaths attributable to medication errors are recorded on death certificates.

  18. Important note! • “There are probably other areas of health care delivery that have been studied to a lesser degree but may offer equal or greater opportunity for improvement in safety.” • That is us!!

  19. What the literature shows 1. How frequently do errors occur? 2. What factors contribute to errors? 3. What are the costs of errors? 4. Are public perceptions of safety in health care consistent with the evidence?

  20. Harvard Medical Practice Study • >30,000 randomly selected discharges • 51 randomly selected hospitals in New York State in 1984 • Adverse events, manifest by prolonged hospitalization or disability at the time of discharge or both = 3.7% • Preventable adverse events = 58% • Negligence = 27.6%

  21. Harvard Medical Practice Study • 13.6% resulted in death • 2.6% caused permanently disabling injuries • Type of adverse event • drug complications = 19% • wound infections = 14% • technical complications = 13%

  22. First instinct? • Blame someone! However… • due most often to the convergence of multiple contributing factors • blaming an individual does not change these factors and the same error is likely to recur • Case of Charles Thompson, deathrow inmate from TX

  23. What would work better? • Preventing errors and improving safety for patients requires a systems approach • to modify the conditions that contribute to errors • which recognizes people working in health care are among the most educated and dedicated workforce in any industry

  24. What would work better? • The problem is not bad people • The problem is that the system needs to be made safer.

  25. Hindsight bias • things that were not seen or understood at the time of the accident seem obvious in retrospect • misleads a reviewer into simplifying the causes of an accident • highlighting a single element as the cause • overlooking multiple contributing factors

  26. Hindsight bias • things that were not seen or understood at the time of the accident seem obvious in retrospect • information about an accident is spread over many participants • no one may have complete information • easy to arrive at a simple solution or to blame an individual, but difficult to determine what really went wrong.

  27. More definitions • Slips • action conducted is not what was intended • observable • Mistakes • the planned action is wrong

  28. More definitions • Slips • physician chooses an appropriate medication, writes 10 mg when the intention was to write 1 mg • Mistakes • selecting the wrong drug because the diagnosis is wrong • Important not to equate slip with "minor." Patients can die from slips as well as mistakes.

  29. Lab definitions? • Slips (action conducted is not what was intended) physician chooses an appropriate medication, writes 10 mg when the intention was to write 1 mgaaaaaaaaaaaaaaaaaaaaaaaaaaaa • Mistakes (the planned action is wrong)

  30. Safety = absence of errors? • More! • Multiple dimensions • an outlook: health care is complex and risky and solutions are found in the broader systems context; • a set of processes: identify, evaluate, and minimize hazards and continuously improve • an outcome: manifested by fewer medical errors and minimized risk or hazard

  31. Safety definition • Freedom from accidental injury • from the patient's perspective, the primary safety goal is to prevent accidental injuries • Safe environment = low risk of accidents • reduce defects in the process or departures from the way things should have been done • establish operational systems and processes that increase the reliability of patient care.

  32. Active vs. latent error • Active errors • occur at the level of the frontline operator • their effects are felt almost immediately • Latent errors • removed from the direct control of the operator • poor design, incorrect installation, faulty maintenance, bad management decisions, and poorly structured organizations

  33. Active vs. latent error • Active errors • the pilot crashed the plane • Latent errors • a previously undiscovered design malfunction caused the plane to roll unexpectedly in a way the pilot could not control and the plane crashed

  34. Active vs. latent error • Latent error • greatest threat to safety in a complex system • often unrecognized • have the capacity to result in multiple types of active errors. • Challenger accident traced contributing events back nine years • Three Mile Island accident, latent errors were traced back two years

  35. Active vs. latent error • Latent error • difficult for the people working in the system to notice • errors may be hidden • in the design of routine processes in computer programs • in the structure or management of the organization • people become accustomed to design defects and learn to work around them, so they are often not recognized

  36. Active vs. latent error • Latent error • "normalization of deviance" • small changes in behavior became the norm • additional deviations became acceptable • the potential for errors is created • signals are overlooked or misinterpreted • signals accumulate without being noticed

  37. Active vs. latent lab error • Active errors • Latent errors

  38. First instinct? • focus on the active errors by punishing individuals (e.g., firing or suing them) • retraining or other responses aimed at preventing recurrence of the active error • punitive response may be appropriate in some cases (e.g., deliberate malfeasance) • it is not an effective way to prevent recurrence

  39. First instinct? • Large system failures • latent failures coming together in unexpected ways • appear to be unique in retrospect • Same mix of factors is unlikely to occur again • efforts to prevent specific active errors are not likely to make the system any safer

  40. Focus on active errors • lets the latent failures remain in the system • their accumulation actually makes the system more prone to future failure

  41. Focus on latent errors • Discovering and fixing latent failures, and decreasing their duration, are likely to have a greater effect on building safer systems than efforts to minimize active errors at the point at which they occur • likely to have a greater effect on building safer systems

  42. High reliability theory • accidents can be prevented through good organizational design and management • an organizational commitment to safety • high levels of redundancy in personnel and safety measures • strong organizational culture for continuous learning and willingness to change

  43. Correct performance and error • "two sides of the same coin”

  44. Complexity and tight-coupling • Systems that are more complex and tightly coupled are more prone to accidents and have to be made more reliable • complex and tightly coupled systems can "spring nasty surprises.“ • Guess what type of system healthcare is????!!!

  45. Two cases of success • Aviation • Occupational health • growing awareness of safety concerns and the need to improve performance • comprehensive strategies • creation of a national focal point for leadership • development of a knowledge base • dissemination of information throughout the industry

  46. Two cases of success • Aviation • Occupational health • designated government agency with regulatory responsibility for safety • carefully constructed research agenda • substantial resources devoted to these initiatives

  47. Third case of success? • Healthcare • no cohesive effort to improve safety in health care • resources devoted to enhancing and disseminating the knowledge base are wholly inadequate • “health care is not likely to make significant safety improvements without a more comprehensive, coordinated approach.“

  48. Center for Patient Safety • provide leadership for safety improvements throughout the industry • establish goals and track progress in achieving results • expand the knowledge base for improving safety in health care • provide visibility to safety concerns

  49. Role of professionals • Become active leaders in encouraging and demanding improvements in patient safety. • Setting standards, convening and communicating with members about safety • Incorporating attention to patient safety into training programs • Collaborating across disciplines • Contribute to creating a culture of safety. As patient advocates, health care professionals owe their patients nothing less.

  50. Center for Patient Safety should… • 4. Define feasible prototype systems (best practices) and tools for safety in key processes, including both clinical and managerial support systems for… • management of diagnostic tests, screening, and information…

More Related