1 / 34

If you think quality and safety are the same...think again

Karen Cardiff, School of Population and Public Health, University of BC Samuel B Sheps, School of Population and Public Health, University of BC Jim Nyce, Department of Anthropology, College of Sciences and Humanities, Ball State University, Muncie Indiana

Mia_John
Télécharger la présentation

If you think quality and safety are the same...think again

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Karen Cardiff, School of Population and Public Health, University of BC Samuel B Sheps, School of Population and Public Health, University of BC Jim Nyce, Department of Anthropology, College of Sciences and Humanities, Ball State University, Muncie Indiana Sidney Dekker, Lund School of Aviation, Lund University, Lund, Sweden June 10, 2010 System Safety Society Canada Chapter—Spring Symposium Ottawa, Ontario If you think quality and safety are the same...think again

  2. The findings presented today are based on the thesis work for completion of an MSc in Human Factors and System Safety with the Department of Aviation, Lund University University of British Columbia School of Population and Public Health

  3. Also informed by work that began a decade ago • Canadian Adverse Events Project (Canadian Institutes of Health Research, 2000-2003) • Management and Regulation of Safety in Risk Critical Sectors (Health Canada, 2004-2007) • Creating High Reliability Organizations in the Canadian Health Care System (Canadian Patient Safety Institute, 2007-2008) • Current work co-funded by the Canadian Health Services Research Foundation and the Canadian Patient Safety Institute—a four year multi-partner capacity building project that is focused on building capacity within acute care hospitals to do critical incident investigation (Sheps and Cardiff, 2009-2013) • Interaction with international opinion leaders and experts in system safety on the emerging ideas of resilience and safety (Sidney Dekker, Eric Hollnagel, René Amalberti, Richard Cook, etc) University of British Columbia School of Population and Public Health

  4. Outline of presentation • Background • Why is the topic important • Theories and models of why things go wrong • Findings from the thesis work University of British Columbia School of Population and Public Health

  5. Background When patients entrust themselves to our care, we make two implicit, but key professional and organizational promises: ‘we promise to do everything possible to help patients—to provide good (possibly excellent) care; and, we promise not to hurt them’(Reinertsen & Clancy, 2006) However, there are many instances where people do not get the care that they need (McGlynn et al 2003) or are inadvertently harmed through the process of care (Kohn et al, 1999; Vincent et al, 2001; David et al 2002; Norton et al, 2004; Leape et al, 2005). University of British Columbia School of Population and Public Health

  6. While being concerned with quality for many years, healthcare did not, in general, think systematically about patient safety until the magnitude of the problem became very clear and could no longer be ignored. Now, after more than a decade of activity that has measured, tracked, and in many instances investigated adverse events in acute care, no one doubts that enhancing patient safety (i.e. reducing harm) is an important and necessary goal Although there continues to be widespread buy-in that the healthcare system must take steps to reduce patient harm, the struggle in how to achieve this, and make it sustainable, remains. Even as the number of activities to improve patient safety and quality increases, along with a the range of tools available to assist with the process the healthcare system has not yet achieved the status of being a high reliability or resilient industry. University of British Columbia School of Population and Public Health

  7. While emerging theoretical and applied work acknowledge that quality and safety are distinct concepts, in healthcare the majority of activities, to date, have focused on activities designed to improve adherence to accepted standards of care, such as hand-washing, appropriate use of antibiotics (i.e. telling people what they already know they should be doing), and are more aligned with the classic quality model than safety – safety by constraint marked by barriers, regulations, procedures, training and standardization. The current efforts are largely based on “reductionistic” thinking that attempts to “trouble shoot and fix things”. Even though many of these efforts have been met with success, they have largely ignored issues related to what it means to create safety in complex, dynamic settings, such as preparing frontline staff to cope with the complexity that they face on a daily basis and supporting them to become more experienced with anticipating what might go wrong (requisite imagination) and knowing when and how to adapt their performance under conditions of uncertainty. University of British Columbia School of Population and Public Health

  8. Understanding adverse events in healthcare ‘A model often used in health care is that accidents are thought to occur when individual components or processes fail to meet criteria….this model of risk and safety builds on the assumption that safety, once established, can be maintained by keeping the performance of a system’s parts (human and technical) within certain bounds (e.g. people should not violate rules and procedures) ‘ Sidney Dekker, Past the edge of chaos. 2006 University of British Columbia School of Population and Public Health

  9. Why does the classic quality model distort efforts to achieve safety? The classic quality model was developed to ensure that the system meets pre-specified criteria. Quality is viewed as the characteristics of a service or product that must be present to meet needs or expectations – such as high professional standards, effective use of resources and high patient satisfaction The goal of quality assurance activity is to keep performance variability under control –organizations develop policies, rules and protocols to keep performance within a particular bandwidth. University of British Columbia School of Population and Public Health

  10. Inlinear production systems (e.g. Toyota, McDonalds) quality may have safety as an additive result – this is because: • the systems can be decomposed into meaningful elements • the failure-probability of individual components can be described and analyzed individually • the order or sequence of events is predetermined and fixed • when combinations of events occur they can be described as non interacting • the influence of context is limited or quantifiable (Dekker, 2007) However, since the process of “producing” healthcare is neither linear nor fixed, the classic quality model is not adequate. University of British Columbia School of Population and Public Health

  11. Complex systems ‘in complex systems (such as healthcare), unpredictability and paradox are ever present, and some things will remain unknowable….new conceptual frameworks that incorporate a dynamic, emergent, creative and intuitive view of the world must replace traditional “reduce and resolve” approaches to clinical care and service organization’ (Plsek, 2001) University of British Columbia School of Population and Public Health

  12. Examples of complexity in healthcare • Demand frequently pushes performance goals; • Rapid introduction of new, complex technology; • Many discontinuities and transitions in care (e.g. multiple care providers caring for the same patient and often in multiple settings – emergency department to operating room to surgical ward—each unit managed independently); • Strong autonomous and semi-autonomous professional cultures with concomitant power struggles; • Often rapid turnover in staff; • Continual influx of new patients, each with their own inherent biological variability and in many instances language and cultural differences. University of British Columbia School of Population and Public Health

  13. The key challenge of creating safety in complex non linear systems is that the knowledge base is inherently and permanently incomplete. In healthcare work situations are always underspecified (i.e. the conditions of work rarely match what has been specified or prescribed), and with this comes an unpredictable component, and thus adaptation is often necessary (Hollnagel et al, 2006; Grote, 2006) Performance variability is both normal and necessary in complex non linear systems (Dekker, 2007; Hollnagel, 2008) University of British Columbia School of Population and Public Health

  14. Performance variability, adaptation & safety IF keeping things safe equals keeping performance within a particular bandwidth, why do complex, dynamic organizations value experience or seniority? • It is because experience brings a broader bandwidth – people at the sharp end of care can judge whether that which they used before will work in the current situation. • There is always a tension around when and how to adapt rules and protocols to the set of circumstances one finds themselves in, but the experienced person is more likely to know when and how to do this. • Complex systems are generally well protected from vulnerabilities with “barriers”, but safety is created through practice, i.e. practitioners recognize local pitfalls and forestall them, and this often involves adaptation. University of British Columbia School of Population and Public Health

  15. Theories and models of why things go wrong: Linear views (Dekker, 2007; Hollnagel, 2008) Assumption and consequence: • Accidents are the natural outcome of a series of events or circumstances which occur in a specific and recognizable manner (e.g. Domino Model, Heinrich, 1930)  accidents are prevented by finding and eliminating possible causes – component failures, such as human technical and organizational. Assumption and consequence: • More recently, accidents have been seen to result from a combination of active failures (unsafe acts) and latent conditions (hazards)  accidents are prevented by strengthening barriers and defenses (e.g. Swiss Cheese Model, Reason, 1990) and safety is ensured by measuring performance indicators—in particular people rely on understanding past events (e.g. Root Cause Analysis) to develop solutions for the future. University of British Columbia School of Population and Public Health

  16. Reason’s Swiss Cheese Model: Layers of defense(J. Reason, Managing the Risks of Organizational Accidents, 2002) University of British Columbia School of Population and Public Health

  17. Linear models provide the basis for investigators to easily take the position of retrospective outsider, looking back on a sequence of events that seems to lead to an inevitable outcome, and pointing out where people went wrong, or where individual components of the system failed. Although this perspective is often adequate in linear systems, the approach seriously limits what the investigator can learn about failure in non linear systems ( i.e. the complex and unexpected combination of system interactions) and may not help prevent recurrence. University of British Columbia School of Population and Public Health

  18. Understanding safety in non linear systems (Dekker, 2007; Hollnagel, 2008) Assumption and consequence: • Accidents result from unexpected combination (resonance) of normal performance variability – hazards emerge from expected (and necessary) variability within the system and accidents are prevented by monitoring and damping the variability. • The variability of normal performance is rarely sufficient to result in an accident, but the variability from multiple functions may combine in unexpected ways to produce a non linear effect, thus safety is an emergent property of the system and cannot be explained by simply examining individual components of the system and/or trying to identify a root cause. • Complex socio-technical systems (such as healthcare) are dynamic—the systems change and develop in response to competing demands, production pressures and changes in technology and knowledge. Resilience exists when operators in the system are able to recognize, absorb and adapt to disruptions/changes that fall outside of their design base. University of British Columbia School of Population and Public Health

  19. Functional Resonance Accident Model (Hollnagel, 2004) An emerging model that attempts to understand the dynamics of “normal” organizational activity University of British Columbia School of Population and Public Health

  20. Non linear accident models provide investigators with the basis to find out why people’s actions and assessments made sense to them at the time; rather than identifying what rule, protocol or process the person violated (people often adapt their actions given the context at hand and this is part of the “normal performance variability” that takes place as part of “normal work” in dynamic, complex systems). ‘Human error is not an explanation, but demands an explanation’ Sidney Dekker, 2006 University of British Columbia School of Population and Public Health

  21. In contrast to this, linear models provide the basis for investigators to easily take the position of retrospective outsider, looking back on a sequence of events that seems to lead to an inevitable outcome, and pointing out where people went wrong, or where individual components of the system failed. • Although this perspective is often adequate in linear systems, the approach seriously limits what the investigator can learn about failure in non linear systems (i.e. the complex and unexpected combination of system interactions) and may not help prevent recurrence. • Hindsight Bias! University of British Columbia School of Population and Public Health

  22. Accident models: The challenge of hindsight bias • “Hindsight bias” is always present when the outcome is known – a retrospective outsider can easily confuse post hoc reality with the actual reality surrounding people during the event. • Hindsight bias is a powerful reason for “old view” explanations for human error and accidents – tending to look for: individual components of the system that need fixing; people deficient in skills; or egregious mistakes. • Hindsight bias makes it difficult to objectively judge behavior leading up to the outcome. In particular, past complexity is transformed into a linear string of “bad” decisions, missed opportunities, flawed assessments, and faulty perceptions. • Thus, recommendations too often focus on protecting the system from “unreliable humans” through procedures, automation, training and discipline. University of British Columbia School of Population and Public Health

  23. Thesis work The aim of the thesis work was to begin to explore the extent, and in what ways, safety and quality are conflated in healthcare, at both the sharp and blunt ends of care in an acute care institutional setting within a large health authority in Canada. The key questions this research sought to answer are: • How are the notions of patient safety operationalized through local context? • How is safety thought about and constructed? • How is it discussed? • Is it neglected? University of British Columbia School of Population and Public Health

  24. Methods Survey work: interviewed key informants (registered nurses working in acute care; nurse managers responsible for acute care units; and senior decision makers). Semi-structured face-to-face interviews were conducted (interview guide was developed to support the discussion and contained a series of open ended questions) to find out how the key informants thought about safety and how they feel they contribute to safety on a day-to-day basis. University of British Columbia School of Population and Public Health

  25. Interview guide (registered nurses) • How long have you worked as a registered nurse? • How do you define the term safety? • What factors and activities help contribute to patient safety at your institution, in general, and in particular, on your unit? • What do you think would improve patient safety in acute care hospitals, in general, and in particular, on your unit? University of British Columbia School of Population and Public Health

  26. Interview guide (nurse managers and senior decision makers) • What is your role in the organization? • How long have you worked in healthcare? • How do you define the term safety? • What factors and activities help contribute to patient safety at your institution? • What do you think would improve patient safety in acute care hospitals? • Does the work you do contribute to safety? If yes, in what ways? University of British Columbia School of Population and Public Health

  27. Survey results: Major themes • Designing robust organizations (prescriptive practice) • Designing robust organizations (compliance) • Designing robust organizations (rules and procedures are important but insufficient to create safety) • Expertise and experience • Adaptation of work, depending on the context and competing priorities • Efficiency-thoroughness-tradeoff • Unpredictable notion of safety • Learning from near misses and critical incidents • Storytelling as a form of learning • Communication and teamwork • Leadership • Competing system challenges • Vigilance and troubleshooting University of British Columbia School of Population and Public Health

  28. There were notable differences regarding the emphasis that each group placed on the respective themes. University of British Columbia School of Population and Public Health

  29. Thesis findings con’t • Safety is important, but people are still looking for standard fixes and are influenced by conventional opinion leaders (e.g. Safer Health Care Now campaign, and Saving 100,000 Lives Campaign). • Confusion regarding the difference between safety and quality exists and the confusion is greater at more senior levels in the organization (i.e. people continue to think that if you improve quality through standardization, guidelines, procedures, etc that safety will automatically follow). • People at senior levels focus on the need to develop robust systems that are marked by guidelines, protocols, rules and also focus on training, technology, rules and enforcing compliance as solutions. • People at the front lines (the practitioners) understand the need to adapt their behaviour and practice in unusual situations, but are tentative in how they discuss this with both their peers and managers, aware of potential negative consequences or sanctions, if things don’t work out well. University of British Columbia School of Population and Public Health

  30. Thesis findings con’t • Set of ingrained attitudes about how work is performed, i.e. there is no gap between work as imagined and work as done, i.e. work can be performed in a high quality manner, despite the context – this is an easy perspective for senior management to adopt since it feeds off the sense, amongst most professional groups in healthcare, that they are, or should be, perfect and can provide high quality care under a range of conditions. • Lack of deep understanding of the source of failure in complex organizations. • Superficial understanding of hindsight bias and its impact on what you look for when you are doing critical incident investigations. • Accountability remains a thorny issue, particularly at the senior management and governance level, with little consideration of the accountability/authority dynamic. University of British Columbia School of Population and Public Health

  31. In practical terms conflating the concepts of quality and safety in a complex, dynamic setting such as healthcare can result in investing efforts to solve the wrong problem and thus potentially misappropriates limited human and financial resources. Besides the potential misappropriation of resources, if quality and safety are conflated, it is far too easy to assume that if one improves quality that safety will automatically follow, and thus the system, unfortunately, continues doing “more of the same”; neither fully understanding or adequately tackling the problem. University of British Columbia School of Population and Public Health

  32. Conclusions Safety and quality are often conflated in health care and this may limit progress on both creating safety AND enhancing quality. Safety is the attribute of being able to respond to surprise or instability of the system--creating safety involves anticipating what could go wrong. Non linear accident models, based on an understanding of both high reliability and resilience theories, and empirical evidence from high risk, dynamic settings, can help us appreciate why safety and quality need separate strategies. There were noticeable differences in how the key informants (from the thesis work) talked about safety and the perspectives of the people at the sharp end of the system (point of care) are fairly consistent with what the thought leaders in system safety are telling us about creating safety in complex dynamic environments. University of British Columbia School of Population and Public Health

  33. Making progress in safety may be supported by a better understanding by system leaders about how work at the frontline actually gets done (normal work) as well as a better understanding about the necessity and value of performance variation in complex, dynamic systems. There needs to be discussion around the tension between developing robust systems (marked by rules, procedures etc), while at the same time, supporting performance variation. A dedicated interdisciplinary Safety Management System, with a broad mandate, is one structural tool that may help healthcare organizations make progress on safety • Develops an analytical framework for critically monitoring safety using a non linear perspective; • Keeps the discussion of risk alive within the organization; • Enables people at the sharp end to actively look for the things that could go wrong and understand how to keep these at bay; • Surveillance activity University of British Columbia School of Population and Public Health

  34. References Bagian, James, Director, VA National Centre for Patient Safety. Personal communication, July 2007. Cook, R. I., Woods, D. D., & Miller, C. (1998). A tale of two stories: Contrasting views of patient safety. Report from a Workshop on Assembling the Scientific Basis for Progress on Patient Safety National Patient Safety Foundation. Dekker, S. W. A. (2001). Reconstructing human contributions to accidents: The new view on error and performance (Rep. No. Technical Report 2001-01). Ljungbyhed, Sweden: Lund University School of Aviation. Dekker, SWA (2002). The field guide to human error investigations. Bedford, UK: Cranfield University Press/Aldershot UK: Ashgate Publishing Co. Dekker, S. W. A. (2003). Errors in understanding of human error: the real lessons from aviation for healthcare (Rep. No. Technical Report 2003-01). Ljungbyhed, Sweden: Lund University School of Aviation. Dekker, SWA (2005). Ten questions about human error: A new view of human factors and system safety. Mahwah, NJ: Lawrence Erlbaum Associates. Dekker, S. W. A. (2005). Why we need new accident models (Rep. No. 2005-02). Ljungbyhed, Sweden: Lund University School of Aviation. Dekker, S.W.A. Past the edge of chaos. Technical Report 2006-03. Lund University School of Aviation. Ljungbyhed, Sweden. Dekker, S.W.A. Professor, Lund University School of Aviation. Personal communication, October 2007, Winnipeg, Manitoba. Hollnagel, E. (2004). Barriers and accident prevention. Aldershot, UK: Ashgate Publishing Co. Hollnagel, E. Woods, DD. And Leveson, N. (2006). Resilience engineering. Concepts and precepts. Aldershot, UK: Ashgate Publishing Co. Hollnagel, E. Theory W and theory Z: Contrasting views on safety. Presentation. L’Ecole de Mines, Paris. Leveson, N. (2004). A new approach to systems safety engineering. MIT. Perrow, C. (1999). Normal accidents: Living with high-risk technologies. Princeton, NJ: Princeton Paperbacks. Reason JT. (1997) Managing the risks of organizational accidents. Aldershot, UK: Ashgate Publishing Co. Reinertsen, James L. and Clancy, Carolyn Foreword to: Keeping our Promises: Research Practice, and Policy Issues in Health Care Reliability. A Special Issue of Health Services Research, Vol 41, No. 4, Part II, August 2006 Resar, R. Consultant, Institute for Health Improvement. Personal communication, July 2007. Roberts, K. H. & Bea, R. G. (2001). When systems fail. Organizational Dynamics, 29, 179-191 Sagan, SD (1993). The limits of safety: Organizations, accidents and nuclear weapons. Princeton, NJ: Princeton Paperbacks. Snook, S. (2000). Friendly fire: The accidental shoot down of two US Black Hawks over Northern Iraq. Princeton, NJ: Princeton University Press. Vaughan, D. (1996). The Challenger launch decision: Risky technology, culture, and deviance at NASA. Chicago, IL: Chicago University Press. Vaughan, D. (2005). On slippery slopes, repeating negative patterns, and learning from mistake. In W.H.Starbuck & M. Farjoun (Eds.), Organization at the limit: Lessons from the Columbia Disaster (First ed., pp. 41-59). Malden, Ma: Blackwell Publishing. Weick, K. E. (1995). Sensemaking in organizations. Thousand Oaks, California: Sage Publications. Weick, K. E. & Sutcliffe, K. M. (2001). Managing the unexpected: Assuring high performance in an age of complexity. San Francisco, CA: Jossey-Bass. Weick, K. E. (2005). Organizing and failures of imagination. International Public Management Journal, 8, 425-438. Weick, K. E. & Sutcliffe, K. M. (2005). Organizing and the process of sensemaking. Organization Science, 16, 409-421. Vaughan, D. (1999). The dark side of organizations: mistake, misconduct, and disaster. Annual Review of Sociology, 25, 271-305. Woods, D. Cook RI. (2003) Mistaking error In MJ Hatlie and BJ Youngberg (eds) “Patient Safety Handbook”, Jones and Barlett. University of British Columbia School of Population and Public Health

More Related