1 / 40

Usability and Human Factors

Usability and Human Factors. Unit 10: Designing for Safety. Lecture b.

geraldiner
Télécharger la présentation

Usability and Human Factors

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Usability and Human Factors Unit 10: Designing for Safety Lecture b This material (Comp 15 Unit 10) was developed by Columbia University, funded by the Department of Health and Human Services, Office of the National Coordinator for Health Information Technology under Award Number 1U24OC000003. This material was updated by The University of Texas Health Science Center at Houston under Award Number 90WT0006. This work is licensed under the Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. To view a copy of this license, visit http://creativecommons.org/licenses/by-nc-sa/4.0/.

  2. Designing for SafetyLecture b – Learning Objectives • Apply the cognitive taxonomy of errors (Lecture b) • Define “workflow analysis” and methods for examining and addressing human errors (Lecture b)

  3. Woods and Colleagues: Resilience Engineering • Challenger disaster an example of: • Drift toward failure as defenses erode in the face of production pressure. • An organization that takes past success as a reason for confidence instead of investing in anticipating the changing potential for failure. • Fragmented problem solving process that clouds the big picture. • Failure to revise assessments as new evidence accumulates. • Breakdowns at the boundaries of organizational units that impedes communication and coordination. (Hollnagel, E., Woods, D.D., and Leveson, N., 2006)

  4. Woods and Colleagues: Challenger Analysis • Interpretation of past “success”. The absence of failure is taken as positive indication that hazards are not present or that countermeasures are effective. • An organization usually is unable to change its model of itself unless and until overwhelming evidence accumulates that demands revising the model. (Hollnagel, E., Woods, D.D., and Leveson, N., 2006)

  5. ‘Failure of Foresight’ • Focus on differences  people see no lessons for own operations, narrow well bounded responses • Crux is to notice info that changes past models, without clear cut evidence • Provide ‘fresh’ view by: • New people, interactions across diverse groups, knowledge, tools, new visualizations which capture big picture, reorganize data into different perspectives

  6. Woods and Colleagues: Detecting Danger • Cross-checking, collaborations • Display safety margin indicators • “Errors will always be there” – can’t have 100% perfection, but anticipate and avoid risk situations, or handle appropriately

  7. Woods – Resilience Engineering (Cont’d – 1) • Based on insights from above 5 patterns • Assessing organization risk, i.e. that holes in decision making will produce unrecognized drift to failure boundary • Assessing technical hazards, but goal is to monitor decision making • Balance production pressures with protection pressures • Management commitment to above • Truly open and encouraged reporting

  8. Resilience Engineering • Learning culture v. culture of denial reflected in incident response • Preparedness / Anticipation • Opacity / Observability • Flexibility / Stiffness • Revise / Fixated

  9. Resilience Engineering – 3 Basics • Detecting signs of increasing organizational risk, especially when production pressures are intense or increasing; • Having the resources and authority to make extra investments in safety at precisely these times when it appears least affordable; • Having a means to recognize when and where to make targeted investments to control rising signs of organizational risk and re-balance the safety and production tradeoff.

  10. Reason’s Swiss Cheese Model Duke University,("Anatomy of an Error", 2016)

  11. Catastrophes Duke University,("Anatomy of an Error", 2016)

  12. Failure Factors and Recovery Zhang, J., Patel, L.V., Johnson, R. T., &. Shortliffe, H.E. (2004).

  13. Patel, Cohen – Error in Critical Care • Distribution of cognition can result in vulnerabilities • Increased complexity, interruptions (Laxmisan: every 9 min for attendings, 14 min for residents) Patel, V.L, , & Cohen, T. (2008).

  14. Time Course of Medical Errors Cohen, T., Blatter, B., Almeida, C., Patel VL. (2007).

  15. Error Detection and Correction • Poorly understood, but integral to all cognitive work • Bounds of acceptable practice violated, usually corrected (e.g. prescribe neglected med) • Failure to detect this > cross another boundary > adverse event • Maximal productivity > strains system, shifts balance between error commission and correction (due to cognitive capacity strain)

  16. Workflow Analysis and Modeling (Malhotra and Colleagues: 2006) • Detailed characterization of individual workflows • ID critical events • Reconstruct collective workflow from events connected in space or time, done collaboratively • Delineate the workflow, role players, devices, protocols, and communications so that we can identify and focus on areas where cognitive aids, technology or interventions may be of assistance. • Develop a generalizable cognitive model to represent the intricate workflow applicable to other health care settings

  17. Schematic Layout of the Cardio Thoracic Intensive Care Unit (CTICU) & Key Activities (Malhotra et al 2007) Malhotra, S., Jordan, D., Shortliffe, E., Patel, V.L. (2007).

  18. CTICU Critical Zones - Examples Jiajie, Z., Vimla, P.L., Johnson, T.R., Shortliffe. E.H. (2004).

  19. Intensive Care Unit (ICU) and Critical Care • Almost all patients admitted to ICU suffer adverse events • 10% average mortality • 5 milllion ICU admissions/year (US) • 30% of hospital costs, $180B / year • Priority for Joint Commission and Leapfrog

  20. Factors in ICU Care • Staffing: Intensivists reduce mortality (3x reduction, Pronovost, JAMA 1999) • Fewer ICU nurses  increased length of stay (LOS), pulmonary complications (Pronovost ECP 2001) • Pharmacists: daily rounds: 66% reduction in ADEs (10.4/1000 pt days -> 3.5 • Principles: staff accountable, reduce complexity, independent redundancies for key processes

  21. Care Goal Sheet (Pronovost) • Instituting care goals dramatically reduced length of stay from 2.2 to 1, increasing revenue from new admissions Pronovost, (2005).

  22. Critical Care Environments • Methods include ethnographic data collection, observations, surveys and questionnaires coupled with cognitive task analysis of the processes • Vankipuram (2010): RID tags: dual method, like ‘black box’ of aviation • Data trained and analyzed, visualized using virtual world replay • Israeli study: avgpt has 178 actions/day; errors 1% (Gawande, 2007); 2/day/pt

  23. Virtual World Replay Vankipuram, M., Kahol, K., Cohen, T., Patel, V.L. (2010).

  24. Cognitive Taxonomy of Error (Zhang and Colleagues: 2004) • Systematically categorize medical errors (individual level) • Better understand cognitive mechanisms of medical error • Framework to guide future studies • Interventions to decrease errors • Foundation for reporting system (to categorize, ID and generate solutions

  25. Errors • Reason: “an error is a failure of achieving the intended outcome in a planned sequence of mental or physical activities when that failure is not due to Chance” • Not all adverse events caused by error • E.g. device malfunction, • System problems (e.g. delays in care caused by organizational policies); not caused by individuals • Non-preventable: e.g. unpredictable drug reaction

  26. Cognitive Taxonomy of Error Zhang, J., Patel, L.V., Johnson, R. T., &. Shortliffe, H.E. (2004).

  27. Example of an Error and Questions It Raises (from Zhang, 2004) • Nurse tries to program infusion pump to deliver 130.1 ml/h, presses “1 3 0 . 1” • Nurse is unaware that decimal point only works for numbers up to 99.9 • Pump ignores decimal point key press and is programmed to deliver 1301 ml/h • Error blamed on user, ‘solved’ by more training

  28. Error Example (Cont’d – 1) • Questions: why did decimal point only work up to 99.9? (design flaw)? • Why does device just ignore decimal key press, rather than alerting user? • Why is nurse unaware of this flaw? • Was this problem covered in training? • Why was order written for 130.1, which pump cannot deliver? • Why did nurse not see 1301 on display? • Tiredness? Display hard to read? Understaffing?

  29. Error Taxonomy • 2 types: • slips that result from the incorrect execution of a correct action sequence and • mistakes that result from the correct execution of an incorrect action sequence. • Zhang et al: based on Norman’s 7-stage theory of action; errors can be on evaluation side as well as execution side

  30. Taxonomy Zhang, J., Patel, L.V., Johnson, R. T., &. Shortliffe, H.E. (2004).

  31. Examples From Zhang, 2004 1.1 Table: Patel, V.L, , & Cohen, T. (2008).

  32. Examples From Zhang, 2004 (Cont’d – 1) 1.2 Table: Patel, V.L, , & Cohen, T. (2008).

  33. Examples From Zhang, 2004 (Cont’d – 2) 1.3 Table: Patel, V.L, , & Cohen, T. (2008).

  34. Examples From Zhang, 2004 (Cont’d – 3) 1.4 Table: Patel, V.L, , & Cohen, T. (2008).

  35. Cognitive Interventions • Depend on type of slip or mistake • e.g. education, decision support, representational aid, information reduction, display design, device redesign • Aids for perceptual systems • Example: if intention slip due to loss of activation in memory, give memory aid (‘Press Volume to enter volume to be infused’).

  36. Errors - Context • Contrary to expectation, most errors can happen at times of low productivity • Personnel will arrange items for maximal functioning during peak times (Xiao, 1995)

  37. Designing for SafetySummary – Lecture b • This unit examine cognitive taxonomies in error and reviewed various studies on source of errors

  38. Designing for SafetyReferences – Lecture b References Cohen T, Blatter B, Almeida C, Patel VL. (2007). Reevaluating recovery: perceived violations and preemptive interventions on emergency psychiatry rounds. J Am Med Inform Assoc. 2007 May-Jun;14(3):312-9.) Gawande A. (2007). The checklist.  Retrieved  on September 10th, 2010 from The New Yorker, December 10 2007.  Available at http://www.newyorker.com/reporting/2007/12/10/071210fa_fact_gawande?currentPage=2 Hollnagel, E., Woods, D.D., and Leveson, N. (2006). Resilience engineering: concepts and precepts. Publisher, Ashgate Publishing Limited, Burlington, VT. Hollnagel, E., Woods, D.D., and Leveson, N. (2006). Resilience engineering: concepts and precepts. Publisher, Ashgate Publishing Limited, Burlington, VT. Malhotra S, Jordan D, Shortliffe E, Patel VL. Workflow modeling in critical care: piecing together your own puzzle. J Biomed Inform. 2007 Apr;40(2):81-92. Patel VL, Cohen T. (2008). Error in Critical Care. CurrOpinCrit Care. 2008 Aug;14(4):456-9 Pronovost, PJ,Jenckes, MW,Dorman, T., Garrett, E., Breslow, MJ, Rosenfeld, BA, Lipsett, PA, Bass, E. (19990. Introduction to patient safety research. JAMA. 1999;281(14):1310-1317. http://www.slideshare.net/changezkn/pronovost-ppt-918kb. Vankipuram M, Kahol, K, Cohen, T, Patel, VL. Toward automated workflow analysis and visualization in clinical environments. J Biomed Inform (2010) Xiao Y. Artifacts and collaborative work in healthcare: methodological, theoretical, and technological implications of the tangible. Journal of Biomedical Informatics. 2005 February 2005;38(1):26-33. Zhang, J., Patel, L.V., Johnson, R. T., &. Shortliffe, H.E. (2004). A cognitive taxonomy of medical errors. Journal of Biomedical Informatics 37:193–204

  39. Designing for SafetyReferences – Lecture b (Cont’d – 1) Images Slides 10, 11: Anatomy of an Error. (2016). Patientsafetyed.duhs.duke.edu. Retrieved 30 June 2016, from http://patientsafetyed.duhs.duke.edu/module_e/swiss_cheese.html Slide 12: Zhang, J., Patel, L.V., Johnson, R. T., &. Shortliffe, H.E. (2004). A cognitive taxonomy of medical errors. Journal of Biomedical Informatics 37:193–204 Slide 13: Patel, V.L, , & Cohen, T. (2008). Error in Critical Care. CurrOpinCrit Care. 2008 Aug;14(4):456-9. Slide 14: Cohen, T., Blatter, B., Almeida, C., Patel VL. (2007). Reevaluating recovery: perceived violations and preemptive interventions on emergency psychiatry rounds. J Am Med Inform Assoc. 2007 May-Jun;14(3):312-9. Slide 17: Malhotra, S., Jordan, D., Shortliffe, E., Patel, V.L. (2007). Workflow modeling in critical care: piecing together your own puzzle. J Biomed Inform. 2007 Apr;40(2):81-92. Slide 18: Jiajie, Z., Vimla, P.L., Johnson, T.R., Shortliffe. E.H. (2004).A cognitive taxonomy of medical errors. Journal of Biomedical Informatics 37 (2004) 193–204 Slide 21: Pronovost, PJ,Jenckes, MW,Dorman, T., Garrett, E., Breslow, MJ, Rosenfeld, BA, Lipsett, PA, Bass, E. (19990. Introduction to patient safety research. JAMA. 1999;281(14):1310-1317. http://www.slideshare.net/changezkn/pronovost-ppt-918kb. Slide 23: Vankipuram, M., Kahol, K., Cohen, T., Patel, V.L. (2010). Toward automated workflow analysis and visualization in clinical environments. J Biomed Inform(2010). Slide 26, 30: Zhang, J., Patel, L.V., Johnson, R. T., &. Shortliffe, H.E. (2004). A cognitive taxonomy of medical errors. Journal of Biomedical Informatics 37:193–204

  40. Usability and Human FactorsDesigning for Safety Lecture b This material was developed by Columbia University, funded by the Department of Health and Human Services, Office of the National Coordinator for Health Information Technology under Award Number 1U24OC000003. This material was updated by The University of Texas Health Science Center at Houston under Award Number 90WT0006.

More Related