1 / 35

Research in Human-Computer Interaction for VA Clinical Decision Support

Research in Human-Computer Interaction for VA Clinical Decision Support. July 15 th , 2008 Jason J. Saleem, PhD VA HSR&D Center on Implementing Evidence-Based Practice; IU Center for Health Services & Outcomes Research, Regenstrief Institute. Presentation Overview.

qamra
Télécharger la présentation

Research in Human-Computer Interaction for VA Clinical Decision Support

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Research in Human-Computer Interaction for VA Clinical Decision Support July 15th, 2008 Jason J. Saleem, PhD VA HSR&D Center on Implementing Evidence-Based Practice; IU Center for Health Services & Outcomes Research, Regenstrief Institute

  2. Presentation Overview • Brief overview of human factors and HCI • Human factors studies involving VA CPRS and computerized clinical reminders • Current work: AHRQ grant (Doebbeling, PI); “Improving Integration of CDS into Workflow” • Human-Computer Interaction (HCI) / IT Lab • Questions / Discussion

  3. Human Factors Engineering • Study of human physical and cognitive capabilities and limitations, and application of that knowledge to system design • Design of interfaces between people and technology • Human-machine interface technology • Human-environment interface technology • Human-job interface technology • Human-software interface technology • Human-computer interaction • Human-organization interface technology

  4. Common Human Factors Methods • Usability study • Performance-based (time on task, error rates) • Scenario-based (think aloud technique) • Simulation study • Heuristic evaluation • Cognitive task analysis • Card sorting • Ethnography / naturalistic observation • Kushniruk and Patel – methods review in Journal of Biomedical Informatics - 2004

  5. Human Factors Studies – VA Computerized Clinical Reminders • Observational Field Study • Barriers and facilitations to clinical reminder use • Saleem JJ, Patterson ES, Militello L, Render ML, Orshansky G, Asch SM. Exploring barriers and facilitators to the use of computerized clinical reminders. J Am Med Inform Assoc. 2005;12(4):438-47. • Follow-up Simulation Study • A vs B comparison study of redesign recommendations • Saleem JJ, Patterson ES, Militello L, et al. Impact of clinical reminder redesign on learnability, efficiency, usability, and workload for ambulatory clinic nurses. J Am Med Inform Assoc. 2007;14(5):632-40. This work funded by: VA HSR&D Merit Review grant (TRX 02-216): “Human Factors and the Effectiveness of Computerized Clinical Reminders” Principal Investigators: Emily S. Patterson, PhD and Steven M. Asch, MD, MPH 7/03 – 6/06

  6. Observational Field Study • Ethnographic, “naturalistic” observation • Non-intrusive; shadowing of nurses and physicians • 3 observers, 4 VA hospitals, 2 days/site • Capture observable activities and verbalizations • Self-report data about how artifacts (tools) support or hinder performance • Qualitative field data Jason J. Saleem, PhD

  7. Field Study Participants Jason J. Saleem, PhD

  8. Field Study Results • Barrier 1: Coordination Between Nurses and Providers • Barrier 2: Satisfying Reminders While Not With the Patient • Barrier 3: Workload • Barrier 4: Lack of Flexibility • Barrier 5: Poor Usability Jason J. Saleem, PhD

  9. Facilitators • Limit the number of reminders • Position computer workstations strategically • Improve integration of reminders into clinical workflow • Feedback mechanism . Jason J. Saleem, PhD

  10. Simulation Study • A vs B comparison study • 16 non-VA nurses • Hypotheses: A redesigned interface compared to the current design will: • Have greater learnability • Have increased efficiency • Have a lower perceived workload • Have greater perceived user satisfaction Saleem JJ, Patterson ES, Militello L, et al. Impact of clinical reminder redesign on learnability, efficiency, usability, and workload for ambulatory clinic nurses. J Am Med Inform Assoc. 2007;14(5):632-640.

  11. *Fictitious patient record*

  12. Design changes *Prototype* *Fictitious patient record*

  13. Dependent Measures

  14. NASA Task Load Index (TLX)

  15. Results - Learnability A = current CR design B = redesigned prototype • Time limit = 300 sec (5 min) • Learnability is statistically significant; time to satisfy a clinical reminder with design B (redesign) significantly less than time with design A (current system) • t-test • t stat = 4.365 • t.05, 14 = 1.761 • p < 0.001

  16. Results - Efficiency • Paired t-test revealed users completed 2 of 5 patient scenarios with significantly less time for Design B (p<0.05).

  17. Results – Workload: Mental Demand Subscale • Mental demand with design B rated significantly less than with design A • Means: A = 39.6, B = 34.2 • Paired t test • t stat = 1.840 • t.05, 15 = 1.753 • p = 0.043

  18. Results – Workload: Frustration Subscale • Frustration with design B rated significantly less than with design A • Means: A = 33.8, B = 27.7 • Paired t test • t stat = 2.040 • t.05, 15 = 1.753 • p = 0.030

  19. Usability Questionnaire • Likert-type scale (1=strongly disagree; 7=strongly agree) • Sample questions: • The organization of the information on the systems screens is clear. • The display layouts simplify tasks. • The sequence of displays is confusing. See Saleem et al. JAMIA. 2007;14(5):632-640 for complete results

  20. Implications for Design • Modest design modifications to better integrate clinical reminders with CPRS increased time to reach proficiency in reminder use • This could potentially impact the willingness for new users to adopt and consistently use clinical reminders

  21. Conclusions • Human factors methods should be routinely used to rapidly collect data to support design decisions formatively (i.e., prior to implementation) • improve user performance and usability • reduce cost by addressing design issues pre-implementation • This model is still not widely adopted in healthcare

  22. AHRQ Grant – Improving Integration of CDS into Workflow (Doebbeling, PI) • AHRQ ACTION Collaborative • Improving Quality Through Health IT (RFTO#8) • CDS for Colorectal Cancer Screening • Team: • Brad Doebbeling, MD, MSc • David Haggstrom, MD, MAS • Jason Saleem, PhD • Laura Militello, MA • Heather Hagg, MS • Jeff Linder, MD, MPH • Paul Dexter, MD • Brian Dixon, MPA • et al.

  23. Study Objectives • Identify key approaches for effective integration of CDS for colorectal cancer screening into clinical workflow. • Test alternatives through controlled simulation analysis. • Evaluate improved CDS design after subsequent implementation (at a VA test-site).

  24. Phase 1 Phase 2 Phase 3 Key Informant Interviews of site-specific best-practices for integration of colorectal cancer screening CDS into workflow Rapid Prototyping of CDS design alternatives based on Phase 1 findings Implementation in primary care clinic after simulation study Direct Observation of colorectal cancer screening CDS for barriers and facilitators to workflow integration Simulation Study to test impact of CDS design alternatives on efficiency, usability, and workload Evaluation in primary care clinic after simulation study Figure. Project Overview

  25. Phase 1 - Observational Sites • VAMC – West Haven, CT • CRC screening computerized clinical reminder • VAMC – Columbia, SC • OncWatch CDS (clinical reminder and management tool) • Divides patients (based on the data) into four different cohorts, defined by their risk/needs • Creates a “fail safe” system to identify patients and ensure follow-up recommendations are being fulfilled • Regenstrief Institute – 2 Indianapolis clinics • Encounter form reminders for CRC screening • Partners Healthcare – Brigham & Women’s Hospital • Previous failed attempts to implement CRC screening CDS: “no easy way to feed back that an adequate colonoscopy was done and was normal” • New CRC screening tool being piloted this year • Phases 2 & 3 – 2009

  26. Summary • Clinical reminder studies and current work with AHRQ on CDS workflow integration: • Qualitative field observation followed by scenario-driven, comparative usability testing of experimental prototypes in a simulated setting • Clinical software development may benefit from this paradigm if more widely followed • Acknowledgments: • Emily S. Patterson, PhD • Steven M. Asch, MD, MPH • Marta L. Render, MD • Bradley N. Doebbeling, MD, MSc

  27. Human-Computer Interaction (HCI) / IT Lab (Indianapolis VAMC) • Operational – May 2008 • Rapid prototyping of design alternatives • Usability testing / simulation study • Specialized software and data collection equipment • VA network & University network • Grad students specializing in usability and human-computer interaction

  28. Perfect Timing? Just published: • Sittig DF, Wright A, Osheroff JA, Middleton B, Teich JM, Ash JS, Campbell E, Bates DW. “Grand Challenges in Clinical Decision Support v11”, Journal of Biomedical Informatics, 2008;41(2):387-92. • #1 Ranked challenge in clinical decision support: “Improve the human-computer interface”. • Paper was written to influence funders and policy makers

  29. 47 inch monitor 17” monitor & player for researcher’s office Portable DigiCam w/WA adapter Human-Computer Interaction Information Technology Laboratory (HCI / IT LAB) D-5014 • VA Desktops • Visual Basic • VA Laptop • Morae • Snag-it • IU Desktops • Visual Basic • Azure X 1PC • IU Laptop • Morae • Snag-it - VA LAN OUTLET - IU LAN OUTLET

  30. Current Project Portfolio for HCI / IT Lab; affiliated projects Funded: • VA STP: “Factors Influencing Effective Implementation of My HealtheVet” (Chumbler, PI) • AHRQ RFTO#8 HIT: “Implementing and Improving the Integration of Decision Support into Outpatient Clinical Workflow” (Doebbeling, PI) • VA CDA: “Colorectal Cancer Survivor Surveillance Care and Personal Health Records” (Haggstrom) • Purdue– Lilly Seed Grant: “Integrating Pharmacogenomic-guided Dosing into Clinical Practice” (Overholser, PI) • Doctoral Dissertation: “Impact of Information Flow and Prioritization on the Use of Computerized Clinical Reminders” (Wu) In Process: • VA IIR for June ‘08: “The Effects of Exam Room Computing on Patient-Centered Communication” (Frankel, PI) • VA IIR for Dec ‘08: “Barriers and Facilitators to Providers’ Adoption of My HealtheVet” (Chumbler, PI) • VA CDA or IIR for Dec ‘08: “Circumventing Health IT with Paper: Identifying Patient Safety Risks” (Saleem)

  31. Focus Areas • Rapid prototyping of CDS and other Health IT • Applied usability studies • Research experiments through simulation (as part of a broader research agenda, including field study) • Knowledge Management capabilities • Prototype KM strategies

  32. Goals • Become a warehouse for user-centered design • Provide early input through rigorous human factors methods for health IT software and hardware design • Become a resource for healthcare informatics research • Design research studies to improve health IT prior to implementation • Become a bridge for collaboration VHA-wide and external to VA • Partnerships with RI, IU School of Informatics, Purdue, etc. • Partnership with Roudebush VA and VISN 11 CIO on local projects; • Partnership with VA Office of Information & Technology on national-level projects • Partnership with non-VA research groups on broad health IT projects (e.g., CDS Consortium with Partners Healthcare, Regenstrief, etc.) • Actively seek strategic teaming opportunities for collaborative research to transfer knowledge out of the Lab to meet real world challenges

  33. Opportunities / Partnerships • Resource for Center Investigators to conduct studies as part of an overall research proposal or CDA • Pilot studies for rapid data collection on usability and piloting of new technology to support future proposals • Demonstration of Center’s IT research capabilities to visitors / collaborators • Graduate students, internships to support work in the Lab • Potential partnership with VA Office of Information & Technology to provide input during clinical applications development, usability testing and assess impact on patients and providers • Potential partnership with Roudebush VA and VISN 11 CIO on local projects

  34. Contact Jason J. Saleem, PhD Human Factors Engineer VA Center on Implementing Evidence-Based Practice IU Center for Health Services & Outcomes Research, Regenstrief Institute jsaleem@iupui.edu

More Related