1 / 21

QUERI National Meeting Working Group December 12, 2008

Measuring QI Intervention Implementation: Helping the Blind Men See? EQUIP (Evidence-Based Practice in Schizophrenia ). QUERI National Meeting Working Group December 12, 2008. QI Intervention Example. EQUIP (Enhancing QUality of care In Psychosis)

addo
Télécharger la présentation

QUERI National Meeting Working Group December 12, 2008

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Measuring QI Intervention Implementation: Helping the Blind Men See?EQUIP (Evidence-Based Practice in Schizophrenia ) QUERI National Meeting Working Group December 12, 2008

  2. QI Intervention Example • EQUIP (Enhancing QUality of care In Psychosis) • evidence-based quality improvement to implement effective care in specialty mental health • Alex Young, MD & Amy Cohen, PhD (Co-PIs)

  3. EQUIP Effective Schizophrenia Care • Evidence base: • TMAP • EQUIP-1 Provider/patient education Quality manager EBQI QI Informatics support Performance feedback “infrastructure” “priority-setting” Leadership support

  4. Context Matters: Design for It • EQUIP • 4 VISNs: intervention and control site in each VISN • sites chosen collaboratively based on interest • Each VISN asked to select evidence-based care targets for intervention: all selected Wellness & Supported Employment • Availability & quality of these care targets vary across sites • Structure of care for patients with schizophrenia varies across sites • Formative evaluation methods utilized to understand variable implementation

  5. What is Formative Evaluation? • Formative evaluation=assessment process designed to identify potential and actual influences on the progress and effectiveness of implementation efforts • Data collection occurs before, during, and after implementation • Need to be able to answer questions about context, adaptations, and responses to change

  6. Four Stages of Formative Evaluation • Developmental evaluation • Implementation-focused evaluation (process evaluation) • Progress-focused evaluation • Interpretive evaluation

  7. Simpson Transfer Model

  8. Stages of FE (STM) & EQUIP FE Measures Pre-Implementation (STM: Exposure & Adoption) Post-Implementation (STM: Practice) Implementation (STM: Implementation) • Developmental • Field notes • Documents (minutes, etc.) • ORC & Burnout Inventory • Key stakeholder interviews • Implementation-Focused • Field notes • Quality Coordinator logs • Documents • Key stakeholder interviews • Interpretive • Field notes • Key stakeholder interviews • ORC & Burnout Inventory • Progress-Focused • QI tools

  9. Multiple Data Sources: Measuring Implementation

  10. Multiple Data Sources:Strengths and Challenges

  11. EQUIP Organizational Climate Measures • Organizational Readiness for Change (ORC): Staff and Administrator versions • Maslach Burnout Inventory • On-line measure • Pre- and post-implementation

  12. Organizational Readiness for Change • Using scales related to: • Motivation for change (program needs, training needs, pressures for change) • Staff attributes (growth, adaptability) • Organizational climate (mission, cohesion, autonomy, communication, change) • Purpose is descriptive & to assess change in readiness from pre- to post-implementation

  13. EQUIP Semi-Structured Interviews • Conducted pre-, mid-, and post-implementation • Versions for providers, administrators, and VISN leaders • Covered in consent • Face-to-face recorded interviews • Professionally transcribed • Analyzed after each round

  14. EQUIP Participant Observation: Field Journal • Primary method of capturing data from observant participation • “If you didn’t write it down in your field notes, then it didn’t happen.” (at least in terms of data analysis) • 3 kinds of notes • Records of events observed and information given • Records of prolonged activities • Chronological daily diary

  15. EQUIP Quality Coordinator Logs • Submitted monthly by RN Quality Coordinator • What % of time was spent on each aspect of clinical intervention • Will be able to look across sites to see variation in time spent on clinical activities; can see if this relates qualitatively to implementation at each site

  16. Critical Measures of Implementation • Integrity of innovation • Fidelity to planned implementation strategy • Dose of intervention delivery, when variability is possible • Requires clear operational definitions of intervention components • Exposure to innovation • Degree to which intervention is experienced by targeted users • Dose of exposure, when variability is possible • Requires clear operational defs for measuring intervention exposure • Intensity of implementation • E.g, implementation or intensity scores for multifaceted interventions • Eg, ‘goal attainment scaling’ when strategy allows local adaptation or choice of alternative interventions across sites

  17. Triangulation • Critical to collect information about implementation from multiple sources • Be prepared for disagreement • Perspectives and opportunities for observation differ for managers, providers vs. patients • Recognize differences between “exposed” sample and practice population • Does the “enrolled” group represent the practice? • Did the intervention penetrate among all providers?

  18. Telling the story of variable implementation • Examine range of data sources as a team • Throughout course of data collection • Discuss which data sources answer which questions • Examine which data sources are complementary • Which data sources should be triangulated? • What questions are raised or what answers are provided?

  19. Telling the story of variable implementation • Use qualitative data analysis software to facilitate mixed methods analysis • Multiple data sources • Multiple grouping options (e.g., by site, by stakeholder, by data collection time points) • Team-based analysis • Ongoing, iterative analysis informing implementation efforts

  20. Software Support: ATLAS.ti

  21. Telling the story of variable implementation • Audience considerations • Throughout course of data collection • Which data sources answer which questions, for whom • Issue of providing feedback to sites • Product considerations • Which data sources should be triangulated? • What questions are raised or what answers are provided? • How much and what should go into which products?

More Related