1 / 10

Presented by: Gail V. Barrington, PhD, CMC Barrington Research Group, Inc.

Evaluation of Health Canada’s Hepatitis C Program: Engaging Stakeholders in the Evaluation Process. Presented by: Gail V. Barrington, PhD, CMC Barrington Research Group, Inc. National Hepatitis Coordinators’ Conference Program Evaluation for Viral Hepatitis Integration Projects Workshop

simpsonf
Télécharger la présentation

Presented by: Gail V. Barrington, PhD, CMC Barrington Research Group, Inc.

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluation of Health Canada’s Hepatitis C Program: Engaging Stakeholders in the Evaluation Process Presented by: Gail V. Barrington, PhD, CMC Barrington Research Group, Inc. National Hepatitis Coordinators’ Conference Program Evaluation for Viral Hepatitis Integration Projects Workshop San Antonio, Texas January 30, 2003

  2. Key Principles of the Evaluation Process • Wide stakeholder engagement from the evaluation design to Program recommendations • Consultation with hepatitis C experts • Guided by a Program logic model and Data Collection Matrix • Use of traditional social science and applied research methods • Peer reviews of the final report

  3. Initial Consultations • Preliminary interviews with all regional Program staff • Informal telephone interviews conducted by the Project Director • Project Director “listened in” on Program teleconferences early in the evaluation • Purpose: • To build rapport and acclimatize regional staff to the evaluation and the Evaluators • To understand the regional perspective of the evaluation • Two hepatitis C experts were invited to educate Barrington Research staff on hepatitis C issues

  4. Instrument Design • Each survey/interview question was designed to address an evaluation question from the Data Collection Matrix (DCM) • The DCM was based on the Program Logic Model • This use of the Logic Model and DCM helped to focus the questions asked in the surveys/interviews • Survey/interview tools were also reviewed by stakeholders to ensure question relevancy, appropriateness, and comprehensiveness • Health Expert Survey • Various case study tools • Community-Based Support Implementation and Outcome Achievement Survey • Researcher Survey

  5. Sample Selection • Because this was a formative evaluation, participants from many groups were to be surveyed to gain a broad overview of the Program • Purposive sampling best addressed the need to collect data from populations of an unknown size (e.g., “health experts”) • As resources were limited, this sampling approach was appropriate Key informants were used to identify potential participants: • Health Canada – National and regional Program staff and other stakeholders at the national/ regional levels • Hepatitis C experts – Health Expert Survey • Canadian Institutes for Health Research (research fund manager) – Researcher Survey

  6. Sample Selection (cont’d) Local level • Case studies in 7 sites across Canada to: • Access those infected with/affected by hepatitis C, community-based support projects funded by the Program, other hepatitis C service providers in the community • Explore Program implementation at the micro level • Methodology based on the work of theorists (Yin & Chelimsky) • Sites selected in consultation with Health Canada staff and based on criteria such as regional representation, service to a priority population, project type and willingness to participate

  7. Data Collection • Reviewed Program documents • Survey/interview data collection proceeded smoothly due to: • Support of stakeholders • Respectful treatment of participants • Understanding of community agencies • Multiple response options for each instrument (in-person or phone interviews; e-mail, mail or fax surveys) • Thank you cards, gifts to the participating case study sites and incentives (grocery coupons) to clients interviewed

  8. Analysis & Write-Up • Used SPSS and N-Vivo for data analysis • Utility of the findings kept in mind—”What does this Program need?” • Data Collection Matrix used to guide qualitative analysis and the “triangulation” of all data sources • All data sources that addressed an evaluation question were compiled in a Data Summary • Similarities and differences across groups of respondents were compared • Findings that are included in the final report represent themes that stood out across all groups of respondents and/or documents

  9. Analysis & Write-Up (cont’d) • Case studies: • Research team met to draft a report template (series of questions) • Team leaders organized case study data using the report template • Completed template given to Project Director for case study write-up • Preliminary review of each case by research team • Review of each case by site coordinator • Case study signed-off by site prior to distribution to Health Canada • Health Canada regional staff reviewed and made final changes • Process increased stakeholder ownership of the findings • Final Report: • Guided by the Logic Model for report structure • Peer reviewers helped clarify & focus the report • Various national and regional stakeholders reviewed the draft report

  10. Lessons Learned • The evaluation identified lessons learned and highlighted Program strengths and weaknesses • Through these insights, the Evaluators were able to propose 19 recommendations • Health Canada staff were consulted on the wording of the recommendations to facilitate their implementation

More Related