1 / 96

Evidence Based Chronic Disease Prevention Module 8: Evaluating The Program Or Policy

Evidence Based Chronic Disease Prevention Module 8: Evaluating The Program Or Policy. Presented by: Karen Peters, DrPH. Objectives. Understand the basic components of program evaluation. Describe the differences and unique contributions of quantitative and qualitative evaluation.

chione
Télécharger la présentation

Evidence Based Chronic Disease Prevention Module 8: Evaluating The Program Or Policy

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evidence Based ChronicDisease Prevention Module 8: EvaluatingThe Program Or Policy Presented by: Karen Peters, DrPH

  2. Objectives • Understand the basic components of program evaluation. • Describe the differences and unique contributions of quantitative and qualitative evaluation. • Understand the concepts of measurement validity and reliability. Evidence Based Chronic Disease Prevention

  3. Develop statement of the issue Determine what is known in the literature Dissemination Quantify the issue Evaluate the program or policy Develop program or policy options Develop an action plan Evidence Based Chronic Disease Prevention

  4. Overview • The “what’s” and “why’s” of evaluation • Study designs and measurement issues • Types of evaluation • Quantitative • Qualitative • Organizational issues in evaluation • Infeasible to provide in-depth discussion mechanics • e.g., sampling/instrument development • Can show basic components in an evaluation, what to look for, where to turn for help Evidence Based Chronic Disease Prevention

  5. What is Evaluation? • “A process that attempts to determine as systematically and objectively as possible the relevance, effectiveness, and impact of activities in light of their objectives.” • From: Last JM. A Dictionary of Epidemiology. Third Edition. New York: Oxford Press; 1995. • Complex and diverse field Evidence Based Chronic Disease Prevention

  6. Some questions to consider? • What are the most significant challenges you face in program evaluation? • In program planning when should you begin planning an evaluation? Evidence Based Chronic Disease Prevention

  7. Why Evaluate? • Public health agencies need to be accountable • Basis for choices when resources are limited • Helps to determine costs and benefits • Source of information for improving programs and policies • Increasingly mandated by funders/policy makers • Convincing funders is not always easy • Leads to research questions that can be tested in other studies/programs Evidence Based Chronic Disease Prevention

  8. New Directions • Social programs have become more broad, complex and interactive • Seek to bring about changes in community capacity, social support, decision-making, control over resources and individual behavior • Time to supplement traditional strategies with new approaches reflecting complexity of community-based initiatives Evidence Based Chronic Disease Prevention

  9. New Directions • More flexible evaluation approaches can play role in building community capacity and self determination • Need to re-direct program evaluation toward community-based public health values • Traditional evaluation conducted by ‘experts’ to determine if program objectives met, strengths/weaknesses/replicability and contribution to scientific knowledge Evidence Based Chronic Disease Prevention

  10. New Directions • Some evaluators believe communities lack skills to design, engage in and interpret evaluations • However, ‘experts’ may lack insight/ flexibility needed to capture ‘essence’ of community projects or to answer questions raised by communities and CBOs Evidence Based Chronic Disease Prevention

  11. New Directions • Community-based evaluation perspective involves more participatory and inclusive process that incorporates the values, knowledge, expertise and interests of the community and uses evaluation as a tool for community capacity building • Community involved as full/equal partner allows for development of more ‘relevant’ program success measures and produces data that are useful in community settings Evidence Based Chronic Disease Prevention

  12. New Directions • Evaluation is 1 part of broader planning process but can help in: • reflecting on progress • document where going, where coming from • share what worked and what did not with other communities • demonstrate need for targeted resources to address community issues • illustrate impact of community based initiatives to decision/policy makers • provide information on developing meaningful community based indicators Evidence Based Chronic Disease Prevention

  13. Research Phases Model(Greenwald and Cullen, 1985) • Phase 1: Hypothesis Development • Ex: Link between sedentary behavior - obesity • Phase 2: Methods Development • Pilot test of intervention to increase physical activity and validation of measures • Phase 3: Controlled Intx Trial (Efficacy) • Small scale randomized trial of physical activity • Phase 4: Defined Population (Effectiveness) • Larger scale trial of PA in populations of interest • Phase 5: Demonstration (Dissemination) • Evaluation of results of PA program Evidence Based Chronic Disease Prevention

  14. Evaluation Models • For Practitioners: • Models identify key factors to consider when developing/selecting health behavior programs • Factors to focus on when reading the literature • For Researchers: • Models identify important dimensions to be included in program evaluations Evidence Based Chronic Disease Prevention

  15. Evaluation Models • 2 Major Frameworks (there are many others) • For Process and Formative Evaluation • CDC Framework for Public Health Programs • For Impact and Outcome Evaluation • RE-AIM Framework Evidence Based Chronic Disease Prevention

  16. Framework for Program Evaluation • 2 year process by CDC • designed as a framework for ongoing, practical program evaluation • can be integrated with routine program program operations • involves input from program staff, community members, other stakeholders, not just evaluation experts • Involves 6 basic steps and 4 broad evaluation standards Evidence Based Chronic Disease Prevention

  17. Broad Evaluation Framework (ala CDC)* • Engage stakeholders • Describe the program • Focus the evaluation design • Gather and analyze evidence • Justify conclusions • Ensure use and share lessons learned *CDC Framework for Program Evaluation in Public Health, 9/17/99 and Center for Advancement of Community Based Public Health, June 2000 Evidence Based Chronic Disease Prevention

  18. CDC Framework for Evaluation • 4 Evaluation standards • Guidelines that can help assess whether an evaluation is well designed • Utility: Is the evaluation useful? • Does the evaluation answer questions that are relevant to the stakeholders • Feasibility: Is the evaluation practical? • Is the evaluation realistic and cost-effective Evidence Based Chronic Disease Prevention

  19. CDC Framework for Evaluation • Propriety: Is the evaluation ethical? • Does the evaluation consider the rights and interests of those involved and affected • Accuracy: Is the evaluation correct? • Do the evaluation findings convey information that is correct and technically adequate Evidence Based Chronic Disease Prevention

  20. CDC Framework for Evaluation • Step 1: Engage the stakeholders • Stakeholders - those involved in implementing the program and those served or affected by it including decision-makers who can do something with the results • Standards for step 1: Utility and Propriety • Utility: Have you identified individuals and organizations affected by the evaluation? • Are those involved in the evaluation trustworthy and competent? Evidence Based Chronic Disease Prevention

  21. CDC Framework for Evaluation • Propriety: • Is there an explicit, written agreement about what is to be done, how, by whom and when? • Does the evaluation design protect the rights and welfare of those involved? • Are the individuals who are conducting the evaluation interacting respectfully with stakeholders? • Have you discussed conflicts of interest openly and honestly? Evidence Based Chronic Disease Prevention

  22. CDC Evaluation Framework • Step 2: Describe the program • summarize the program being evaluated with a statement of need which includes expectations, a logic model, specifying resources available to conduct program activities and how program fits into larger organizational/community context • A good description allows the program to be compared to similar efforts and makes it easier to figure out what parts brought about what effects Evidence Based Chronic Disease Prevention

  23. CDC Evaluation Framework • Standards for Step 2: Accuracy/Propriety • Accuracy: Have you clearly and accurately described the program • Have you documented the program context • Propriety: Is the evaluation complete and fair • Does it assess program strengths and weaknesses Evidence Based Chronic Disease Prevention

  24. CDC Evaluation Framework • Step 3: Focus the evaluation design • Specify the evaluations overall intent • Determine who and what the evaluation is for (users and uses) • What questions the evaluation should examine • What methods are best to answer the questions Evidence Based Chronic Disease Prevention

  25. CDC Evaluation Framework • Purpose of evaluation depends on program’s stage of development • New or developing program - feasibility of intervention approach • Implementation - fine tuning or changes needed • Established program - assess program effects • Standards for Step 3: • Feasibility • Propriety • Accuracy Evidence Based Chronic Disease Prevention

  26. CDC Evaluation Framework • Feasibility • Have you considered the political interests/needs of groups and obtained ‘buy-in’ • Does the information produced justify the costs • Are evaluation procedures practical • Propriety • Does evaluation design help to identify needs • Are costs guided by sound/ethical accountability procedures • Accuracy • Is there an accurate description of evaluation purposes and procedures Evidence Based Chronic Disease Prevention

  27. CDC Evaluation Framework • Step 4: Gather credible evidence • Need well rounded picture of the program • Develop indicators that translate program concepts into specific measures • Use multiple sources of evidence that reflect different perspectives about the program • Techniques used to gather and handle evidence should be compatible with cultural conditions in each program setting Evidence Based Chronic Disease Prevention

  28. CDC Evaluation Framework • Standards in Step 4: Utility and Accuracy • Utility: Are you collecting information that addresses pertinent program issues and is responsive to stakeholder needs • Accuracy: Have you adequately described your sources of information • Do data collection procedures address internal validity and reliability issues • Is there a system in place for identifying and correcting errors Evidence Based Chronic Disease Prevention

  29. CDC Evaluation Framework • Step 5: Justify Conclusions • Involves making claims about a program based on evidence gathered • Stakeholder values provide basis for making judgements about program merits/performance • Conclusions are based on analysis,synthesis and interpretation of information to detect patterns and result in recommendations • Reaching good conclusions requires variety of stakeholder perspectives Evidence Based Chronic Disease Prevention

  30. CDC Evaluation Framework • Standards for Step 5: Accuracy/Utility • Accuracy: Has data analysis process been effective in answering key evaluation questions • Can you explicitly justify your conclusions • Utility: Have you carefully described the perspectives, procedures and rationale used to interpret the findings Evidence Based Chronic Disease Prevention

  31. CDC Evaluation Framework • Step 6: Ensure use, share lessons learned • Make sure stakeholders understand the evaluation procedure and findings • All participants should have the opportunity to provide feedback • Evaluators should provide any needed follow-up • Use a variety of communication strategies to disseminate evaluation results in a timely and unbiased manner Evidence Based Chronic Disease Prevention

  32. CDC Evaluation Framework • Standards for Step 6: Utility, Propriety, Accuracy • Utility - do evaluation reports describe the program context, purpose, procedures and findings clearly? • Propriety - Have evaluators made sure findings (including limitations) are accessible to everyone affected by the program • Accuracy - do evaluation reports reflect the findings fairly and impartially? Evidence Based Chronic Disease Prevention

  33. RE-AIM Evaluation FrameworkGlasgow, Vogt, Boles, 1999Glasgow, McKay, Piette, Reynolds, 2001 • Reach, Efficacy or Effectiveness (depending on research phase), Adoption, Implementation, Maintenance • RE-AIM relies on 2 comprehensive models • PRECEDE-PROCEED (Green & Kreuter 1999) • Diffusion Theory (Rogers 1995; Nutbeam 1996) Evidence Based Chronic Disease Prevention

  34. Dimensions of RE-AIM • Reach • Individual Level • What % of potentially eligible participants will take part • How representative are they? • Efficacy or Effectiveness • Individual Level • What was the impact on all who began? • What was the impact on intermediate & primary outcomes? • What was the positive/negative (unintended) outcomes, including quality of life? Evidence Based Chronic Disease Prevention

  35. Dimensions of RE-AIM • Adoption • Setting Level • What % of settings/intervention agents will participate? (worksites, schools, educators, nurses) • How representative are they? • Implementation • Setting or Agent Level • To what extent were the intervention components delivered as intended (in the protocol), when conducted in applied settings by non researchers Evidence Based Chronic Disease Prevention

  36. Dimensions of RE-AIM • Maintenance • Individual and Setting Levels • Individual Level: What are the long term effects (minimum 6-12 months following intervention)? • Setting Level: To what extent are different intervention components continued or institutionalized? Evidence Based Chronic Disease Prevention

  37. Common Challenges and Suggested Strategies Using RE-AIM Framework • Reach • Challenge: Not including a relevant, high risk, or representative sample • Strategy: Use population-based recruitment or over sample high risk groups, reduce exclusion criteria • Efficacy or Effectiveness • Challenge: Ambiguous outcomes • Strategy: Assess broader set of outcomes, conduct subgroup analyses, use different assessment points Evidence Based Chronic Disease Prevention

  38. Common Challenges and Suggested Strategies Using RE-AIM Framework • Adoption • Challenge: Program never adopted or endorsed - used only in academic settings • Strategy: Involve participants in all phases, approach numerous settings early on while revision is still possible • Implementation • Challenge: Protocols not delivered as intended (Type III error) • Strategy: Assess treatment as too complicated, intensive, incompatible; involve non-researchers Evidence Based Chronic Disease Prevention

  39. Common Challenges and Suggested Strategies Using RE-AIM Framework • Maintenance • Challenge: Program or effects not maintained over time • Strategy: Include maintenance phase in protocol and evaluation plans; Leave treatment behind after study and plan for institutionalization Evidence Based Chronic Disease Prevention

  40. Common Study/Evaluation Designs • Experimental/randomized • Quasi-experimental • Time-series • Use of existing data Evidence Based Chronic Disease Prevention

  41. Study/Evaluation Designs • Quasi-experimental • Increasing attention • At least one intervention and one comparison group, without randomization • Appeal of intervening thru intact social groups • See Koepsell chapter in readings (in Module 4) Evidence Based Chronic Disease Prevention

  42. Study/Evaluation Designs • Use the best designs feasible • Pre- post-data • Comparison groups • Complete program records • Reliable and valid measures • Proper analytic techniques • Review principles/tools from Goodman, page 39 (in Module 8) Evidence Based Chronic Disease Prevention

  43. Challenges in Community:Wide Studies • Varying degrees of intervention “exposure” • Running programs in multiple locations • Accounting for community-level variance • Lack of sensitivity of the “community” • Concepts of participatory research • Equity, collective decision making • High-quality, ethical research • Addressing social inequalities • Maximize learning opportunities • See Goodman article on community capacity Evidence Based Chronic Disease Prevention

  44. Challenges in Community:Wide Studies • Community-level variance • Individuals in communities, neighborhoods, schools, worksites are correlated • ICC (intra-class correlation coefficient) • People have related characteristics (not independent) • In practical terms, means increased chance of Type I error (saying there is a difference when there really is not) Evidence Based Chronic Disease Prevention

  45. Measurement Issues • Components of a “good” evaluation • Adequate sample size • High validity • High reliability • Sample size considerations • Number of communities • Number of individuals per community • Increasing number of communities versus number of individuals per community • Can rely on simple, accessible programs like Epi Info Evidence Based Chronic Disease Prevention

  46. Concepts of Validity and Reliability and their Importance: • Measurement Issues • Evaluation “threats”: • Validity • Is the instrument or design measuring exactly what was intended? (Self report vs.... biologic test) • Reliability • Is the measurement being conducted consistently? (Face to face vs...... telephone, different interviewers) Evidence Based Chronic Disease Prevention

  47. Measurement Issues • Validity: best available approximation to the “truth” • Internal Validity • The extent of causality (the effects are really attributable to the program) • External Validity • The extent of generalizability • Importance?? (so what???) Evidence Based Chronic Disease Prevention

  48. Measurement Issues: • Major threats to validity* • Low statistical power • Violated assumptions in statistical tests • Reliability of measures • Reliability of treatment implementation • Random confounders in the experiment • Random heterogeneity of respondents * adapted from Cook and Campbell, 1979 Evidence Based Chronic Disease Prevention

  49. Measurement Issues: • Reliability (repeatability) • Consistency in measurement • Multiple types • Inter-observer • Test-retest • Internal consistency Evidence Based Chronic Disease Prevention

  50. Measurement Issues: Examples • Validity • Self-reported rate of having a health professional check for hemoglobin ‘A1C’ among diabetics in an intervention program compared with clinic records • Reliability • Test-retest data from the BRFSS on self-report of seeing a health care professional in the last year for diabetes among diabetics in Illinois Evidence Based Chronic Disease Prevention

More Related