1 / 36

Improving Local Evaluation through Training and Technical Assistance: Wisconsin’s Strategy

Improving Local Evaluation through Training and Technical Assistance: Wisconsin’s Strategy. Mary D. Michaud, MPP Ellen Taylor-Powell, PhD Bonita Westover, MSPH University of Wisconsin-Extension. Acknowledgements .

davida
Télécharger la présentation

Improving Local Evaluation through Training and Technical Assistance: Wisconsin’s Strategy

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Improving Local Evaluation through Training and Technical Assistance: Wisconsin’s Strategy Mary D. Michaud, MPP Ellen Taylor-Powell, PhD Bonita Westover, MSPH University of Wisconsin-Extension

  2. Acknowledgements The authors wish to thank staff and members of the following organizations who are making Wisconsin's strategy for providing local program development and evaluation possible: Members of Wisconsin's 77 Tobacco-Free Coalitions Wisconsin Tobacco Control Board State of WI Department of Public Health UW-Comprehensive Cancer Center UW-Center for Health Policy and Program Evaluation UW-Center for Tobacco Research and Intervention UW-Cooperative Extension

  3. What we will cover • Overview of our Wisconsin initiative • Typical questions we receive • Real examples of incorporating evaluation into coalition activities • Using a logic model to guide long-range planning as prerequisite for useful evaluation

  4. Background Wisconsin Tobacco Control Board • Comprehensive program • 5 year goals • Commitment to evaluation • Monitoring and Evaluation Program (MEP)

  5. Local Program Evaluation GOAL: Build capacity in program development and evaluation enabling local coalitions to effectively design, implement and assess tobacco control programs • What are you doing? • What difference is it making for reducing tobacco use? • How do you know?

  6. How we define evaluation capacity Evaluation capacity is having resources and ability to engage in evaluation that leads to learning, program improvement and enhanced accountability Prerequisites: committed leadership, resources – technical and financial, attitude that values evaluation

  7. How we do this: Operating principles Empowerment/participatory approach • Community members can learn and use planning and evaluation concepts, techniques and findings to evaluate themselves and their programs to improve practice. • Coalitions conduct their own evaluation; our professional role is one of trainer, consultant, facilitator, coach. • Local advisory group of coalition members will help provide direction and feedback.

  8. Operating principles…. Evaluation value • Important learning occurs during the process of ‘doing’ evaluation that impacts those involved and leads to more effective programs and enhanced outcomes • Evaluation is more than measurement, findings and external reporting. • Value lies in learning and continuous improvement.

  9. Operating principles… Practical approach • Approaches and methods will be used that are practical, innovative and appropriate for cultural and low-resource contexts. • Participatory adult education principles will be applied.

  10. Operating principles… Mixed approach • There are no “cookie cutter” approaches or answers • Heterogeneity of coalitions and local contexts demands mixed approaches and mixed methods. • Innovation and creativity are key

  11. Operating principles Research base • We will use research and best practices in program planning and evaluation. • We hold ourselves to the same standards of accountability in learning and use of evaluation. • We will apply the evaluation standards: utility, feasibility, propriety, accuracy

  12. Our logic model Increased valuing of evaluation UWEX staff Assess needs and assets Increased number of coalitions that demonstrate actions of effective planning and evaluation Increased local evaluation capacity Coalitions - facilitator - members Increased involvement in planning and evaluation Grant $$ Develop tobacco specific planning and evaluation materials Research Increased resources committed to planning and evaluation Local DH officer Evaluatoin Advisory Group Provide training and technical assistance Integration of evaluation into coalition operations More effective programs Partners: DPH; CTRI; WTCB; MEP; Smokefree Facilitate cross site sharing Increased knowledge and ability to collect and use data Partners Improved outcomes – WTCB goals Work with partners to create environment that understands and values evaluation Increased confidence and motivation to engage in evaluation

  13. Our structure and process Structure • Regional collaboration • Statewide coordinator • 5 regional specialists • Local advisory group Process • Training: small-large group, face-to-face, distance • TA: customized, individualized • Resource development; distribution • Partnering

  14. Regional Collaborative Model

  15. Wisconsin Regions

  16. Content of our T and TA • Demystify evaluation • Logic models • Long-range planning: Focus on WTCB goals; integrate evaluation • Stakeholder engagement • Writing outcomes (SMART objectives) • Evaluation planning: process and outcomes • Components of evaluation: Focus, data collection, analysis and interpretation, use

  17. www.uwex.edu/ces/pdande Outputs Activities Participation Outcomes – Impact Short Term Medium Term Long Term Inputs What we invest Staff Volunteers Time Money Research base Materials Equipment Technology Partners What we do Conduct workshops, 1meetings Deliver services Develop products, curriculum, resources Train Provide counseling Assess Facilitate Partner Work with media Whom we reach Participants Clients Agencies Decision- makers Customers What the short term results are Learning Awareness Knowledge Attitudes Skills Opinions Aspirations Motivations What the medium term results are Action Behavior Practice Decision- making Policies Social action What the ultimate impact(s) is Conditions Social Economic Civic Environmental Assumptions External Factors

  18. www.uwex.edu/ces/tobaccoeval/

  19. Assessing strengths and barriers to evaluation capacity building • Attitudes • Involvement • Leadership • Resources • Knowledge-skills • Logic model • Planning • Focus • Data collection • Analysis and interpretation • Use

  20. Lessons learned/learning • Multiple partners with different contexts and cultures require plenty of time for relationship building • Great variation in coalitions: type, functioning, resources, history, interest and abilities. Must start with where coalition is • Staffing: need for technical expertise, adult education, facilitation/relationship building/political savvy • Budget uncertainties created even greater need for constant communications and support

  21. Lessons… • Annual deliverables resulted in an “evaluation frenzy” • Need for long-range planning • What are tobacco “best practices”? • Not all coalitions need or can use evaluation T and TA; analysis may be contracted out • General T and TA provides foundation but need is for practical, “real” applications • And the learning continues…

  22. Typical Evaluation Technical Assistance Requests “There is no such thing as ‘typical’.”

  23. A Breadth of Examples of Evaluation Technical Assistance Requests

  24. Q: How should I evaluate the effects of TATU on middle school kids?

  25. A: “Let’s consider some possibilities…” • Work with High School kids to identify learning objectives. • Develop evaluation questions based upon learning objectives. • Engage HS kids in development of pre/post survey instrument and/or group interviews.

  26. Q. Could you please review this survey and give me feedback for improvement?

  27. A. “Could we start first with… What is the purpose of the survey? What do you want to learn? Who will use the results- for what? Are you sure a survey is the right method?

  28. “Let’s cover some tips for improving” surveys… • Use agency letterhead to improve credibility. • Introduce survey – Who are you? Why are you conducting survey – its importance? How will the information be used? • Keep it concise! • If possible, provide incentives for responding. • Make time for follow-ups

  29. Improving Surveys… • Local program evaluation web site • http://www.uwex.edu/ces/tobaccoeval/ • Search on “surveys”. • Program Development & Evaluation web site • http://www.uwex.edu/ces/pdande • Go to “Evaluation Publications” • Go to “Quick Tips”

  30. Q: We have conducted so many surveys in the past year. Are there some other evaluation things I can be doing?

  31. A: “YES!” • Consider using qualitative methods to gain greater depth. • Not everything needs to be evaluated

  32. Frantic: We have all these data & I don’t know what to do with them!

  33. A. “I’ll walk you through what to do” • First, what did you want to know/learn when you collected this information? • What do you hope to learn from these data? • Are your data of good enough quality to merit analysis? Who/how many responded? What was the sample?

  34. I’ll walk you through what to do… • Are all the data together? In one place? Ready for analysis? • Code • Enter into a data management program • Clean • Run Frequencies and percentages • Call me as you have more questions

  35. Q: Excel is making me crazy! How can I…?

  36. A: Contact Jenny – she is our Excel guru and can walk you through specific issues with the program.

More Related