1 / 39

CESSR Workshops in Methods Introduction to Program Evaluation September 24, 2010 Mindy Hightower King, Ph.D. Research Sc

CESSR Workshops in Methods Introduction to Program Evaluation September 24, 2010 Mindy Hightower King, Ph.D. Research Scientist Center for Evaluation & Education Policy Indiana University. CEEP…

margot
Télécharger la présentation

CESSR Workshops in Methods Introduction to Program Evaluation September 24, 2010 Mindy Hightower King, Ph.D. Research Sc

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CESSR Workshops in Methods Introduction to Program Evaluation September 24, 2010 Mindy Hightower King, Ph.D. Research Scientist Center for Evaluation & Education Policy Indiana University

  2. CEEP… • Promotes and supports rigorous program evaluation and education policy research primarily, but not exclusively, for educational, human services, and nonprofit organizations. • Takes a dynamic approach to evaluation and education policy research, using both quantitative and qualitative methodologies, including experimental designs. • Represents a merger of the Indiana Center for Evaluation and the Indiana Education Policy Center.

  3. CEEP’s Mission Improve education by providing nonpartisan information, research, and evaluation on education issues to policymakers and other education stakeholders Encourage rigorous program evaluation across a variety of settings by providing evaluation expertise and services to diverse agencies, organizations, and businesses Expand knowledge of effective strategies in evaluation and policy research by developing, modeling, and disseminating innovative approaches to program evaluation and policy research

  4. Current Projects • CEEP researchers currently manage over 60 projects in the following areas: • Charter Schools • Professional Learning Communities’ • After School Programming • Literacy • Education Policy • Science, Technology, Engineering and Math • School Wellness • Public Health

  5. Presentation Overview • What is Evaluation? • 2. Tools of the Trade • 3. A Few Words on Program Goals/Objectives

  6. What is Evaluation? • The use of social science research activities directed at collecting, analyzing, interpreting, and communicating information about the workings and effectiveness of programs. • Differentiated from social science research in that: • Interpretation and communication of results is essential, but less standardized • The engagement of stakeholders and the political nature of evaluation requires additional skill sets

  7. Why Conduct Evaluation? • To assess the utility of new programs or initiatives; • To satisfy accountability requirements of program sponsors; • To increase the effectiveness of program management and administration; • To aid in decisions regarding whether programs should be continued, expanded, or curtailed.

  8. Who Commissions Evaluation? • Program funders • Program managers • Research organizations • Program designers • Any combination of the above

  9. Some Background on the Field of Evaluation The following factors have contributed to the rise and professionalism of the field: • Public health programs targeting infectious disease • Post-WWII boom of federally and privately funded social programs • 1960’s war on poverty • More recently, the driving force behind evaluation has shifted from social science researchers to consumers of the research

  10. Challenges of Evaluation • Dynamic nature of social programs • Programs may change as they are implemented, and the evaluation will often need to do the same in response. • Scientific versus pragmatic concerns • Challenge involves selecting techniques of the highest rigor with available resources while maintaining utility. • Diversity of perspectives and approaches • There are rarely absolutes in evaluation. Most often, it involves finding the approach that best fits the situation.

  11. Formative v. Summative Evaluation

  12. General Stepsin Conducting a Program Evaluation(The Evaluation Center, 2001) • Evaluation Assessment • Involves determination of purpose, key questions, intended use, culture/environment, research design. • Evaluation Study • Involves instrument design, data collection and analysis, report development and dissemination, utilization.

  13. Evaluation Assessment Who are the clients of the evaluation? What are the questions and issues driving the evaluation? What resources are available to do the evaluation? What has been done previously? What is the program all about?

  14. Evaluation Assessment - Continued 6. In what kind of environment does the program operate? 7. Which research designs are desirable appropriate? 8. What information sources are available and appropriate, given the evaluation issues and environment, and the program structure? 9. Given all the issues raised in 1-8, which evaluation strategy is least problematic? 10. Should the program evaluation be undertaken?

  15. Evaluation Study Develop the measures and collect the data. Analyze the data Write the report Disseminate the report / results Make changes based on the data / utilize the results

  16. Quick & Dirty Evaluation Design Who wants to know what and why? What resources are available to do the evaluation? What do I need to keep in mind about the context in which the program operates? What information sources are available and appropriate? Which evaluation strategy is most feasible and least problematic?

  17. Evaluation Tools and Strategies: Logic Models Stakeholder Interest Matrices Data Collection Plans

  18. What is a Logic Model? • A simplified picture of a program, initiative, or intervention. • Shows logical relationships among the resources that are invested, the activities that take place, and the benefits or changes that result. • (This is often called program theory or the program's theory of action) • It is a "plausible, sensible model of how a program is supposed to work" (Bickman, 1987).

  19. Logic Model Basics OUTPUTS OUTCOMES INPUTS Program Investments Activities Participation Short Term Intermediate Long Term (Impacts) What is invested What is done Who is reached Learning Action / Performance Conditions • Inputs - the resources invested that allow a program to achieve the desired outputs. • Outputs - activities conducted or products created that reach targeted participants or populations. Outputs lead to outcomes. • Outcomes - changes or benefits for individuals, families, groups, businesses, organizations, and communities.

  20. Teaching American History Program • Funded by the U.S. Department of Education • Provides grants to schools and districts • Purpose of the program: increase student and teacher knowledge of American History • Program funds used to provide professional development to teachers, purchase materials, support collaborative efforts

  21. PRACTICE EXERCISE: Developing a Logic Model: (Articulate the desired long-term outcomes and work backwards) INPUTS OUTPUTS OUTCOMES Program Investments Activities Participation Short Term Intermediate Long Term STEP 3 STEP 2 STEP 1

  22. Teaching American History Logic Model INPUTS OUTPUTS OUTCOMES Program Investments Activities Participation Short Term Long Term Teacher Professional Development Peer Mentoring Curriculum Development # of teachers who attend workshops # of students impacted by trained teachers Staff Volunteers Money Time Materials Technology Partners Increased Teacher Knowledge in American History Increased Student Achievement in American History

  23. Logic Models: Easy as pie…or cookies?

  24. Stakeholder Interest Matrix • Used to identify individuals/groups who may be interested and/or involved in an evaluation. • Clarifies stakeholder interests in evaluation results, potential concerns and/or road-blocks, and opportunities to increase buy-in. • Particularly useful for participatory evaluations. • May also help in identifying potential evaluation resources.

  25. Stakeholder Interest Matrix

  26. Stakeholder Interest Matrix

  27. Data Collection Plans • Used to summarize evaluation methodology in grant proposals or to summarize for stakeholders. • Illustrates the connection between project goals/objectives and data collection strategies. • Particularly useful when space is limited in proposals/applications or when evaluation strategies are multi-dimensional.

  28. Sample Data Collection Plan

  29. A few words on goals and objectives…

  30. Goals – Objectives – Performance Measures PROGRAM GOAL Project Objectives: What your project is doing to support the overall program goa?l Performance Measures:How you measure your progress toward meeting your objectives?

  31. PRACTICE EXERCISE: Developing a Logic Model: (Articulate the desired long-term outcomes and work backwards) INPUTS OUTPUTS OUTCOMES Program Investments Activities Participation Short Term Intermediate Long Term STEP 3 STEP 2 STEP 1 Program Objectives Process Measures Outcome Measures

  32. Performance Measures • A performance measure is a measurable indicator used to determine how well objectives are being met. • How will you assess progress? • How much progress will constitute success? • How will you know if your objective or part of your objective has been achieved?

  33. Objective 1 Performance Measure 1a Performance Measure 1b Performance Measure 1c Relevance of Performance Measures

  34. Components of Performance Measures What will change (or happen)? How much change is expected? (What is the expected quantity?) Who will achieve the change (or who will the events involve)? When the change will take place (or happen)?

  35. Performance Measure Examples Five (how much)charter schools will be developed in geographic areas with a concentration of high priority schools (as defined by state standards) (who/what) throughout the state each year between 2010 and 2012 (when). © 2010 CEEP

  36. Performance Measure Examples 100% of charter school leaders and CFOs (expected quantity) will attend the Fiscal Review Workshop (what will happen/who will be involved) during years one and two of their grant period (when will it happen).

  37. Objectives/Performance Measures • Objective: • To encourage dissemination of best practices within charter schools to the broader public. • Performance Measures: • On an annual basis, 100% of charter schools will submit their best practices to the SEA for inclusion to a catalogue of innovative methods. • During each year of the grant, at least two venues/partner organizations will disseminate collected charter school data. • Follow up surveys attending partner organization training events will show that at least 75% of those attending dissemination workshops will implement new practices based on charter school innovations.

  38. For more information… • Mindy Hightower King, Ph.D. • Center for Evaluation and Education Policy • 1900 E. Tenth Street, Room 918 • Bloomington, Indiana 47401 • 812-855-4438 • minking@indiana.edu

  39. CESSR Workshops in Methods Introduction to Program Evaluation September 24, 2010 Mindy Hightower King, Ph.D. Research Scientist Center for Evaluation & Education Policy Indiana University

More Related