1 / 47

How’s it Working? Evaluating Your Program

How’s it Working? Evaluating Your Program. MAAPS Conference, 7 May 2010 Debra Smith & Judah Leblang Program Evaluation & Research Group School of Education, Lesley University. PERG. Founded 1976 Over 600 program evaluation and research studies in various educational settings

kaili
Télécharger la présentation

How’s it Working? Evaluating Your Program

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. How’s it Working? Evaluating Your Program MAAPS Conference, 7 May 2010 Debra Smith & Judah Leblang Program Evaluation & Research Group School of Education, Lesley University

  2. PERG • Founded 1976 • Over 600 program evaluation and research studies in various educational settings • Also offers professional development and consultation

  3. Session participants will: • Be introduced to the basics of program evaluation through an example • Define a question or questions about their own program • Identify methods for collecting data that would help to answer their question/s • Discuss next steps

  4. What is program evaluation? • A type of applied research focused on systematically collecting and analyzing data to help answer questions about a program, or some aspect of a program, in order to make decisions about it.

  5. Purposes • Accountability • Program development • Generating knowledge

  6. Formative vs Summative • Formative evaluation offers feedback along the way to improve programs • Summative evaluations “sum up” the results of a program at the end of a period of development or implementation.

  7. Audiences • Funders • Program leaders • Program participants • Organizational partners • Others

  8. Evaluation process 1. Goals/ logic model 6. Reporting 2. Questions PROGRAM 5. Data analysis 3. Evaluation plan 4. Data collection

  9. An example: Evolutions • After school program begun in 2005, connected with Peabody Museum of Natural History at Yale University—initially involved approximately 40 low SES/ minority students

  10. Evolutions program goals To provide opportunities for students to: • Prepare for post-secondary (college) education; • Learn about scientific—and other careers; • Expand their knowledge of and interest in science (science literacy); • Develop transferable skills for the future; and • learn about the Peabody Museum/museum careers.

  11. Logic models • Map a coherent chain of connections between goals, resources, activities and what you expect (short term), want (over an intermediate period) and hope (in the long term) to happen. • They also reflect your assumptions and theory of action or change.

  12. Logic Model Key Concepts

  13. And EVO example

  14. Logic models may look different.. Goal Long-term outcomes Activities Outputs Resources Rationale Mid-term outcomes Assumptions Short-term outcomes

  15. Develop a logic model for your own program/ project

  16. Evaluation process 1. Goals/ logic model 6. Reporting 2. Questions PROGRAM 5. Data analysis 3. Evaluation plan 4. Data collection

  17. Questions: Think Goldilocks • Specific but not too detailed • Important but not too broad in scope

  18. Key Questions: Part One • How does EVO prepare students for college or high school? • How are EVO students involved in developing an exhibit at the museum? • Do students develop increased “science literacy,” as defined by EVO staff?

  19. Key Questions: Part Two • How (if at all) do students express more confidence about and interest in doing science? • Are students more aware of careers in science? • How (if at all) do students demonstrate increased knowledge of the college application process, and develop criteria for choosing a college that meets their needs?

  20. What questions do you want to answer about your program?

  21. Evaluation process 1. Goals/ logic model 6. Reporting 2. Questions PROGRAM 5. Data analysis 3. Evaluation plan 4. Data collection

  22. Data collection methods • Observation • Interviews/ focus groups • Surveys • Document/artifact review

  23. PERG Evaluation Matrix

  24. Technical considerations: Validity • Will the data answer the questions? • Are we asking the right questions?

  25. Triangulation • Is there adequate triangulation (use of multiple methods and/or data sources) to ensure validity?

  26. Drafting your own matrix: What data will help you answer your questions?

  27. Evaluation process 1. Goals/ logic model 6. Reporting 2. Questions PROGRAM 5. Data analysis 3. Evaluation plan 4. Data collection

  28. Collecting data • Make sure your plan is doable given time and resources available. • Design instruments to focus your data collection, ensure consistency and avoid bias. • Be organized: take notes, develop a system for tracking/ filing your data.

  29. Collecting data • Communicate clearly about what you are doing, why and how the findings will be shared and used. • Be mindful of human subjects protections. Does your organization have an institutional review board (IRB)?

  30. The First Year: site visit • On-site data collection • Focus groups with students • Interviews with director, project staff • Observation of end of year event • Parent interviews

  31. Evaluation process 1. Goals/ logic model 6. Reporting 2. Questions PROGRAM 5. Data analysis 3. Evaluation plan 4. Data collection

  32. Analyzing data • What stands out? • What are the patterns? • What are the similarities? • What are the differences? • Is more information needed?

  33. Reliability • Are the patterns in the data, or judgments about the data, consistent?

  34. Validity, again • Is the data helping you answer the questions? • Is the data credible?

  35. Evaluation process 1. Goals/ logic model 6. Reporting 2. Questions PROGRAM 5. Data analysis 3. Evaluation plan 4. Data collection

  36. Reporting • Consider purpose and audience/s • Reporting relevant findings, questions/ recommendations • Engaging stakeholders in discussion • Using findings to inform next steps

  37. Results of the first-year evaluation • The impact of the evaluation on EVO—more focused program, clearer objectives, suggestions for sustainability. • Evidence of program success: Retention, student engagement, positive changes in students’ view of doing science and scientists.

  38. The Ongoing Evaluation--shaping the program: • Implementation of evaluator suggestions—examples: informational interviewing, developing a smaller exhibit, refining requirements for students

  39. EVO: 2006-Today • Continued development and expansion of EVO—2006 until today: Expansion of the program from approximately 40 to more than 80 students, introduction of internships and Sci Corps. • Different areas of science focus—environmental awareness, geoscience, depending on funding sources.

  40. Evaluation resources • W.K. Kellogg Foundation Evaluation Handbook www.wkkf.org/Pubs/Tools/Evaluation/Pub770.pdf • Kellogg Logic Model Development Guide www.wkkf.org/Pubs/Tools/Evaluation/Pub3669.pdf • Basic Guide to Program Evaluation www.managementhelp.org/evaluatn/fnl_eval.htm

  41. Evaluation resources • Program Evaluation & Research Group Lesley University 29 Everett St. Cambridge, MA 02138 www.lesley.edu/perg.htm 617-349-8172 perg@lesley.edu

More Related