1 / 52

Evaluation 101

Evaluation 101. NETA Education Center Webinar January 22, 2015 3:00pm EST. Charles Gasper Nine Network. Janice Fuld WNET. Michelle Kosmicki NET. Jon Rubin WNET. John Chambers NETA. Impact, Outcomes & Evaluation. Charles Gasper Director of Evaluation. What are We Going to do Today?.

richarda
Télécharger la présentation

Evaluation 101

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluation 101 NETA Education Center Webinar January 22, 2015 3:00pm EST

  2. Charles Gasper Nine Network Janice FuldWNET Michelle Kosmicki NET Jon RubinWNET John Chambers NETA

  3. Impact, Outcomes & Evaluation Charles Gasper Director of Evaluation

  4. What are We Going to do Today? Discussion about project design Discussion about evaluation Nine’s program theory-based methodology See how evaluation can be implemented Not try to take over the world!

  5. What is a Project? • Impact/Outcomes • Activities/Outputs • Inputs/Resources • Context or Environment

  6. What is Impact?

  7. What are Outcomes?

  8. Why Impact & Outcomes? • Pressure • Funders • Individuals • Government • Partners

  9. What is a Logic Model?

  10. What’s In a Logic Model? • Assumptions • Inputs/Resources • Activities • Outputs • Outcomes (Short, Medium, Long-Term) • Context or Environment

  11. Basic Form

  12. Theory-Based Evaluation: It’s About the Journey & Destination

  13. Theory Driven Program Planning/Evaluation • Connecting Dots to Understand Programs

  14. Why Theory Driven Program Planning/Evaluation? • Consistent framework • Forces consideration • Conversation and discovery • Avoid pitfalls • Clarity of focus

  15. How it Works Outcome Connect the Dots

  16. What Does It Look Like? Community Cinema

  17. How To Do It? Discuss: • Why • Outcomes • Activities

  18. Let’s Talk about Why

  19. Associated Outcomes • What are the Goals? • What are we trying to accomplish? • How do the Outcomes link together?

  20. Make the Linkages

  21. Activities

  22. Make the Linkages

  23. Dive the Golden Spike(s) Connect the Activities to Outcomes Everything Must Link! • A to A • O to O • A to O

  24. Assumptions

  25. Context • Influence on activities/outcomes • Stakeholders to consider

  26. So You Have All These BoxesNow What? • Definitions • Measurements • Evaluation Plan (Timeline & Integration) • Analysis Plan

  27. Definitions • What do the boxes mean? • Helpful questions: • What does “success” look like? • What is an example of this working well? • What is an example of this failing? • Why is this important to your project?

  28. Group Time - Definitions

  29. Measurement • How do we choose the right measure? • What are the measurement options? • Key questions: • What is your definition? • How does this relate to the rest of the model? • What do you want to know? • What do you want to say?

  30. Choosing the Right Measure • There is no one rightmeasure! • There is no one right way to measure! • Key things to think about: (Credible Evidence) • Importance of the box in the model • Minimum level of rigor (benefit) • Maximum level of effort (cost/discomfort) • Who the information will be reported to

  31. Measurement Options • How? • Survey • Interview • Data collected for other reasons • Accounting • Observation • Test scores • Video • Audio • Quantitative • Qualitative • When? • Retrospective • Prospective • Pre/Post Test

  32. Which Boxes to Measure? • What are the Most Important boxes? • What is the Time Line of the project? • What do we Need to Know?

  33. Questions? Charles Gasper Director of Evaluation Nine Network of Public Media Email: cgasper@ketc.org Phone: 314-512-9010 Twitter: karcsig Blog: evaluationevangelist.blogspot.com

  34. Janice Fuld Associate Director, Education

  35. What we do… • online resources • educational broadcasts • national community engagement initiatives • multimedia materials • professional development • live events • local and national work

  36. Who we do it for…

  37. So what?????

  38. ? ? We’ve got lots of questions.... ? ? ? Why are some resources more popular than others? ? What happens after teachers leave our PD sessions? How are our resources being used? ? What conversations are our events sparking? Are we giving educators what they need? ? ? What is happening in the classrooms? How are parents using our materials at home? ? ? Are students more engaged?

  39. We want answers.... • Evaluation and Impact Strategic Plan • Multipronged Approach • Surveys • Focus groups • Observations • Metrics • In-depth studies • Qualitative and quantitative measures

  40. Formative and Summative Research • Formative Research-Help answer questions while “forming”/improving something. • Summative Research- Look at the results. What happened? How effective was the intervention? Did we achieve our goals?

  41. Next Steps

  42. Questions? Janice Fuld Associate Director, Education WNET New York Public Media Email: fuldj@wnet.org Phone: 212-560-2088

  43. Michelle KosmickiResearch Manager

  44. Design Thinking

  45. Three Evaluation Tips Tip #1 Get the evaluator in the room ASAP!

  46. Three Evaluation Tips Tip #2 Keep the evaluator in the loop.

  47. Three Evaluation Tips Tip #3 Ask Questions. Lots of questions.

  48. Tales of Evaluation This is a tale of two evaluations. It is not for the faint of heart, but for those courageous enough to Evaluate.

More Related