1 / 14

Begin at the Beginning introduction to evaluation

Begin at the Beginning introduction to evaluation. It all depends…. Educational evaluation methods differ depending upon… The mission of the program The stakeholders Money available to perform the evaluation The purpose of the evaluation The target audience for the report.

RexAlvis
Télécharger la présentation

Begin at the Beginning introduction to evaluation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Begin at the Beginning introduction to evaluation

  2. It all depends… • Educational evaluation methods differ depending upon… • The mission of the program • The stakeholders • Money available to perform the evaluation • The purpose of the evaluation • The target audience for the report

  3. Effective Evaluations Consider: • Who is served • Target Population • What services are provided • The treatment / The program / The intervention • Who has an interest in the success of the program • Stakeholders

  4. Effective Evaluations Consider: • The purpose of the program • Mission / Goals / Measurable Objectives • How services are typically delivered • Treatment Implementation Fidelity • Service Delivery Cycle

  5. Effective Evaluations Consider: • Why the evaluation is being conducted • Purpose of Evaluation • The target audience for the evaluation • Target Audience • Final Report

  6. The Essence of Evaluation • Determining the worth or value of a program • Conducting a context-specific interpretation of what is happening with a program in a real world setting. • Making causal attributions about effects.

  7. Thinking and Doing… • There are many evaluation models. • Independent of the model, evaluation is an intervention in itself. • Evaluators have an impact on the program.

  8. The Impact of Evaluation • Help define the purpose of the evaluation and the target audience of stakeholders for the results. • Conduct process evaluations that document delivery of the program as well as implementation fidelity.

  9. The Impact of Evaluation • Help programs recognize the usefulness of evaluation as a source of feedback and guidance for program improvement and development purposes, including making programs aware of national quality standards. • Help identify the stage of development of a program / organizational maturity.

  10. The Impact of Evaluation • Help programs recognize the role of evaluation in measuring program impact – and selling program impact. • Propose reasonable methods that fit the purpose and target audience. Examples could include surveys, observational measures, analysis of test scores, focus groups, etc.

  11. The Impact of Evaluation • Propose reasonable use of comparative strategies where appropriate such as control groups, multiple measures over time, comparison conditions, etc. • Randomization. • Help programs recognize the importance of setting specific objectives by which the program can be evaluated.

  12. The Impact of Evaluation • Mission – Goals – Measurable Objectives • Help programs outline how specific indicators can be tied to each objective. • Help programs understand how to fully specify the desired outcomes in terms of how they would be measured.

  13. The Impact of Evaluation • Help programs outline realistic indicators that are closely tied to the actual program or intervention, rather than overly lofty or unrealistic expectations of broad program impact. • The world peace issue.

  14. The Impact of Evaluation • Address reliability, validity, and cultural sensitivity of the outcome measures in the context of the specific target population. • Help programs understand the use of multiple data sources and indicators, ideally more than just state test scores.

More Related