
It all depends… • Educational evaluation methods differ depending upon… • The mission of the program • The stakeholders • Money available to perform the evaluation • The purpose of the evaluation • The target audience for the report
Effective Evaluations Consider: • Who is served • Target Population • What services are provided • The treatment / The program / The intervention • Who has an interest in the success of the program • Stakeholders
Effective Evaluations Consider: • The purpose of the program • Mission / Goals / Measurable Objectives • How services are typically delivered • Treatment Implementation Fidelity • Service Delivery Cycle
Effective Evaluations Consider: • Why the evaluation is being conducted • Purpose of Evaluation • The target audience for the evaluation • Target Audience • Final Report
The Essence of Evaluation • Determining the worth or value of a program • Conducting a context-specific interpretation of what is happening with a program in a real world setting. • Making causal attributions about effects.
Thinking and Doing… • There are many evaluation models. • Independent of the model, evaluation is an intervention in itself. • Evaluators have an impact on the program.
The Impact of Evaluation • Help define the purpose of the evaluation and the target audience of stakeholders for the results. • Conduct process evaluations that document delivery of the program as well as implementation fidelity.
The Impact of Evaluation • Help programs recognize the usefulness of evaluation as a source of feedback and guidance for program improvement and development purposes, including making programs aware of national quality standards. • Help identify the stage of development of a program / organizational maturity.
The Impact of Evaluation • Help programs recognize the role of evaluation in measuring program impact – and selling program impact. • Propose reasonable methods that fit the purpose and target audience. Examples could include surveys, observational measures, analysis of test scores, focus groups, etc.
The Impact of Evaluation • Propose reasonable use of comparative strategies where appropriate such as control groups, multiple measures over time, comparison conditions, etc. • Randomization. • Help programs recognize the importance of setting specific objectives by which the program can be evaluated.
The Impact of Evaluation • Mission – Goals – Measurable Objectives • Help programs outline how specific indicators can be tied to each objective. • Help programs understand how to fully specify the desired outcomes in terms of how they would be measured.
The Impact of Evaluation • Help programs outline realistic indicators that are closely tied to the actual program or intervention, rather than overly lofty or unrealistic expectations of broad program impact. • The world peace issue.
The Impact of Evaluation • Address reliability, validity, and cultural sensitivity of the outcome measures in the context of the specific target population. • Help programs understand the use of multiple data sources and indicators, ideally more than just state test scores.