370 likes | 650 Vues
Ten Steps to Designing an Evaluation for Your Educational Program. Linda Perkowski, Ph.D. University of Minnesota Medical School. Readiness Assurance Test. What is Program Evaluation?.
E N D
Ten Steps to Designing an Evaluation for Your Educational Program Linda Perkowski, Ph.D. University of Minnesota Medical School
What is Program Evaluation? • Systematic collection of information about a broad range of topics for use by specific people for a variety of purposes Patton, 1986
Definitions • Evaluation ---- program • Assessment ---- individual • Formative Evaluation -- to improve • Summative Evaluation --- to prove • Outcomes Research --- patient care
Purposes of Program Evaluation • To improve program • To determine next steps/make decisions • Help decide to replace, develop further, eliminate, accredit • To determine effectiveness • To document success • To measure outcomes
For curricular purposes, evaluation helps • Ensure teaching is meeting learner’s needs • Identify where teaching can be improved • Inform the allocation of resources • Provide support to faculty and learners • Diagnose and document program strengths and weaknesses • Articulate what is valued by the institution • Determine that educational objectives met Adapted from Morrison (2003)
Influences on the evaluation • External • Accrediting agencies • Public • Funding priorities • Internal • Who needs what answers? • Who gets to pose the questions? • How will the answers be made known?
Barriers to Program Evaluation • Tension between implementing and evaluating • Lack of skills in conducting applied social science research • Paucity of funding, time, and publication outlets • Failure to recognize evaluation as scholarship and place in literature Wilkerson, 2000
:00 What is the biggest barrier for you or your institution to collect and analyze program evaluation data? Tension between getting a program implemented and evaluating it Lack of skills Paucity of funding or time Limited outlets to present or publish findings
Many Models • Goal Oriented/Objective-Based (Tyler) • Goals-free Evaluation (Scriven) • Judicial/Adversary Evaluation • CIPP (Stufflebeam) • Kirkpatrick’s 4-level model • Situated Evaluation • Connoisseurship Evaluation (Eisner) • Utilization-Oriented Evaluation (Patton) • Logic Model
Kirkpatrick’s Four Levels of Outcomes • Satisfaction • Advance in knowledge, skills and attitudes • Skills used in everyday environment of the learner • Bottom-line • Effect on participants’“learners” • Effect on participants’ career • Institutional improvements
Overview of 10 Program Evaluation Steps (Workplan) • Step 1: Identify Users • Step 2: Identify Uses • Step 3: Identify Resources • Step 4: Identify Evaluation Questions/Objectives • Step 5: Choose Evaluation Design • Step 6: Choose Measurement Methods and Construct Instruments • Step 7: Address Ethical Concerns • Step 8: Collect Data • Step 9: Analyze Data • Step 10: Report Results
Step 1: Identify Users • Who will use the evaluation? • Learners • Faculty • Workshop developers • Administrators • Agencies • Other stakeholders • What do they want from the evaluation?
Step 2: Identify Uses • Generally both formative and summative • Individual and program decisions • Qualitative and/or quantitative information • Consider specific needs of each user • Judgments about individuals • Judgments about project management and processes
What uses do you have for program evaluation? Improving existing or new programs Proving that a program works
Step 3: Identify Resources • What time is needed from everyone? • What personnel is needed? • What equipment? • What facilities? • What funds?
Step 4: Identify Evaluation Questions/Objectives • These go back to the model chosen, but • Generally • Relate to specific measurable objectives for • Learner • Process • Outcomes • Wise to include some questions that get at what was not anticipated both as strengths and weaknesses
Step 4: Identify Evaluation Questions/Objectives – cont. • Evaluation questions should: • Be clear and specific • Congruent with the literature • Focus on outcomes versus process • Outcomes imply change • Workshop will improve educator’s skill RATHER THAN • How the workshop was given (process) • Align with goals and objectives
What are the questions? Interface • Interaction • Feedback • Clarity • Quality • Organization Process Presentation & Organization • Ease of use • Efficiency • Relevance • Language Evaluation of Learning Pedagogy Outcomes • Instructional method • Structure • Active learning • Learner differences • Objectives ~ methods ? • Knowledge • Attitudes • Behaviors Evaluation of Cost Evaluation of Content Development Implementation Maintenance • Authority • Accuracy • Appropriateness • Breadth • Depth • Needs assmt • Objectives • Materials • Staffing • Design • Staff time • Materials • Recruitment • Facilities • Hardware • Portability • Coordination • Durability • Tech support Adapted from Elissavet & Economides (2003)
Step 5: Choose Evaluation Designs • What ones are appropriate to the questions? • Posttest only X - - O • Satisfaction/reactions Retrospective Pretest X - - O • Attitudes • Pretest-Posttest O- -X- - O • Changes in knowledge/attitudes • Quasi-Experimental O- -X- - O- - - - - O • Cross-over O- - - - - O - -X - - O
Step 6: Choose Measurement Methods and Construct/Adapt Instruments • Common methods • Rating forms • Self-assessments • Essays • Exams • Questionnaires • Interviews/focus groups • Direct observations • Performance audits • Existing data (AAMC questionnaires, Course evals, JAMA) • Collect appropriate demographics
SOURCES OF DATA • What do we have? • What do we need? • What, realistically, can we do?
Group Assignment 1 • Evaluate the effectiveness of the CORD program (see handout) • Use your experiences and the information in the handout to address the first four steps During this workshop, you will begin to:
:10 What would be the best model to use as we begin to develop our plan? Goal oriented/ objective based Kirkpatrick’s 4-level model Logic model
Assignment 2 • Take your own project/program and begin filling in one of the blank matrices • Be prepared to discuss with the group
Step 7: Address Ethical Concerns • Confidentiality • Access to data • Consent • Resource allocation • Seek IRB approval
Step 8: Collect Data • Timing and response rate • Already existing data collection • Impact on instrument design (e.g. mail vs. web survey) • Assignment of responsibility
Step 9: Analyze Data • Plan at the same time as the rest of the evaluation • Want congruence between question asked and analysis that is feasible
Step 10: Report Results • Timely • Format fits needs of users • Display results in succinct and clear manner