1 / 33

Linda Perkowski, Ph.D. University of Minnesota Medical School

Ten Steps to Designing an Evaluation for Your Educational Program. Linda Perkowski, Ph.D. University of Minnesota Medical School. Readiness Assurance Test. What is Program Evaluation?.

agostino
Télécharger la présentation

Linda Perkowski, Ph.D. University of Minnesota Medical School

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Ten Steps to Designing an Evaluation for Your Educational Program Linda Perkowski, Ph.D. University of Minnesota Medical School

  2. Readiness Assurance Test

  3. What is Program Evaluation? • Systematic collection of information about a broad range of topics for use by specific people for a variety of purposes Patton, 1986

  4. Definitions • Evaluation ---- program • Assessment ---- individual • Formative Evaluation -- to improve • Summative Evaluation --- to prove • Outcomes Research --- patient care

  5. Purposes of Program Evaluation • To improve program • To determine next steps/make decisions • Help decide to replace, develop further, eliminate, accredit • To determine effectiveness • To document success • To measure outcomes

  6. For curricular purposes, evaluation helps • Ensure teaching is meeting learner’s needs • Identify where teaching can be improved • Inform the allocation of resources • Provide support to faculty and learners • Diagnose and document program strengths and weaknesses • Articulate what is valued by the institution • Determine that educational objectives met Adapted from Morrison (2003)

  7. Influences on the evaluation • External • Accrediting agencies • Public • Funding priorities • Internal • Who needs what answers? • Who gets to pose the questions? • How will the answers be made known?

  8. Barriers to Program Evaluation • Tension between implementing and evaluating • Lack of skills in conducting applied social science research • Paucity of funding, time, and publication outlets • Failure to recognize evaluation as scholarship and place in literature Wilkerson, 2000

  9. :00 What is the biggest barrier for you or your institution to collect and analyze program evaluation data? Tension between getting a program implemented and evaluating it Lack of skills Paucity of funding or time Limited outlets to present or publish findings

  10. Many Models • Goal Oriented/Objective-Based (Tyler) • Goals-free Evaluation (Scriven) • Judicial/Adversary Evaluation • CIPP (Stufflebeam) • Kirkpatrick’s 4-level model • Situated Evaluation • Connoisseurship Evaluation (Eisner) • Utilization-Oriented Evaluation (Patton) • Logic Model

  11. Program Logic Model - MERC

  12. Tyler Model - MERC

  13. Kirkpatrick’s Four Levels of Outcomes • Satisfaction • Advance in knowledge, skills and attitudes • Skills used in everyday environment of the learner • Bottom-line • Effect on participants’“learners” • Effect on participants’ career • Institutional improvements

  14. Overview of 10 Program Evaluation Steps (Workplan) • Step 1: Identify Users • Step 2: Identify Uses • Step 3: Identify Resources • Step 4: Identify Evaluation Questions/Objectives • Step 5: Choose Evaluation Design • Step 6: Choose Measurement Methods and Construct Instruments • Step 7: Address Ethical Concerns • Step 8: Collect Data • Step 9: Analyze Data • Step 10: Report Results

  15. Step 1: Identify Users • Who will use the evaluation? • Learners • Faculty • Workshop developers • Administrators • Agencies • Other stakeholders • What do they want from the evaluation?

  16. Step 2: Identify Uses • Generally both formative and summative • Individual and program decisions • Qualitative and/or quantitative information • Consider specific needs of each user • Judgments about individuals • Judgments about project management and processes

  17. What uses do you have for program evaluation? Improving existing or new programs Proving that a program works

  18. Step 3: Identify Resources • What time is needed from everyone? • What personnel is needed? • What equipment? • What facilities? • What funds?

  19. Step 4: Identify Evaluation Questions/Objectives • These go back to the model chosen, but • Generally • Relate to specific measurable objectives for • Learner • Process • Outcomes • Wise to include some questions that get at what was not anticipated both as strengths and weaknesses

  20. Step 4: Identify Evaluation Questions/Objectives – cont. • Evaluation questions should: • Be clear and specific • Congruent with the literature • Focus on outcomes versus process • Outcomes imply change • Workshop will improve educator’s skill RATHER THAN • How the workshop was given (process) • Align with goals and objectives

  21. What are the questions? Interface • Interaction • Feedback • Clarity • Quality • Organization Process Presentation & Organization • Ease of use • Efficiency • Relevance • Language Evaluation of Learning Pedagogy Outcomes • Instructional method • Structure • Active learning • Learner differences • Objectives ~ methods ? • Knowledge • Attitudes • Behaviors Evaluation of Cost Evaluation of Content Development Implementation Maintenance • Authority • Accuracy • Appropriateness • Breadth • Depth • Needs assmt • Objectives • Materials • Staffing • Design • Staff time • Materials • Recruitment • Facilities • Hardware • Portability • Coordination • Durability • Tech support Adapted from Elissavet & Economides (2003)

  22. Step 5: Choose Evaluation Designs • What ones are appropriate to the questions? • Posttest only X - - O • Satisfaction/reactions Retrospective Pretest X - - O • Attitudes • Pretest-Posttest O- -X- - O • Changes in knowledge/attitudes • Quasi-Experimental O- -X- - O- - - - - O • Cross-over O- - - - - O - -X - - O

  23. Step 6: Choose Measurement Methods and Construct/Adapt Instruments • Common methods • Rating forms • Self-assessments • Essays • Exams • Questionnaires • Interviews/focus groups • Direct observations • Performance audits • Existing data (AAMC questionnaires, Course evals, JAMA) • Collect appropriate demographics

  24. SOURCES OF DATA • What do we have? • What do we need? • What, realistically, can we do?

  25. Group Assignment 1 • Evaluate the effectiveness of the CORD program (see handout) • Use your experiences and the information in the handout to address the first four steps During this workshop, you will begin to:

  26. :10 What would be the best model to use as we begin to develop our plan? Goal oriented/ objective based Kirkpatrick’s 4-level model Logic model

  27. Assignment 2 • Take your own project/program and begin filling in one of the blank matrices • Be prepared to discuss with the group

  28. Step 7: Address Ethical Concerns • Confidentiality • Access to data • Consent • Resource allocation • Seek IRB approval

  29. Step 8: Collect Data • Timing and response rate • Already existing data collection • Impact on instrument design (e.g. mail vs. web survey) • Assignment of responsibility

  30. Step 9: Analyze Data • Plan at the same time as the rest of the evaluation • Want congruence between question asked and analysis that is feasible

  31. Step 10: Report Results • Timely • Format fits needs of users • Display results in succinct and clear manner

  32. QUESTIONS???

  33. ©

More Related