1 / 36

Evaluation and Assessment Strategies for Web-Based Education

Evaluation and Assessment Strategies for Web-Based Education Paula Frew, MA, MPH Associate Director of Programs Behavioral Sciences and Health Education Emory University Atlanta, Georgia, USA Overview Why Evaluate? Types of Assessment Theoretical Framework Evaluation Research Questions

andrew
Télécharger la présentation

Evaluation and Assessment Strategies for Web-Based Education

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluation and Assessment Strategies for Web-Based Education Paula Frew, MA, MPH Associate Director of Programs Behavioral Sciences and Health Education Emory University Atlanta, Georgia, USA

  2. Overview • Why Evaluate? • Types of Assessment • Theoretical Framework • Evaluation Research Questions • Methods & Data Collection Strategies • Frew (2001) Formative Evaluation Study of Web-Based Course Initiatives • Additional Resources

  3. Introductions • About us • About the workshop leader • Ice-breaking exercise - getting to know each other

  4. Evaluation Design Framework: A Process Exercise Evaluation Purposes Theory Evaluation Questions Sampling & Data Collection Strategies Methods Adopted from Robson, 2000 (Fig 5.1, p. 80)

  5. Why evaluate? • Evaluation is the systematic acquisition and assessment of information to provide useful feedback about some object (program, technology, activity, etc.) - William Trochim,Cornell University (http://trochim.human.cornell.edu/kb/intreval.htm) • Learner evaluation: Assess worth or quality of learning and instruction in web-based environment (Donald E. Hanna, et al., 2000)

  6. Why Evaluate? • To improve a program/course • Assess outcomes and efficiency - Provide “useful feedback” to administrators, faculty, sponsors, other relevant constituencies. • Find out how a program operates/Understand why a program works (or does not) - May aid in decision-making or policy formulation processes through provision of empirically-driven feedback.

  7. Why evaluate? • Other reasons: • To generate knowledge - assess the academic/applied value which includes topics such as: • Testing theory • Distinguishing types of interventions • Learning about measurements • Developing policy

  8. Why evaluate learners? • To identify content areas that are unclear and confusing • Recognize content areas that need revision • Gather evidence to support revisions • Assess effectiveness of your course • (Hanna, et al., 2000)

  9. Reflective Exercise:Your reasons to evaluate • Workshop exercise - what are your reasons to conduct an evaluation? • Detail specifics related to own experience • Who are your stakeholders? • What is the political dynamic at work? • What is your relationship to the stakeholders and others involved in process? • Conclude exercise: Share with others at workshop

  10. Types of Assessment • Improvement-oriented evaluations: • Formative evaluation - strengthen or improve the object being evaluated* • Scriven (1967) - form or develop programs • Patton (1994) - “developmental evaluation” • Focus on processes - what is going on • Quality-enhancement and continuous improvements • Local adaptation (for different cultures)

  11. Types of Assessment • Knowledge-oriented evaluations: • Effectiveness • Theory-building • Policy making

  12. Types of Assessment • Judgement-oriented evaluations: • Summative evaluation - examine the effects or outcomes of some object* • Robson (2000) - an “end of term report” - what goals have been achieved? • Were needs met, target audiences reached, was program implemented as planned? • Audit: accountability and quality control • Cost-benefit decisions

  13. Program/Technology/Courses: Formative Evaluation Distinctions • Needs assessment - who needs the program, how great is the need, what might work to meet the need? • Structured conceptualization - helps define the program/technology, the target audience, and possible outcomes • Implementation evaluation - monitors the fidelity of the program or technology delivery • Process evaluation - investigates the process of delivering the program or technology, including alternative delivery procedures (Trochim, 1999)

  14. Formative Evaluation: Learning & Instructional Focus • Helps to identify the knowledge and skills learners have gained in the course to date • Allows you to determine whether or not to introduce new content • Gives you feedback on the learners’ learning processes • Signals whether learners need additional practice or work in certain areas • Refocuses the learners’ attention

  15. Program/Technology/Courses: Summative Evaluation Distinctions • Outcomes evaluation - investigate whether the program or technology caused demonstrable effects on specifically defined target outcomes • Impact evaluation - assesses the overall or net effects of a program or technology as a whole • Cost-effectiveness/Cost-benefit analysis - address questions of efficiency by standardizing outcomes in terms of their dollar costs and values • Meta-analysis - integrates the outcomes estimates from multiple studies to arrive at an overall summary judgement on an evaluation question (Trochim, 1999)

  16. Summative Evaluation: Learning & Instruction Focus • Measures what learners have learned • Finalizes decisions about grades • Reviews new knowledge and skills the learners have gained in taking the course.

  17. Reflective Exercise: What type of evaluation do you propose? • Given the perceived need of the stakeholders, what type of evaluation would you pursue? Why? • What challenges do you perceive in conducting this type of evaluation? • What would be gained in doing this type of evaluation? • Exercise conclusion: share with others

  18. Theoretical Framework • 3 Approaches (Patton, 1994) • Deductive Approach - scholarly theory guides inquiry • Inductive Approach - generate theory from fieldwork • User-focused - working with others in evaluation context to extract and specify their implicit theory of action

  19. Theoretical Framework • Deductive Approach Examples: • Diffusion of Innovations Theory (Rogers, 1995) - e.g., examine the adoption of educational innovations in academic settings • Adult Learning Theory (Knowles, 1980) - e.g., how learners acquire their skills, knowledge, understandings, attitudes, values and interests

  20. Reflective Exercise: What Theoretical Approach will you take? • What is your theoretical orientation, if any, in conducting the evaluation? • Reasons for this decision • Potential blindspots in theoretical orientation • Alternative approaches • Exercise conclusion: share with others

  21. Evaluation Research Questions • Formative Research Examples: • How should the program or technology be delivered to address the problem? • How well is the program or technology delivered? • What is the definition and scope of the problem/issue, or what is the question? • Where is the problem and how big or serious is it? (Trochim, 1999)

  22. Evaluation Research Questions • Learning & Instruction: Formative Evaluation Examples: • How well is the instruction likely to work? • What obstacles are learners encountering, and how can they be overcome? • Are the selected instructional methods, media, and materials effective in helping learners learn? • What improvements could be made in the instruction for future use? • (Hanna, et al., 2000)

  23. Evaluation Research Questions • Summative Evaluation Research: • What was the effectiveness of the program or technology? • What is the net impact of the program? (Trochim, 1999)

  24. Evaluation Research Questions • Learning & Instruction: Summative Evaluation Examples: • What did the learners learn? • Did the learners find the instruction interesting, valuable, and meaningful? • What instructional methods contributed to learner motivation? • (Hanna, et al., 2000)

  25. Reflective Exercise: Your research questions • Write up to 5 questions that will guide your evaluation work - what do you hope to learn from these questions? • Once answered, what impact do you see this information having on your intended audience (the stakeholders, institution, etc.)? • Conclusion: share with others

  26. Program/Course/Technology Evaluation: Methods & Data Collection Strategies • Most used: • Informational interviews • Formal, open-ended interviews • Formal, questionnaire-based interviews • Focus groups • Participant observations • Document review & analysis • Statistical modeling/analysis

  27. Learner Evaluation: Data Collection Strategies • Most used: • Pre-test/post-test quizzes of knowledge and skills • Essays • Portfolios • Performance evaluations/learner self-assessment • Interviews • Journals • Reflective papers • Website development • Learner participation figures • Peer assessment

  28. Reflective Exercise: Your methods/data collection approach • What methods would you employ in your evaluation study? Why? • How much time and resources do you think are necessary to conduct your evaluation? • How would you propose to address bias in your data collection approach? • Conclusion: share ideas about your methods with others

  29. Quick Tips on Questionnaire/Survey Design • A popular method in gathering data for evaluation studies • Tips on writing survey questions (handout distributed to participants) • Statistical implications - how questionnaire data is turned into statistics/how to avoid bias in writing questions • Other issues

  30. Reflective exercise: Write a brief questionnaire for your evaluation study • Write 5 good questionnaire items for your evaluation survey based upon your research questions • Who is your intended audience for this instrument (students?, faculty?, administrators?) - why does this matter? • Conclusion: share questions with others/why is it difficult to write good questions?

  31. If time, exploring other methods • Role play exercise: conducting a focus group • Role play exercise: how to interview participants • Basic descriptive statistics: how to analyze user statistics for web-based courses and programs

  32. Why Evaluate: Are we progressing toward meeting Teaching Strategic Plan objectives? What Type of Evaluation:Formative - focus on improving course website to meet specific TSP goals Research Questions: example - to what extent do the participants agree that the website enhanced the course curriculum? Theoretical Orientation: DOI and ALT Frew (2001): Formative Evaluation Study of Web-Based Course Initiatives

  33. Study Population: Faculty and Medical Students (MEDI 605 course - Fall 2000) Methods/Data Collection: Faculty: User Statistics Questionnaire Students: User Statistics Questionnaires Focus Group Frew (2001): Formative Evaluation Study of Web-Based Course Initiatives

  34. Results (an example): Curriculum Enhancement Question: 72.2% faculty - increased enthusiasm for instructional resource 94% students - website improved quality of course Recommendations for Future Educational Development (examples): Greater faculty involvement in diffusion process Offer pre-course training Improve access to computing resources Frew (2001): Formative Evaluation Study of Web-Based Course Initiatives

  35. Additional Resources: • Hanna, D.E., Glowacki-Dudka, M., & Conceição-Runlee, S. (2000). 147 Practical Tips for Teaching Online Groups: Essentials of Web-Based Education. Atwood Publishing: Madison, WI. • Knowles, M.S. (1980). The Modern Practice of Adult Education: From Pedagogy to Andragogy. Cambridge/Prentice Hall: Englewood Cliffs, NJ. • Patton, M. (1994). Utilization-focused Evaluation: 3rd Edition. SAGE Publications: London. • Robson, C. (2000). Small-Scale Evaluation. SAGE Publications: London. • Rogers, E.M. (1995). Diffusion of Innovations (4th ed.). The Free Press: New York. • Trochim, W. (1999). The Research Methods Knowledge Base, 1st Edition. Atomic Dog Publishing: Cincinnati, OH.

  36. Discussion:Your Evaluation Experiences: Lessons LearnedQuestions & Answers

More Related