1 / 24

Evaluation of Learning Systems

Evaluation of Learning Systems. Bertram C. Bruce Graduate School of Library and Information Science University of Illinois, Urbana-Champaign. Evaluation.

minna
Télécharger la présentation

Evaluation of Learning Systems

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluation of Learning Systems Bertram C. Bruce Graduate School of Library and Information Science University of Illinois, Urbana-Champaign

  2. Evaluation • “the systematic collection and interpretation of evidence, leading, as part of the process, to a judgment of value with a view to action” (Beeby, 1977)

  3. Basic questions for assessment • What do you want to assess? • What evidence can you gather? • How, when, where? • How will you interpret the results? • To whom will you report? • What will you do about the results?

  4. Assessing the measurement • Reliability: Is the measurement representative? Is it repeatable? • Validity: Does it measure what matters?

  5. Assessments should ... • Draw on multiple sources of evidence • Occur on a regular basis • Closely align with activity • Measure work on authentic tasks • Provide information for multiple stakeholders

  6. Purpose and method • Ongoing assessments: frequent analyses of performance; informal,brief, based on existing practice; formative information to guide action • Focused assessments: periodic analyses of performance designed to measure progress in specific areas; formal, summative information to evaluate mastery of specific skills or concepts

  7. interpretivist/ functionalist internal/external responsive/objective qualitative/quantitative longitudinal./snapshot formative/summative process/product goal-free/goal-directed descriptive/normative emergent/pre-set case study/systemic analysis unique uses/uniform impact open inquiry/audit review Evaluation approaches

  8. Challenges (esp. for ICT) • ...complex characteristics • ...changes with scale • ...reshapes geography • ...limited trail of use • ...radically changeable • ...appropriated within social practices • ...access issues

  9. Technology as social practice • How do we relate learning to ICT use? • How can that inform development? • How do participants interpret ICT’s? • How do they construct their activities? • What aspects persist across realizations? • What materials, activities, and collaborative environments facilitate success?

  10. Developmental Multidimensional Constructive Situated in existing practices Concerns-based adoption model Component checklist Scenario-based design/evaluation Situated evaluation Organizational change

  11. Stage of Concern 6. Refocusing 5. Collaboration 4. Consequence 3. Management 2. Personal 1. Informational 0. Awareness Expression of Concern How can I improve it? …do I relate it to others? …does it affect learners? …do I organize it? …does it affect me? What is it? --- Concerns-based adoption model

  12. Component checklist • Low-cost assessment • Can be done by participants (cf. responsive) • Can incorporate existing practices • Assumes developmental process • Identify components needing attention • Applies at site and project levels

  13. Planning Integration of reading and writing Publishing Meaningful communication Collaboration Revision Classroom management Use of PLANNER Integration with content areas Sharing writing Writing for different audiences Use of LIBRARY and MAILBAG Writing in different genres Working in pairs Teacher's comments Teaching revision Conferencing Frequency of revision Nature of revision Frequency of use Scheduling of QUILL Composing at the computer Students using QUILL Classroom structure Quill components

  14. Component checklist example • Active use • Never • Rare, few participants, forced • Occasional • Frequent • Fully integrated into daily practice

  15. Scenario-based design and evaluation • Example: Border learning • Describe a scenario of successful implementation • Identify the key components • Specify levels of adoption

  16. Situated evaluation • How well does it work? (summative) • How can it be improved? (formative) • What practices emerge as the innovation is incorporated into different settings? (situated) => • How well do they work? • How can they be improved?

  17. Electronic Networks for Interaction • New social dimensions in the classroom • Immersion in a writing community • Collaboration in writing • Writing for authentic purposes • Writing across the curriculum • Writing process made visible

  18. Text sharing Drama Socratic tutoring Scenarios Small group discussions Brainstorming Collaborative writing Devil’s Advocate Distance networking Twenty questions Cross-age tutoring Discussion of reading Discussion of issues Open discussion Realizations of ENFI

  19. Alternate realizations

  20. Aspects of situated evaluation • Analyze the innovation • Analyze existing practices • Observe changes over time as the innovation is incorporated into practice • Examine the functional relevance of differences in realizations • Reanalyze innovation => emergent properties

  21. Data sources • Classroom observations; videotapes • Interviews with students, teachers,... • Student writing • Survey data • Email: Students. teachers, experts • Curriculum guides • Teachers’ writing about their classrooms • Participant feedback

  22. Changes mediated by the contexts of use • Students’ beliefs and values • Parents’ beliefs and values • Teacher’s pedagogical approach • Teacher’s view of the educational potential • Classroom management issues • Support • Institutional realities

  23. Implications of situated evaluation • Evaluation: Understand use in diverse contexts • Teacher education: Innovation begins with the teacher • Curriculum development: Useful tools for the re-creation process • Re-creation: Vital part of the process of change

  24. References • Bruce, B. C., & Rubin, A. D. (1993). Electronic Quills: A situated evaluation of using computers for writing in classrooms. Hillsdale, NJ: Lawrence Erlbaum. (§§8) • Bruce, B. C., Peyton, J. K., & Batson, T. W. (Eds.). (1993). Network-based classrooms: Promises and realities. New York: Cambridge University Press. (§§2) • Loucks-Horsley, S., & Bybee, R. W. (1998 ). Implementing the National Science Education Standards. Science Teacher, 65(6), 22-26.

More Related