1 / 31

Evidence-Centered Design and Cisco’s Packet Tracer Simulation-Based Assessment

Evidence-Centered Design and Cisco’s Packet Tracer Simulation-Based Assessment. Robert J. Mislevy Professor, Measurement & Statistics University of Maryland with John T. Behrens & Dennis Frezzo Cisco Systems, Inc. December 15, 2009 ADL, Alexandria, VA.

bessie
Télécharger la présentation

Evidence-Centered Design and Cisco’s Packet Tracer Simulation-Based Assessment

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evidence-Centered Design and Cisco’s Packet Tracer Simulation-Based Assessment Robert J. Mislevy Professor, Measurement & Statistics University of Maryland with John T. Behrens & Dennis Frezzo Cisco Systems, Inc. December 15, 2009 ADL, Alexandria, VA

  2. Simulation-based & Game-based Assessment • Motivation: Cog psych & technology • Complex combinations of knowledge & skills • Complex situations • Interactive, evolving in time, constructive • Challenge of technology-based environments

  3. Outline • ECD • Packet Tracer • Packet Tracer & ECD

  4. ECD

  5. Evidence-Centered Assessment Design Messick’s (1994) guiding questions: • What complex of knowledge, skills, or other attributes should be assessed? • What behaviors or performances should reveal those constructs? • What tasks or situations should elicit those behaviors?

  6. Evidence-Centered Assessment Design • Principled framework for designing, producing, and delivering assessments • Process model, object model, design tools • Explicates the connections among assessment designs, inferences regarding students, and the processes needed to create and deliver these assessments. • Particularly useful for new / complex assessments.

  7. Domain Analysis What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? Domain Modeling How do we represent key aspects of the domain in terms of assessment argument. Conceptualization. • From Mislevy & Riconscente, in press Conceptual Assessment Framework Design structures: Student, evidence, and task models. Generativity. Manufacturing “nuts & bolts”: authoring tasks, automated scoring details, statistical models. Reusability. Assessment Implementation Assessment Delivery Students interact with tasks, performances evaluated, feedback created. Four-process delivery architecture. Layers in the assessment enterprise

  8. Packet Tracer

  9. Cisco’s Packet Tracer • Online tool used in Cisco Networking Academies • Create, edit, configure, run, troubleshoot networks • Multiple representations in the logical layer • Inspection tool links to a deeper physical world • Simulation mode • Detailed visualization and data presentation • Standard support for world authoring • Library of elements • Simulation of relevant deep structure • Copy, paste, save, edit, annotate, lock

  10. Instructors and students can author their own activities

  11. Instructors and students can author their own activities

  12. Instructors and students can author their own activities

  13. Explanation

  14. Experimentation

  15. Packet Tracer & ECD

  16. Domain Analysis What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? Domain Modeling How do we represent key aspects of the domain in terms of assessment argument. Conceptualization. • From Mislevy & Riconscente, in press Conceptual Assessment Framework Design structures: Student, evidence, and task models. Generativity. Manufacturing “nuts & bolts”: authoring tasks, automated scoring details, statistical models. Reusability. Assessment Implementation Assessment Delivery Students interact with tasks, performances evaluated, feedback created. Four-process delivery architecture. Layers in the assessment enterprise

  17. Domain Analysis What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? Domain Modeling How do we represent key aspects of the domain in terms of assessment argument. Conceptualization. • From Mislevy & Riconscente, in press Conceptual Assessment Framework Design structures: Student, evidence, and task models. Generativity. Manufacturing “nuts & bolts”: authoring tasks, automated scoring details, statistical models. Reusability. Assessment Implementation Assessment Delivery Students interact with tasks, performances evaluated, feedback created. Four-process delivery architecture.

  18. Domain Analysis What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? Domain Modeling How do we represent key aspects of the domain in terms of assessment argument. Conceptualization. • From Mislevy & Riconscente, in press Conceptual Assessment Framework Design structures: Student, evidence, and task models. Generativity. • Assessment argument structures • Design Patterns Manufacturing “nuts & bolts”: authoring tasks, automated scoring details, statistical models. Reusability. Assessment Implementation Assessment Delivery Students interact with tasks, performances evaluated, feedback created. Four-process delivery architecture.

  19. Application to familiar assessments Domain Analysis What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? Fixed competency variables Domain Modeling How do we represent key aspects of the domain in terms of assessment argument. Conceptualization. • From Mislevy & Riconscente, in press Conceptual Assessment Framework Design structures: Student, evidence, and task models. Generativity. Manufacturing “nuts & bolts”: authoring tasks, automated scoring details, statistical models. Reusability. Assessment Implementation Upfront design of features of response classes Upfront design of features of task situation – static, implicit in task, not usually tagged Assessment Delivery Students interact with tasks, performances evaluated, feedback created. Four-process delivery architecture.

  20. Application to complex assessments Domain Analysis What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? Multiple, perhaps configured-on-the-fly competency variables Domain Modeling How do we represent key aspects of the domain in terms of assessment argument. Conceptualization. • From Mislevy & Riconscente, in press Conceptual Assessment Framework Design structures: Student, evidence, and task models. Generativity. Manufacturing “nuts & bolts”: authoring tasks, automated scoring details, statistical models. Reusability. Assessment Implementation More complex evaluations of features of task situation Assessment Delivery Students interact with tasks, performances evaluated, feedback created. Four-process delivery architecture.

  21. Time Unfolding situated performance Macro features of performance Micro features of performance Micro features of situation Macro features of situation Application to familiar assessments Application to complex assessments Domain Analysis What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? Domain Modeling How do we represent key aspects of the domain in terms of assessment argument. Conceptualization. • From Mislevy & Riconscente, in press Conceptual Assessment Framework Design structures: Student, evidence, and task models. Generativity. Manufacturing “nuts & bolts”: authoring tasks, automated scoring details, statistical models. Reusability. Assessment Implementation Some up front design of features of performance or effects; others recognized (e.g., agents) Some up front design of features of task situation; others recognized (e.g., agents) Assessment Delivery Students interact with tasks, performances evaluated, feedback created. Four-process delivery architecture. Evolving, interactive, situation

  22. Domain Analysis What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? Domain Modeling How do we represent key aspects of the domain in terms of assessment argument. Conceptualization. • From Mislevy & Riconscente, in press Conceptual Assessment Framework Design structures: Student, evidence, and task models. Generativity. Manufacturing “nuts & bolts”: authoring tasks, automated scoring details, statistical models. Reusability. Assessment Implementation • Object models for representing … • Psychometric models • Includes “competences/proficiencies • Simulation environments • Task templates • Automated scoring Assessment Delivery Students interact with tasks, performances evaluated, feedback created. Four-process delivery architecture.

  23. PADI object model for task/evidence models Domain Analysis What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? Domain Modeling How do we represent key aspects of the domain in terms of assessment argument. Conceptualization. • From Mislevy & Riconscente, in press Conceptual Assessment Framework Design structures: Student, evidence, and task models. Generativity. Manufacturing “nuts & bolts”: authoring tasks, automated scoring details, statistical models. Reusability. Assessment Implementation Assessment Delivery Students interact with tasks, performances evaluated, feedback created. Four-process delivery architecture.

  24. Domain Analysis What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? Graphical representation of network & configuration is expressable as a text representation in XML format, for presentation & work product, to support automated scoring. Domain Modeling How do we represent key aspects of the domain in terms of assessment argument. Conceptualization. • From Mislevy & Riconscente, in press Conceptual Assessment Framework Design structures: Student, evidence, and task models. Generativity. Manufacturing “nuts & bolts”: authoring tasks, automated scoring details, statistical models. Reusability. Assessment Implementation Assessment Delivery Students interact with tasks, performances evaluated, feedback created. Four-process delivery architecture.

  25. Domain Analysis What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? • Authoring interfaces • Simulation environments • Re-usable platforms & elements • Standard data structures • IMS/QTI, SCORM Domain Modeling How do we represent key aspects of the domain in terms of assessment argument. Conceptualization. • From Mislevy & Riconscente, in press Conceptual Assessment Framework Design structures: Student, evidence, and task models. Generativity. Manufacturing “nuts & bolts”: authoring tasks, automated scoring details, statistical models. Reusability. Assessment Implementation Assessment Delivery Students interact with tasks, performances evaluated, feedback created. Four-process delivery architecture.

  26. In Packet Tracer, Answer Network Serves as base pattern for work product evaluation

  27. Dynamic task models - Variable Assignment: Initial Network Similar to the Answer Network Tree.When the activity starts, instead of using the initial network as the starting values, the activity will configure the network with the contents of the variables.

  28. Domain Analysis What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? Domain Modeling How do we represent key aspects of the domain in terms of assessment argument. Conceptualization. • From Mislevy & Riconscente, in press Conceptual Assessment Framework Design structures: Student, evidence, and task models. Generativity. Manufacturing “nuts & bolts”: authoring tasks, automated scoring details, statistical models. Reusability. Assessment Implementation Assessment Delivery Students interact with tasks, performances evaluated, feedback created. Four-process delivery architecture.

  29. Interoperable elements • IMS/QTI, SCORM • Feedback / instruction / reporting Domain Analysis What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? Domain Modeling How do we represent key aspects of the domain in terms of assessment argument. Conceptualization. • From Mislevy & Riconscente, in press Conceptual Assessment Framework Design structures: Student, evidence, and task models. Generativity. Manufacturing “nuts & bolts”: authoring tasks, automated scoring details, statistical models. Reusability. Assessment Implementation Assessment Delivery Students interact with tasks, performances evaluated, feedback created. Four-process delivery architecture.

  30. Conclusion • Behavior in learning environments builds connections for performance environments. • Assessment tasks & features strongly related to instruction/learning objects & features. • Re-use concepts and code in assessment, • via arguments, schemas, data structures that are consonant with instructional objects. • Use data structures that are • share-able, extensible, • consistent with delivery processes and design models.

  31. Further information • Bob Mislevy home page • http://www.education.umd.edu/EDMS/mislevy/ • Links to papers on ECD • Cisco NetPASS • Cisco Packet Tracer • PADI: Principled Assessment Design for Inquiry • NSF project, collaboration with SRI et al. • http://padi.sri.com

More Related