1 / 68

Design Patterns: Supporting Task Design by Scaffolding the Assessment Argument

Design Patterns: Supporting Task Design by Scaffolding the Assessment Argument. Robert J. Mislevy University of Maryland Geneva Haertel & Britte Haugan Cheng SRI International. DR K-12 grant #0733172, “Application of Evidence-Centered Design to State Large-Scale Science Assessment.”

delila
Télécharger la présentation

Design Patterns: Supporting Task Design by Scaffolding the Assessment Argument

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Design Patterns: Supporting Task Design by Scaffolding the Assessment Argument Robert J. MislevyUniversity of Maryland Geneva Haertel & BritteHaugan Cheng SRI International DR K-12 grant #0733172, “Application of Evidence-Centered Design to State Large-Scale Science Assessment.” NSF Discovery Research K-12 PI meeting, November 10, Washington D.C. This material is based upon work supported by the National Science Foundation under Grant No. DRL- 0733172. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the National Science Foundation.

  2. Overview Design patterns Background Evidence-Centered Design Main idea Layers Assessment Arguments Attributes of Design Patterns How they inform task design

  3. Design Patterns Design Patterns in Architecture Design Patterns in Software Engineering Polti’s Thirty-Six Dramatic Situations

  4. Messick’s Guiding Questions What complex of knowledge, skills, or other attributes should be assessed? What behaviors or performances should reveal those constructs? What tasks or situations should elicit those behaviors? Messick, S. (1994). The interplay of evidence and consequences in the validation of performance assessments. Educational Researcher, 23(2), 13-23.

  5. Evidence-Centered Assessment Design Organizing formally around Messick quote Principled framework for designing, producing, and delivering assessments Conceptual model, object model, design tools Connections among design, inference, and processes to create and deliver assessments. Particularly useful for new / complex assessments. Useful to think in terms of layers

  6. From Mislevy & Riconscente, in press Domain Analysis What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? Domain Modeling How do we represent key aspects of the domain in terms of assessment argument. Conceptualization. Conceptual Assessment Framework Design structures: Student, evidence, and task models. Generativity. Manufacturing “nuts & bolts”: authoring tasks, automated scoring details, statistical models. Reusability. Assessment Implementation Assessment Delivery Students interact with tasks, performances evaluated, feedback created. Four-process delivery architecture. Layers in the assessment enterprise

  7. From Mislevy & Riconscente, in press Domain Analysis What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? Domain Modeling How do we represent key aspects of the domain in terms of assessment argument. Conceptualization. Conceptual Assessment Framework Design structures: Student, evidence, and task models. Generativity. Manufacturing “nuts & bolts”: authoring tasks, automated scoring details, statistical models. Reusability. Assessment Implementation Assessment Delivery Students interact with tasks, performances evaluated, feedback created. Four-process delivery architecture.

  8. From Mislevy & Riconscente, in press Domain Analysis What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? Domain Modeling How do we represent key aspects of the domain in terms of assessment argument. Conceptualization. Conceptual Assessment Framework Design structures: Student, evidence, and task models. Generativity. • Assessment argument structures • Design Patterns Manufacturing “nuts & bolts”: authoring tasks, automated scoring details, statistical models. Reusability. Assessment Implementation Assessment Delivery Students interact with tasks, performances evaluated, feedback created. Four-process delivery architecture.

  9. From Mislevy & Riconscente, in press Domain Analysis What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? Domain Modeling How do we represent key aspects of the domain in terms of assessment argument. Conceptualization. Conceptual Assessment Framework Design structures: Student, evidence, and task models. Generativity. Manufacturing “nuts & bolts”: authoring tasks, automated scoring details, statistical models. Reusability. Assessment Implementation • Psychometric models • Automated scoring • Task templates • Object models • Simulation environments Assessment Delivery Students interact with tasks, performances evaluated, feedback created. Four-process delivery architecture.

  10. From Mislevy & Riconscente, in press Domain Analysis What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? • Authoring interfaces • Simulation environments • Re-usable platforms & elements Domain Modeling How do we represent key aspects of the domain in terms of assessment argument. Conceptualization. Conceptual Assessment Framework Design structures: Student, evidence, and task models. Generativity. Manufacturing “nuts & bolts”: authoring tasks, automated scoring details, statistical models. Reusability. Assessment Implementation Assessment Delivery Students interact with tasks, performances evaluated, feedback created. Four-process delivery architecture.

  11. From Mislevy & Riconscente, in press Domain Analysis What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? Domain Modeling How do we represent key aspects of the domain in terms of assessment argument. Conceptualization. • Interoperable elements • IMS/QTI, SCORM • Feedback / instruction / reporting Conceptual Assessment Framework Design structures: Student, evidence, and task models. Generativity. Manufacturing “nuts & bolts”: authoring tasks, automated scoring details, statistical models. Reusability. Assessment Implementation Assessment Delivery Students interact with tasks, performances evaluated, feedback created. Four-process delivery architecture.

  12. Toulmin’s Argument Structure Claim unless Alternative explanation since Warrant so Backing Data

  13. Assessment Argument Structure Claim about student unless Alternative explanations Warrant for assessment argument since so Data concerning performance

  14. Assessment Argument Structure Data concerning situation Claim about student unless Alternative explanations Warrant for assessment argument since so Data concerning performance

  15. Assessment Argument Structure Warrant for scoring Warrant for task design since since Claim about student unless Alternative explanations Warrant for assessment argument since so Data concerning situation Data concerning performance Student acting in assessment situation

  16. Assessment Argument Structure Claim about student unless Alternative explanations Warrant for assessment argument since e.g., near or far transfer, familiarity with tools, assessment format, representational forms, evaluation standards, task content & context. so Data concerning situation Data concerning performance Not in measurement models, but crucial to inference. Other information concerning student vis a vis assessment situation Warrant for scoring Warrant for task design since since Student acting in assessment situation

  17. PADI Design Patterns Structured around assessment arguments Substance based on recurring principles, ways of thinking, inquiry, etc. E.g., NSES on inquiry, unifying themes Science ed. & cog psych research

  18. Some PADI Design Patterns Model-Based Reasoning Model Formation; Evaluation; Revision; Use Model-Based Inquiry Design under Constraints Generate Scientific Explanations Troubleshooting (with Cisco) Assessing Epistemic Frames (in progress; with David Williamson Shaffer)

  19. The Structure of Assessment Design Patterns

  20. How Design Patterns Support Thinking about the Assessment Argument Claim about student unless Alternative explanations Warrant for assessment argument since so Data concerning situation Data concerning performance Other information concerning student vis a vis assessment situation Warrant for scoring Warrant for task design since since Student acting in assessment situation

  21. How Design Patterns Support Thinking about the Assessment Argument Claim about student The design pattern is organized around Focal KSAs. They will be involved in the Claim, although there may be other KSAs that are included in the target of inference (e.g., Model Formation—but what models, what context?). unless Alternative explanations Warrant for assessment argument since so Data concerning situation Data concerning performance Other information concerning student vis a vis assessment situation Warrant for scoring Warrant for task design since since Student acting in assessment situation Associated with Characteristic Features of Tasks.

  22. How Design Patterns Support Thinking about the Assessment Argument Claim about student unless Alternative explanations Warrant for assessment argument The Rationale provides background into the nature of the Focal KSAs, and the kinds of things that people do in what kinds of situations that evidence it. It contributes to the Warrant in the assessment argument. since so Data concerning situation Data concerning performance Other information concerning student vis a vis assessment situation Warrant for scoring Warrant for task design since since Student acting in assessment situation

  23. How Design Patterns Support Thinking about the Assessment Argument Claim about student unless Alternative explanations Warrant for assessment argument since so Data concerning situation Data concerning performance Other information concerning student vis a vis assessment situation Warrant for scoring Warrant for task design since since Student acting in assessment situation Additional KSAs play multiple roles. You need to think about which ones you really DO want to include as targets of inference (validity) and which ones you really DON’T (invalidity).

  24. How Design Patterns Support Thinking about the Assessment Argument Claim about student unless Alternative explanations Warrant for assessment argument since The Additional KSAs you DO want to include as targets of inference are part of the claim. E.g., knowing Mendel’s laws as well as being able to formulate a model in an investigation. so Data concerning situation Data concerning performance Other information concerning student vis a vis assessment situation Warrant for scoring Warrant for task design since since Student acting in assessment situation Connected with Variable Features of Tasks.

  25. How Design Patterns Support Thinking about the Assessment Argument Claim about student unless Alternative explanations Warrant for assessment argument since The Additional KSAs you DON’T want to include as targets of inference introduce alternative explanations for poor performance. (Especially important for assessing special populations – UDL & acommodations.) so Data concerning situation Data concerning performance Other information concerning student vis a vis assessment situation Warrant for scoring Warrant for task design since since Student acting in assessment situation Connected with Variable Features of Tasks & Work Products.

  26. How Design Patterns Support Thinking about the Assessment Argument Claim about student unless Alternative explanations Warrant for assessment argument since so Data concerning situation Data concerning performance Other information concerning student vis a vis assessment situation Warrant for scoring Warrant for task design since since Student acting in assessment situation The Characteristic Features of Tasks help you think about critical data concerning the situation –what you need to get evidence about the Focal KSAs.

  27. How Design Patterns Support Thinking about the Assessment Argument Claim about student or to bring in or reduce demand for Additional KSAs to avoid alternative explanations. unless Alternative explanations Warrant for assessment argument since so Data concerning situation Data concerning performance Other information concerning student vis a vis assessment situation Warrant for scoring Warrant for task design since since Student acting in assessment situation Variable Features of Tasks also help you think about data concerning the situation – but now to influence difficulty …

  28. How Design Patterns Support Thinking about the Assessment Argument Some Variable Features of Tasks help you match features of tasks and background / knowledge / characteristics of students: Interests, familiarity, previous instruction. Claim about student unless Alternative explanations Warrant for assessment argument since so Data concerning situation Data concerning performance Other information concerning student vis a vis assessment situation Warrant for scoring Warrant for task design since since Student acting in assessment situation

  29. How Design Patterns Support Thinking about the Assessment Argument Claim about student unless Alternative explanations Warrant for assessment argument Can also call attention to demand for Additional KSAs, & avoid alternative explanations (e.g., Stella) since so Data concerning situation Data concerning performance Other information concerning student vis a vis assessment situation Warrant for scoring Warrant for task design since since Student acting in assessment situation Potential Work Products help you think about what you want to capture from a performance –product, process, constructed model, written explanation, etc.

  30. How Design Patterns Support Thinking about the Assessment Argument Claim about student Potential Observations are possibilities for the qualities of Work Products – i.e., the data concerning the performance. unless Alternative explanations Warrant for assessment argument since so Data concerning situation Data concerning performance Other information concerning student vis a vis assessment situation Warrant for scoring Warrant for task design since since Student acting in assessment situation

  31. How Design Patterns Support Thinking about the Assessment Argument Claim about student And Potential Rubrics are algorithms/rubrics/rules for evaluating Work Products to get the data concerning the performance. unless Alternative explanations Warrant for assessment argument since so Data concerning situation Data concerning performance Other information concerning student vis a vis assessment situation Warrant for scoring Warrant for task design since since Student acting in assessment situation

  32. For more information… PADI: Principled Assessment Design for Inquiry http://padi.sri.com NSF project, collaboration with SRI et al. Links to follow-on projects Bob Mislevy home page http://www.education.umd.edu/EDMS/mislevy/ Links to papers on ECD Cisco applications

  33. Now for the Good Stuff … • Examples of design patterns with content • Different projects • Different grain sizes • Different users • How they evolved to suit needs of users • Same essential structure • Representations, language, emphases, and affordances tuned to users and needs • How they are being used

  34. Use of Design Patterns in STEM Research and Development Projects Britte Haugan Cheng and Geneva Haertel DRK-12 PI Meeting, November 2009

  35. Current Catalog of Design Patterns • ECD/PADI related projects have produced over 100 Design Patterns • Domains include: science inquiry, science content, mathematics, economics, model-based reasoning • Design Patterns span grades 3-16+ • Organized around themes, models, and processes, not surface features or formats of tasks • Support the design of scenario-based, multiple choice, and performance tasks • The following examples show how projects have used and customized Design Patterns in ways that suit their needs and users

  36. Example 1 DRK-12 ProjectAn Application of ECD to a State, Large-scale Science Assessment • Challenge in Minnesota Comprehensive Assessment of science: • How to design scenario-based tasks, technology-enhanced interactions, grounded in standards both EFFICIENTLY and VALIDLY. • Design Patterns support storyboard writing and task authoring • Designers are committee of MN teachers, supported by Pearson • Project focuses on a small number of Design Patterns for “hard-to-assess” science content/inquiry • Based on Minnesota state science standards and benchmarks and the NSES inquiry standards • Design Patterns are Web-based and interactive

  37. Design Pattern Observational Investigation • Relates science content/processes tocomponents of assessment argument • Higher-level, cross-cutting themes, ways of thinking, ways of using science, rather than many finer-grained standards • Related to relevant standards and benchmarks • Interactive Features: • Examples and details • Activate pedagogical content knowledge • Presents exemplar assessment tasks • Provides selected knowledge representations • Links among associated assessment argument components

  38. Design Pattern Observational Investigation

  39. Design Pattern Observational Investigation (cont.)

  40. Design Pattern Observational Investigation (cont.)

  41. Interactive Feature: Details

  42. Interactive Feature:Linking assessment argument components

  43. Design Pattern HighlightsObservational Investigation • Relates science content/processes tocomponents of assessment argument • Higher-level, cross-cutting themes, ways of thinking, ways of using science, rather than many fine-grained standards • Interactive Features: • Examples and details • Activates pedagogical content knowledge • Presents exemplar assessment tasks • Provides selected knowledge representations • Relates relevant standards and benchmarks • Links among associated assessment argument components

  44. Design Pattern Reasoning about Complex Systems • Relates science content/processes to components of assessment argument • Across scientific domains and standards • Convergence among the design of instruction, assessment and technology • Interactive Features: • Explicit support for designing tasks around multi-year learning progression

  45. Design Pattern Reasoning about Complex Systems

  46. Interactive Feature:Details

  47. Interactive Feature:Linking assessment argument components

  48. Design Pattern HighlightsReasoning about Complex Systems • Relates science content/processes to components of assessment argument • Across scientific domains and standards • Convergence among the design of instruction, assessment and technology • Interactive Feature: • Explicit support for designing tasks around multi-year learning progression

  49. Example 2Principled Assessment Designs in InquiryModel-Based Reasoning Suite • Relates science content/processes to components of assessment argument • A suite of seven related Design Patterns support curriculum-based assessment design • Theoretically and empirically motivated by Stewart and Hafner (1994), Research on Problem-Solving: Genetics. In D. L. Gable (Ed.), Handbook of research on science teaching and learning. New York: MacMillan Publishing. • Aspects of model-based reasoning including model formation, model use, model revision, and coordination among aspects of model-based reasoning • Multivariate student model: scientific reasoning and science content • Interactive Feature: • Support the design of both: • Independent tasks associated with an aspect of model-based reasoning • Steps in a larger investigation comprised of several aspects including model conceptualization, model use and model evaluation

  50. Design PatternModel Formation

More Related