1 / 26

Developed and Narrated by

Michigan Assessment Consortium Common Assessment Development Series Putting Together The Test Blueprint. Developed and Narrated by. Bruce R. Fay, PhD Assessment Consultant Wayne RESA. Support.

Télécharger la présentation

Developed and Narrated by

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Michigan Assessment ConsortiumCommon Assessment Development SeriesPutting TogetherThe Test Blueprint

  2. Developed and Narrated by Bruce R. Fay, PhD Assessment ConsultantWayne RESA

  3. Support The Michigan Assessment Consortium professional development series in common assessment development is funded in part by the Michigan Association of Intermediate School Administrators in cooperation with MDE, MSU, Ingham & Ionia ISDs, Oakland Schools, and Wayne RESA.

  4. What You Will Learn Test blueprints…what they are and why you need them The components of a test blueprint Criteria for a good test blueprint Test blueprint examples

  5. “If you don't know where you're going, any road will take you there.” George Harrison (1943 - 2001) "Any Road", Brainwashed, 2002

  6. Assessment with a Purpose Educational assessment is not something incidental to teaching and learning. It is an equal partner with curriculum and instruction. It is the critical “3rd leg” through which both students and teachers receive feedback about the effectiveness of the teaching and learning process in achieving desired learning outcomes. Assessment closes the loop.

  7. Closed–Loop(Feedback) Systems Desired Temperature (Learning Target) Home Heating System (Teaching & Learning) Actual Temperature (Test Results)

  8. C – I – A Alignment • Requires thoughtful alignment – ensuring that the items on a test fairly represent the… • Intended learning targets (intended curriculum) • Actual learning targets (taught curriculum) Test what you teach, teach what you test

  9. Target–Level Alignment Relative importance of those targets Level of cognitive complexity associated with those targets

  10. Useful feedback requires tests that are… Reliable (consistent; actually measure something) Fair (Free from bias or distortions) Valid (contextually meaningful or interpretable; can reasonably support the decisions we make based on them)

  11. Test BlueprintsThe Big Idea • A simple but essential tool, used to: • Design tests that can meet the preceding requirements • Define the acceptable evidence to infer mastery of the targets • Build in evidence for validity

  12. The Test Blueprint orTable of Test Specifications • Explicitly “map” test items to: • Learning Targets • Levels of Complexity • Relative Importance • Provides “common” definition of the test

  13. Learning Targets &Standards Frameworks • Standards as structured hierarchical frameworks. Michigan’s is: • Strands • Standards • Domains • Content Expectations • Detailed curriculum is usually left to local districts or classroom teachers.

  14. A Simple Taxonomy ofCognitive Complexity Norm Webb’s Depth Of Knowledge (1997) (highest to lowest) • Extended Thinking • Strategic Thinking • Skill / concept use / application • Recall

  15. Putting it all together…A Basic Test Blueprint Table (matrix) format (spreadsheet) Rows = learning targets (one for each) Columns = Depth of Knowledge Cells = number of items and points possible

  16. Summary Information • Number of items and points possible: • Row Margins = for that target • Column Margins = for that level of complexity • Lower Right Corner = for the test

  17. Example 1 – Basic Blueprintfor a test with 5 learning targets

  18. Is this reasonable?Rule of Thumb Criteria… • At least 3 items per target for reliability • Appropriate: • Distribution of items over targets • Levels of complexity for targets/instruction • Distribution of items over levels of complexity (all items are NOT at the lowest or highest level)

  19. Professional Judgment • Like all things in education, the development of assessments and the use of the results are dependent on professional judgment, which can be improved through… • Experience • Collaboration • Reflection on methods  Results

  20. Limitations… Shows total points for each target/level combination, but not how those points apply to each item Doesn’t show item types Doesn’t indicate if partial credit scoring can/will be used (but may be implied) But…it was easy to construct, is still a useful blueprint, and is much better than not making one!

  21. Add details on item type and format to ensure… • Appropriate match to learning targets and associated levels of complexity • Balanced use within tests and across tests over time • Specification of test resources, i.e. – • Calculators, dictionaries, measuring tools… • Track on same or separate spreadsheet

  22. Common item types include… • Selected-response • Multiple-choice • Matching • Constructed-response • Brief (fill-in-the-blank, short answer, sort a list) • Extended (outline, essay, etc.) • Performance • Project • Portfolio

  23. Complexity vs. Utility Your test blueprint could get complicated if you try to account for too much in one spreadsheet. Make sure your test blueprint covers the basics, is not a burden to create, and is useful to you The following example is slightly more complicateded, but still workable

  24. Example 2 – Blueprint with Explicit Items and Item Types

  25. Beyond the Test Blueprint Answer key (selected-response items) Links to scoring guides & rubrics Specs for external test resources Item numbering for alternate test forms

  26. Conclusions • Destination & Road Map • Alignment/balance of items/types for… • learning targets (curriculum /content) • size (complexity) of targets • cognitive level of targets • relative importance of targets • Spec or document other aspects of the test

More Related