1 / 17

Math Item Design

Math Item Design. Design framework Item development Cognitive complexity Sample items. 4 building blocks. Learning Progression. Item Design. Assessment Quality. Outcomes/Scoring. Item Design Framework. BEAR Assessment System Step 2 A match between what is taught and what is assessed

laurieb
Télécharger la présentation

Math Item Design

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. MathItem Design Design framework Item development Cognitive complexity Sample items

  2. 4 building blocks Learning Progression Item Design Assessment Quality Outcomes/Scoring

  3. Item Design Framework • BEAR Assessment System Step 2 • A match between what is taught and what is assessed • Constructive alignment  aligning teaching and assessment to the learning outcomes/standards (Biggs, 2003) • Proposed items are located along the LP map One Framework (Wilson & Sloane, 2000) Learning Outcomes/Standards Assessment Task Teaching & Instruction

  4. Alignment framework • Item design framework used by Smarter-Balanced Assessment under the evidence-centered design approach (Mislevy, Steinberg, & Almond, 2003) • Defined as the degree to which expectations and assessments work together to improve and measure students learning

  5. 4 criteria to determine the degree of alignment • Categorical concurrence • Commonality between the content categories of the standards and those of the assessment items • Range of knowledge correspondence • Number of objectives within each standard covered by item(s) • Balance of representation • Relative coverage of content categories by items in a test • Depth of Knowledge consistency • Match between the cognitive demand of items and the level of cognitive demand communicated by the wording of the objectives

  6. Item development • Universal design • Design item that accurately assess the targeted competency for all students • Ensure item fairness – make sure that items are equally difficult for groups of equal ability (e.g. males and females; urban and rural) • Vocabulary & language • use content-specific language appropriate to the assessed grade • For non-content-specific material, use vocabulary/language from previous grade levels

  7. Item development • Grade appropriateness • Design items that assess a primary content domain/standard of the appropriate grade • “For non-reading items, the reading level is approximately one grade level below the grade level of the test, except for specifically assessed content terms or concepts” (SBAC, 2012) • Using items to link tests • For pre-post test designs, include some items that appeared on previous test(s) to measure student progress • If the time between tests is relatively long (i.e. 2-3 months), the same test can be used • If the time is short (i.e. 2-3 weeks), pick a few items to reuse and include new ones

  8. Cognitive complexity • Use of Modified Bloom’s Taxonomy • Definition • An example of Cognitive Rigor Matrix (Hess, et al., 2009) • Demonstration on how to align standards and proposed item(s) on the LP map

  9. Modified Bloom’s Taxonomy • Modified by Anderson & Krathwohl (2001) Evaluation Synthesis Analysis Application Comprehension Knowledge Old Bloom’s Taxonomy (Bloom, 1956)

  10. Cognitive rigor matrix

  11. Sample items focus on Application process

  12. Sample item* • Intended level: Grade 8 • Domain: Expressions and Equations • Content standard: Analyze and solve linear equations and pairs of simultaneous linear equations. • CCSS: Analyze and solve pairs of simultaneous linear equations (8.EE.8). • Intended claims: 1, 2, 3 and 4 Max is organizing a trip to the airport for a party of 75 people. He can use two types of taxi. A small taxi costs $40 for the trip and holds up to 4 people. A large taxi costs $63 for the trip and holds up to 7 people. * Adapted from SBAC (2013, p. 134)

  13. Sample item Grade 8 How many taxis of each type should Max order, to keep the total cost as low as possible? Explain.

  14. Sample item Grade 7 Let L be the number of large taxis needed and S be the number of small taxis needed. a. Write an expression to show the number of taxis needed. b. If Max orders 6 large taxis, how many small taxis will he need? b. How much will the cost be?

  15. Sample item Grade 6 a. Let x be the number of large taxis needed. If Max wants to order large taxis only, evaluate x. b. If each person must equally share the cost of taking large taxi defined previously, write an expression for the cost of each person. Let y be the amount of money each person must pay. Calculate y.

  16. Bibliography • Bloom, B. S. (1956). Taxonomy of educational objectives. Handbook I: The Cognitive Domain. New York, NY: David McKay Co. • Anderson, L. W., & Krathwohl, D. (2001). A Taxonomy for Learning, Teaching, and Assessing: A Revision of Bloom's Taxonomy of Educational Objectives. New York, NY: Longman. • Hess, K., Carloc, D., Jones, B., & Walkup, J., (2009). What exactly do “fewer, clearer, and higher standards” really look like in the classroom? Using a cognitive rigor matrix to analyze curriculum, plan lessons, and implement assessments. Paper presented at CCSSO, Detroit, Michigan. • Nitko, A. J., & Brookhart, S. (2007). Educational assessment of students. Upper Saddle River, NJ: Pearson Education, Inc. • McMillan, J. H. (2007). Classroom assessment. Principles and practice for effective standard-based instruction (4th ed.). Boston: Pearson - Allyn & Bacon. • Oregon Department of Education. (2014, June). Assessment guidance. • Webb, N. (2007). Aligning assessments and standards. Retrieved from http://www.wcer.wisc.edu/news/coverStories/aligning_assessments_and_standards.php • Wilson, M. (2005). Constructing measures: An item response modeling approach. New York, NY: Psychology Press, Taylor & Francis Group. • Wilson, M., & Sloane, K. (2000). From principles to practice: An embedded assessment system. Applied Measurement in Education, 13 (2), pp. 181-208. • Smarter Balanced Assessment Consortium. (2012, April). General item specifications. • Smarter Balanced Assessment Consortium. (2013, June). Content specifications for the summative assessment of the Common Core State Standards for Mathematics. Revised draft.

  17. Creative Commons License Item Design (Math) PPT by the Oregon Department of Education and Berkeley Evaluation and Assessment Research Center is licensed under a CC BY 4.0. You are free to: • Share — copy and redistribute the material in any medium or format • Adapt — remix, transform, and build upon the material Under the following terms: • Attribution— You must give appropriate credit, provide a link to the license, and indicate if changes were made. You may do so in any reasonable manner, but not in any way that suggests the licensor endorses you or your use. • NonCommercial— You may not use the material for commercial purposes. • ShareAlike— If you remix, transform, or build upon the material, you must distribute your contributions under the same license as the original. Oregon Department of Education welcomes editing of these resources and would greatly appreciate being able to learn from the changes made. To share an edited version of this resource, please contact Cristen McLean, cristen.mclean@state.or.us.

More Related