html5-img
1 / 31

New England Common Assessment Program

New England Common Assessment Program. Item Review Committee Meeting March 26-28, 2007 Lake Morey Inn ~ Fairlee, VT. Welcome and Introductions. Tim Kurtz Director of Assessment NH Department of Education. Harold Stephens NECAP Program Director Measured Progress.

omer
Télécharger la présentation

New England Common Assessment Program

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. New England Common Assessment Program Item Review Committee Meeting March 26-28, 2007 Lake Morey Inn ~ Fairlee, VT

  2. Welcome and Introductions Tim Kurtz Director of Assessment NH Department of Education Harold Stephens NECAP Program Director Measured Progress

  3. NH DOE Introductions

  4. RI DOE Introductions

  5. VT DOE Introductions

  6. Measured Progress Introductions

  7. Measured Progress Introductions

  8. Overview • 2007-08 Schedule • Bias/Sensitivity Review & Role of Committees • Depth of Knowledge • Test Item Review & Role of Committees • Universal Design for Assessment

  9. NECAP 2007-08 Schedule • Item Review Committee meeting – March 26-28(96 teachers, 4 from each state for reading and mathematics in each grade cluster committee – 3/4, 5/6, 7/8, 11) • Bias Committee meeting – March 26-27(18 teachers, 6 from each state) • Face to Face meetings – May 22 to June 2(State content specialists and Measured Progress developers) • Test Forms Production & DOE Reviews – July and August • Printing – late August and early September

  10. NECAP 2007-08 Schedule • Test Administration workshops – late August and early September • Shipments to schools – September 10-14 • Test Administration window – October 1-23 • Pick up of test materials – October 24 • Scoring – November • Reports shipped to schools – late January 2008

  11. Bias/Sensitivity Review How do we ensure that this test works well for students from diverse backgrounds?

  12. What Is Item Bias? • Bias is the presence of some characteristic of an assessment item that results in the differential performance of two individuals of the same ability but from different student subgroups • Bias is not the same thing as stereotyping, although we don’t want either in NECAP • We need to ensure that ALL students have an equal opportunity to demonstrate their knowledge and skills

  13. Role of the Bias-Sensitivity Review Committee The Bias-Sensitivity Review Committee DOES need to make recommendations concerning… • Sensitivity to different cultures, religions, ethnic and socio-economic groups, and disabilities • Balance of gender roles • Use of positive language, situations and images • In general, items and text that may elicit strong emotions in specific groups of students, and as a result, may prevent those groups of students from accurately demonstrating their skills and knowledge

  14. Role of the Bias-Sensitivity Review Committee The Bias-Sensitivity Review Committee DOESNOT need to make recommendations concerning… • Reading Level • Grade Level Appropriateness • GE Alignment • Instructional Relevance • Language Structure and Complexity • Accessibility

  15. Depth of Knowledge How do we ensure that the test contains a range of complexity?

  16. Depth of Knowledge • Level 1: RecallRequires recall of facts, information, or procedure • Level 2: Skill / ConceptRequires use of information or conceptual knowledge, two or more steps, etc. • Level 3: Strategic ThinkingRequires reasoning, developing a plan or a sequence of steps, some complexity, and more than one possible answer • Level 4: Extended ThinkingRequires an investigation, time to think and process multiple conditions of the problem

  17. Depth of Knowledge

  18. Depth of Knowledge

  19. Depth of Knowledge

  20. Test Item Review Committees This assessment has been designed to support a quality program in mathematics and English language arts. It has been grounded by the input of hundreds of NH, RI, and VT educators. Because we intend to release assessment items each year, the development process continues to depend on the experience and professional judgment and wisdom of classroom teachers from our three states.

  21. Role of the Test Item Review Committees Over the next few days you will be looking at items and passages in reading and writing and items and graphics in mathematics. There will also be people representing the three states and Measured Progress in the breakout rooms.

  22. Role of the DOE Staff The DOE staff are here to listen to the perspectives of the committee members on items. They will provide policy context around item development and answer questions about the rationale for decisions regarding item development and the NECAP program in general.

  23. Role of Measured Progress Developers The staff from Measured Progress are here to facilitate discussion and record feedback and input from the committee members on items and rubrics.

  24. Role of the Committee Members As an item review committee member, you are here to review items and rubrics, provide input, and ensure that the classroom perspectives and expertise of educators inform the development of the NECAP tests.

  25. Role of the Test Item Review Committees You will be asked to review all items against the following criteria: 1) Content Correctness 2) Universal Design 3) Grade Expectation Alignment 4) Depth of Knowledge

  26. Item Review Criteria Content Correctness • Are the items developmentally appropriate? • For each multiple choice item, is there a single correct answer? • Are the scoring guides and rubrics consistent with GEs? • For each constructed response item, is there any additional input regarding correct or partially correct responses? What would be acceptable or necessary to earn partial credit?

  27. Item Review Criteria Universal Design • Is there an appropriate use of simplified language (does not interfere with the construct being assessed)? • Are charts, tables and diagrams easy to read and understandable? • Are charts, tables and diagrams necessary to assist students in either answering or explaining the item? • Are instructions easy to follow? • Is the item amenable to accommodations – for example: read aloud, signed, or Braille? • Is the language clear and accurate (syntax, grammar, conventions)?

  28. Item Review Criteria Grade Expectation Alignment • Is the test item aligned to the appropriate GE? • If the item is aligned to more than one GE, is the GE identified the most predominant one assessed? • If not, which GE or grade level is more relevant?

  29. Item Review Criteria Depth of Knowledge • Is the item coded to the appropriate Depth of Knowledge level?

  30. Item Review Sheet

  31. Role of the Test Item Review Committees You are here today to represent your diverse contexts. We hope that you… • share your thoughts vigorously, and listen just as intensely – we have different expertise and we can learn from each other, • use the pronouns “we” and “us” rather than “they” and “them” – we are all working together to make this the best assessment possible, and • grow from this experience – I know we will. And we hope that today will be the beginning of some new interstate friendships.

More Related