1 / 121

New Mexico Alternate Performance Assessment

New Mexico Alternate Performance Assessment. (NMAPA) Test Administrator & Rater Training New Mexico Public Education Department American Institutes for Research February 2008. NMPED American Institutes for Research NMAPA Team. Mona Martin, Assessment Administrator, NMPED

ora
Télécharger la présentation

New Mexico Alternate Performance Assessment

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. New Mexico Alternate Performance Assessment (NMAPA) Test Administrator & Rater Training New Mexico Public Education Department American Institutes for Research February 2008

  2. NMPEDAmerican Institutes for ResearchNMAPA Team • Mona Martin, Assessment Administrator, NMPED • Lynnett Wright, Project Director, AIR • Dan Hauser, Operations, AIR • Werner Wothke, Technical Advisor, AIR

  3. Administration Window March 10 to April 25 Testing materials will arrive in the districts by February 29.

  4. Old Alt – New Alt • (OLD) Original NM Alternate Assessment • (NEW) New Mexico Alternate Performance Assessment • See handout for key comparisons

  5. Alternate Assessments Alternate assessments are designed for the small number of students who are unable to participate in regular grade-level state assessments even with appropriate accommodations. (IDEA 1997 & NMAC)

  6. Alternate Assessments • Must be aligned to the state’s content standards • Must yield results in both English language arts and mathematics and must be designed and implemented in a manner that supports use of results as an indicator for AYP • Can measure progress based on alternate achievement standards (NCLB)

  7. Alternate Achievement Standards • Set an expectation of performance that differs in complexity from a grade-level achievement standard • Must be linked to state’s academic content standards • Must promote access to the general education curriculum • Must reflect professional judgment of the highest achievement standards possible (NCLB 34 C.F.R.§200.1(d))

  8. NM Participation CriteriaNMAC 6.31.2.11 (a) The student’s past and present levels of performance in multiple settings (i.e., home, school, community) indicate that a significant cognitive disability is present; (b) The student needs intensive, pervasive, or extensive levels of support in school, home, and community settings; and (c) The student’s current cognitive and adaptive skills and performance levels require direct instruction to accomplish the acquisition, maintenance, and generalization of skills in multiple settings (home, school, community).

  9. NMAPA: Purpose • To promote maximum access to the general education curriculum • To ensure that all students are included in the statewide assessment program • To drive instruction and appropriate pedagogical practices with high expectations

  10. A certified employee of the district An employee of the district who is a special education teacher who has an interim certificate Test Administrators may not administer the assessment to close relatives. Test Administrator Requirements

  11. Training Requirements • Every Test Administrator (generally the student’s teacher) who will administer the NMAPA must attend one of the statewide training sessions.

  12. Administration and Scoring Fidelity • A random sample of the test administrations will be double-scored. • Random sample • Based on the students • Determined by PED

  13. Review and Refinement Process • Rigorous review and refinement by NMPED and AIR, as with any content area achievement test (e.g., fairness, psychometric) • Bias and content review panels of expert New Mexico teachers

  14. Test Administrator • A random sample of NMAPA administrations must be double-scored: The Test Administrator’s record is scored for AYP; the Rater’s record is used for our inter-rater reliability study. • Test Administrator: Teacher, related service provider, diagnostician • Rater: Any of the above and paraprofessionals (EAs)

  15. New Mexico Expanded Grade Band Expectations (EGBEs) Assessment Frameworks

  16. EGBE Development • Began task force work in early spring 2006, finalized on December 15, 2006: focus on academic expectations • Involved a variety of diverse stakeholders from throughout the state, including our Alternate Assessment Advisory Council

  17. EGBE Development • Special, ELL, and general education teachers and administrators wrote and reviewed the EGBEs. • They are available on the Assessment and Evaluation Bureau’s homepage, along with an approval memo from Dr. Garcia and a user’s guide.

  18. EGBE Assessment Frameworks • Linked to grade-level content standards • Basis for IEP goals and objectives • Basis for classroom instruction for students with significant cognitive disabilities

  19. Importance of Font in the EGBEs • Bold = This EGBE is assessed on this year’s NMAPA. • Italic = This is an instructional objective that may be on future tests but is currently to be used for classroom assessment only.

  20. Levels of Complexity Complexity in terms of demands on students • Content (e.g., from simple to complex standards) • Cognitive (e.g., from concrete to more abstract understanding) • Communication levels • Extended Symbolic (Clusters 7 and 8) • Symbolic (Clusters 5 and 6) • Pre-symbolic (Clusters 3 and 4) • Engagement (Clusters 1 and 2)

  21. Link to All of the EGBEs http://www.ped.state.nm.us/div/acc.assess/assess/Expanded_Grade_Band_Expectations/egbe.html

  22. Lessons Learned • Teachers may use the EGBEs as a basis for IEP goals. • Teachers should teach from the EGBEs. • The NMAPA is directly drawn from the EGBEs.

  23. Determining Starting and Concluding Tasks

  24. Pre-assessment to determine the most appropriate starting point • Student Placement Questionnaire (SPQ) • 12–15 “can do” statements addressing student skills and knowledge based on teacher’s prior knowledge • Allows maximum opportunity for student to demonstrate his or her skills without prolonging the assessment

  25. Student Placement Questionnaire • SPQ is designed to identify the most appropriate starting task for each student. • Teachers complete a series of questions regarding student knowledge of the content. • Based on responses, teachers compute a score to determine most appropriate starting task in the test form.

  26. Student Placement Questionnaire (cont.) Each content area SPQ is located in the student score form. The SPQ contains directions for • Identifying the starting task using the SPQ; • Administering and concluding the NMAPA. All students will complete at least these tasks: • 1–5 or • 3–9 or • 6–12

  27. Student Placement Questionnaire (cont.) Student “responds successfully” to the beginning task • When the student gets at least three points on all of the items in the beginning task. If the student is not successful, you should drop back to the beginning of the previous group of tasks.

  28. Student Placement Questionnaire (cont.) Student who “responds successfully” to the concluding task: • Continue to the next task, then the next, and so forth until the student no longer responds successfully. Responding successfully • When the student gets at least three points on the items in the concluding task

  29. Student Placement Questionnaire (cont.) For example: • Starting at Task 1: Tasks 1–5, 6, 7, … • Starting at Task 3: Tasks 3–9, 10, 11, … • Starting at Task 6: Tasks 6–12

  30. Student Placement Questionnaire (cont.) • Possibilities • Starting at Task 1 1 2 3 4 5 6 7 8 9 10 11 12

  31. Student Placement Questionnaire (cont.) • Possibilities • Starting at Task 3 3 4 5 6 78 9 10 11 12

  32. Student Placement Questionnaire (cont.) • Possibilities • Starting at Task 6 6 7 8 9 10 11 12

  33. Student Placement Questionnaire (cont.) • Possibilities • Starting at Task 1 • Starting at Task 3 • Starting at Task 6 1 2 3 4 56 7 8 9 10 11 12 3 4 5 6 78 9 10 11 12 6 7 8 9 10 11 12

  34. Move Forward or Stop Example

  35. Lessons Learned • It is important to follow the directions for completing the SPQ accurately. • The student’s teacher or Test Administrator must complete the SPQ. • If you start too high, you must drop back.

  36. Video Clip Let’s watch as Edward is administered an the task - Being Healthy and Safe

  37. NMAPA Task and Item Information

  38. NMAPA Tasks A task is a collection of items and materials organized around a theme (e.g., a story, a writing activity, a math activity). • Tasks contain 4 to 6 items. Each item • Scripted • Scaffolded to reduce complexity • Students can respond verbally or nonverbally • Includes directions for scoring student responses

  39. NMAPA Tasks • Materials include printed and physical manipulatives, storybooks. • Almost all materials are provided in a manipulatives kit.

  40. Task Information Cover page includes • Materials needed to administer each item • Some are provided by the teacher. • Special adaptive instructions • Access limitations and instructions • Introductory and closure statements

  41. Item Information For each item: • Materials • Directions for setup • Placement of manipulatives • Response cards • Displaying a text

  42. Item Scripting • Scaffolded scripting • Opening statement in a say/do format • Tell me or show me

  43. Item Scripting Opening statement or question • “Here is a _______________.” • “We are going to talk about __________” • Followed by: • The student showing or telling which one is the correct response option or which one is correct

  44. Tells where each of the materials is to be placed. Script is always on the left. Scoring scaffolding directions are always on the right. Graphic Setup

  45. Scaffolding • If the student does not answer correctly or fails to respond, there are specific instructions for the Test Administrator. • These instructions are in boxes within each test item.

  46. Scaffolding (cont.) For example: • If a student indicates _________, record a 3. If not, continue with the scaffolded script below.

  47. Item Format Example of an Item

  48. Video Clips Let’s watch as Devin is administered the task - Money Identification And Tim is administered Ride of My Life

  49. Adapting and Accommodating to Meet Student’s Needs

  50. Allowable Accommodations • Mayer-Johnson pic-syms have been used throughout the tasks and items. • If your student uses a different symbol for the same word, you may substitute that symbol for the one provided. For example:

More Related