1 / 31

Leadership in Using NeSA Data Data Conference April 18-19, 2011

NEBRASKA STATE ACCOUNTABILITY. Leadership in Using NeSA Data Data Conference April 18-19, 2011. Pat Roschewski pat.roschewski@nebraska.gov Jan Hoegh jan.hoegh@nebraska.gov John Moon john.moon@nebraska.gov. Nebraska schools have many sources of data:.

jayden
Télécharger la présentation

Leadership in Using NeSA Data Data Conference April 18-19, 2011

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. NEBRASKA STATE ACCOUNTABILITY Leadership in Using NeSA Data Data Conference April 18-19, 2011 Pat Roschewski pat.roschewski@nebraska.gov Jan Hoegh jan.hoegh@nebraska.gov John Moon john.moon@nebraska.gov

  2. Nebraska schools have many sources of data: Each assessment tool has a purpose and a role in the big picture of Continuous Improvement! Commit to data analysis as a continuous process, not an event. (Reeves, 2009)

  3. Commit to data analysis as a continuous process, not an event. (Reeves, 2009)

  4. AND NOW . . . Nebraska State Accountability

  5. Nebraska schools should use NeSA data to . . . • Provide feedback to students, parents and the community • Inform instructional decisions. • Inform curriculum development and revision. • Measure program success and effectiveness. • Promote accountability to meet state and federal requirements.

  6. Leadership in Using NeSA Data • Session I • What is NeSA? • How do we access and interpret NeSA data? • Session II • How do we use NeSA data?

  7. What is NeSA? Commit to data analysis as a continuous process, not an event. (Reeves, 2009)

  8. NeSA . . . • Criterion-referenced summative tests. • Measurement of the revised Nebraska Academic Standards for reading, mathematics, and science. • Tools that include multiple-choice items. • Tests administered to students online OR paper/pencil.

  9. Commit to data analysis as a continuous process, not an event. (Reeves, 2009)

  10. NeSA . . . • Administered during the spring of the year. • Based on Tables of Specification and Performance Level Descriptors. • Built upon the best thinking of Nebraska educators, national experts, and a worthy partner – Data Recognition Corporation.

  11. What is the author’s purpose for writing the story? • to inform the reader about chores for children • to persuade the reader to increase chore rates • to entertain about two children visiting a farm • to describe the benefits of living on a farm

  12. Goals for Instruction

  13. Where do we find the content components of NeSA? • www.education.ne.gov/Assessment • Tables of Specification • Performance Level Descriptors • Accommodations Guides • Webb’s DOK documents • Other important NeSA documents

  14. NeSA . . . • Produces a raw score that converts to a scale score of 0-200. • Allows for students to be classified into one of three categories: Below the Standards, Meets the Standards, Exceeds the Standards. • Provides comparability across Nebraska school buildings and districts.

  15. How do we access and interpret NeSA data? Commit to data analysis as a continuous process, not an event. (Reeves, 2009)

  16. What is the Data Reporting System?(DRS) • Secure Site – through portal • Public Site – NDE website http://drs.education.ne.gov Commit to data analysis as a continuous process, not an event. (Reeves, 2009)

  17. Interpreting NeSA-R Data Reports • District/building level information • Individual student level information • Subgroup information • Indicator information

  18. Use the Reports Interpretive Guide! http://www.education.ne.gov/Assessment/documents/NESA.Read.InterpretiveGuide.pdf

  19. PERFORMANCE LEVELS - three possible categories of student performance on NeSA ~NeSA Terminology~

  20. How are performance levels determined? • Cut score processes: • Contrasting Group Method – 400+ teachers • Bookmark Method – 100+ teachers • State Board of Education Reviewed • Examined results of both processes • Examined NAEP and ACT results for Nebraska • Made decisions within recommended range at public meeting

  21. RAW SCORE – the number of items a student answers ‘right’ on NeSA-R on NeSA Reports on Conversion Chart ~NeSA Terminology~

  22. SCALE SCORE – a student’s transformed version of the raw score earned on NeSA ~NeSA Terminology~

  23. What is the difference between a raw score and a scale score? What is a raw score? A raw score is the number of correct items. Raw scores have been typically used in classrooms as percentages: 18/20= 90% correct. ~NeSA Terminology~

  24. What is a scale score? A scale score is a “transformation” of the number of items answered correctly to a score that can be more easily interpreted between tests and over time. The scale score maintains the rank order of students (i.e., a student who answers more items correctly gets a higher scale score). For NeSA, we selected 0-200 and will use it for all NeSA tests, including writing. ~NeSA Terminology~

  25. Why convert raw scores to scale scores? Raw scores are converted to scale scores in order to compare scores from year to year. Raw scores should not be compared over time because items vary in difficulty level. Additionally, raw scores should not be compared across different content area tests. Scale scores add stability to data collected over time that raw scores do not provide. ~NeSA Terminology~

  26. On score reports why is the . . . SCALE SCORE CONVERTED TO PERCENTILE RANK? The percentile rank was placed on the score reports because our Technical Advisory Committee felt that parents would want to know their child’s position in relation to other test takers. A percentile rank of 84 means the child scored better than 84% of the students who took the test that year. ~NeSA Terminology~

  27. NeSA (CRT) vs. NRT ? • --Differences-- • Purposes: • NeSA is intended to match and measure identified standards and instruction. • NRT is not intended to measure any state’s standards. The intention is to compare students to each other. • Item Development: • NeSA items with exact match to the standards – NDE had to prove the match with an independent alignment study • NRT – No standards to match – matches inherent and previous knowledge, enriched homes, pre-skills.

  28. NeSA (CRT) vs. NRT ? • --Similarities— • All the psychometric steps – standard setting (Bookmark, Angoff, Contrasting Group) • Reliabilities – KR 20-21 / Inter-rater Reliabilities • Descriptive Statistics (Item P-values, Dif-analysis) • Administration: • Both standardized – are generally administered the same way.

  29. Leadership in Using NeSA Data • Session I • What is NeSA? • How do we access and interpret NeSA data? • Session II • How do we use NeSA data?

  30. NeSA results ARE an important data source! When combined with other information, these data can support curricular, instructional, and learning support decision making. --It’s all about the Continuous Improvement Process!

More Related