1 / 52

So Much Data – Where Do I Start?

So Much Data – Where Do I Start?. Assessment & Accountability Conference 2008 Session #18. Purpose of the Presentation. Review what MI-Access reports are provided Review what information can be found on the MI-Access reports Talk about how the results might be used.

sordonez
Télécharger la présentation

So Much Data – Where Do I Start?

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. So Much Data – Where Do I Start? Assessment & Accountability Conference 2008 Session #18

  2. Purpose of the Presentation Review what MI-Access reports are provided Review what information can be found on the MI-Access reports Talk about how the results might be used

  3. Three MI-Access Assessments Participation: For students who have, or function as if they have, severe cognitive impairment Supported Independence: For students who have, or function as if they have, moderate cognitive impairment Functional Independence: For students who have, or function as if they have, mild cognitive impairment

  4. Selecting the Right Assessment The IEP Team determines which assessment is appropriate MI-Access Web page: www.mi.gov/mi-access. Look under “IEP Team Information” section. The data will only be helpful IF the appropriate assessment is selected

  5. Content Areas Assessed All three populations are assessed in three content areas in the fall English language arts Mathematics Science ELA and mathematics are assessed in grades 3-8 Science is assessed in grades 5 and 8 Grade 11 MI-Access students are assessed in the spring (ELA, mathematics, and science)

  6. MI-Access Reports

  7. Individual Student Reports: Design Student demographic information Student performance summary Individual item analysis for released items Earned/points possible by component or strand

  8. Individual Student Report:Participation ELA

  9. How Participation Scores are Derived Two types of items activity-based observation selected-response PAA and SAA observe and score student PAA and SAA scores are added together to determine total item points

  10. Individual Student Report:Supported Independence Mathematics

  11. How SI Scores are Derived Two types of items activity-based observation (except science) selected-response PAA and SAA observe and score student PAA and SAA scores are added together to determine total item points

  12. Individual Student Report:Participation Science

  13. Individual Student Report:Functional Independence Science

  14. How FI Scores are Derived • No scoring rubric for Functional Independence • Only selected-response items • Student receives 1 point for each correct response • The ONLY exception is ELA Expressing Ideas • Open-ended response to a prompt • Scored using a 4-point rubric • Student receives may receive up to 4 points per prompt

  15. Other Differences:Condition Codes P/SI Condition Codes: All Content Areas A = responds incorrectly B = resists/refuses C = responds only after the assessment administrator provides hand-over-hand assistance or step-by-step directions FI Condition Codes: ELA Expressing Ideas Only A = off topic B = illegible C = written in a language other than English D = blank/refused to respond All condition codes = zero points

  16. Other Differences:Performance Level Change ONLY for Functional Independence ELA and Mathematics

  17. Performance Level Change SI = Significant Improvement; I = Improvement; N = No Change; D = Decline; SD = Significant Decline

  18. Using ISRs for Instruction and Curriculum Compare ISR to other data you have for the student. Is this what you would expect? If prior year and performance level change data are provided, is the student making progress over time? Using released item booklets (www.mi.gov/mi-access), match to curriculum and instruction. Has the student been taught this? If yes, are teaching methods effective for this student? Identify strengths to reinforce Identify areas where additional instruction is needed Confirm this is the right assessment for the student to take

  19. Rosters List scores by individual student One report for each grade and content area Provided at class, school, and district levels (not state)

  20. Rosters Show number assessed and mean scale score or mean earned points at top left List results by student

  21. Rosters For FI ELA and mathematics, shows current year scale score, two years of performance level (high, mid, and low), and performance level change

  22. Rosters Shows earned points by component or strand, EGLCE or EB, and overall

  23. Using Rosters for Instruction and Curriculum Best place to gather information about assessment items, because results are provided by EGLCE or EB www.mi.gov/mi-access

  24. Summary Reports Executive summaries of student scores Provided at school, district, and state levels One report for each grade and each content area assessed (ELA, mathematics, and/or science) Report generated ONLY when there are 10 or more students at the same grade level taking the same assessment

  25. Supported Independence District Summary Report: ELA Summary results shown for current year at top right

  26. Supported Independence District Summary Report: ELA Shows # and % of students that earned each earned points total

  27. Functional Independence Summary Reports: ELA and Mathematics Grades 4, 5, 6, 7, and 8 ONLY PERFORMACE LEVEL CHANGE — YEAR-TO-YEAR TRANSITIONS

  28. Functional Independence Summary Reports: Performance Level Change “Gaining” means there was an improvement from 2006 to 2007 “Maintaining” means scores that were “proficient” stayed the same “Not gaining” means scores that were “not proficient” stayed the same “Declining” means there was a decline from 2006 to 2007

  29. Functional Independence Summary Reports: ELA and Mathematics Grades 4, 5, 6, 7, and 8 ONLY PERFORMANCE LEVEL CHANGE — SUMMARY

  30. Using Summary Reports for Instruction and Curriculum Look at progress collectively. How did our fifth graders do this year compared to last year? Look at progress over time. Are there trends to be aware of? Compare state assessment data to other data. Is this what we would expect? Within the performance levels, where are clusters of students?

  31. Functional Independence Summary Report: Science

  32. P/SI Parent Reports Page 1 shows student’s earned points and performance level for current year.

  33. P/SI Parent Reports Page 2 shows appropriate scoring rubric, and ELA and mathematics scores by component or strand ELA Mathematics

  34. P/SI Parent Reports Science Page 3 shows science scores by strand and ELA individual student item analysis ELA

  35. P/SI Parent Reports Mathematics Page 4 shows mathematics and science individual student item analysis Science

  36. Functional Independence Parent Reports (Grades 4, 5, 6, 7, & 8) Page 1 shows the student’s scale score and performance level for 2007 and 2006. Also shows “Performance Level Change” for current and past year.

  37. Functional Independence Parent Reports Page 2 shows student’s earned points for ELA and mathematics, and his/her scale scores in a range ELA Mathematics

  38. Functional Independence Parent Reports Page 3 shows student’s earned points and scale score range for science. Also shows the student’s item analysis for ELA Science ELA

  39. Functional Independence Parent Reports Page 4 shows student’s individual item analysis Mathematics Science

  40. Item Analyses Provide detailed, aggregated information on released items Can be used to identify areas of collective strengths and areas that need improvement Provided at school, district, and state level Provided only when 10 or more students in the same grade take the same assessment

  41. District Item Analysis: ELA Shows # and % of students who selected each answer choice for each released Word Recognition and Text Comprehension item

  42. District Item Analysis: ELA Shows the # and % of students at (1) each score based on a 4-point rubric, and (2) each condition code

  43. District Item Analysis: ELA Also shows the # and % of students that received specific comment codes

  44. Using Item Analyses Use the Item Analysis Report along with the Released Item Booklet to identify collective strengths and weaknesses (available at www.mi.gov/mi-access)

  45. Using Item Analyses: ELA State Fall 2006 Functional Independence Grade 3 Informational passage about a chameleon EGLCE being measured: Make inferences, predictions, and conclusions Only 62.1% answered the item correctly (A) C was the incorrect answer chosen most often (23%)

  46. Using Item Analyses: ELA State Fall 2006 Same narrative passage EGLCE being measured: Identify main ideas and details 74.7% of students answered the item correctly (B) A was the incorrect answer chosen most often (13%)

  47. Using Item Analyses: ELA State Fall 2006 Expressing Ideas EGLCE being measured: Write/draw personal narrative Only 7% of students received a “4,” and 23% received a “3” Comment code given most often = “Showed limited development with insufficient details and/or examples”

  48. Using Item Analyses: Math State Fall 2006 Functional Independence Grade 7 EGLCE being measured: Solve problems using data Only 33.6% answered correctly (A) C was the incorrect answer chosen most often (54.6%)

  49. Using Item Analyses: Mathematics State 2006 EGLCE being measured: Recognize representations for whole numbers to 10,000 96.5% answered correctly (C) B was the incorrect answer chosen most often (2.2%)

More Related