1 / 65

2014 CAASPP Interpreting and Using Results

2014 CAASPP Interpreting and Using Results. September 2014 Webcast. Objectives. Workshop participants will be able to: Describe the purposes of CAASPP reports Interpret CAASPP results Explain key statistics Compare and contrast types of reports Identify proper uses of reports.

Télécharger la présentation

2014 CAASPP Interpreting and Using Results

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. 2014 CAASPPInterpreting and Using Results September 2014 Webcast

  2. Objectives Workshop participants will be able to: • Describe the purposes of CAASPP reports • Interpret CAASPP results • Explain key statistics • Compare and contrast types of reports • Identify proper uses of reports 2014 Post-Test Workshop 2

  3. Agenda • What’s New? • Results and Statistical Analysis • Using Results • Summary and Internet Reports • Data CDs • Individual Student Reports • Early Assessment Program 2014 Post-Test Workshop 3

  4. What’s New in 2014 • Results are available for only those California Assessment of Student Performance and Progress (CAASPP) content areas administered. • Results for the Early Assessment Program (EAP) for students in grade eleven who took these tests are not reported by the CAASPP Program. • Percent tested data are not reported for the Standards-based Tests in Spanish (STS). • Enrollment data—number and percentage—are not reported for the STS. Post-Test Guide (PTG) 2–3 2014 Post-Test Workshop 4

  5. What’s New in 2014 • Since there were no CAASPP end-of-course (EOC) tests administered, there is no Student Master List Summary EOC report. • Electronic reporting, formerly Quick-turnaround Reporting, is now offered for all remaining paper-pencil tests. Files have been available since August 8. • EAP results are part of electronic reporting. • Handscoring of tests is not offered in 2014. PTG 2–3

  6. Results: Purposes of CAASPP Reports • Report progress toward proficiency on the state’s academic content standards • Notify where improvement needed • To help students’ achievement • To improve educational programs PTG 4 2014 Post-Test Workshop 6

  7. Results: Performance Levels • State goal: All students score at proficient or above • 350 or higher scale score • CST for Science • CMA For Science • STS for RLA • CAPA proficient: 35 or higher scale score PTG 7–13 2014 Post-Test Workshop 7

  8. Results: Other Performance Levels • Advanced • Basic cut score • CST for Science: 300 • CMA for Science: 300 • STS for RLA: 300 • CAPA: 30 • Below basic • Far below basic • For each testing program, cut pointsvary for advanced and below basic by • Subject • Grade PTG 7 –13; Appendix B 2014 Post-Test Workshop 8

  9. Results: Scale Scores • Scale scores allow the same score to mean the same thing across test versions within grade and content area. • Scale scores account for differences in difficulty. • Scale score ranges by program are: • CST for Science, CMA for Science, STS for RLA: 150–600 for each grade and subject • CAPA: 15–60 for each level and subject PTG 7–13 2014 Post-Test Workshop 9

  10. Results: Equating • Psychometric procedure • Adjusts for test difficulty from year to year (form to form) • Additional information in the technical reports on the CDE Technical Reports and Studies Web page at http://www.cde.ca.gov/ta/tg/sr/technicalrpts.asp PTG 7

  11. Results: Reporting Clusters (Content Area) • Three to six clusters for each subject. • May be useful as indicators of individual or group strengths and weaknesses. • But. . . reporting clusters should be interpreted with caution. PTG 8–13; Appendix A 2014 Post-Test Workshop 11

  12. Results: Reporting Clusters Cautions • Cluster percent correct available for CSTs for Science, CMA for Science, STS for RLA. • Clusters are based on small numbers of items; therefore, may not be reliable or generalized. • Clusters are NOT equated from year to year. • You should not compare reporting cluster percent correct from year to year. PTG 8–13; Appendix A 2014 Post-Test Workshop 12

  13. Interpreting Reporting Clusters or Content Areas in the Same Year • Compare to percent-correct range of proficient students statewide PTG 8–13; Appendix A 2014 Post-Test Workshop 13

  14. 2014 CST Reporting Clusters: Number of Questions and Average Percent Correct PTG Appendix A 2014 Post-Test Workshop 14

  15. Examples—Interpreting Reporting Clusters for the CST for Science PTG 9 2014 Post-Test Workshop 15

  16. Using Results • For instructional decisions in conjunction with other data • CAPA ELA and mathematics in grade ten used in adequate yearly progress (AYP) calculations PTG 2, 4 2014 Post-Test Workshop 16

  17. Year-to-Year Comparisons Do Compare CSTs: Same Grade and Same Content Area • Mean scale score • Same content and grade, varying years • Percent in each performance level • Same content by grade across years PTG 10–13 2014 Post-Test Workshop 17

  18. Year-to-Year Comparisons Do Compare CSTs: Percent Proficient and Advanced • Percentage of students scoring at PROFICIENT and above • For a given grade and subject, e.g., percent proficient and above for grade 5 science in 2013 and 2014 • For a given subject and aggregated grades, e.g., percent proficient and above for grades 5, 8, and 10 science in 2013 and 2014 • Across grades and a subject, e.g., percent proficient and above in all courses and all grades PTG 10–13 2014 Post-Test Workshop 18

  19. Year-to-Year Comparisons Don’t Compare • Individual scale scores or statistics based on scale scores for different grades or content areas • Subjects by grade are independently scaled • Different content standards are measured in different grades • Cohorts across grades • Across tests • Scale scores to percent correct scores • CAPA to years prior to 2009, because of new standard setting that year PTG 10–13 2014 Post-Test Workshop 19

  20. Example—Using CST Results to Compare Grade Results from Year to Year PTG 10 2014 Post-Test Workshop 20

  21. Aggregate (Summary) Reports • What are they? • Student Master List Summary • Subgroup Summary • Report emphasis: CSTs for Science • Criterion-referenced tests • Progress is measured in percent of students scoring proficient and advanced • Back of reports provides guide to abbreviations and score codes PTG 16–39 2014 Post-Test Workshop 21

  22. Student Master List Summary • Tests are as follows: • CSTs, CMA for Science: Grades 5, 8, and 10 • CAPA for ELA and Mathematics: Grades 2–11; Levels I–V • CAPA for Science: Grades 5, 8, and 10; Level I, III–V • STS for RLA: Grades 2–11 • # and % at each performance level • Mean scale score • Reporting cluster: Mean percent correct (except CAPA) PTG 16–18; PTG 22–25 2014 Post-Test Workshop 22

  23. Student Master List Summary Grade 5 Example PTG 25 2014 Post-Test Workshop 23

  24. Student Master List Summary Basic Statistics PTG 23–25 2014 Post-Test Workshop 24

  25. Who Counts? Number Enrolled • For the content area: • Number of multiple-choice answer documents submitted minus the number of answer documents marked to indicate that the student enrolled after the first day and was subsequently testedminus • Documents marked as “Student enrolled after the first day of testing and was given this test” • Does not apply to the STS PTG 22–25 2014 Post-Test Workshop 25

  26. Who Counts? Number Tested • For the content area, number of students who responded to any questions on the test or whose answer documents were marked to indicate that the student tested but marked no answers (special condition Z). • Not included: • A = Students absent • E = Not tested due to significant medical emergency • P = Parent/guardian exemptions • T = Enrolled first day, not tested, tested at previous school • Students with inconsistent grades • Non–English learners who took the STS PTG 22–25 2014 Post-Test Workshop 26

  27. Who Counts? Number and Percent Valid Scores • Number Valid Scores • For the subject, number of students tested at grade level who received a score for the test. • Not included: • Incomplete tests • Modified tests • Non–English learners who took the STS • Inconsistent grades • Percent Valid Scores • For the subject, number of valid scores divided by the number of students tested. PTG 24–25 2014 Post-Test Workshop 27

  28. Who Counts? Number Tested with Scores • All tests taken, including those taken with modifications, that result in a score • Not included: • Incomplete tests • Non–English learners who took the STS • Inconsistent grades PTG 24–25 2014 Post-Test Workshop 28

  29. Student Master List Summary Performance Levels PTG 23–25 2014 Post-Test Workshop 29

  30. Who Counts? Performance Levels • All CSTs for Science; CMA for Science; CAPA for ELA, Mathematics, and Science; STS for RLA • Advanced, proficient, basic, below basic • All valid scores falling in the performance level • Far below basic • All valid scores falling in the performance level • CSTs for Science and STS for RLA taken with modifications in aggregate reporting PTG 22–39 2014 Post-Test Workshop 30

  31. Who Counts? Mean Scale Scores • Average of valid scale scores • Can be used to compare results for the same content/grade across years PTG 24–25 2014 Post-Test Workshop 31

  32. Student Master List Summary: Reporting Clusters Compare to: Average percent correct range for students statewide who scored proficient on the total test (See the Post-Test Guide, Appendix A.) PTG 25 2014 Post-Test Workshop 32

  33. Subgroup Summary: CSTs, CMA, CAPA, and STS • Disability status • Based on disability status for CST, CMA, STS • CAPA: each disability type • Economic status • Based on NSLP eligibility or parent education level • Gender • English proficiency • Ethnicity • Ethnicity for Economic Status (only for CSTs, CMA, and CAPA) PTG 26–39 2014 Post-Test Workshop 33

  34. Subgroup Summary: Ethnicity for Economic Status PTG 39 2014 Post-Test Workshop 34

  35. Subgroup Summary: Ethnicity for Economic Status Example: Economically disadvantaged for each ethnicity PTG 29 2014 Post-Test Workshop 35

  36. Subgroup Summary: Ethnicity for Economic Status PTG 29 2014 Post-Test Workshop 36

  37. Internet Reports • Summaries based on same data as paper reports: CSTs for Science; CMA for Science; CAPA for ELA; Mathematics; and Science, STS for RLA • Available to the public online for school, district, county, and state • “Students with Scores” = number tested with scores • More subgroups than paper reports • Parent education • Special program participation • Access from http://caaspp.cde.ca.gov/ PTG 65–73 2014 Post-Test Workshop 37

  38. Internet Reports • In 2014, neither total enrollment nor percent tested data are included in Internet reports for any test. • Aggregate scores will be reported on the Internet in mid-September 2014 after the CALPADS data corrections window. PTG 65–73

  39. Internet Demonstration 2014 Post-Test Workshop 39

  40. Internet Reports: CST Sample PTG 65–73 2014 Post-Test Workshop 40

  41. Other Internet Reports • CST (PTG 69) • CMA (PTG 69–70) • Same as CST • CAPA (PTG 70−73) • State level: same as CST; separate Level I • County, district, school • Mean scale score • Percent proficient or above • STS (PTG 73) • Same as CST 2014 Post-Test Workshop 41

  42. Data CDs • What are they? • Lists of information from answer documents and scores of every student in the LEA • In .txt format • What are they used for? • Searching for specific data • Creating unique reports • Verifying paper reports • What else is needed? • Text editor • or Desktop application • or Student Information System 2014 Post-Test Workshop 42

  43. View of Data • As .txt, word wrap on • With text editor, word wrap off 2014 Post-Test Workshop 43

  44. Organization of Data • Two files: • Demographics, special conditions, and test scores • Accommodations, modifications, English learners, and irregularities • Data Layout = guide to location of data on files • Position • Number of characters • Whether numeric or alpha 2014 Post-Test Workshop 44

  45. Data Layout Sample 2014 Post-Test Workshop 45

  46. Individual Reports • CAASPP Student Record Label • Adhesive label to affix to student’s permanent school record • CAASPP Student Master List • Alphabetical list of students and their scores • Tests listed in order within grade • CSTs • CMA • CAPA • STS • CAASPP Student Report: individual’s scores • Two 2-sided color copies for each test • For parents/guardians, school • Per California Code of Regulations, Title 5, Section 863, LEA must forward one copy to parent/guardian within 20 business days PTG 40–64 2014 Post-Test Workshop 46

  47. Student Record Label Grade 5 Sample: Student Name and Identification PTG 40–41 2014 Post-Test Workshop 47

  48. Student Record Label CST Grade 5 Example PTG 40–41 2014 Post-Test Workshop 48

  49. Student Master List CST/CMA Grade 10 Example PTG 44 2014 Post-Test Workshop 49

  50. Student Report CST Grade 5 Example PTG 46–50 2014 Post-Test Workshop 50

More Related