1 / 16

Exploring Data Use & School Performance in an Urban School District

Exploring Data Use & School Performance in an Urban School District. Kyo Yamashiro, Joan L. Herman, & Kilchan Choi. UCLA Graduate School of Education & Information Studies National Center for Research on Evaluation, Standards, and Student Testing (CRESST)

Télécharger la présentation

Exploring Data Use & School Performance in an Urban School District

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Exploring Data Use & School Performance in an Urban School District Kyo Yamashiro, Joan L. Herman, & Kilchan Choi UCLA Graduate School of Education & Information StudiesNational Center for Research on Evaluation,Standards, and Student Testing (CRESST) CRESST ConferenceUCLASeptember 8, 2005

  2. Context & Background • Large urban school district in the Pacific Northwest • Value-added Assessment System implemented in District • Need for more info on schools’ use of data (VA and other)

  3. Data Use & Evidence-based Practice • Data use at the heart of test-based reforms (NCLB) & continuous improvement efforts • Little evidence of effects of data use on performance • Some evidence shows limited access and capacity of schools to use data

  4. Study Components CRESST conducts multi-year, multi-faceted study of data use: • Transformation Plan Review - content analysis of school improvement plans • Interviews, surveys, and observations from site visits of case study schools • Analysis of district achievement and survey data • Observations of school presentations about progress

  5. Sampling • Latent variable, multilevel analyses used to estimate gains (student-level, longitudinal ITBS data in reading & math) • Gains based on growth from 3rd to 5th grade for 2 cohorts in each school: • 3rd graders in 1998 • 3rd graders in 2001 • Within each cohort, 3 performance subgroups (average, low, high)

  6. Sampling (cont’d) • 13 Schools met the following criteria: • Greater than district average % of low-SES students • Starting point below district average • “Beat the Odds” Sample (7): • Higher than average gains • Relatively more consistent across: • 2 cohorts (98 & 01) • reading and math • performance subgroups (hi, avg, lo)

  7. Sample • Extremely diverse set of 13 small, elementary schools • African American student populations between 11 - 81% • Asian American student populations between 2 - 59% • White student populations between 5-59% • Enrollment range: 134 to 533

  8. Transformation Plan Review • TP Review Rubric (Rating of 1 to 3) • Types of evidence or indicators used • Breadth; depth; VA data; technical sophistication • Identification of goals/objectives or needs analysis • Identification of solution strategies • Specificity; based on theory/ research/data • Analysis of progress • Inclusion of stakeholders

  9. Case Study Site Visits • 2-day visits to 4 case study sites: • Interviews/focus groups: • Principal • Building Leadership Team (BLT) • Teachers (primary, upper) • Teacher Survey

  10. Additional Achievement Analyses • Latent Variable Multiple Cohort (LMC) Design (with SEMs) • Estimating gains on ITBS based on data across 5 cohorts (1998 to 2002) • Gains for performance subgroups: • Average (students starting at school mean initial status) • High (students starting at 15 points above school’s average) • Low (students starting at 15 points below school’s average) • Patterns of growth differ from 2-cohort analysis

  11. Results: Achievement • Differences between Pre- and Post-Transformation Plan Reform • High/Avg: 4 schools - consistent growth across rdg & math & subgroups • Low: 6 schools - left some subgroups behind in math and/or rdg • Very Low: 3 schools - no growth or negative gains

  12. Results: Data Use • Data Use Is Improving but Still Varied • Over 3 years, schools increased use of assessment results and other evidence • Schools increased mention of VA data • Data Review Process is Inclusive When Capacity Exists • Principal often conduit (filter, interpret) • However, many schools developed collaborative processes for data review • Transf Planning Process May become More Centralized (Less Inclusive) in Later Years

  13. Results: Data Use (cont’d) • Accessible and Excessive Data • Teachers use data for schoolwide reform and (to lesser degree) instructional planning • Teachers are overwhelmed with amount of data • More Capacity Needed • Whether schools integrate data into instructional decisions tended to be person- or climate-driven • Principals need help, too • More Diagnostic, Instructionally Sensitive Data Needed • State testing data not seen as useful, valid, timely, or interpretable • lack of continuity in tests (from grade to grade) • lack of diagnostic info (item analyses) • lack of individual growth info (pre-post) • District assessments seen as more helpful to instruction

  14. Results: Data Use & Achievement • Pre-Post Gains & Data Use Practices

  15. Results: Data Use & Achievement (cont’d) • Ratings overlap for 7 of 13 schools • For the most discrepant case (Polk): • showing high gains but low data use • school in chaos, with new leadership • For remaining 5 moderate discrepancies, no case study data

  16. Conclusions • Less use of data for instructional planning probably a function of: • type of data provided • leadership & climate • capacity • Principals and teacher leaders need more help in interpreting and using data • Data use and gains appear to have a moderate link for struggling schools; more case study info needed • Need for more research on how to use value-added (gains) in an accountability setting

More Related