1 / 26

Cecil J. Picard Center for Child Development University of Louisiana at Lafayette Sessions 22A & 22B

Cecil J. Picard Center for Child Development University of Louisiana at Lafayette Sessions 22A & 22B. Holly Howat Oliver Winston Greg Crandall. PBS in Louisiana: 2006-2007 Evaluation Findings. Understanding the power of data-based decisions.

liona
Télécharger la présentation

Cecil J. Picard Center for Child Development University of Louisiana at Lafayette Sessions 22A & 22B

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Cecil J. Picard Center for Child DevelopmentUniversity of Louisiana at LafayetteSessions 22A & 22B Holly Howat Oliver Winston Greg Crandall

  2. PBS in Louisiana: 2006-2007 Evaluation Findings Understanding the power of data-based decisions

  3. Cecil J. Picard Center for Child Development The Cecil J. Picard Center for Child Development was established in 2005 at the University of Louisiana at Lafayette. Our mission is to improve Louisiana by focusing on its children. The Center’s is dedicated to providing high quality, rigorous evaluation of programs that addresses learning from birth to adulthood. The Center is proud to partner with many state agencies including the Department of Education. Our Center’s work with DOE includes the evaluation of the implementation of Positive Behavior Support.

  4. Evaluation Focus • School-wide Evaluation Tool • Correlation Analysis • Behavioral Characteristics • Academic Characteristics • Risk and Protective Factors Characteristics • Qualitative Results for District-Wide Implementation

  5. Positive Behavioral SupportSchools Trained2006-2007 School Year

  6. Positive Behavioral SupportSchools Trained2006-2007 School Year

  7. SET Total and Subcategories Mean Scores for All Sampled PBS Schools 100 Total Score Expectations Defined 80 Expectations Taught 60 Reward System Percentage Violation System 40 Monitoring Management 20 District Support 0 Comparison of 2006-07 SET Total Scores Across Cohorts by Years of Experience 100 Cohort 4 N=7 Schools 80 Cohort 3 60 N=11 Schools Percentage Cohort 2 40 N=15 Schools 20 Cohort 1 N=6 Schools 0 1 Year 2 Years 3 Years 4 Years School Wide Evaluation Tool Most sampled schools had strengths with monitoring and district support and had difficulty with expectations taught. The more experience a sampled school has with universal level PBS, the better they are at implementing it.

  8. SET Scores Relation to Benchmark Scores 100 80 60 SET - Benchmarks SET Linear (SET - Benchmarks) 40 20 0 0 20 40 60 80 100 Benchmarks Correlation Analysis This graph indicates that there is statistical significant correlation between School-wide Evaluation Tool scores and Benchmarks Of Quality scores.

  9. Change in ISS Rates from 2003-04 to 2005-06 10 8 6.86 6.73 6 Percentage 4 2 1.05 1.04 0 Cohort 1 Cohort 2 Cohort 3 Cohort 4 Change in OSS Rates from 2003-04 to 2005-06 4 2.67 3 2 1 0.45 Percentage 0.02 0 -1 -2 -2.45 -3 Cohort 1 Cohort 2 Cohort 3 Cohort 4 Behavioral Characteristics:Suspension Rates Sampled schools with over two years of PBS implementation had much lower increases in in-school suspension rates. A similar pattern existed for out-of-school suspension rates.

  10. Retention Rates Over Time by Year 20 Cohort 1 N=9 Schools 15 Cohort 2 N=24 Schools Cohort 3 Percentage 10 N=17 Schools Cohort 4 N=8 Schools 5 State Ave. N=1475 Sch. 0 2003-04 2004-05 2005-06 Academic Characteristics:Test Scores and Retention Rates A general pattern of decline in retention rates can be observed in this sample. From the data collected for 2006-2007, there was no discernible correlation of PBS implementation to academic outcomes on test scores.

  11. PBS Sample School Results on CCYS Protective Factor: Rewards for Pro-social Behaviors 100 80 2004 N=34 Schools 60 Percentage 40 2006 N=34 Schools 20 0 Grade 6 Grade 8 Grade 10 PBS Sample School Results on CCYS Risk Factor: Low Commitment to School 100 80 60 2004 Percentage 2006 40 20 0 Grade 6 Grade 8 Grade 10 Risk and Protective Factors Protective factors increased in Grades 6 and 8, particularly the rewards for pro-social behavior. Risk factors decreased in Grades 6 and 8, particularly a low commitment to school.

  12. Qualitative Results for District-Wide Implementation

  13. Qualitative Results for District-Wide Implementation

  14. Data Driven Decision Making At the Picard Center for Child Development, we collect and analyze data to inform policy makers so they can informed decisions. School and districts can also collect and analyze data so they can make informed decisions.

  15. Data Driven Decision Making PURPOSE: To review critical features & essential practices of data collection and the analysis of data for interventions

  16. School-wide Positive Behavior Support Systems Classroom Setting Systems Non-classroom Setting Systems Individual Student Systems School-wide Systems

  17. Data Collection Examples An elementary school principal found that over 45% of their behavioral incident reports were coming from the playground. High school assistant principal reports that over two-thirds of behavior incident reports come from our cafeteria.

  18. Data Collection Examples A middle school secretary reported that she was getting at least one neighborhood complaint daily about student behavior during arrival and dismissal times. Over 50% of referrals occurring on “buses” during daily transitions.

  19. Data Collection Examples At least two times per month, police are called to settle arguments by parents & their children in parking lots. A high school nurse lamented that “too many students were asking to use her restroom” during class transitions.

  20. Data Collection Questions • What system does the parish utilize for data collection? • How is the data system being used in each school setting? • How frequently are data collection system reports generated (bi-weekly, monthly, grading period and/or semester reports )?

  21. Minimal School-Level Data Collection Needs • Minor referrals • Major referrals • Referrals by staff members • Referrals by infractions • Referrals by location • Referrals by time • Referrals by student

  22. Minimal District-Level Data Collection Needs • Majors referrals (ODRs) • Referrals by Incident • Referrals by Infractions • Times of incidents • Locations of incidents (what school and where in the school)

  23. Data Analysis Questions • How is the data displayed (graphs, tables, etc.) and is it effective? • What are the outcomes of data review? • Are data-based decisions reached? • How are data-based decisions monitored for effectiveness?

  24. Minimal School-Level Data Analysis Needs • PBS team should be part of analysis process • Data should be reviewed to determine patterns of problem behaviors • Decisions should be based upon data presented • Decisions should include an intervention that can be successfully implemented and monitored.

  25. Using Data to Make Decisions • What interventions are needed to respond to problem behaviors? • How do we implement the intervention throughout the school? • What is the time table for the intervention to show a decrease in undesirable behavior?

  26. Contact Information Dr. Holly Howat 337-482-1552 holly.howat@louisiana.edu Mr. Oliver Winston 337-365-2343 olwinston@iberia.k12.la.us http://ccd-web.louisiana.edu/

More Related