1 / 66

Data-based Decision-making: Evaluating the Impact of School-wide Positive Behavior Support

Data-based Decision-making: Evaluating the Impact of School-wide Positive Behavior Support. George Sugai, Rob Horner, Anne Todd, and Teri Lewis-Palmer University of Oregon OSEP Funded Technical Assistance Center www.pbis.org. Purpose.

Télécharger la présentation

Data-based Decision-making: Evaluating the Impact of School-wide Positive Behavior Support

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Data-based Decision-making:Evaluating the Impact of School-wide Positive Behavior Support George Sugai, Rob Horner, Anne Todd, and Teri Lewis-Palmer University of Oregon OSEP Funded Technical Assistance Center www.pbis.org

  2. Purpose • Examine the extent to which the logic of School-wide Positive Behavior Support (PBS) fits your real experience in schools • Define the outcomes for School-wide PBS • Is School-wide PBS related to reduction in problem behavior? • Is School-wide PBS related to improved school safety? • Is School-wide PBS related to improved academic performance? • Define tools for measuring School-wide PBS outcomes • Examine a problem-solving approach for using office discipline referral (ODR) data for decision-making • Provide strategies for using data for decision-making and action planning

  3. To Improve Schools for Children • Use evidence-based practices • Always look for data of effectiveness • Never stop doing what is working • Implement the smallest change that will result in the largest improvement Measure Compare Improvement

  4. Academic Systems Behavioral Systems • Intensive, Individual Interventions • Individual Students • Assessment-based • High Intensity • Intensive, Individual Interventions • Individual Students • Assessment-based • Intense, durable procedures • Secondary Group Interventions • Some students (at-risk) • High efficiency • Rapid response • Secondary Group Interventions • Some students (at-risk) • High efficiency • Rapid response • primary Interventions • All students • Preventive, proactive • primary Interventions • All settings, all students • Preventive, proactive Designing School-Wide Systems for Student Success 1-5% 1-5% 5-10% 5-10% 80-90% 80-90%

  5. SW Positive Behavior Support Social Competence, Academic Achievement, and Safety OUTCOMES Supporting Decision- Making Supporting Staff Behavior DATA SYSTEMS PRACTICES Supporting Student Behavior

  6. Improving Decision-Making Solution Problem From Problem-solving Information Problem Solution To

  7. Problem-solving Steps • Define the problem(s) • Analyze the data • Define the outcomes and data sources for measuring the outcomes • Consider 2-3 options that might work • Evaluate each option: • Is it safe? • Is it doable? • Will it work? • Choose an option to try • Determine the timeframe to evaluate effectiveness • Evaluate effectiveness by using the data • Is it worth continuing? • Try a different option? • Re-define the problem?

  8. Key Features of Effective Data Systems • Data are accurate • Data are very easy to collect • Data are used for decision-making • Data are available when decisions need to be made • Data collectors must see the information used for decision-making

  9. Guiding Considerations • Use accessible data • Handle data as few times as possible • Build data collection into daily routines • Establish and use data collection as a conditioned positive reinforcer • Share data summaries with those who collect it

  10. Types of Questions • Initial Assessment Questions • What type or which program do we need? • Where should we focus our efforts? • Ongoing Evaluation Questions • Is the program working? • If no, • Can it be changed? • Should we end the program? • If yes, • Do we need this program anymore? • What do we need to do to sustain success?

  11. What Data Should be Collected? • Always start with the questions you want to answer • Make data that will answer your question: • Easy, available, reliable • Balance between reliability and accessibility • Systems approach • Consider logistics • Who? When? Where? How? • Two levels • What is readily accessible? • What requires extra resources?

  12. When Should Data Decisions Be Made? • Natural cycles, meeting times • Weekly, monthly, quarterly, annually • Level of system addressed • Individual: daily, weekly • School-wide: monthly, quarterly • District/ Region • State-level

  13. Basic Evaluation Questionsby School or Program • What does “it” look like now? • How would we know if are successful? • Are we satisfied with how “it” looks? • YES: • Celebrate • NO: • What do we want “it” to look like? • What do we need to do to make “it” look like that? • What can we do to keep “it” like that?

  14. Basic School-wide PBS Evaluation Questionsby School/ District/ Region Are our efforts making a difference? • How many schools have adopted School-wide PBS? • Are schools adopting School-wide PBS to criterion? • Are schools who are implementing School-wide PBS perceived as safe? • Are teachers delivering instructional lessons with fidelity as planned? • Is School-wide PBS improving student outcomes?

  15. Is School-wide PBS Having a Positive Influence on School Culture? Using Office Discipline Referral Data

  16. Office Discipline Referrals and The BIG 5! • Examine office discipline referral rates and patterns • Major Problem events • Minor Problem events • Ask the BIG 5 questions: • How often are problem behavior events occurring? • Where are they happening? • What types of problem behaviors? • When are the problems occurring? • Who is contributing?

  17. what where The BIG 5 when How often who

  18. Office Discipline Referral Caution • Data reflects 3 factors: • Students • Staff members • Office personnel • Data reflects overt rule violators • Data is useful when implementation is consistent • Do staff and administration agree on office-managed problem behavior verses classroom-managed behavior?

  19. General Procedure for Dealing with Problem Behaviors Observe problem behavior Find a place to talk with student(s) Is behavior major? NO YES Ensure safety Problem solve Write referral and Escort student to office Problem solve Determine consequence Determine consequence Follow procedure documented Follow documented procedure Does student have 3? NO YES Follow through with consequences File necessary documentation Send referral to office File necessary documentation Follow up with student within a week

  20. SWIS™ Compatibility Checklist Procedure for Documenting Office Discipline Referrals School ___________________________ Date ____________________ Next review date: _______________ Redesign your form until answers to all questions are “Yes.” Readiness requirements 4 and 5 are complete when you have all “Yes” responses.

  21. Tables versus Graphs

  22. Year Month Number of Days Number of Referrals Average Referrals Per Day 2001 Aug 0 0 0.00 2001 Sep 19 5 0.26 2001 Oct 21 18 0.86 2001 Nov 18 17 0.94 2001 Dec 14 21 1.50 2002 Jan 22 18 0.82 2002 Feb 17 15 0.88 2002 Mar 19 26 1.37 2002 Apr 21 14 0.67 2002 May 18 13 0.72 2002 Jun 11 2 0.18 2002 Jul 0 0 0.00 Totals: 180 149 0.83

  23. Number of ODR per Day and Month

  24. Total verses Rate

  25. Total Number of ODRs per Month

  26. Number of ODRs per Day and Month

  27. Priorities and Rationale • Graphs • Rate

  28. SWIS summary 2008-2009 (Majors Only)3,410 schools; 1,737,432 students; 1,500,770 ODRs Newton, J.S., Todd, A.W., Algozzine, K, Horner, R.H. & Algozzine, B. (2009). The Team Initiated Problem Solving (TIPS) Training Manual. Educational and Community Supports, University of Oregon unpublished training manual.

  29. SWIS summary 2009-10 (Majors Only)4,019 schools; 2,063,408 students; 1,622,229 ODRs

  30. Interpreting Office Referral Data:Is there a problem? • Absolute level (depending on size of school) • Middle, High Schools (> 1 per day per 100) • Elementary Schools (>1 per day per 300) • Trends • Peaks before breaks? • Gradual increasing trend across year? • Compare levels to last year • Improvement?

  31. Application Activity: Absolute Value Is there a Problem? Middle School of 625 students? Compare with national average: 625/100 = 6.25 6.25 X .92 = 5.75 Office Discipline Referrals per School Day

  32. Compare with National Average 150 / 100 = 1.50 1.50 X .35 = .53 Elementary School with 150 Students

  33. Compare with National Average 1800 / 100 = 18 18 X 1.06 = 19.08 High School of 1800 students

  34. Middle School of 700 students

  35. Middle School N= 495

  36. Is There a Problem? #2Absolute - Trend - Compare Middle School N= 495

  37. Middle School N= 495

  38. Middle School N= 495

  39. Trevor Test Middle School 565 students Grades 6,7, and 8

  40. Lang. Defiance Disrespect Harrass Skip 12:00 Cafeteria Class Commons Hall

  41. Langley Elementary School 478 Students Kindergarten - Grade 5

  42. Savings in Administrative Time ODR = 15 minutes/ event Suspension = 45 minutes/event 13,875 minutes 231 hours 29, 8-hour days Savings in Student Instructional Time ODR = 45 minutes/ event Suspension = 216 minutes/ event 43,650 minutes 728 hours 121, 6-hour school days What Does a Reduction of 850 Office Discipline Referrals and 25 Suspensions Mean?Kennedy Middle School

  43. Is Implementation Related to Reduction in Problem Behavior?767 students

More Related