1 / 33

Implementing a Three-Tiered State Evaluation Structure Bob Putnam The May Institute Karen Childs University of South

Implementing a Three-Tiered State Evaluation Structure Bob Putnam The May Institute Karen Childs University of South Florida . 2009 National PBIS Leadership Forum. Goals of Session. Present the purpose of a state-wide evaluation

avery
Télécharger la présentation

Implementing a Three-Tiered State Evaluation Structure Bob Putnam The May Institute Karen Childs University of South

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Implementing a Three-Tiered State Evaluation Structure Bob Putnam The May Institute Karen ChildsUniversity of South Florida 2009 National PBIS Leadership Forum 1

  2. Goals of Session • Present the purpose of a state-wide evaluation • Provide exemplars of the core components of three tier state-wide evaluations • Provide an ongoing state-wide evaluation system

  3. Purpose of Statewide SWPBS Evaluation • Evaluation is the process of gathering information for decision-making. • SWPBS decisions focus on levels of adoption, adaptation, and replication. • Efficiency • Effectiveness • Sustainability • Wide-spread adoptability 3

  4. Core Indicators • These indicators include considerations about the • context in which implementation of SWPBS is to occur • inputs available to guide and assist with implementation • fidelity with which core elements of SWPBS are put in place • impact of those core elements on the social and academic behavior of students. • Assess the breadth of adoption, and sustainability of implementation. Algozzine, Horner, Sugai, Barrett, Eber, Kincaid & Lewis, 2009 5

  5. Context • What are/were the goals and objectives for SWPBS implementation? • Who provided support for SWPBS implementation? • Who received support during SWPBS implementation? 6

  6. Context Who received support during SWPBS implementation?

  7. Input • What professional development was part of SWPBS implementation support? • Who participated in the professional development? • What was the perceived value of the professional development? 8

  8. Participating Schools 2000 Model Demonstration Schools (5) 2004 Schools (21) 2005 Schools (31) 2006 Schools (50) 2007 Schools (165) Who Is Receiving Training and Support? 2008 Schools (95)

  9. Training/Technical Assistance Who Is Receiving Training and Support? More than three-fourths of the counties in the state have at least one school participating in the North Carolina Positive Behavior Support Initiative.

  10. Tier 2 & Tier 3 Who Is Receiving Training and Support?

  11. Tier 2 & Tier 3 Who Is Receiving Training and Support?

  12. Tier 3 Who Is Receiving Training and Support?

  13. Fidelity • To what extent was SWPBS implemented as designed? • To what extent was SWPBS implemented with fidelity? 14

  14. School-Wide Evaluation Tool35 Coaches trained as SET assessors15 Contractual SET assessors To What Extent Have Practices Changed? 97 SETs completed 2004 154 SETs completed 2005 157 SETs completed 2006 104 schools have at least two SET scores 80% Total score is considered Maintenance Phase (IPI) All regions met 80% criterion across schools 69% increase after one year of implementation

  15. SET Scores by Region To What Extent Have Practices Changed?

  16. PBIS Maryland To What Extent Have Practices Changed?

  17. To What Extent Have Practices Changed? www.pbisillinois.org

  18. To What Extent Have Practices Changed? Tier 3 www.pbisillinois.org

  19. To What Extent Have Practices Changed? Tier 3 www.pbisillinois.org

  20. Impact • To what extent is SWPBS associated with changes in student outcomes? • To what extent is SWPBS associated with changes in other areas of schooling? 21

  21. What extent is SWPBS associated with changes in student behavior? www.pbisillinois.org

  22. What extent is SWPBS associated with changes in student behavior? www.pbisillinois.org

  23. What extent is SWPBS associated with changes in student outcomes? Tier 2 & 3 www.pbisillinois.org

  24. Percent of Students at DIBELS Benchmark (Spring) and Major Discipline Referrals per 100 Students To What Extent has Behavior and Academic Performance Changed?

  25. MiBLSi Schools and Reading MEAP:Average Total Office Discipline Referrals per 100 Students per Day 2004-2005 To What Extent has Academic Performance Changed?

  26. To What Extent has Academic Performance Changed?

  27. North CarolinaPositive Behavior Support Initiative To What Extent has Academic Performance Changed? [A]chievement causes [B]ehavior? [B]ehavior causes [A]chievement? [Context causes [A]chievement and [B]ehavior?

  28. Replication, Sustainability, and Improvement • To what extent did SWPBS implementation improved capacity for the state/region/district to replicate, sustain, and improve behavior and other outcomes? • To what extent did SWPBS implementation change educational/behavioral policy? • To what extent did SWPBS implementation affect systemic educational practice? 29

  29. What is the breadth of adoption, and sustainability of implementation? www.pbisillinois.org

  30. What is the breadth of adoption, and sustainability of implementation? www.marylandpbis.org

  31. Conclusions • Increase of states reporting on the implementation of SWPBS • Information on implementation and outcomes of Tier 1 interventions more available than Tier 2 and 3 interventions • Research is needed on the types of and presentations of data included in state evaluations that will have best outcome on funding, visibility, political support and policy.

  32. For More Information Bob Putnam, Senior Vice President of School Consultation May Institute Phone: (781) 437-1207) Email: bputnam@mayisnstitute.org Website: http:/mayinstitute.org Karen Elfner Childs, Research & Evaluation Coordinator Florida’s PBS Project Phone: (813) 974-7358 Email: childs@fmhi.usf.edu Website: http://flpbs.fmhi.usf.edu

More Related