1 / 17

Developing AND IMPLEMENTING STATE-LEVEL EVALUATION SystemS Bob Algozzine, Heather Reynolds, and Steve Goodman

Developing AND IMPLEMENTING STATE-LEVEL EVALUATION SystemS Bob Algozzine, Heather Reynolds, and Steve Goodman. National PBIS Leadership Forum Hyatt Regency O’Hare Rosemont, Illinois October 27, 2011. Objectives. Describe core features of an effective evaluation system

Sophia
Télécharger la présentation

Developing AND IMPLEMENTING STATE-LEVEL EVALUATION SystemS Bob Algozzine, Heather Reynolds, and Steve Goodman

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Developing AND IMPLEMENTING STATE-LEVEL EVALUATION SystemSBob Algozzine, Heather Reynolds, and Steve Goodman National PBIS Leadership ForumHyatt Regency O’Hare Rosemont, IllinoisOctober 27, 2011

  2. Objectives • Describe core features of an effective evaluation system • Evidence to document program, initiative, or intervention • Evidence to improve and support continuation • Evidence to direct policies and practices • Share ongoing and exemplary state-level evaluations • Provide an opportunity for question-answer collaboration

  3. Program Evaluation Simplified Design/Plan [Redesign/Re-Plan] Assess Continuously and Document Intended and Unintended Outcomes Implement Intentionally and Document Fidelity

  4. Core Features of an Effective Evaluation System An effective evaluation has a clearly defined purpose that tells a story that helps to… • document program, initiative, or intervention • context, input, fidelity, and impact evidence • improve and support continuation • stages of innovation and continuous improvement evidence • direct policies and practices • efficient and effective reporting and dissemination of evidence

  5. Document Program, Initiative, or Intervention A simple plan? Organize evidence around what you need to know and questions you can answer. • Why (i.e., circumstances, conditions, or events) was the program implemented? [Statement of the problem and data on which to build evaluation…] • What program was implemented? [Program description including key features…] • What other programs were considered? • Why was program selected over other programs? • How was the program implemented? [Pilot sites, administrative dictum, widespread panic, quiet riot, volunteers…] • Was program implemented with fidelity sufficient to produce change? [Statement of the problem and data on which to build evaluation…] • What short-, intermediate-, and long-term changes resulted from implementing the program? [Statement of the problem and data on which to build evaluation…] • Improvements in school and classroom ecology? • Improvements in academic and social behavior? • Did implementation improve the capacity of the state/district to continue the program? [Statement of the problem and data on which to build evaluation…] An important reminder: What you need to know and the questions you can answer will depend on where you are in the implementation process. Context EXPLORATION Input INSTALLATION Fidelity IMPLEMENTATION Impact INNOVATION CONTINUATION

  6. Documenting Program Context and Input What to collect and report? • Information about need and intervention • Information about national, state, and local education agency leadership personnel and program providers • Information about program participants • Information about program • Focus, critical features, and content • Type and amount of support • Perceptions and other indicators of appropriateness • Expectations for change Context Input

  7. Documenting Program Fidelity What to collect and report? Fidelity Forms on www.pbisassessment.org

  8. Documenting Program Impact • Social Behavior Benefits • Fidelity Indicators • School and Classroom Climate • Attitudes • Attendance • Office Discipline Referrals (ODRs) • Individual Student Points/Behavior Records • Proportion of Time in Typical Educational Contexts • Referrals to Special Education • Academic Behavior Benefits • Fidelity Indicators • Instructional Climate • Attitudes • Universal Screening and Progress Monitoring (vocabulary, oral reading fluency) • Standardized Test Scores Impact What to collect and report?

  9. Core Features of an Effective Evaluation System An effective evaluation has a clearly defined purpose that tells a story • Evidence to Document Program, Initiative, or Intervention • context, input, fidelity, and impact • Evidence to Improve and Support Continuation • stages of innovation/continuous improvement cycles • Evidence to Direct Policies and Practices • efficient and effective annual reports

  10. 2 – 4 Years Evidence to Improve and Support Continuation Stages of Implementation • Exploration • Installation • Initial Implementation • Full Implementation • Innovation • Sustainability What to collect and report? Continuous Improvement Process Design/Plan [Redesign/Re-Plan] Assess Continuously and Document Intended and Unintended Outcomes Implement Intentionally and Document Fidelity Fixsen, Naoom, Blase, Friedman, & Wallace, 2005 [report]

  11. Core Features of an Effective Evaluation System An effective evaluation has a clearly defined purpose that tells a story • Evidence to Document Program, Initiative, or Intervention • context, input, fidelity, and impact • Evidence to Improve and Support Continuation • stages of innovation/continuous improvement cycles • Evidence to Direct Policies and Practices • efficient and effective annual reports • external support • www. pbisassessment.org • www. pbseval.org

  12. Evidence to Direct, Support, and Revise Policy Decisions Evaluation Blueprint The OSEP Technical Assistance Center on Positive Behavioral Interventions and Supports has developed a document for individuals who are implementing School-wide Positive Behavior Intervention and Support (SWPBIS) in districts, regions, or states. The purpose of the “blueprint” is to provide a formal structure for evaluating if implementation efforts are (a) occurring as planned, (b) resulting in change in schools, and (c) producing improvement in student outcomes. (blueprint)

  13. Evidence to Direct, Support, and Revise Policy Decisions North Carolina Annual Performance Report Annual reports highlight development and continued growth of PBIS in North Carolina as well as indicators of fidelity of implementation and the impact PBIS is having on participating schools across the state. In addition, the reports include information about plans for sustainability through training, coaching, and partnerships with other initiatives, in particular Responsiveness to Instruction (RtI). Michigan’s Integrated Behavior and Learning Support Initiative (MiBLSi) News http://miblsi.cenmi.org/News.aspx Illinois Evaluation Reports http://pbisillinois.org/ Florida’s Positive Behavior Support Project Childs, K. E., Kincaid, D., & George, H. P. (2010). A model for statewide evaluation of a universal positive behavior support initiative. Journal of Positive Behavior Interventions, 12, 198–210.

  14. Evidence from Exemplary State-Level Evaluations North Carolina North Carolina has been implementing a statewide Positive Behavior Intervention and Support (PBIS) Initiative for 10 years. Heather Reynolds is the State PBIS Consultant. Michigan Michigan’s Integrated Behavior and Learning Support Initiative (MiBLSi) works with schools to develop a multi-tiered system of support for both reading and behavior; PBIS is a key part of the Initiative’s process for creating and sustaining safe and effective schools. Steve Goodman is Director of Michigan Integrated Behavior and Learning Support Initiative and PBIS Coordinator.

  15. Presentation Questions and Answers Bibliography and Selected Resources Evaluation Action Plan

  16. Bibliography and Selected Resources Abma, T. A., & Stake, R. E. (2001). Stake’s responsive evaluation: Core ideas and evolution. In J. C. Greene & T. A. Abma (Eds.), New directions for evaluation: No. 92. Responsive evaluation (pp. 7-21). San Francisco: Jossey-Bass. Algozzine, B., Horner, R. H., Sugai, G., Barrett, S., Dickey, S. R., Eber, L., Kincaid, D., et al. (2011). Evaluation blueprint for school-wide positive behavior support. Eugene, OR: National Technical Assistance Center on Positive Behavior Interventions and Support. Retrieved from www.pbis.org Fixsen, D. L., Naoom, S. F., Blase, K. A., Friedman, R. M., & Wallace, F. (2005). Implementation research: A synthesis of the literature. Tampa, FL: Florida Mental Health Institute, The National Implementation Research Network (FMHI Publication #231). Ruhe, V., & Zumbo, B. D. (2009). Evaluation in distance education and e-learning. New York: Guilford. Scriven, M., & Coryn, C. L. S. (2008). The logic of research evaluation. In C. L. S. Coryn & M. Scriven (Eds.), Reforming the evaluation of research. New Directions for Evaluation, No. 118, pp. 89-106). San Francisco, CA: Jossey-Bass. Stufflebeam, D. L. (2001). Evaluation models. In D. L. Stufflebeam, New directions for evaluation: No. 89. Responsive evaluation (pp. 7-98). San Francisco: Jossey-Bass. Stufflebeam, D. L., & Shinkfield, A. J. (2007). Evaluation theory, models, and applications. San Francisco: Jossey-Bass/Pfeiffer. The Evaluation Center. (2011). Evaluation checklists. Kalamazoo, MI: Western Michigan University. Retrieved from http://www.wmich.edu/evalctr/checklists/ The Joint Committee on Standards for Educational Evaluation (1994). The program evaluation standards. Thousand Oaks, CA: Sage Publications, Inc.

  17. Evaluation Action Plan Evaluation_Action_Plan

More Related