1 / 36

SPDG National Conference Washington, DC March 5, 2013

Developing, Measuring, and Improving Program Fidelity: Achieving positive outcomes through high-fidelity implementation. SPDG National Conference Washington, DC March 5, 2013. Allison Metz, PhD, Associate Director, NIRN Frank Porter Graham Child Development Institute

kalyca
Télécharger la présentation

SPDG National Conference Washington, DC March 5, 2013

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Developing, Measuring, and Improving Program Fidelity:Achieving positive outcomes through high-fidelity implementation SPDG National Conference Washington, DC March 5, 2013 • Allison Metz, PhD, Associate Director, NIRN • Frank Porter Graham Child Development Institute • University of North Carolina

  2. Goals for Today • Define fidelity and its link to outcomes • Identify strategies for developing fidelity measures • Discuss fidelity within a stage-based context • Describe the use of Implementation Drivers to promote high fidelity • Provide case example

  3. “Program Fidelity” “The degree to which the program or practice is implemented ‘as intended’ by the program developers and researchers.” “Fidelity measures detect the presence and strength of an intervention in practice.”

  4. Definition of Fidelity Context, Compliance, and Competence • Three components • Context: Structural aspects that encompass the framework for service delivery • Compliance: The extent to which the practitioner uses the core program components • Competence: Process aspects that encompass the level of skill shown by the practitioner and the “way in which the service is delivered”

  5. Fidelity Purpose and Importance • Interpret outcomes – is this an implementation challenge or intervention challenge? • Detect variations in implementation • Replicate consistently • Ensure compliance and competence • Develop and refine interventions in the context of practice • Identify “active ingredients” of program

  6. Formula for Success Effective Interventions A well-operationalized “What” Effective Implementation Methods Enabling Contexts Socially Significant Outcomes

  7. Usable Intervention Criteria • Clear description of the program • Clear essential functions that define the program • Operational definitions of essential functions (practice profiles; do, say) • Practical performance assessment

  8. Developing Fidelity Measures Practice Profiles • Practice Profiles Operationalize the Work • Describe the essential functions that allow a model to be teachable, learnable, and doable in typical human service settings • Promote consistency across practitioners at the level of actual service delivery • Consist of measurable and/or observable, behaviorally-based indicators for each essential function Gene Hall and Shirley Hord, (2010) Implementing Change: Patterns, Principles, and Potholes (3rd Edition)

  9. Practice Profiles Measuring Competency • For each Essential Function: • Identifies “expected” activities • Identifies “developmental” variation(s)in practice • Identifies “unacceptable,” incompatible, or undesirable practices

  10. Practice Profiles Sample Template

  11. Implementation Science Case ExampleDifferential Response • Engagement • Assessment • Partnership • Goal Planning • Implementation • Communication • Evaluation • Advocacy • Culturally Competent Service Delivery Functions

  12. Practice Profiles Case Example Differential Response

  13. Practice Profiles Multiple Purpose for Implementation If you know what “it” is then: • You know the practice to be implemented • You can improve “it” • Increased ability to effectively develop the Drivers • Increased ability to replicate “it” • More likely to deliver high quality services • Outcomes can be accurately interpreted • Common language and deeper understanding

  14. Stages of Implementation When are we ready to assess fidelity? Practice profiles are a part of stage-based work. When we are engaged in program development work, practice profiles operationalize the intervention so that installation activities can be effective and fidelity can be measured during initial implementation. Stages

  15. Practice Profiles When are they developed? • In order to create the necessary conditions for… • Creating practitioner competence and confidence • Changing organizations and systems • ….we need to define our program and practice adequately so that we can install the Drivers necessary to promote consistent implementation of the specific activities associated with the essential functions of the new service(s) Drivers

  16. Practice Profiles Fidelity Measures • Start with the Expected/Proficient column • Develop an indicator for each Expected/Proficient Activity • Identify “evidence” that this activity has taken place • Identify “evidence” that this activity has taken place with high quality • Identify potential data source(s)

  17. Practice Profiles Performance Assessment

  18. Practice Profiles Establishing Fidelity

  19. Practice Profiles Establishing Fidelity

  20. Fidelity Data Collection 5 Steps New or established criteria • Assure fidelity assessors are available, understand the program or innovation, and are well versed in the education setting • Develop schedule for conducting fidelity assessments • Assure adequate preparation for teachers/practitioners being assessed • Report results of the fidelity assessment promptly • Enter results into decision-support data system

  21. Promote High Fidelity Implementation Supports • Fidelity is an implementation outcome How can we create an implementation infrastructure that supports high fidelity implementation?

  22. Improved OUTCOMES for children and families Performance Assessment (fidelity) Systems Intervention Coaching Facilitative Administration CompetencyDrivers OrganizationDrivers Training Integrated & Compensatory Integrated & Compensatory DecisionSupport DataSystem Selection Leadership Drivers Technical Adaptive

  23. Practice Profiles Building the Infrastructure

  24. Practice Profiles Building the Infrastructure

  25. Practice Profiles Function X Driver Example

  26. Practice Profiles Improvement Cycles Cycle Do over and over again until intended benefits realized Shewhart (1924); Deming & Juran (1948); Six-Sigma (1990) Act Plan Make Adjustments Decide what to do Look at the results Do it (be sure) Study Do

  27. Practice Profiles Improvement Cycles • PDSA cycles • Competency Drivers • Organization Drivers • Leadership • Essential Functions of the Profile • Data Collection activities • Practice Profiles and accompanying implementation supports will change multiple times during initial implementation Cycles

  28. Fidelity Data Program Improvement Program Review Process • Process and Outcome Data • Detection Systems for Barriers • Communication protocols Questions to Ask • What formal and informal data have we reviewed? • What is the data telling us? • What barriers have we encountered? • Would improving the functioning of any Implementation Driver help address barrier?

  29. Case Example Results from Child Wellbeing Project Case management model involved intense program development of core intervention components and accompanying implementation drivers. Clinical case management and home visiting model for families post-care.

  30. Using Data to Improve Fidelity Case Example • How did Implementation Teams improve fidelity? • Intentional action planning based on implementation drivers assessment data and program data • Improved coaching, administrative support, and use of data to drive decision-making ; adapted model • Diagnosed adaptive challenges, engaged stakeholders, inspired change

  31. Case Example Results from Child Wellbeing Project Success Coach model involved intense program development of core intervention components and accompanying implementation drivers

  32. High Fidelity Positive Outcomes Did high fidelity implementation lead to improved outcomes? Early outcomes include… • Stabilized families • Prevented re-entry of children into out of home placements

  33. Fidelity Data Collection Methods, Resources and Feasibility If fidelity criteria are already developed • Understand reliability and validity of instruments • Are we measuring what we thought we were? • Is fidelity predictive of outcomes? • Does fidelity assessment discriminate between programs? • Work with program developers or purveyors to understand the detailed protocols for data collection • Who collects the data (expert raters, teachers) • How often is data collected • How are data scored and analyzed • Understand issues (reliability, feasibility, cost) in collecting different kinds of fidelity data • Process data vs. Structural data

  34. Summary Program Fidelity • Fidelity has multiple facets and is critical to achieving outcomes • Fully operationalized programs are pre-requisites for developing fidelity criteria • Valid and reliable fidelity criteria need to be collected carefully with guidance from program developers or purveyors • Fidelity is an implementation outcome; effective use of Implementation Drivers can increase our chances of high-fidelity implementation • Fidelity data can and should be used for program improvement

  35. Resources Program Fidelity Examples of fidelity instruments • Teaching Pyramid Observation Tool for Preschool Classrooms (TPOT), Research Edition, Mary Louise Hemmeter and LiseFox • The PBIS fidelity measure (the SET) described at http://www.pbis.org/pbis_resource_detail_page.aspx?Type=4&PBIS_ResourceID=222 Articles • Sanetti, L. & Kratochwill, T. (2009). Toward Developing a Science of Treatment Integrity: Introduction to the Special Series. School Psychology Review, Volume 38, No. 4, pp. 445–459. • Mowbray, C.T., Holter, M.C., Teague, G.B., Bybee, D. (2003). Fidelity Criteria: Development, Measurement and Validation. American Journal of Evaluation, 24 (3), 315-340. • Hall, G.E., & Hord, S.M. (2011). Implementing Change: Patterns, principles and potholes (3rd ed.)Boston: Allyn and Bacon.

  36. Stay Connected! Allison.metz@unc.edu nirn@unc.edu nirn.fpg.unc.edu www.scalingup.org www.implementationconference.org

More Related