1 / 32

Implementing EBHV Programs with Fidelity - translating science into practice 2010 Strengthening Families Training Insti

Implementing EBHV Programs with Fidelity - translating science into practice 2010 Strengthening Families Training Institute March 16, 2010. Presented by: Joan Sharp, MA Executive Director & Nancy Gagliano, LICSW Programs & Evaluation Director Council For Children & Families .

lei
Télécharger la présentation

Implementing EBHV Programs with Fidelity - translating science into practice 2010 Strengthening Families Training Insti

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Implementing EBHV Programs with Fidelity - translating science into practice2010 Strengthening Families Training InstituteMarch 16, 2010 Presented by: Joan Sharp, MA Executive Director & Nancy Gagliano, LICSW Programs & Evaluation Director Council For Children & Families

  2. Topics we will cover • CCF Historical Funding Approach • New EBHV Dollars- New Game • Initial Logic Model for Funding EBHV • Assumptions, Expectations & Early Realizations • Design of an Evaluation Plan • What Does it Really Take to Implement EBHV Programs in Diverse Communities?

  3. Council for Children & Families • Created by the legislature in 1982 • Supported by state general fund, CBCAP, Children’s Trust Fund, private donations • Three Activities: Funding Community Based Programs, Public Awareness/Education & Partnerships

  4. CCF Historical Funding Approach – organizational capacity building in the development and use of information to guide services • 12-15 programs each year; 3 year funding cycle • Parent Education/Training, Parent Support and Mentoring, and Home Visiting • Local communities choose various program type/focus based on local needs, capacity, and interest. • Capacity-Building Framework – • Community needs assessment • Research • Support programs in developing evaluation processes for quality assurance, program development and sustainability!

  5. New EBHV Dollars….New Game! • 2007 Washington State legislature dramatically increased its investment in child abuse and neglect prevention and early school readiness by providing new dollars for implementation of EBHV programs. • $3.2 million for a 2 year period for CCF to fund Evidence Based Home Visitation Programs in the across the state. • Earmarked: $185 K of the EBP funding for underserved rural and/or tribal applicants. $400 K based on input from NFP consortium $150 K per Thrive by Five demonstration site

  6. New Dollars, New Game cont… Which EBHV Models to Fund? • CCF Research Advisory Committee - approved EBHV models • Matrix- recommended for those submitting a proposal. • Three levels of evidence approved: • Best Support • Good Support • Promising Practice

  7. A Portfolio is Created… fast turn-around…. legislative approval out to 13 different organizations Implementing five Different EBHV • NFP – Nurse Family Partnership • PAT – Parents as Teachers • STEEP – Steps Towards Effective Enjoyable Parenting • PCHP – Parent Child Home Program • Project SafeCare

  8. October 2007Logic model for funding EBHV programs

  9. Everything in Perfect Order! We had great intentions, perfect assumptions and then….

  10. Home Visiting Works but…Assumptions and Expectations (don’t always meet reality!) Home visiting is among the best-tested prevention and early intervention models The relative lack of effective prevention strategies The potential for feasible large scale community-based services where people live. The promise of the research is not consistently supported in routine practice We can’t launch, forget and get the same results. 10

  11. Assumptions & Expectations • Implementing with fidelity yields effective practice - programs are willing to engage around fidelity implementation: • Good understanding model • Clear definition of model elements • Model elements guidelines actually exist • Able to implement with some degree of fidelity • Model developers are offering adequate technical assistance, monitoring and supporting implementation and development • Programs have internal capacity for outcomes and process evaluation - are using data to inform practice • Programs have organizational capacity – infrastructure and support internally

  12. Early Realizations Capacity Challenges – Some organizations limited to no understanding • logic models • collecting basic demographic data • contract compliance • confusion on reporting on outputs • Process Measures Related to Fidelity – • implementation demonstrating fidelity to the model. • “We are implementing the program with 100% fidelity.”

  13. Early Realization…programs don’t know what they don’t know Long Term Outcome: Implement EBHV Programs w/ Fidelity= outcomes. • Asking the question was not enough! • programs said they were implementing with fidelity • did they really understand fidelity? • how did the different program models actually measure fidelity? • how consistently? Were programs going to achieve the outcomes that the models promised?

  14. Early Realizationfunders don’t know what they don’t know either! Long Term Outcome Funder Demonstrate child/parent benefits of significant degree to justify the investment of state dollars. Document benefit of implementing multiple home visiting models under this state program.

  15. Design of an Evaluation Plan Call in WSU • Policy goals - system level outcomes to be addressed in the evaluation: • Create state standards for program delivery and improvement of quality in Washington State home visiting • Develop a learning community regarding home visiting and early intervention that can support progressive improvements in quality • Test the practicality and relative benefit of this multi-method approach of delivery of home visiting as a sustainable part of the state continuum of care. Not enough money to do RCT & would it work with multiple models?.

  16. Where does Washington fit in the national landscape? Most states have some level of state home visiting initiative falling in two strategies. Single strategy efforts dominated by Healthy Families America practice Portfolio strategies involving local choice and control Single strategy approaches have the strongest evaluations and as a result dominate the current policy discussion Portfolio based initiatives have weak evaluations or are not collecting and reporting data Addressing the portfolio model evaluation is a significant area of needed work 16

  17. The original evaluation questions Does the routine use of home visiting programs using various evidence informed protocols collectively result in better child and caregiver outcomes? Can we demonstrate significant benefit to justify investment of state dollars? Can we document benefit across a balanced portfolio approach to support the continuation of this approach?

  18. Design of an Evaluation Plan WSU Look at the research – comprehensive literature review around home visiting Start with the programs before we looking at multi method approach and child parent outcomes

  19. WSU Reviewed the Research:Evaluating EBHV & Implementing with Fidelity Translation of evidence based home visiting models from randomized controlled trials into local program practice is very challenging. Improving program quality and implementation of the model with fidelity is a major issue for the field. Organizational conditions and capacity are the key to a successful implementation of an EBHV model.

  20. Organizational conditions for adopting Evidence Based Programs (Fixsen, et al., 2006) • Support for adoption across leadership & treatment staff • Organizational leadership skills to support adoption of new practices • Staff skill level – training in specific home visiting model skills • Information management system and use of data for quality improvement • Capacity to Implement • Staff retention • Supervisory capacity and skills • Family engagement capacity and skills • Capacity to develop & sustain information- driven problem solving • Quality improvement practices, staff development, continuing family engagement • Use of information and outcomes in program development

  21. site visit and the “Discussion Tool” How does a program’s organizational capacity effect implementing with fidelity? Not only do we need to ask core component/fidelity questions but we also have to find a way to assess organizational capacity. We get a little help from our “FRIENDS” at the National Resource Center for Community Based Child Abuse Prevention

  22. FRIENDS and the Tailored Discussion Tool Integrating Evidence-Based Practices into CBCAP Program: A Tool for Critical Discussions - Utilized Appendix C- The Capacity Checklist for Implementing with Fidelity CQI Self Assessment Document WSU incorporated questions - data management capacity & programs ability to use data to inform program practice

  23. Framework for site visits Model Components/Fidelity Staff Experience Staff Training and Monitoring Outcome Measurement/Quality Assurance Community Capacity Support Available from the Program Developer or Other Technical Assistance Provider Funding Availability Overall Assessment

  24. The Reality Sets In –findings support the research • Programs vary in terms of organizational capacity to deliver their programs • Data collection and information use is a common area that needs further development and support • Existing outcome assessment of the model is either limited or involved measuring strategies which do not meet reliability and validity standards • Bottom line- programs need significant support in outcomes assessment and using the information for program improvement and clinical decision making

  25. CCF EBHV evaluation goals update Policy goals Create state standards for program delivery and assurance of quality EBHV dissemination in Washington State Support a learning community regarding home visiting and early intervention to inform practice Test the practicality and relative benefit of this multi-method approach Program goals Provide meaningful information and staff development that informs clinical decision making and program development through a continuous quality improvement process

  26. Analysis and proposed actions to guide the evaluation plan Implementing with fidelity is critical to producing intended model results Emphasis on fidelity varies across models Measuring fidelity varies widely Support is inconsistent across models in addressing fidelity CCF evaluation plan actions Work with developers for fidelity measurement Define common minimum standards across models Develop data collection strategy with programs Develop an evaluation TA plan with program sites when needed

  27. Beginning to create a common fidelity framework for CCF programs Recruitment aligns with the model’s intended service population Recruitment process standards are met. When there are exceptions (e.g., extension of a model to a new population), the outcomes are confirmed as consistent with the original model. What occurs when the local population is different? Program caseload structure for the model is maintained as required by the model developers Minimum standard for the model’s service location, focus, and frequency criteria are met Supervision meets minimum standards

  28. Analysis and proposed actions to guide the evaluation plan Do programs have adequate internal capacity to describe their services, measure outcomes, and use information to improve practice? Agency support Information systems Programs are using data to inform practice CCF evaluation plan actions Adopt agreed-to baseline to outcome assessment Develop data sharing and data warehouse Provide TA to address training in data collection and use in CQI Individualized agency plans

  29. Creating a common data system Align our expectations to resources and capacity Data elements to share based on model Participant demographics Participant needs Participant service summary (e.g., staff assigned, frequency and duration of contacts, start and end dates) Baseline to outcome measures that are valid and aligned to the model’s principal evidence-based claims Baseline and minimum six month assessment Protective Factors Survey CQI and the essential role Training and technical assistance as integrated actions with evaluation

  30. A state home visiting management information system Work with existing management information systems or help develop the systems With functioning MIS, develop data sharing agreements and transfer de-identified data Without MIS, develop and support a data entry system aligned with your model reporting and extract de-identified data Create a state HV data warehouse that can produce on-demand reports and support CQI is service delivery CCF reports Program level analysis of services and outcomes Minimum quarterly data reporting

  31. Conclusion Opportunity to examine what a portfolio EBHV approach requires Recognize that programs need to be active partners with resources and support Phased development, TA and Training Central role of open-ended CQI and information driven decision-making Pace and scope of effort is rate-limited by available resources

  32. Questions? Joan Sharp 206-464-5493 joan@ccf.wa.gov Nancy Gagliano, LICSW 206-389-3297 nancy@ccf.wa.gov

More Related