1 / 35

Strategies for Constructing & Scaling Up Evidence-Based Practices

Strategies for Constructing & Scaling Up Evidence-Based Practices. MARCH , 2010 PATRICIA CHAMBERLAIN, PHD. The Focus. How are evidence-based practices constructed (what goes in to them and why)?

edan
Télécharger la présentation

Strategies for Constructing & Scaling Up Evidence-Based Practices

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Strategies for Constructing & Scaling Up Evidence-Based Practices MARCH , 2010 PATRICIA CHAMBERLAIN, PHD

  2. The Focus How are evidence-based practices constructed (what goes in to them and why)? How can the child outcomes and factors that predict (or drive) those outcomes be measured within “real world” settings? How can evidence-based practice models fit into existing public service systems like juvenile justice and child welfare? How can evidence-based models be scaled up?

  3. Create the Blueprint: Carefully Visualize and Define the Outcome Specificity - “Arrests” and “days incarcerated” versus “delinquency” - Make it measurable (Observable, from multiple sources, not only self- reports) Parsimoniousy

  4. Constructing an EBP:Develop the Plan What do we want to make happen for whom? Define specific desired outcomes & how they can be measured. Resist the temptation to focus on too many outcomes. Keep the plan clean and focused.

  5. What Goes into the Plan? We look for high quality studies that identify risk and protective factors that predict or have strong associations with the outcomes of interest Randomized controlled trials are the strongest for inferring causality Longitudinal studies that examine development over the lifespan are helpful because they provide information on when to intervene (developmental sensitivity) Multiple studies constitute a strong evidence base Which of the risk and protective factors found in the studies are potentially malleable (by us/you)?

  6. Structural Plan Malleable Risk Factor 1 Sample Outcomes <Criminal Offending <Drug Use <Pregnancy > Positive Peer Relations >School Attendance Malleable Risk Factor 2 Malleable Protective Factor 1 Malleable Protective Factor 2

  7. Engineering the Intervention Risk Factor 1 Intervention Components Aimed at decreasing Risk Factor #1 Aimed at decreasing Risk Factor #2 Aimed increasing Protective Factor #1 Aimed at increasing Protective Factor #2 Outcomes <Criminal Offending <Drug Use <Pregnancy >Positive Peer Relations >School Attendance Risk Factor 2 Protective Factor 1 Protective Factor 2

  8. Testing the Impact of the Intervention Mediators Risk Factor 1 Intervention Components Aimed at decreasing Risk Factor #1 Aimed at decreasing Risk Factor #2 Aimed increasing Protective Factor #1 Aimed at increasing Protective Factor #2 Outcomes <Criminal Offending <Drug Use <Pregnancy >Positive Peer Relations >School Attendance Risk Factor 2 Protective Factor 1 Protective Factor 2 Absence of specific intervention components Significantly less change in outcomes

  9. Logic Model for Intervention Effects Supervision Intervention Components Existing Factors Expected to Moderate Outcomes -Number of CWS placements -Age at first placement -Number of changes in caregivers -Number of previous arrests Age Gender Outcomes <Criminal Offending <Drug Use <Pregnancy >Positive Peer Relations >School Attendance Delinquent Peer Groups Foster parent training and support to implement tracking of specific behavior Reinforcement of being where you are supposed to be Daily point and level system School Card Positive Adult Mentoring Reinforcement for + school & home behaviors

  10. MTFC or Group Care p < .05

  11. MTFC or Group Care

  12. Fitting Research in to “Real World” Settings • Ask if outcomes being addressed and measures of outcomes are: • Feasible (do not increase burden) • Meaningful (fit their agenda) • Capitalize on their existing system data

  13. Example in Child Welfare • Placement disruptions: Between 1/3 to 1/2 of children disrupt within the first 12 months of care. • Feasible- already tracked in CFSRs • Meaningful- rates are high and desirable to decrease • Capitalizes on their data and easy to count • Costs increase exponentially as the # of disruptions increase

  14. Using System Data to Predict Risk Level

  15. Research-based Risk & Protective Factors for Disruption • Risk Factors • Child behavioral problems • Foster parent stress • Protective Factors • Foster Parent support • Behaviorally based parenting skills

  16. 35 30 25 20 Total PDR # of Beh. Linear (Total PDR) Linear (# of Beh.) 15 10 5 0 8/3/00 9/7/00 1/4/01 7/20/00 7/27/00 8/10/00 8/17/00 8/24/00 8/31/00 9/14/00 9/21/00 9/28/00 10/5/00 11/2/00 11/9/00 12/7/00 1/11/01 1/18/01 10/12/00 10/19/00 10/26/00 11/16/00 11/23/00 11/30/00 12/14/00 12/21/00 12/28/00 Example of a measure of risk factor: Parent Daily Report A daily snapshot of risk and protective factors

  17. 3 2 Fitted Log Hazard of Placement Disruption 1 0 0 5 10 15 20 Baseline PDR PDR Scores at Baseline Predict Placement Disruption • A threshold effect: After 6 behaviors, every additional behavior on the PDR increases the probability of disruption by 17%.

  18. Research on Uptake and Scaling Up Evidence-based Practices • In the US 90% of child serving agencies do no t use EBPs • The agencies that tend to innovate do so repeatedly • The rich get richer and most fall behind • The Needs/Innovations paradox (the systems that are in most need are least likely to innovate)

  19. Scaling Up MTFC in California & Ohio Who Where Discipline Patti Chamberlain CR2P, Oregon Psychology Hendricks Brown U of Miami Biostatistics Lynne Marsenich Ca Institute for M.H. Social Work Todd Sosna Ca Institute for M.H. Psychology Larry Palinkas U of Southern CA Anthropology Lisa Saldana CR2P, Oregon Psychology Peter Sprengelmeyer CR2P, Oregon Psychology Gerry Bouwman TFCC Inc, Oregon Business Wei Wang U of South Florida Biostatistics Patrick Kanary CIP, Ohio Social Work Courtenay Padgett CR2P, Oregon Coordinator

  20. Study Design 40 non-early adopting counties are randomized to: 2 implementation conditions (CDT or IND) 1 of 3 time frames (research resource issue: Cohorts #1, #2, #3) Quantitative and qualitative measures - Assess stable non-malleable factors (population density, # of placements, % minority) - Assess “dynamic” malleable factors expected to mediate implementation success (organizational factors, attitudes towards EBPs) - Clinical team factors (fidelity, competence, willingness) - Child and Family factors (behavior change, placement outcomes) Implementation success/failure Stages of Implementation Completion (SIC)

  21. Design Cohort 1: 2007 Cohort 2: 2008 Cohort 3: 2009

  22. The Stages of Implementation Completion (SIC)Theoretical Premise Includes steps that have been identified as essential to the successful adoption, implementation and sustainability of MTFC Protocol is developed to measure the achievement of a model-adherent program aimed at obtaining outcomes similar to RCTs. SIC stages are operationalized and sequential Engagement--the fit between community needs and the goals of MTFC Procuring fiscal resources Developing a feasible time-line Analyzing the impact of staff recruitment on the organization (readiness) Assessment of long-term sustainability

  23. Stages of Implementation Completion (SIC) Measures Implementation @ Multiple Levels: System, Practitioner, Child/Family 8 Stages: Who is Involved? 1. Engagement System 2. Considering feasibility System 3. Planning/readiness System, Practitioner 4. Staff hired and trained Practitioner 5. Fidelity monitoring process in place Practitioner, Child/Family 6. Services and consultation begin Practitioner, Child/Family 7. Fidelity, competence, & adherence Practitioner, Child/Family 8. Sustainability (certification) System, Practitioner

  24. Activities Within the 8 SIC Stages Stage 1 Engagement 1.1 Date site is informed services/ program available 1.2 Date of interest indicated 1.3 Date agreed to consider implementation 1.4 Date declined to consider implementation; Stage 1 discontinued Stage 3 Readiness planning 3.1 Date of cost / funding plan review 3.2 Date of staff sequence, timeline, hire plan review 3.3 Date of FP recruitment plan review 3.4 Date of referral criteria plan review 3.5 Date written implementation plan completed 3.6 Date Stage 3 discontinued

  25. Stage 4 Staff hired & trained 4.1 Date Service Provider selected 4.2 Date 1st staff hired 4.3 Date clinical training scheduled 4.4 Date clinical training held Count of # of staff trained 4.5 Date FP training scheduled/held 4.6 Date Stage 4 discontinued Stage 6 Services and Consultation to Services Begin 6.1 Date of first placement 6.2 Date of first consult call 6.3 Date of first clinical meeting video review (count of number of videos) 6.4 Date of first foster parent meeting video review (count of number of videos) 6.5 Date Stage 6 discontinued

  26. Two Scales on the SIC Quantity - performance date driven - tracks completion of activities Quality - performance ratings driven - relies on ratings by sites & trainers

  27. Example of Measuring Quantity (days) Stage 1 Time Variable Mean Range Time to Decline 100.47 3-1020 Time to Consent 70.75 0-533

  28. SIC Progress by County Black = Cohort 1, Blue = Cohort 2, Yellow = Cohort 3, Red = Discontinued, Beige Shading = Discontinue Activity

  29. Examples of Quality Measures Stage 2: Consideration of Feasibility Ratings of system leaders interest MTFC trainers Stakeholders feedback System leaders Stage 3: Planning and Readiness Planning Meeting Impressions MTFC trainers Ratings of helpfulness of planning activities Site participants Stage 4: Staff Hired and Trained Pre-training Ratings of MTFC Clinical team Trainer Impressions MTFC trainers Trainee Impressions Clinical team PS, FP, Team, Org Ratings MTFC trainers

  30. Examples of Quality

  31. Next Steps on the SIC See sites through Stage 8 Finalize most appropriate scale scores Assess if implementation condition (CDT vs. IND) affects quantity and/or quality scales Assess how quantity and quality are related Use of other study measures to validate the measure and assess its ability to predict successful implementation Validate with non-study MTFC sites Validate with other EBPs

  32. What it takes to Scale-Up Evidence-based Practices? • Top down and bottom-up buy in • Mapping the “fit” between the intervention and the mission of the agency/system • Assessing how the activities/structures of the intervention disrupt daily duties & requirements (paperwork, court appearances, home visits, on-call) • Plan for change and instability (leadership turnover, funding ends)

  33. Early Results on Predictors of Implementation Densely populated counties who placed the largest number of youth in placement were the fastest to consent System leaders who had the largest social networks were the “fence sitters” Systems with a positive organizational climate and high motivational readiness to change were the most likely to implement

  34. References - Chamberlain, P., Brown, C. H., Saldana, L., Reid, J., Wang, W., Marsenich, L., Sosna, T., Padgett, C., & Bouwman, G. (2008). Engaging and recruiting counties in an experiment on implementing evidence-based practice in California. Administration and Policy in Mental Health and Mental Health Research, 35(4), 250-260. - Chamberlain, P., Saldana, L., Brown, H., & Leve, L. D. (in press). Implementation of multidimensional treatment foster care in California: A randomized control trial of an evidence-based practice. In M. Roberts-DeGennaro, & S. J. Fogel (Eds.), Empirically supported interventions for community and organizational change. Chicago: Lyceum. Hoagwood, K., & Olin, S. (2002). The NIMH blueprint for change report: Research priorities in child and adolescent mental health. Journal of American Academy of Child and Adolescent Psychiatry, 41, 760-767. NIMH (2004). Treatment research in mental illness: Improving the nation’s public mental health care through NIMH funded interventions research. Washington, DC: Author.

More Related