1 / 49

William B. Hansen, Ph.D. Tanglewood Research, Inc. Greensboro, NC

Improving Prevention Effectiveness The Maryland Alcohol and Drug Abuse Administration Annual Management Conference October 5, 2006. William B. Hansen, Ph.D. Tanglewood Research, Inc. Greensboro, NC. Profile population needs, resources, and readiness to address needs and gaps.

Télécharger la présentation

William B. Hansen, Ph.D. Tanglewood Research, Inc. Greensboro, NC

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Improving Prevention EffectivenessThe Maryland Alcohol and Drug Abuse Administration Annual Management ConferenceOctober 5, 2006 William B. Hansen, Ph.D. Tanglewood Research, Inc. Greensboro, NC

  2. Profile population needs, resources, and readiness to address needs and gaps Monitor, evaluate, sustain, and improve or replace those that fail Mobilize and/or build capacity to address needs Implement evidence-based prevention programs and activities Develop a Comprehensive Strategic Plan SAMHSA’s Strategic Prevention Framework Assessment Evaluation Capacity Planning Implementation

  3. SAMHSA’s Strategic Prevention Framework Assess Evaluate Select a Strategy Implement Develop Capacity

  4. What Do Programs Want To Do? Assess Evaluate Select a Strategy Implement Develop Capacity

  5. What Do Programs Want To DoWhen They Are Required to Evaluate? Assess Evaluate Select a Strategy • The quality of delivery. 2. The effects achieved. Implement Develop Capacity

  6. What Do We Mean: Quality Of Delivery? • Dosage • How much? • How often? • Adherence • Was the program delivered as intended? • Was new content added? • Was important content deleted or modified? • Relevance • Was the program engaging to participants? • Did the program meet participants’ needs?

  7. Dosage 1 • Meta-analysis of 25 SAMHSA model programs. • Programs that were delivered more frequently, generally had larger effects.

  8. Dosage 2 • Same 25 SAMHSA model programs • Programs that had more opportunities for contact were generally more effective.

  9. All Stars Dosage Tracking

  10. All Stars Dosage Tracking

  11. All Stars Dosage Tracking

  12. Adherence • An evaluation of Life Skills Training • Observers rated the percent of objectives met and lesson points covered • High-fidelity classes (>60% adherence) did best.

  13. Adherence • An evaluation of Life Skills Training • Observers rated the percent of objectives met and lesson points covered • High-fidelity classes (>60% adherence) did best.

  14. Adherence • An evaluation of Life Skills Training • Observers rated the percent of objectives met and lesson points covered • High-fidelity classes (>60% adherence) did best.

  15. Local Adherence • Drug Strategies assessed adherence of Life Skills Training implemented in Baltimore. • Teachers re-taught lessons they had previously delivered. • Observers rated adherence. • Teachers implemented 65% of objectives (Range = 45-100%). • Teachers implemented 58% of main points (Range = 38-93%).

  16. Local Adaptation • All teachers made adaptations • 3.5 definable adaptations, on average, per observed session (range 1 to 7) • Overall, 63% of adaptations were judged to be negative

  17. Helpful Adaptations • The addition of reading material, videos, and testimonials • Changes in methods to make them more interactive • Inclusion of examples for cultural relevance or interest

  18. Important Correlates of Adherence • Teacher’s Understanding of LST (r=.784; p<.01) • Quality of Process (r=.663; p=.03) • Level of Experience (r=.756; p<.01)

  19. All Stars Approach to Assessing Adherence

  20. Relevance • As part of the SAMHSA program meta-analysis, we coded relevance. • Three aspects of relevance were significantly correlated with outcomes:

  21. All Stars Approach toAssessing Student Engagement

  22. Quality of Implementation Summary Data are needed to assess: • Whether a sufficient dose has been delivered • How closely delivery adhered to design • The relevance of implementation for participants Gathering and reporting data will improve quality of implementation.

  23. Outcome Evaluation • Everybody is afraid of outcome evaluation. • Why? • No one likes to fail. • It is perceived to be mysterious, complex, and expensive. • Outcomes are not controllable. • It shows you are normal.

  24. What Are Prevention Goals? The goal of prevention is not behavior change but either: • Non-behavior maintenance • Delay in onset • Reduce the intensity of use

  25. How Do Programs Work? • All Programs are based on a logic model. • The program changes a mediator: • Characteristics of the participant (skill, motivation) • Characteristics of the social environment • Characteristics of the physical environment • Characteristics targeted for change affect behavior.

  26. All Stars Logic Model Example All Stars Core targets: • Lifestyle incongruence (idealism) • Normative beliefs • Commitment • Bonding to school • Positive parental attentiveness

  27. Motivation Attitudes Bonding Beliefs consequences Commitment Normative beliefs Lifestyle incongruence Personal Competencies Academic skills Decision-making skills Emotional self-regulation Goal setting skills Self-esteem Social Competencies Resistance skills Media literacy Communication skills Social problem solving skills Social skills Environment Availability, access, enforcement Alternatives Classroom management Family management Monitoring and supervision Positive peer affiliations Support and involvement Mediators in Prevention

  28. The Easy Part ofOutcome Evaluation • Collecting survey data is easy. • After over 30 years development, there are: • Many measures for assessing alcohol, tobacco, drug use, and consequences of use. • Many measures for measuring mediators (risk and protective factors targeted for change).

  29. Sample Outcome ResultsAll Stars in a Community Setting

  30. Sample Outcome ResultsAll Stars in a Community Setting

  31. Sample Outcome ResultsAll Stars in a Community Setting

  32. What Do When Programs Succeed? Assess Evaluate Select a Strategy Implement Develop Capacity

  33. Results Looking at Mediators

  34. Results of the All Stars Community Trial

  35. Sample Local Evaluation Resultsfrom a Community Program in MN

  36. What Do You Do When YouHave Not Yet Succeeded? Assess Evaluate Select a Strategy Implement Develop Capacity

  37. Working Backwards Evaluate Implement Develop Capacity

  38. How To Develop Capacity? • Two predictors of quality implementation: • Experience • Training

  39. Experience Counts! • Teachers with more experience were: • Most adherent when the taught LST (r = .630) • More likely to meet objectives (r = .590) • More likely to cover major points (r = .756) • More likely to make positive adaptations (r = .577)

  40. Developing Skill

  41. Training Counts, Too! • Teachers’ understanding of LST was strongly correlated with adherence (r = .784).

  42. Improving Understanding • Program-specific training • Coaching and feedback • Independent study

  43. Program-Specific Training • Introductory training • Basics of theory and methods • Technical assistance • Help with specific issues and adaptations • Booster training • When important questions are asked • Certification of Mastery • A process of demonstration and certification

  44. Does Training Matter? • Video Training Project • Two conditions • 3-hour course without video • 3-hour course with video • Topic: Norm Setting • Knowledge pretest-posttest survey http://www.PreventionABCs.com

  45. All Stars Certification of Mastery • Basic All Stars facilitator training • Implement one cycle of All Stars • Videotape implementation with feedback about mechanisms of delivery • Videotape implementation with feedback about interactivity • Implement Strategies for Success • Implement Parent Intervention • Complete online course (Prevention ABCs) • Videotape to demonstrate expert delivery • Improved student outcomes

  46. Just-In-Time Support • New project • Life Skills Training • Emailed helpful hints just before you teach • Links to a streaming video demonstration • Recruiting test schools lindadusenbury@tanglewood.net 1-888-692-8412

  47. Conclusion • I was asked to answer two questions: • How can prevention programs use data to improve program effectiveness? • How can programs become data-driven?

  48. Improving Effectiveness • How can prevention programs use data to improve program effectiveness? • Quality of implementation data • Behavioral outcome data • Targeted mediating variable data • All improve the potential of a program to be implemented with greater rigor

  49. Improving Effectiveness • How can programs become data-driven? • Collect data • Look at the data you have collected • Start with modest expectations • Find meaning in the data • Find alternatives in the data

More Related