1 / 91

MAKING A DIFFERENCE

Promoting Science-based Approaches: Bridging Research and Practice by Integrating Research to Practice Models and Community-Centered Models (ISF) Abraham Wandersman wandersman@sc.edu U. Of Connecticut April 2010. MAKING A DIFFERENCE. MAKING A DIFFERENCE. HOW DO WE GET THERE?.

kathy
Télécharger la présentation

MAKING A DIFFERENCE

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Promoting Science-based Approaches: Bridging Research and Practice by Integrating Research to Practice Models and Community-Centered Models (ISF)Abraham Wandersmanwandersman@sc.eduU. Of ConnecticutApril 2010

  2. MAKING A DIFFERENCE

  3. MAKING A DIFFERENCE • HOW DO WE GET THERE?

  4. THE 2015 TARGET DATE FOR ELIMINATING SUFFERING AND DEATH DUE TO CANCER:

  5. AMBITOUS GOALS

  6. Dr. von Eschenbach: I believe we are at what I call a strategic inflection in biology, which means we're at a point of unprecedented growth in three key areas related to cancer research: knowledge, technology, and resources. The integration of growth in these three sectors provides an opportunity for exponential progress. To achieve this progress, we must set a clear direction and focus our efforts into a cohesive strategy.

  7. The goal of eliminating suffering and death due to cancer provides this focus. It does not mean "curing" cancer but, rather, it means that we will eliminate many cancers and control the others, so that people can live with -- not die from -- cancer. We can do this by 2015, but we must reach for it. We owe it to cancer patients around the world -- and their families -- to meet this challenge. May 16, 2003 BenchMarks

  8. HEALTHY PEOPLE 2010

  9. Healthy People 2010 Objectives • Target: 1.0 new case per 100,000 persons. • Baseline: 19.5 cases of AIDS per 100,000 persons aged 13 years and older in 1998. Data are estimated; adjusted for delays in reporting. • Target setting method: Better than the best. • Data source: HIV/AIDS Surveillance System, CDC, NCHSTP.

  10. In 2007, there were 42,495 new cases of HIV/AIDS in adults, adolescents, (2500)

  11. DATA - EVIDENCE

  12. WHY IS EVIDENCE/SCIENCE NOT USED MORE?

  13. Expanding Research and Evaluation Designs…for QII Carolyn M. Clancy, MD Director, AHRQ September 13, 2005

  14. Original research 18% variable Negative results Dickersin, 1987 Submission 46% 0.5 year Kumar, 1992 Koren, 1989 Acceptance Negative results 0.6 year Kumar, 1992 Publication 17:14 35% 0.3 year Poyer, 1982 Balas, 1995 Lack of numbers Bibliographic databases Expert opinion 50% 6. 0 - 13.0 years Antman, 1992 Poynard, 1985 Reviews, guidelines, textbook 9.3 years Inconsistent indexing Implementation It takes 17 years to turn 14 per cent of original research to the benefit of patient care

  15. Treatments Thought to Work but Shown Ineffective • Sulphuric acid for scurvy • Leeches for almost anything • Insulin for schizophrenia • Vitamin K for myocardial infarction • HRT to prevent cardiovascular disease • Flecainide for ventricular tachycardia • Routine blood tests prior to surgery • ABMT for late stage Breast CA BMJ February 28 2004; 324:474-5.

  16. THE GAP BETWEEN SCIENCE AND PRACTICE • IN THE DOCTOR’S OFFICE

  17. OVERALL 54.9% RECEIVED RECOMMENDED CARE ASCH ET AL STUDY, NEJM, 2006

  18. POSSIBLE SOLUTION • VA MEDICAL SYSTEM HAS 67% RECOMMENDED CARE SYSTEM HAS ELECTRONIC MEDICAL RECORDS, DECISION SUPPORT TOOLS, AUTOMATED ORDER ENTRY, ROUTINE MEASUREMENT AND REPORTING ON QUALITY, INCENTIVES FOR PERFORMANCE

  19. As Yogi Berra supposedly said, "In theory there is no difference between theory and practice, but in practice there is."

  20. * Why is there a gap between science and practice?

  21. * What is the dominant scientific paradigm for developing research evidence and disseminating it?

  22. * Why is this science model necessary but not sufficient?

  23. * What is the responsibility of the practitioner to deliver evidence-based interventions and what is their capacity to do so?

  24. * What is the responsibility of funders to promote the science of evidence-based interventions and to promote the practice of effective interventions in our communities?

  25. How can evaluation help providers, local CBOS and coalitions, health districts, and state agencies reach results-based accountability?

  26. Two Routes to Getting To Outcomes (GTO): A) Bridging Science and PracticeB) Empowerment Evaluation

  27. Research To Practice Practice To Research  CLOSING THE GREAT DIVIDE

  28. Feedback Loop • Identity problem or disorder(s) and review information to determine its extent 2. With an emphasis on risk and protective factors, review relevant infor-mation—both from fields outside prevention and from existing preventive intervention research programs • Design, conduct, and analyze pilot studies and confirmatory and replication trials of the preventive intervention program • Design, conduct, and analyze large-scale trails of the preventive intervention program • Facilitate large-scale implementation and ongoing evaluation of the preventive intervention program in the community FIGURE 1.1 The preventive intervention research cycle. Preventive intervention research is represented in boxes three and four. Notre that although information from many different fields in health research, represented in the first and second boxes, is necessary to the cycle depicted here, it is the review of this information, rather than the original studies, that is considered to be part of the preventive intervention research cycle. Likewise, for the fifth box, it is the facilitation by the investigator of the shift from research project to community service program with ongoing evaluation, rather than the service program itself, that is part of the preventive intervention research cycle. Although only one feedback loop is represented here, the exchange of knowledge among researchers and between researchers and community practitioners occurs throughout the cycle.

  29. Gates Foundation Preventive Intervention Vaccine/Drug Mechanism Syringes Physician Health System Support System Medical Schools Government Funding

  30. Community

  31. From Research to “Best Practices” in Other Settings and PopulationsLarry GreenAmerican Journal of Health Behavior, 2001 • Process • Control • Self-Evaluation • Tailoring Process and New Technology • Synthesizing Research

  32. Getting to Outcomes 1) Needs/Resources 2) Goals 3) Best Practice 4) Fit 5) Capacities 6) Plan 7) Process Evaluation 8) Outcome Evaluation 9) CQI 10) Sustain Prevention Science

  33. “Prevention Science” Intervention Basic research Efficacy Effectiveness Services Research Prevention Support System (Funders) Training Technical Assistance Funding Practice Community Organizational Systems 1) Schools 2) Health Agencies 3) Community Coalitions • Green Characteristics • Process • Control • Self-Evaluation • Tailoring Process and new Technology • 5) Synthesizing Research

  34. io Funding Putting It Into Practice—Prevention Delivery System General Capacity Use Innovation-Specific Capacity Use Supporting the Work—Prevention Support System Macro Policy Climate General Capacity Building Innovation-Specific Capacity Building Distilling the Information—Prevention Synthesis & Translation System Synthesis Translation Existing Researchand Theory

  35. ROUTE B: EMPOWERMENT EVALUATION

  36. What Can Steve Spurrier Teach Us about Loving Evaluation?

  37. Lead Agency Forms Ad Hoc Committee Of Community Leaders Committees Forms COALITION { Criminal Justice Grassroots/ Neighborhood Business Education Religion Media Parents Youth Health FORMATION { Conduct Needs Assessment Chairpersons ConsolidateWork of Individual Committees Implementation { MAINTENANCE Resulting In ComprehensiveCommunity Plan Resulting In Plan Implementation { Resulting In OUTCOMES Impact on Community Health Indicators Figure 2. Overview of the development of a community coalition.

  38. Table 1. Evaluation of MPA by Developmental Phases, Ecological Levels, and Stages of Readiness

  39. Outcome Evaluation A B

  40. Shoot Aim Ready Implement Plan No Results

  41. Ready Aim Shoot Close Plan Implement CQI

  42. Ready Aim Shoot Hit Plan Implement Results

  43. Empowerment Evaluation: An evaluation approach that aims to increase the probability of achieving program success by:

  44. Providing program stakeholders with tools for assessing the planning, implementation, and self-evaluation of their program, and

  45. Mainstreaming evaluation as part of the planning and management of the program/organization.

More Related