1 / 56

Acknowledgements

Practical Tools for Measuring and Improving the Quality of Public Health Preparedness Christopher Nelson, Ph.D Michael Seid, Ph.D Julia E. Aledort, Ph.D David J. Dausey, Ph.D. Acknowledgements. William Raub, Project Officer, HHS Lara Lamprecht, HHS Jeffrey Wasserman, Co-PI, RAND

Télécharger la présentation

Acknowledgements

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Practical Tools for Measuring and Improving the Quality of Public Health Preparedness Christopher Nelson, Ph.DMichael Seid, Ph.D Julia E. Aledort, Ph.D David J. Dausey, Ph.D

  2. Acknowledgements • William Raub, Project Officer, HHS • Lara Lamprecht, HHS • Jeffrey Wasserman, Co-PI, RAND • Nicole Lurie, Co-PI, RAND

  3. Project Overview • RAND has worked on a variety of related projects with HHS for the past 3 years • The overarching theme of these projects has been to develop resources and to prepare analyses to describe and enhance key aspects of state and local public health preparedness

  4. Measuring Public Health Preparedness • Accountability • Quality Improvement • Evaluation • Standardization

  5. Federal Policy and Local Quality Improvement in Public Health An Assessment of CDC’s Cooperative Agreement Guidance Christopher Nelson, Ph.D.

  6. Motivation • CDC’s Cooperative Agreement on Public Health Preparedness and Response for Bioterrorism is main federal funding vehicle for PHEP • The agreement includes guidance and performance measures • The guidance has been developed without the benefit of a broad conceptual framework

  7. Objectives • Assess strengths and weaknesses in current and recent guidance as a tool for supporting state and local quality improvement • Develop a framework and recommendations to guide improvements in guidance

  8. Conceptual Framework Guided Analysis • Evaluation of current system requires characterization of the system • Three main policy instruments

  9. Conceptual Framework Guided Analysis • Evaluation of current system requires characterization of the system • Three main policy instruments • Designed to influence grantee activities and PHEP

  10. Research Methods • Methods • 10 Site visits to state health departments and a small number of local health departments (interviews, collect materials) • Interviews with federal officials, national groups, and other stakeholders • Review of literature on federal guidance, measuring preparedness in other disciplines • Considerable changes from 2004 to 2005 guidance • Greater focus on capabilities vs. infrastructure • Performance standards target more “downstream” outputs • Fewer performance goals (34 vs. 186) • Report focuses on 2005 materials, but also draws upon earlier materials where doing so provides important lessons learned

  11. Overview • Overall evaluation • Standards • Assessments • Feedback/consequences • Grantee use of data

  12. Overall Assessment of New Guidance • Most respondents applauded the changes represented in the 2005 guidance • More outcomes-oriented • Provides more flexibility about how to reach standards • Addresses concerns about one-size-fits-all approaches “The new guidance is better. It is more outcomes oriented and allows us to tailor an approach for [my state].” “Before they [CDC] were setting the standard and you had to hit it, whether you needed it or not.” • Also seems more congruent with current state of knowledge about PHEP

  13. Standards • Frequent changes in standards create transaction costs and limits their utility as planning tools • Some change in response to changing knowledge base • But considerable change in format, categorization • Timelines are unrealistic • Encourages haphazard investments

  14. Assessments • Grantees generally pleased with move toward exercises and drills in 2005 guidance, but concerns remain • On the front end, exercises and drills are often customized to local improvement needs. Can they yield comparative judgments about preparedness? • On the back end, methods for evaluating exercises and drills are not well standardized (e.g., the problem of “when to stop the stopwatch”) • Lack of clarity about how to aggregate local preparedness data • Potential source of inconsistency

  15. Feedback and Consequences • Inconsistent experience with CDC feedback on performance reports • “I think someone reads it in Atlanta, but I get very little feedback.” • Unclear consequences associated with performance creates uncertainty for grantees • Generally, states don’t think that funding will be cut as a consequence of poor performance • But some believe that with decreasing funding there will be more competition for scarce resources • Concerns about incentives for maintaining capabilities • Will the funding stop once a standard is met?

  16. Grantee Use of Guidance • Release of guidance often comes too late to influence early planning processes • Generally, little evidence that grantees use information gathered for performance reports for improvement purposes

  17. Examples of Strategies for Improving CDC Guidance • Develop norms about changes in guidance • Well-publicized, periodic top-to-bottom review • Between reviews, require demonstration of compelling need to make changes • Publicize changes ahead of time for review and comment (cf. NPRM model) • Develop a small number of standardized drills to facilitate comparisons across jurisdictions and over time • Develop wider range of low-cost exercise and drill formats • Small-scale timed drills (allow element of surprise?) • “Embedded” assessments • Clarify incentives associated with assessments

  18. Quality Improvement: Implications for Public Health Emergency PreparednessMichael Seid, Ph.D.

  19. Motivation • ‘Preparedness’ difficult to measure, but improvement is necessary • Quality Improvement (QI) has been useful in other sectors • QI may be a useful tool for improving preparedness

  20. Objectives • Develop framework for QI in PHEP • Identify examples of QI in PHEP • Identify barriers and facilitators of QI for PHEP • Make recommendations for QI for PHEP

  21. What is QI? • Core concepts • Emphasis on systems • Product or outcome focus • Data driven • QI focuses on efforts to reduce unwarranted variability • QI efforts must be ongoing, rather than one-offs • Four elements – all necessary • Performance Goals • Measures • QI practices • Feedback/reporting

  22. Implementing QI practices is iterative From the Institute for Healthcare Improvement, http://www.ihi.org/IHI/Topics/Improvement/ImprovementMethods/HowToImprove/rampsofchange.htm

  23. QI Barriers and Facilitators • Skills • Technical, managerial, strategic • Organizational Culture • Openness, collaboration, learning from mistakes • Leadership • Incentives • Organization and individual • Financial and nonfinancial

  24. Capability-Building Processes Develop policies and plans Assure competent workforce Response Processes Inform, educate, empower people Mobilize community partnerships Link to provider and assure care Enforce public health laws Outcomes / Recovery Morbidity Mortality Psychological Social Economic Surveillance and Detection Processes Monitor health of the community Diagnose and investigate Applying QI to PHEP: Conceptual Framework • A preparedness ‘production’ system

  25. Methods, Sample • Methods • Sample Identification, screening • Site-visits • Data analysis • Extract themes from detailed interview notes • Sample • 90 HDs nominated, 9 chosen for site visits • 5 Local HDs: 1 large urban area, 3 medium, 1 small • 4 State HDs: 1 centrally organized, 2 decentralized, 1 mix • 3 of 9 HDs on West Coast, 2 Mid-West, 2 South, and 2 Northeast

  26. Results (1) • Performance goals • Performance goals are being used • Issues with definition, relevance, consistency. • Performance goals sometimes structured correctly • Measures • Concerns about relevance of measures • Measurement not pervasive, nor documented • Examples of measures for routine processes • Examples of measures for rare/response

  27. Results (2) • QI strategies • Area with most difficulty • Examples • Cyclical improvement (PDSA) • Collaborative • Incident Command System • Role of state vs local HD • Feedback and reporting • Few examples of process change as a result of measurement • What happens after the ‘after action report’? • Some promising starts

  28. Results (3) • Barriers and Facilitators • Organizational culture and leadership are key • Incentives often a problem • Resources: lack of time, money, and staff

  29. Summary • While no site was fully engaged in QI, components of QI existed at every site • Implementing QI strategies was most difficult QI element • Promising practices exist and suggest that this is possible • Leadership and culture are key facilitators • Lack of resources a problem, but bigger problem is lack of incentives

  30. Recommendations • QI can be applied to all parts of PH, not just PHEP • QI must be incorporated into daily work to avoid preparedness burnout • QI training is necessary • States should facilitate local QI via measurement, training, collaborative-building • Expand cyclical QI approaches to more processes • Inject ICS into ongoing work • Systematize the after-after action report • The right incentives are key

  31. Facilitated Look-Backs: A New Quality Improvement Tool for Management of Routine Annual & Pandemic Influenza Julia E. Aledort, Ph.D

  32. Motivation • Pandemic influenza public health preparedness challenged by the relative infrequency of pandemics • Routine annual influenza provides important opportunities for state public health agencies (SPHAs) to learn from direct experience • Lessons from 2004-2005 influenza vaccine shortage may be relevant for pandemic influenza preparedness

  33. Objective • Develop a tool for state health departments that enables them to be able to: • Regularly revisit & evaluate routine annual influenza management with key community stakeholders • Systematically institutionalize knowledge from one influenza season to the next • Continually improve public health response to routine annual influenza • Identify & incorporate lessons into preparedness activities for pandemic influenza & other public health emergencies

  34. What is a “Look-Back”? • Convenes SPHA leaders, key staff & community stakeholders after each influenza season • Facilitates candid, “no-fault” systems-level discussion of annual influenza management • Reviews past real-world events & critically examines how participants responded • Key events that unfolded during the past influenza season • Key decisions that were made by stakeholders • How decisions perceived & acted upon by others • Draws on practical experience & broad range of perspectives to inform future responses

  35. Look-Back Operating Assumptions • Annual influenza activities offer recurrent lessons for some aspects of pandemic influenza & other public health preparedness activities • No one person can represent all perspectives about the past • Systems-level analyses can identify opportunities to learn from experience • Organizational learning is not complete without improvement plans that specify responsibilities for change

  36. Look-Back Pilot Tests • Designed & piloted with three SPHAs in different US regions between June & August 2005 • Focused on topics identified in collaboration with SPHA • Involved 10-25 participants, including SPHA departments, healthcare partners, other community stakeholders • Lasted 3 to 5 hours • Resulted in After Action Plans (AARs) and initial Improvement Plans

  37. Look-Back Discussion Topics 1. Organizational Structure of Decisionmaking 2. Influenza Surveillance 3. Vaccine Procurement and Distribution 4. Routine Annual Influenza Vaccination Campaigns 5. Vaccine Administration 6. Priority Groups & Implications of Changing Priorities 7. Non-Pharmacological & Public Health Strategies 8. Communication 9. Unanticipated Events

  38. Core Discussion Questions • What are activities, roles & responsibilities during annual influenza season? • What are specific issues that emerged last year? • What went well & are past successes sufficiently institutionalized? • What specific problems emerged? • What might have been done differently? • What should be done differently in the future? • What are lessons for an influenza pandemic?

  39. SPHA Officials & Staff State health director Emergency management coordinator Immunization program director Pandemic influenza coordinator Communicable disease control/disease investigation director Quality Improvement Coordinator State epidemiologist Public health nurse Communications specialist/public information officer Other Community Partners District or LPHA staff Hospital representatives Nursing home & LTC representatives Professional medical organizations Managed care organizations Insurers Commercial enterprises offering influenza vaccine to the public Pharmacies Minority community leadership representatives Look-Back Participants

  40. Design issues & Implementation Challenges • Advanced planning & investigation allow for customized Look-Backs • Facilitator objectivity and independence are critical • It is a challenge to produce effective & broadly agreed-upon AARs • AARs can generate valuable dialogue if they are broadly disseminated & reviewed by individuals not typically involved with annual influenza activities

  41. Examples of Lessons Learned • Leveraging state emergency management resources & infrastructure may facilitate emergency response by “traditional” state public health agencies • Communication is of paramount importance • Broad-based coalitions & public-private partnerships may mitigate vaccine distribution & administration challenges

  42. Conclusions • A Look-Back is a relatively simple, effective quality improvement tool • Look-Backs can be used to assess recent past events & identify issues relevant to future annual & pandemic influenza • Adoption & implementation of Look-Backs capitalizes on annual influenza to better prepare for pandemic • Document & formalize learning from successes & problems • Encourage follow-through on lessons learned • Reinforce the role of public health during annual & pandemic influenza

  43. Common Themes and Lessons Learned from Designing and Conducting Tabletop Exercises to Assess Public Health Preparedness David J. Dausey, Ph.D

  44. Motivation • The US government has made substantial investments to enhance the nation’s ability to respond to bioterrorism and other public health emergencies. • Funding and mandates for federal preparedness programs have led health departments throughout the country to implement exercise programs. • The development and conduct of exercises to test and assess preparedness is now considered the responsibility of all health departments.

  45. Objective • To summarize insights that the RAND Corporation has gained about public health preparedness and the process of developing, conducting, and evaluating tabletop exercises in collaboration with state and local health departments in every region of the United States.

  46. What Are Tabletop Exercises? z • Full Scale Exercise • Functional Exercises • Drills • Tabletops • Workshops • Seminars Capabilities Operations Based Planning/Training Discussion Based Source: www.hseep.dhs.gov

  47. Exercise DevelopmentProcess Goal: Enhance Public Health Preparedness Make Improvements Develop action plan Create after action report Conduct exercise Revise the exercise Review exercise with stakeholders Develop a draft tabletop exercise Meet with key actors and leaders; clarify exercise goals Understand existing plans, actors, and system

  48. Public Health Infrastructure Public Health Preparedness Public Health Methods Exercise Performance Measurement Subject Matter Expertise Exercise Development Exercise Facilitation Exercise Customization Exercise Development Requires Multiple Levels of Expertise and Experience

  49. Tabletop Pilot Tests • Convenience sample of 30 local public health agencies in 13 different states across the continental US • Majority were located in urban areas that served populations of less than 1 million residents • Conducted from 2003 to 2006 • 1 exercise per site

  50. Design Concerns • Competing desires for realistic scenarios and logistic feasibility • What are the objectives of an exercise? • What is the nature and scope of the exercise scenario? • Who should attend? • How should the exercise be facilitated?

More Related