1 / 28

Public Health Performance Measurement Moving from Version 1.0 to 2.0

Public Health Performance Measurement Moving from Version 1.0 to 2.0. AcademyHealth Annual Meeting Boston, MA June 26, 2005. University of Kentucky Team Members. F. Douglas Scutchfield, MD, P.I. Evelyn A. Knight, PhD Ann V. Kelly, MHA Ilie Puiu Vasilescu, PhD (UVa-Wise)

lenora
Télécharger la présentation

Public Health Performance Measurement Moving from Version 1.0 to 2.0

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Public Health Performance MeasurementMoving from Version 1.0 to 2.0 AcademyHealth Annual Meeting Boston, MA June 26, 2005

  2. University of Kentucky Team Members • F. Douglas Scutchfield, MD, P.I. • Evelyn A. Knight, PhD • Ann V. Kelly, MHA • Ilie Puiu Vasilescu, PhD (UVa-Wise) • Michelyn Bhandari, MPH

  3. Provide performance standards for public health systems Improve quality and accountability of public health practice Conduct systematic collection and analysis of performance data Develop a science-base for public health practice improvement Goals of the National Public Health Performance Standards Program(NPHPSP)

  4. Aims of the Assessment Instruments • Provide a NPHPSP performance report to the public health agency and the state or local system. • Provide baseline, feedback and guidance to the public health system for continuous performance improvement.

  5. Four Concepts Applied in NPHPS 1. Based on the 10 Essential Public Health Services (EPHS) 2. Focus on the overall public health system 3. Describe an optimal level of performance 4. Support a process of quality improvement

  6. Goals of UKY’s Instrument Research(Local and State PHPA) • Decrease the burden on users due to instrument characteristics. • Assure content supports the goals of the performance assessment process. • Increase the usability of the results for performance improvement.

  7. Research Phases • Preliminary interviews--determine areas of need for instrument improvement—state and local instruments. • Psychometric analysis--identify items that might be changed or eliminated in future versions—local instrument only. • Discussion groups--identify specific changes in question inclusion and/or wording which would improve instrument usability and usefulness—state and local instruments.

  8. Preliminary Interviews—Results • Directions clear; some questions clear, some not • Understandable by public health--but not partners • Too long, instrument fatigue--but details important • Rating scale clear—but doesn’t correspond with report • Rating methodology varies—consensus, vote, etc. • Subjective scores—depend on who is present • Costly to gather information needed to answer the questions

  9. Instrument Format Essential Service—Level 1 Indicator—Level 2 Model Standard Measure—Level 3 (First-tier stem) Measure—Level 4

  10. Measure—Level 4 (Second-tier stem) Measure—Level 5 Summary Questions-- Not analyzed

  11. Numbers of Indicators & Questions(Local Instrument)

  12. Performance Scoring—Levels 3 & 2 • Level 3 stem scores are weighted scores based on stem and level 4 & 5 sub-questions. Stem question (3-digit)  Sub-questions (4-digit) Sub-sub-questions (5 digit) • Level 2 scores (model standards) are average of Level 3 scores--yields 31 Indicator Scores.

  13. Performance Scoring—Level 1 & Overall • Indicator Scores are averaged to obtain 10 EPHS scores—Level 1. • 10 EPHS scores are averaged to obtain one overall Public Health Performance Score.

  14. Psychometric Analysis of Local Instrument • Determine Cronbach’sinternal reliability • Determine Item-Total Correlations (ITC) • Test effect on reliability of removing specific questions or tiers of questions • Use results to frame discussion group interviews

  15. Table 1. Internal Reliability of Overall Performance Score Based on 10 EPHS Scores (n=228) For Overall Score  = 0.915(very reliable) *Reliability of group to total increases slightly if EPHS #2 (item) is removed. Reliability is slightly higher with only 9 EPHS (0.918), than with all 10 EPHS (0.915).

  16. Table 2: Internal Reliability of 10 EPHS Scores based on Indicator Scores (n=228)

  17. Summary of Internal Reliability Overall & EPHS • Instrument has very high reliability overall—measuring the PH construct • EPHS-level reliability varies • EPHS 3,9 Very reliable • EPHS 2,4,5,10 Acceptable reliability • EPHS 8 Marginal reliability • EPHS 1,6,7 Not reliable

  18. Table 3. Internal Reliability—Indicators based on Level 3 First-Tier Stem Question Scores

  19. Summary of Internal Reliability of 31 Indicators • 14 indicators very reliable >0.8 • 10 indicators of acceptable reliability 0.7 to 0.8 • 3 become very reliable if one stem question is removed • 6 indicators not reliable <0.7

  20. Psychometrics of Current and Simulated Instrument Versions (V.1-V.4) • Current Local Assessment Instrument • V.1—All questions as scored by CDC • Eliminate scores of drill-down details • V.2—No level 5 details—use only level 3&4 • V.3—No level 4&5 details—use only level 3 stem • V.4—No questions with high or ns correlation to stems at levels 4&5

  21. Table 4. Comparison of Reliability across the Four Versions (v.1-4) for 10 EPHS

  22. Table 5. Internal Reliability—10 EPHS Scores based on Indicator ScoresV.1 and simulated V.2-V.4

  23. Moving from V1.0 to V2.0 • Instrument construction • Instrument content • Instrument context • Instrument consensus

  24. Instrument Construction • Clarity • Time • Complexity • Subjectivity

  25. Instrument Content • Examine unreliable EPHS and subscales • Determine key content • Fit content into a logic model with inputs—outputs--outcomes

  26. Instrument Context • Large vs small systems • Centralized vs decentralized • Urban vs rural

  27. Instrument Consensus • Agreed on definitions of PH practice • Purpose • Process • Strengths • Weaknesses

  28. NPHPSP Instrument V2.0 should: • Build on current instrument and lessons learned. • Specify how users will validate their results. • Recognize and respond to agency--system issues. • Agree on purpose. • Use for purpose for which designed.

More Related