1 / 25

E. Samoff MPH PhD, A. T. Fleischauer MSPH PhD, L. DiBiase MS, M. Davis MPH, A. Waller ScD,

North Carolina Preparedness & Emergency Response Research Center (NCPERRC). Local health department electronic reportable disease surveillance practice and costs, North Carolina, 2009 OR: Proving it out.

Télécharger la présentation

E. Samoff MPH PhD, A. T. Fleischauer MSPH PhD, L. DiBiase MS, M. Davis MPH, A. Waller ScD,

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. North Carolina Preparedness & Emergency Response Research Center (NCPERRC) Local health department electronic reportable disease surveillance practice and costs, North Carolina, 2009OR: Proving it out E. Samoff MPH PhD, A. T. Fleischauer MSPH PhD, L. DiBiase MS, M. Davis MPH, A. Waller ScD, P. D. M. MacDonald MPH PhD This research was carried out by the North Carolina Preparedness and Emergency Response Research Center(NCPERRC) which is part of the UNC Center for Public Health Preparedness at the University of North Carolina at Chapel Hill’s Gillings School of Global Public Health and was supported by the Centers for Disease Control and Prevention (CDC) Grant 1PO1 TP 000296. The contents are solely the responsibility of the authors and do not necessarily represent the official views of CDC. Additional information can be found at http://cphp.sph.unc.edu/ncperrc/

  2. Background • All states now use an electronic disease surveillance system • What we know • Increase speed of initial notification to public health and number of cases reported • Facilitate data capture and review • What we don’t know • Does electronic disease surveillance improve public health surveillance practice? • Support public health interventions? • Is electronic disease surveillance more efficient or cost-effective? • Improve population health?

  3. Background • Project: To evaluate North Carolina’s electronic disease surveillance system • Project objectives • Describe workforce resources used for electronic disease surveillance system • Describe impact on case reporting and surveillance practice • Identify best practices for electronic disease surveillance

  4. Background • North Carolina Electronic Disease Surveillance System (NC EDSS) • Highly customized off-the-shelf Maven system • Implemented in 2008 • All reportable diseases except syphilis and HIV • Case data entered by • LHD staff • Laboratories via ELR (≈ 33% of cases) • State staff • System offers additional surveillance capacities

  5. Methods • Random sample: 30/100 counties • Interviews • NC Electronic disease surveillance system (NC EDSS) lead • Staff # and hours • Use of NC EDSS system • Use of surveillance data • CD Nurse • Case management • Use of NC EDSS system • Cost • $29/hour ($60,552/yr) salary

  6. Methods • NC EDSS system data • All VPD, STD, and other CD cases • Number of cases • Timeliness: % reported to state within 30 days • Accuracy: % of cases returned by state to LHD • Currently ignored cases: % of cases never handled >45 days old

  7. Methods Composite score for indicators of good reporting practice: • Assigned 1 point each for: • Timeliness (>79% of completed cases submitted to state in <30 days) • Accuracy (<17% of cases returned to LHD for corrections) • Ignored cases (<1% of total cases ignored after 45 days) • High/low comparison: High (2/3 points) vs. Low (0/1 points) • County size: Small: <55,654, Medium: 55,655 - 107,427, Large: >107,427

  8. Results: Respondent Profile • May-August 2010 • 28 counties • Broad geographical distribution • Broad population distribution • 8,888-923,944 population • 10 small, 8 medium, 10 large

  9. Results: Cases • Total number of cases reported: 10,809 Smaller Larger

  10. Results: Staff using electronic disease surveillance • Total staff using NC EDSS • 136 employees, 34.5 FTEs • average 4.8 employees, 1.2 FTEs per county • Type of staff • CD nurses/supervisors • Administrative staff • DIS • Laboratory personnel

  11. Results: Staff time • 69% of employees using NC EDSS spent <12 hours per week of their work time on the system 10% 21% 69%

  12. Results: Staff expenditure (FTEs) Smaller Larger

  13. Results: Cases reported per FTE • Average of 68 cases reported per FTE per month Smaller Larger

  14. Results: Salary cost per case reported Smaller Larger

  15. Results: Salary cost per case reported Smaller Larger

  16. Results: Impact on case reporting and surveillance practice • NC EDSS leads: • 68% (19/28): Reported changes in case management • 89% (17/19): Improvement • CD nurses: • 57% (12/21): Reported changes in case management • 75% (9/12): Improvement • Because • Increased timeliness • Easier to know what to do/ask • Easier to access case-patient data • More thorough documentation

  17. Results: Impact on case reporting and surveillance practice • Counties using >5 NC EDSS capacities are more likely to • Report using surveillance data for decisions about public health program management • Report providing surveillance data to policy-makers • Report including surveillance data in annual reports • Report using data from extended surveillance form for disease intervention

  18. Results: Reporting performance rank • Rank based on • Timeliness (>82% submitted to NC DPH within 30 days) • Accuracy (<17% of cases returned to LHD) • Incomplete cases (<1% cases incomplete longer than 45 days Rank=0         Small Medium Large

  19. Mean cost per case by reporting practice rank

  20. Mean cost per case by rank and county size Small Medium Large

  21. Good surveillance costs less. How do we get there? Practices associated with high reporting performance • Can look at incoming cases daily (P=.11) • 6 staff or fewer using NC EDSS (P=.02) • Use surveillance data for program evaluation (P=.13) • >5% of cases=Not a case (P=0.22)

  22. Limitations • FTE data are reported by interviewee • Not verified by electronic system • Based on current user lists • Does not represent multi-county LHDs as well • Interviewer bias

  23. Conclusions • Resources used per case reported differs across state • Good surveillance costs less • Perceived improvement in case management and disease surveillance • Electronic surveillance system is supporting key surveillance activities • Daily use of electronic surveillance system by a focused user group supports good reporting practice

  24. Acknowledgements • Carolina Center for Health Informatics / UNC Dept of Emergency Medicine • Anna Waller ScD • Amy Ising MSIS • CDC/NC Division of Public Health • Aaron Fleischauer PhD UNC Gillings School for Global Public Health Pia MacDonald MPH PhD Carol Gunther-Mohr MA Meredith Davis MPH Lauren Dibiase MPH Heidi Soeters MPH Erika Samoff MPH PhD UNC School of Information and Library Science Stephanie W. Haas PhD

  25. Contact Erika Samoff erika.samoff@unc.edu NC PERRC cphp.sph.unc.edu/ncperrc

More Related