1 / 44

Technical assistance for data collection and data management

The New York State Experience. Technical assistance for data collection and data management. Brief History . The NYSDOH AIDS Institute has collected client level data for Ryan White contracts since 1995 Model is a distributed, custom built relational database product installed at each agency

linus
Télécharger la présentation

Technical assistance for data collection and data management

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The New York State Experience Technical assistance for data collection and data management

  2. Brief History • The NYSDOH AIDS Institute has collected client level data for Ryan White contracts since 1995 • Model is a distributed, custom built relational database product installed at each agency • Concept is agency ownership of data within a structured data collection and data entry environment

  3. Brief History, continued URS and AIRSNY • Original database design of the NYS URS based on data needed for Ryan White AAR • Expanded to include CDC grant requirements in 1996 • Updated to the AIRSNY system in 2007 • De-Identified and de-duplicated client level data exported from provider agencies to NYS via XML on NY State’s secure Health Commerce System • RSR file generated from provider systems, not State level database

  4. ELEMENTS OF TECHNICAL ASSISTANCE • I. Structure and validations in software to enhance accuracy and completeness and reflect grant monitoring standards • II. Technical Assistance and Training via phone, web-based systems and webinars, and in-person classroom training and user groups • III. Feedback via local reports and analysis of data submitted to the State

  5. Software Design, Technical Support and Training, and Feedback on Data Submissions must be balanced to ensure Accurate and Complete Data

  6. Supporting Accuracy and Completeness I. ELEMENTS OF SOFTWARE DESIGN

  7. Elements of Software Design VALIDATIONS ON DATA ENTRY • De-duplication: trapping duplicate client registrations on entry vs. analysis and review after the fact based on identifiers: prevention is better than treatment • Initial client registrations should at minimum check against existing records for name, gender, date of birth matches • Checks after the fact can look for name/gender/DOB matches, eUCI or URN matches. Issue becomes consolidating service records and other histories when dups are found

  8. Example: Adding an Intake pulls up a search utility to identify matches

  9. Elements of Software Design VALIDATIONS ON DATA ENTRY • Inconsistent dates and information: Issues such as DOB later than intake date, first service prior to intake, biological males with pregnancies, and other likely entry errors • Software should check dates and content for consistency on save and alert user • Information out of the realm of the possible should be prohibited • Probable errors should result in an alert message

  10. Each invalid entry returns a message and requests correction – detail on errors also appears in the information box on left

  11. Elements of Software Design MISSING DATA • Missing Data on Demographics easier to trap by adding required fields to save • Missing Data that is part of the sequence of services or status changes cannot be trapped on data entry, require reports and other analysis of data • Software can help to identify gaps but good data collection and data entry are critical to success

  12. Highlighting skipped fields and preventing save of incomplete data

  13. Identify gaps in data using a report-based assessment

  14. Completeness Assessment covers all clinical questions in RSR order

  15. A tickler or reminder system on major report components and monitoring standards is a tool to assist staff in managing client data

  16. Elements of Software Design USING REPORTS TO MANAGE DATA QUALITY • Reports that filter clients according to grant requirements allow efficient focus • Mimic report logic of the desired output as much as possible • Provide ability to drill down to client data that has identified issues or gaps • Information from reports can be used to work collaboratively with program staff to improve data collection and entry

  17. Reports can highlight issues or provide detail on content

  18. Elements of Software Design MEETING RSR REQUIREMENTS FOR CONTRACT/FUNDING SPECIFIC REPORTING • A. Structuring systems to identify RW eligible clients and RW funded services • B. Organizing data within systems that contain non-RW clients and services • C. Ensuring that reports are selecting only the clients and services that meet criteria for the RSR

  19. A: Identifying Reportable Clients: Histories on HIV Status and Service dates, Financial data, Insurance and other critical data

  20. B: Organizing Data: Establishing Programs and Services for RW Funded Contracts and Managing Client Enrollments

  21. C: Using items A and B to ensure proper client selection and providing detailed review for accuracy

  22. Questions regarding Software Design as a tool for TA?

  23. From Niagara to Montauk: Supporting Accuracy and Completeness II. TRAINING AND TECHNICAL SUPPORT

  24. Training • Significant challenges to offering training across a large geographic area and diverse provider types • Level of experience and expertise with data entry and data management varies • Turnover at the agency level requires an ongoing training presence • Training specific to the RSR is offered to our Part B providers

  25. In-Person vs. Web Based • Opportunity for interaction, observation and feedback • Ease of communication between participants • Enhanced relationship with training staff • Opportunity for DOH staff to observe and interact • Less resource intensive at both DOH and provider level • Ease of access – no travel • Flexible on-demand schedule • Broad geographic coverage “the personal touch” “ease of implementation”

  26. NYSDOH does both • In-person trainings are offered in NYC on an on-going basis; Upstate trainings are offered twice per year • An in-person User Group meets monthly in NYC • A web-based User Group meets monthly • On-demand training videos are available on the support website AIRSNY.org • Special purpose web trainings are offered as needed (e.g. at RSR season)

  27. In-Person Training volume in NYS from 2010 - 2012

  28. Web-based Training Videos are Available On-Demand at AIRSNY.org

  29. Technical Support • Technical support is critical for all steps in the data collection, data entry and data reporting process • The level of support can range from basic (providing forms, instructions, documentation) to advanced (one-on-one phone support, site visits, web-based support on specific issues) • Technical Support is resource-intensive

  30. Traffic on the AIRSNY website shows providers are using the resource to obtain forms, sign up for training, and view video tutorials

  31. On-line technical support is available for all registered providers and staffed by the programming contractor under NYSDOH/AI funding

  32. Technical Support RESOURCES • Technical support combines all available resources: AIDS Institute staff responsible for data monitoring and reports; subcontracted TA/training staff; web based; phone based; in person • As resources decline, ability to perform one-on-one and on-site support declines and emphasis shifts to web resourcesand centralized TA • Use of web based or group TA alone is least resource intensive but has limited effectiveness due to limits of feedback, communication and assessment • Use of one-on-one support (phone or in person) has limited effectiveness due to lack of immediate response and availability

  33. Questions regarding Training and Technical Support?

  34. Supporting Accuracy and Completeness III. FEEDBACK VIA DATA SUBMISSION AND REVIEW

  35. Methods for Feedback • Ongoing monitoring of reports and/or data submissions on key variables • Periodic review of complete YTD reports • Year end review prior to submission

  36. Program-specific Reports on CLD submitted to NYSDOH AIDS Institute Generated Quarterly

  37. Feedback loop on Provider RSR Interim (and year end) Submission

  38. RSR Completeness Reports are Generated by the AIDS Institute (mimic HRSA reports)

  39. Detailed lists of the system level client ID (non identifiable) of clients with missing data can be sent back to the provider for troubleshooting, if the data comes from AIRSNY.

  40. Grantee level RSR data handling and review – Feedback Loops

  41. Challenges to Feedback • Data from multiple systems outside of the influence of NYSDOH • Validation errors on data outside of NYS systems • Multiple points of contact within provider agencies • “Detective Work” in locating sources of issues in multiple systems at provider level with limited time

  42. In Summary • Accurate and Complete Data depends on good systems for data collection, data entry and data management (software characteristics). • Using those systems effectively depends on training and technical support. • Review of submissions is the last stand of data completeness and cannot stand alone.

  43. Final Q&A

  44. FOR FURTHER INFORMATION, PLEASE CONTACT: VIDA BEHN CHERNOFF, NYSDOH AIDS INSTITUTE vab01@health.state.ny.us OR VISIT AIRSNY.org

More Related