1 / 14

Feed-back on Belgium Experiments

Feed-back on Belgium Experiments. Brant E. Fries, Ph.D. University of Michigan, USA Brussels, Belgium February 19, 2008. New Technology of Assessment. Link assessment directly with clinical care Turn data into information Multiple uses of data Careful design and scientific testing

luke
Télécharger la présentation

Feed-back on Belgium Experiments

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Feed-back on Belgium Experiments Brant E. Fries, Ph.D. University of Michigan, USA Brussels, Belgium February 19, 2008

  2. New Technology of Assessment • Link assessment directly with clinical care • Turn data into information • Multiple uses of data • Careful design and scientific testing • Cross-sector compatibility • Cross nation comparisons

  3. It’s not enough just to measure…..

  4. Link assessment directly with clinical care • Primary goal: care of the individual • Use Clinical Assessment Protocols (CAPs) • triggers to target opportunities for better care • content to design interventions • New triggers better identify individuals who can benefit • NOT automated care planning!

  5. Turn Data into Information • Assessment data can be used for multiple purposes: • Clinical care • Case-mix measurement and payment • Quality measurement and program evaluation • Eligibility determination or program recommendation • Program monitoring and policy research

  6. Multiple Uses of Data • Efficient, as only collected once • Focuses attention on proper assessment • Offsetting incentives encourage accuracy

  7. Careful Design and Scientific Testing • Design by clinical experts • Cross-fertilization from other long-term care sectors • Extensive testing: • Reliability • Validity • International testing and experience • Development of multiple applications • Careful translation to other languages • Cross-sector integration

  8. Cross-Sector Compatibility • Goal of new interRAI “Suite” • Assessment items consistent across instruments • Common structure for all databases • Allows comparisons between sectors • Effective items used across sectors

  9. Comparing Persons Served in Two Michigan Settings

  10. Cross-National Comparisons • Practical experience from use in multiple nations • Cross-national comparisons provide more accurate standards • Only possible with standardized assessment

  11. Lessons Learned • Staff training • Plan for staff turnover • Develop cadre of “trained trainers” • Invest in computer-directed learning technology • Establish web-based clinical “help desk” • Create “Frequently Asked Questions” (FAQ) list; post and update frequently • Set training requirements in regulation or contracts

  12. Lessons Learned - 2 • Develop “business intelligence” plan • Analyze desired uses of data • How will stakeholders get timely information? • Design reports for key audiences: • Administrators • Clinicians • Policymakers • Consumers? • Fund adequate, multi-year budget • Future data analysis: make or buy? • All or above: at both government and provider level

  13. Lessons Learned - 3 • Insure timely data outputs • Include both clinicians and administrators in software design/ purchasing decisions • Negotiate performance contracts with software vendors • Set aside funds to create additional reports as need arises

  14. Lessons Learned - 4 • Don’t let the assessments become “paperwork” • Field needs strong, persistent message about importance of data quality from top leadership. • Use the data in many ways: • “Continuous Quality Improvement” (CQI) efforts • Encourage users to identify effective “best practices” • Invest in audit process to ensure assessment and data accuracy over time

More Related