1 / 56

Hippocrates, Winslow & Babbage HWB Foundation

Hippocrates, Winslow & Babbage HWB Foundation. Richard Ghillani, M.D. Bradford Henley, M.D. David Karges, D.O. Henry Mankin, M.D. Jeffrey Mast, M.D. Augusto Sarmiento, M.D. Marc Swiontkowski, M.D. Edward Yang, M.D.

liv
Télécharger la présentation

Hippocrates, Winslow & Babbage HWB Foundation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Hippocrates, Winslow & Babbage HWB Foundation Richard Ghillani, M.D.Bradford Henley, M.D.David Karges, D.O.Henry Mankin, M.D. Jeffrey Mast, M.D.Augusto Sarmiento, M.D.Marc Swiontkowski, M.D.Edward Yang, M.D. not-for-profithttp://www.hwbf.org12th Annual Meeting Renaissance HotelWashington, DCFebruary 24, 2005AGENDA • Annual Report • Data Collection • Educational Archives • Meetings • By-Laws • Certification

  2. HWB Mission Statement The mission of the HWB foundation is to collect uniform and well specified clinical data in the form of text and graphics from reliable, university-affiliated sources and make that data, in quantities of statistical significance, available in the public domain. There, in an electronic bulletin board database format, collected data may be evaluated and re-evaluated by any party - particularly university affiliated research groups. The foundation endeavors to establish a new pattern of research whereby instead of the basic data being available only in small samples within the purview of a select few determined to make a specific case - an open database deriving input from multiple sources is created to permit a larger sample size with equal access from all points of view. Thus, enhanced validation of the reporting is possible.

  3. Data Collection - Why? "Maintenance of an active registry must be viewed as important as the medical care rendered, if the right person is going to receive the right treatment in a timely fashion without undue cost to society.“ • Gillot et al, Development of a Statewide Trauma Registry, J Trauma 29:1667, 1989

  4. Data Collection - Why? "The future belongs to whoever best measures quality of care and then markets it best. Whoever does will absolutely control the market, and everyone who doesn't will disappear." • Richard L. Scott, Esq, former CEO Columbia/HCA HealthCare Corp.

  5. Data Collection - Why? "Tens of millions of dollars have been spent in the establishment of trauma databases over the past 15 years. Much data is gleaned from ICD entries into administrative databases. The emphasis here is for maximum reimbursement. Because of the lack of clinical detail the data within these registries, the data is inadequate for sophisticated research and outcome analysis. " Cushing, Champion - J TRAUMA 1995

  6. Open Tibia Fracture = ICD 823.3 =

  7. Data Collection – by Default Practice Profiling Pennsylvania Health Care Cost Containment Council Maine Quality Forum Medicare Hospital Quality Initiative

  8. Data Collection – How?

  9. Data Collection – How?HWB Shortfall – Not Unique Electronic Medical Record - a cresting wave for 30 years - has not broken. 10% adoption- Berner et al, JAMIA Feb 2005 Katsushika Hokusai wood engraving "The Great Wave off Kanagawa" - cerca 1830

  10. Data Collection – How?President Bush – State of the Union Proposes all Americans have an EMR in 10 years to prevent medical errors and needless costs. • Spend $125 million to get the job done. http://www.jibjab.com

  11. Data Collection – Failure - Causes JAMIA Feb 2005 EMR Problems: • Malaligned Incentives • Product Viability • Data Quality • Standards - Interoperability

  12. Data Collection – Failure - Causes JAMIA Feb 2005 Malaligned Incentives Minimal Return On Investment with Information vs Imaging technology. CT study billable - not EMR data.

  13. Data Collection – Failure - Causes JAMIA Feb 2005 Malaligned Incentives - Rx Indirect Reduction through Practice Management Integration • Billing • Reporting • Scheduling • Transcription • Archiving

  14. Data Collection – Failure - Causes JAMIA Feb 2005 Malaligned Incentives – Rx Indirect Reduction through facilitation of: • Risk Adjusted Practice Profiling • Error Checking - Liability Reduction • RRC Certification • Academic Promotion

  15. Data Collection – Failure - CausesLeape, Massachusetts Plan, NPR 02/04/05 Malaligned Incentives – Rx 3rd Party Payors to purchase MD EMR. EMR Cost @ $35,000/MD ($800 million) 3rd Party Savings ($ 5 billion) through: • fewer errors and complications • fewer inappropriate tests • fewer inappropriate meds • fewer billing errors

  16. Data Collection – Failure - Causes JAMIA Feb 2005 EMR Problems: • MalalignedIncentives • Product Viability • Data Quality • Standards - Interoperability

  17. Database Production and MaintenanceEngh et al. CORR 421 April 2004Product Viability 1977 Starts porous coated THA stem. Seeks documentation of outcomes. • Data stored on paper forms in a physical chart.

  18. Database Production and MaintenanceEngh et al. CORR 421 April 2004 Product Viability 1983 – 1992 Starts electronic archiving - 1 data entry person • dBASE II (Ashton-Tate) • dBASE III • dBASE IV

  19. Database Production and MaintenanceEngh et al. CORR 421 April 2004 Product Viability Early 1990's Need: Automated data entry Standardized classifications and outcomes • dBASE IV to International Documentation and Evaluation System (IDES) migration

  20. Database Production and MaintenanceEngh et al. CORR 421 April 2004 Product Viability Mid 1990's IDES does not allowlocal modification. • IDES to FoxPro 2.6 migration with • Customized scanable forms. • Codes for a comprehensive Orthopedic Research Database (ORDB) • Third-party applications to facilitate data entry, maintenance, and report generation.

  21. Database Production and MaintenanceEngh et al. CORR 421 April 2004Product Viability 2004 3rd party apps not compatible with new FoxPro FoxPro 2.6 no longer supported by Microsoft FoxPro 2.6 not compatible with Windows XP. Database not reflecting contemporary clinical concerns • Currently migrating to another database application.

  22. Database Production and MaintenanceEngh et al. CORR 421 April 2004Product Viability Summary: • 7 major database changes in 21 years • associated data corruption/loss • $50,000 - estimated annual data entry & maintenance cost ( does not include the surgeons’ time). • New upgrades will substantially increase annual cost. • Consider saving the paper.

  23. Database Production and MaintenanceHWB Experience – Software Changes in 12 yearsProduct Viability • 5 major database changes in 12 years • Hypercard 1.0 • Fox Pro 2.5 • Visual Fox Pro 3.0 • FileMaker Pro • MySQL & PHP • 4 major front-end changes in 12 years • Hypercard 1.0 • Hypercard 2.0 • Metacard 2.5 • HTML

  24. Database Costs

  25. Data Collection – Failure - Causes JAMIA Feb 2005 EMR Problems: • MalalignedIncentives • Product Viability • Data Quality • Standards - Interoperability

  26. Data Quality "It has become increasingly clear that much of the clinical research that has long been published and on which we base much of our education and practice activity is, in fact, severely flawed."Keller, SPINE 20 384 1995 OTA Specialty Day Debate 1999 Swiontkowski

  27. Data QualityTang, Shortliffe et al, JAMIA Proc 1994 An observational study of physicians using 168 medical records in an internal medicine outpatient clinic • 81% of cases failure of chart to provide all information deemed necessary by the physician for appropriate management of the patient. • 38% of a clinic encounter is taken up by chart review.

  28. Data QualityGIGO What is written without pain is in general read without pleasure – Samuel Johnson 1709-1784 The quality of the data begins with the diligence of those who generate it. – Charles Engh - CORR 421

  29. Data QualityBarrie, Marsh, BMJ. 1992 Manchester physician-entered orthopaedic database Data Quality = Completeness X Accuracy 62% 96%

  30. Data QualityPolitical Factors Health Info Tech Success • 80% dependent on political factors • 20% dependent on info technologyReed Gardner 1998 Past President of AMIA

  31. Data QualityPolitical Factors • 2% surgeons over 55 have a major interest in orthopaedic trauma. • As it is these surgeons who tend to be politically influential it is clear that orthopaedic trauma is considerably disadvantaged. JBJS 1997 79B:1, Court-Brown, McQueen Editorial: Trauma management in the UK

  32. Data Quality Prospective vs. Retrospective Do clinical databases give rise to retrospective paper mills ?

  33. Data Quality Random vs. Non-Random Prospective randomized trials are well-proven for evaluating pharmaceuticals. Different hands administering a pill, however, have less influence on the result than different hands performing a surgical procedure. Matta JOT August 2001

  34. Prospective vs. RetrospectiveSwiontkowski, JOT 2001 Aug Because of the difficulty with performing high-quality control trials, observational studies are often the best evidence we have. However When orthopaedists rely on these weaker forms of evidence, they must acknowledge the risk of utilizing a suboptimal or potentially even detrimental intervention for patients.

  35. Prospective vs. RetrospectiveBenson et al, NEJM - June 22, 2000 Observational study advantages over PRCT: • lower cost • greater timeliness • a broader range of patients However Bias is a problem Some say: • observational studies are not reliable. • observational studies should not be funded. • observational studies should not be published.

  36. Prospective vs. RetrospectiveBhandari et al, Arch Orthop Trauma Surg. 2004 Jan Femoral Neck Fx THA vs ORIF 14 randomized vs 13 non-randomized Non-randomized studies: • Over estimated THA risk - mortality 40% • Under estimated THA benefit - revision reduction 20%

  37. Prospective vs. RetrospectiveBenson et al, NEJM - June 22, 2000 • 136 reports about 19 diverse treatments • We found little evidence that estimates of treatment effects in observational studies reported after 1984 are either consistently larger than or qualitatively different from those obtained in randomized, controlled trials

  38. Prospective vs. RetrospectiveConcato et al, NEJM - June 22, 2000 • 99 reports about 5 clinical topics • The results of well-designed observational studies (with either a cohort or a case-control design) do not systematically overestimate the magnitude of the effects of treatment as compared with those in randomized, controlled trials on the same topic.

  39. Prospective vs. RetrospectiveTornetta et al, OTA 2000 • Randomized trials are subject to Beta (Type II) error - insufficient sample size. • Beta (Type II) error is the probability of concluding that no difference between treatment groups exists, when, in fact, there is a difference. • The beta (Type II) error rate for randomized trials in orthopaedic trauma is exceedingly high, averaging 90%.

  40. Prospective vs. RetrospectiveSarmiento, JOT 2001 Aug • There is not a system today, and there will not be a system tomorrow, that will guarantee the elimination of intended or unintended bias or prejudice from the medical literature. • That being the case, let us accept outcome studies as well as traditional retrospective ones.

  41. Data Quality End User Impact

  42. Data QualityEMR Design - Embi, JAMIA April 2004 End User Impact Portland VAH • Documentation Availability • Work Processes and Communication • Alterations in Document Structure and Content • Mistakes, Concerns, and Decreased Confidence

  43. Data QualityPatel, JAMIA November 2000 End User Impact • EMR use changes physician information gathering and reasoning strategies. • Technology has a profound influence in shaping cognitive behavior. • Effects on cognition by technology design needs to be explored.

  44. Data QualityAMNews Feb. 17, 2003 End User Impact EMR Rejection - Cedars Sinai Revolt "They poorly designed the system, poorly sold it and then jammed it down our throats and had the audacity to say everybody loves it and that it's a great system.“ Cedars-Sinai Medical Center in Los Angeles has indefinitely suspended use of its computerized physician order entry (CPOE) system, after hundreds of doctors complained it was difficult to use and compromised patient safety Los Angeles Times

  45. Data Collection – Failure - Causes JAMIA Feb 2005 EMR Problems: • MalalignedIncentives • Product Viability • Data Quality • Standards - Interoperability

  46. Standards & InteroperabilityOpen Source ?? Solution ?? - Open Source Software– e.g. Linux PROS • Low Cost - Free • Stable • Could provide essential infrastructure • US Interstate Highways System • Internet Transmission Control Protocol / Internet Protocol • Public Library of program components CONS • Difficult to program and obtain local support • Microsoft buys IT lunch – Significant kickbacks barriers

  47. SummaryEMR Potential

  48. Data Collection – How?

  49. ConclusionHill, NEJM 1953 "One must go seek more facts, paying less attention to technique of handling the data and far more to the development and perfection of the method for obtaining them."

  50. Revised HWB Mission Statement The mission of the HWB foundation is to find methods to routinely collect well-specified, structured and privacy-protected clinical data from reliable sources and make that data, in quantities of statistical significance, available in the public domain where it may be interpreted from all points of view.

More Related