1 / 101

EHR-Q TN Final Review

EHR-Q TN Final Review. Brussels, March 30 th 2012. Introduction of consortium and experts. Agenda. The partners EHR-Q TN : why was it needed? Overview of the Network Activities Investments done during the lifetime of the project Validation of the EuroRec Statements

vinnie
Télécharger la présentation

EHR-Q TN Final Review

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. EHR-QTNFinal Review Brussels, March 30th 2012

  2. Introduction of consortium and experts Brussels, March 30, 2012

  3. Agenda • The partners • EHR-QTN: why was it needed? • Overview of the Network Activities • Investments done during the lifetime of the project • Validation of the EuroRec Statements • EHR Market Overview: main conclusions • Roadmap towards a sustainable pan-European certification of EHR systems • Recommendations from the Roadmap • Quality Labelling and Certification Procedures: the quality of the process • Conclusions and what brings the future? • Questions Brussels, April 8, 2011

  4. Brussels, March 30, 2012

  5. EHR-QTN : Why certification? Why the project? Brussels, March 30, 2012

  6. Reasons for “Certification” • eHealth and more specifically Electronic Health Record systems have an enormous potential to improve quality, accessibility and efficiency of care, provided they are: • reliable, trustworthy and of sufficient quality; • sharable and interoperable; • used appropriately. • Quality labelling and certification through professional third party assessment offers best chances for a comparable and reliable quality documentation of those systems. Brussels, March 30, 2012

  7. Some EHR quality issues • Patients are too important to just suppose that EHR systems are trustworthy. • Patient data should not be locked into one system or application. • Patients essential data should be made available anywhere anytime to health professionals authorised to access them. • Patient has the right to request confidentiality of some data to be handled while taking full responsibility for that option. • Patients’ data accesses should be audit-trailed. Brussels, March 30, 2012

  8. The quality challenge Myers et al*. show that adverse events are mainly resulting from: • missing or incorrect data; • data displayed for the wrong patient; • chaos during system downtime; • system unavailable for use. Examples of reported incidents in healthcare where a medical information system was the cause or a significant factor: http://iig.umit.at/efmi/badinformatics.htm *Myers DB, Jones SL, Sittig DF, Review of Reported Clinical Information System Adverse Events in US Food and Drug Administration Databases, Applied Clinical Informatics 2011; 2: 63–74.

  9. Why was such project needed? • Too many authorities not aware of the quality needs and related problems. • Even more authorities not effectively involved in quality labelling and certification. • Lack of awareness about what exists. • Existing initiatives not comparable with each other. • Cross-border quality assessment almost unexisting. Brussels, March 30, 2012

  10. Overview of the Network Activities • Dissemination & Awareness • Workshops • Annual EuroRec Conferences • Project Data • Consortium Meetings • Deliverables Brussels, March 30, 2012

  11. Workshops • In total 76 workshops • At least one per country • Agenda of the workshops: • Year 1: Communications from the Commission, Seal Level 1, Repository • Year 2: Validation of the EuroRec Statements • Year 3: Validation of procedures & roadmap for the countries with most chance of making progress Brussels, March 30, 2012

  12. Overview Workshops • Listed in the three Annual Reports and the Final Report • Reported in deliverables D1.4, D2.2, D3.2, D4-I, D4-II, D6.1a, D6.1b and D6.4 Brussels, March 30, 2012

  13. Workshops: questions and suggestions Resulted in valuable comments and questions regarding: • Quality labelling and certification as such: need, quality & professionalism of the process; • The content of quality assessment; • The role of the (national) health authorities; • The use and appropriateness of the EuroRec Descriptive Statements; • The importance of functional testing. Brussels, March 30, 2012

  14. A Norwegian statement… A recent Norwegian statement is an important one and based on a large experience in certifying “messages” (certifying all kind of standards based data exchange). The Norwegian Ministry of Health and Care Services stated that “EHR Quality will be difficult to reach unless certification of the EHR systems is made mandatory”. Brussels, March 30, 2012

  15. EuroRec Conferences Brussels, March 30, 2012

  16. Consortium Meetings Brussels, March 30, 2012

  17. Deliverables Brussels, March 30, 2012

  18. Investments during the lifetime of the project Brussels, March 30, 2012

  19. Investments • Improving functionality of the Repository. • Definition of EuroRec Seal Level 1 and 2. • Translations into 21 languages. • Validation workshops. • Setting-up effective quality labelling. Brussels, March 30, 2012

  20. Functional Tools Investments • Extensionof the repository: secondary use of EHR data and lab reporting related issues. • Adding content-related functionality: • definition section • comments and interpretation • national variants • Adding maintenance functions to manage e.g. modifications in a multi-lingual environment Brussels, March 30, 2012

  21. EuroRec Seals • EuroRec Seal Level 2 defined during the lifetime of the project • Basic sets of quality criteria • Addressing reliability, trustworthiness, authentication, access management and basic functionality • Comparable across borders and domains • Example of a “market driven” approach • Standardisation of the procedure (see further) • Products out of 7 different countries certified Brussels, March 30, 2012

  22. Industrial testimonies • “EuroRec Seal greatly helped to improve SW products.” • “Developers got clear guidelines about key features that are often neglected by end users.” • “Customers got additional assurance of software quality.” • “The Seal offers increased odds at foreign markets.” • “The EuroRec approach is very useful for new software (modules), new application design… giving ‘new ideas’.” Brussels, March 30, 2012

  23. Translations: 12.379 in total Brussels, March 30, 2012

  24. Validation of the EuroRec Repository and of the Descriptive Statements Brussels, March 30, 2012

  25. Validation of the statements • Validation done • When translating the original statements • During the workshops • By the “clients”, software suppliers • Validation of the following aspects: • Formulation • Content • Technical correctness • Importance (for application quality) • Feasibility • Reported in deliverable D4.4 Brussels, March 30, 2012

  26. Setting-up effective quality labelling Brussels, March 30, 2012

  27. Effective Quality Labelling • Documentation free available. • Seal request forms on the web. • Procedure validated against the applicable standards. • Partners involved (not only ProRec centres). Brussels, March 30, 2012

  28. EHR Market Overview : main conclusions Brussels, March 30, 2012

  29. Brussels, March 30, 2012

  30. EHR Market Overview - Summary • 24 National market Overviews using a common template • Two Deliverables • Del. D3.1 Part I: Suppliers and Supplier Organisations: 1.005 supplier / product ID • Del. D3.2 Part II: Authorities and important stakeholders: 663 addresses • Detailed presentation: previous review Brussels, March 30, 2012

  31. EHR Market: some considerations • Very fragmented as expected • May endanger quality of applications, though never proven. • Not the privilege of the suppliers: also large number of “important stakeholders”. • There is no one single nor homogeneous provision of healthcare in Europe, neither within one country • Each profession needs a “different” application. • Using the same application in several countries does not work. • There is some market “concentration” • Concentration of ownership • No concentration of applications, even when the same name is used in different countries Brussels, March 30, 2012

  32. One of the project conclusions The only approach that may work seems to be to increase harmonisation within diversity, offering more and more “similar” (not identical), functions based on the same basic functional and quality specifications. Brussels, March 30, 2012

  33. Brussels, March 30, 2012

  34. Roadmap towards a sustainable pan-European certification of EHR systems Brussels, March 30, 2012

  35. Deliv. D5.2 :Table of Content • Potential of the EHR: Why? Possible benefits? Barriers to adoption? Requirements? • Importance of Quality Labelling & Certification • Stakeholders and their role • State of Practice in Europe • Prerequisites for (sustainable) Quality Labelling & Certification • Main strategies regarding quality labelling and certification • Roadmap for certification at National Level • Cross-border Certification Roadmap • Cost of Certification & Business Definition • Risks and Remedies • EHR-QTN Recommendations Brussels, March 30, 2012

  36. Introductory statement “Realising the potential health added and economic value linked to using an EHR system is not obvious. It requires huge investments and a professional development environment. It requires a permanent focus on “quality” at functional quality level and thus also regarding “interoperability”. Quality should be documented in an objective, comparable and trustworthy way.” Brussels, March 30, 2012

  37. Verification versus Validation • Verification = technical correctness of the software application or component of an application. Verification attempts to answer the question “is the software built right (rightly)?” => medical device directive ? • Validation = compliance of the application to the consumer’s / user’s functional expectations: is the application offering what it is expected to do? Validation attempts to answer the question “is the right software built?” => procurement and functional validation ! Brussels, March 30, 2012

  38. Five areas for quality labelling and certification • Data exchange facilities (incl. IOP) • Functional (incl. some aspects of IOP) • Administrative and billing facilities • Use related measurements and validation • Software development quality (out of scope, not specific for EHR systems) => Different expertise , different organisations Dublin, November 17-18

  39. Scope of EHR QL & Certification • Different expertise => Different organisations. • Our focus for Deliverable D6.2 is on • Functional testing, including some aspects of interoperability • Data exchange (message production and integration) • We will address how to “cooperate” later on. Dublin, November 17-18

  40. The use of EHR systems • Consortium listed the top 5 good reasons to generalise the use of EHR systems. • A small literature survey proofs quantifiable profits. • There are nevertheless still barriers to EHR adoption: • by Healthcare Professionals • by IT providers • related to political and organisational factors Brussels, March 30, 2012

  41. Not all EHR systems are good enough • Selecting the most appropriate application from the correct vendor is a real challenge => importance of assessing the systems’ quality. • Comprehensive and correct use is another important factor => • Importance of training the users • Importance of assessing the users • Motivation for incentives for the users. Brussels, March 30, 2012

  42. Impact of certification Consortium listed the top 5 good reasons to adopt country wide EHR certification: • Assure compliance to national rules and standards. • Increase quality of the products through coherent and pre-tested functionality. • Leverage exchange of health (care) related data and interoperability of systems. • Improve patient safety in care. • Have a reliable data source for secondary use. Brussels, March 30, 2012

  43. Prerequisite “ If quality labelling and certification of EHR systems is to become generalised, then it needs endorsement at the highest competent levels e.g. by the EU Commission, the responsible Member States Ministries, the Healthcare Providers Organisations and the specialised industry.” Brussels, March 30, 2012

  44. Stakeholders & Functional Diagram <= ISO/IEC 17011 ISO/IEC 17020 => ISO/IEC 17025 => Brussels, March 30, 2012

  45. Kind of Quality Assessments • “Authority driven” versus “Market Driven” • Independent organisation / Industrial organisation • Public initiative / Supplier initiative • Third party assessment versus self-assessment • Comprehensive versus Modular • National / Regional versus Cross-Border • System functionality versus “Interoperability” • Generic versus Domain/Target Specific (LIS,…) Brussels, March 30, 2012

  46. Procedure and kind of attestation most suitable procedure

  47. Actual “National” Certification Brussels, March 30, 2012

  48. Actual “cross border” quality labelling • Not “authority driven” • There is no such an authority • No formal recognition of certificates across borders • Three “private” initiatives • EuroRec: independent, focus on EHR systems (functional and exchange as function) • I.H.E.: industry driven, focus on testing the exchange and the technical interoperability • Continua Health Alliance: industry driven, focus on devices content portability Brussels, March 30, 2012

  49. One Destination, Two Itineraries • Different and complementary approaches but always phased in a similar way: • Setting the framework & the decision to go for it • Pre-assessment: organisational context • Assessment: test related activities • Granting label or certificate Brussels, March 30, 2012

  50. Roadmap for Certification: national level

More Related