430 likes | 599 Vues
Building Privacy into Health Care Technology. Mike Gurski Senior Policy and Technology Adviser Information and Privacy Commissioner/Ontario Health Care Privacy Conference September 23, 2002 Toronto. The Security/Privacy Dilemma. Health & IT issues. Electronic health records
E N D
Building Privacy into Health Care Technology Mike Gurski Senior Policy and Technology Adviser Information and Privacy Commissioner/Ontario Health Care Privacy Conference September 23, 2002 Toronto
Health & IT issues • Electronic health records • Online prescription renewals & scheduling • Telemedicine • Integration of health networks • Increasing use of data by third parties • Public Private Partnerships / outsourcing • Increasingly literate patients
A simple, plain-language tool Free of charge, self-administered Available on paper, online, or as a CD-ROM The PDT
The Privacy Security Relationship • Privacy as personal control • Security as organizational control • Overlaps but distinct. • Examples of the differences
Electronic Record Keeping in the Healthcare Sector • Potential to increase quality of care for patients • Canadians are amenable to the idea of computer storage of health records
Electronic Record Keeping in the Health Care Sector • Very sensitive PI • Privacy risks • Must respect Privacy Design Principles
HEALTH PETs Privacy enhancing technologies have already been implemented into some health care systems. There are a number of case studies that highlight progress in this arena: 1. Hippocratic Database 2. Dutch HIS 3. IBM's Zurich Research Laboratory
HIPPOCRATIC DATABASE • Developed by IBM Research The database negotiates the privacy of information between a user and an organization. The database owner sets an information storage and retrieval policy which database donors can accept or deny.
Traditional Database Characteristics • Standard Database functions: • Manage persistent data • Efficient access of large volumes of data • Managed access, manipulation • Resiliency
Hippocratic Database Characteristics • ‘And about whatever I may see or hear…I will remain silent … unutterable’ • Excerpt from Hippocratic Oath • Incorporation of privacy values found in legislation, FIPs, policies. • Straw man focuses on purpose for data collection.
Strawman Test • Ensure that a customer’s privacy is supported by the Hippocratic Database, including against attacks by authorized staff
How it works • Before data is collected, the types of information to be obtained and basic rules about how the data will be used are decided. • These rules include who should have access to the data and how long it will be retained.
How it works (2) • When a user enters information, an application at the user end will interact with the database to check that its data privacy policies are acceptable to the user, who has already programed his or her preferences into the application. • Once verified, data is transferred from the user to the database
HIPPOCRATIC DATABASE Architecture for the Hippocratic Database is to be based on 10 guiding principles: Purpose specification, consent, limited collection, limited use, limited disclosure, limited retention, accuracy, openness, and compliance.
The Architecture Continued • Other functions of the Hippocratic Database: • A tool (Privacy Constraint Validator) tests the business’ privacy policy against the customer’s preferences. • The data is stored together with that preference. • Data access is limited to “authorized use” and “authorized user”. A ‘Record Access Control’ tool would automatically check a query and the user before allowing the data to be used. • Tests for anomalous patterns of requests would act as a control for an authorized user querying data for an unauthorized purpose. • E.g., customer service rep attempting to steal e-mail addresses.
Potential Benefits • Provide privacy controls within database, not just at policy level; • Can dovetail into P3P, mirroring privacy preferences in the database structure itself; • Provide level of trust to patients and customers regarding protection of their PI.
DUTCH HIS (Hospital Information System) “Privacy engineering in the health informatics field has resulted in system designs that can guarantee highly conditional linkabilities of identity in terms of access control.” Gilles van Blarkom Office of the Dutch Data Protection Registrar “Guaranteeing requirements of data‑protection legislation in a hospital information system with PET” The British Journal of Healthcare Computing and Information Management
BACKGROUND Case study looks at how patient identification numbers can be encrypted to create a pseudonym in a Hospital Information System (HIS) - true identities of patients are hidden to unauthorized users of the system. - system ensures compliance with EUnion data protection legislation
TYPICAL PROCESS An example of the encryption used in the Dutch HIS: 1. After successful log-in by user at a PC client, a process is started to select the patient whose medical record is to be accessed; 2. Once the correct patient is identified, the unique identifying number is passed from the server to the client, this number is encrypted to a value being the pseudo-identity of this patient;
TYPICAL PROCESS cont. 3. In the client, this number is encrypted to a value being the pseudo-identity of this patient; 4. Using this pseudo-identity, the required table containing the medical record can be accessed
Creating a Secure Environment 1 3 4 Client-Server architecture PC (Client) 2 Server Privacy-incorporated database
OUTCOME - A HIS using this technique conforms to all requirements of the mandatory data protection called for by the EU legislation. - The encryption process is performed at no measurable cost and has no adverse effect on the performance of the application.
IBM ZURICH RESEARCH CENTRE Michael Waidner IBM Privacy Research Institute Switzerland
BACKGROUND IBM's Zurich Research Laboratory is the European branch of IBM Research and is located in Switzerland. The lab comprises three scientific departments: - Communication Systems; - IT Solutions; and - Science & Technology. Michael Waidner is manager of the Network Security and Cryptography Research Group.
IBM Privacy Research Institute Within the research lab is the Privacy Research Institute, established in 2001 to develop the necessary technologies for enterprises that enable the transition from today’s privacy-unaware or even privacy-intrusive ways of doing e-business to privacy-enabling ways.
IBM Privacy Research Institute The Privacy Research Institute has created a global research program to develop privacy-enhancing services and technologies. It addresses issues such as: • Enterprise Privacy Architecture • Enterprise Privacy Policies • Privacy-enhanced management of p.i. • Privacy-enhanced XML document access control • Privacy-enhanced PKI • Privacy-enhanced data mining
Enterprise Privacy Architecture • Enables enterprises to provide well-defined levels of privacy to their customers. • This architecture identifies privacy-enabling security technologies as well as all components necessary for enterprise privacy management.
Enterprise Privacy Policies • A policy language that enables enterprises to describe their privacy practices; • Policies are associated with all data collected; • This "sticky" policy paradigm mandates that 1. policy sticks to the data; 2. travels with it; 3. can be used to decide how the data can be used.
Privacy-enhanced management of personal information • Software components that enable individuals and enterprises to manage, use, and distribute personal data according to privacy policies; • Also supports contacting the data subject for consent, and allowing data subjects to access/update their data and policies.
Privacy-enhanced XML document access control • Enables individuals to specify privacy preferences as access control policies for XML-encoded personally identifiable information; • Ensures that each element in the document is securely updated.
Privacy-enhanced pseudonym-based public key infrastructure (PKI) • Digital signatures and public-key infrastructures (PKIs) help ensure personal data is only given to people deemed trustworthy. • The system idemix (identity mix) enables an individual to establish a specific fact. This allows only information that is necessary, to be released. • Anonymity can be revoked if necessary.
Privacy-enhanced data mining • Allows users to randomize information in their records; • Preserves privacy at the individual level while still building accurate data mining models.
Step 1: • Define privacy expectations of the public and identify legislated requirements: • Survey polls and reports that discuss public discontent; • Compare and understand privacy legislation within your jurisdiction; • Standard privacy principles.
Step 2: • Develop privacy policies and principles: • Free diagnostic tools: • Privacy Diagnostic Tool www.ipc.on.ca/english/whatsnew/newsrel/08601nr • Privacy principles and standards: • Fair Information Practices (www.cdt.org/privacy/guide/basic/generic)
Step 3: • Undertake an assessment of human and informational resources with a focus on personally identifiable data (collection, processing, management, flows and storage): • What does your organization possess • Human resources, budget size, a plan; • What is required • More people, more tools, more strategic modeling.
Step 4: • Undertake a threat risk assessment by completing a Privacy Impact Assessment (PIA): • Management Board Secretariat’s PIA – www.gov.on.ca:80/MBS/englis/fip/pia/pia.pdf • “The Value of Privacy Engineering” by Steve Kenny and John Borking – elj.warwick.ac.uk/jilt/02-1/kenny.html
Step 5: • Deploy methodology for privacy risk management at the systems level: • Use results from the PIA analysis to communicate and transfer policy/legal vulnerabilities into technological solutions • Design rules and controls around personal information, linkability, access, use, accountability and delineation of business processes into the architecture of the network; • Rules should depend on data type (level of sensitivity etc.).
Step 6: • Introduce the rules and controls developed in the previous step at the source code level: • Apply the amalgamated design into the already existing system or as foundation for the beginnings of a system
Step 7: • Deploy and audit, through a model of continuous improvement. Review expectations and requirements: • Always ensure from a policy, legal and technological perspective your organization and all its systems/tools are privacy compliant; • Train all staff (IT, marketing, engineers, senior executives) to be privacy sensitive and preventative solution providers.
How to Contact Us Mike Gurski Information & Privacy Commission/Ontario 80 Bloor Street West, Suite 1700 Toronto, Ontario M5S 2V1 Phone: (416) 325-9164 E-mail: mgurski@ipc.on.ca Web: www.ipc.on.ca