machine learning regulated by the fda n.
Skip this Video
Loading SlideShow in 5 Seconds..
Machine Learning Regulated by the FDA PowerPoint Presentation
Download Presentation
Machine Learning Regulated by the FDA

Machine Learning Regulated by the FDA

319 Vues Download Presentation
Télécharger la présentation

Machine Learning Regulated by the FDA

- - - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

  1. Machine LearningRegulated by the FDA Pilar N. Ossorio, Ph.D., JD April 5, 2019

  2. FDA regulates medical device software • Software in a medical device is run on or operates a hardware medical device system. Software, and any advanced analytics it embodies, is a component of or accessory to the hardware medical device. • FDA has regulated this type of software for years. E.g., deep learning neural networks in radiological devices and locked machine-learned models in DNA sequencing systems. • Software as a Medical Device (SaMD) = software intended for medical use that can run on different OS or in virtual environments and is not intended for use with a particular hardware device. This type of software often incorporates advanced analytics. • FDA has largely exercised enforcement discretion over SaMD!

  3. IDxDR = a deep learning neural network running on a cloud server.

  4. ML Challenges FDA’s Traditional Oversight • Capacity • Possible flood of SaMD and ML-driven hardware devices • Need personnel with appropriate expertise • What should the regulatory agency do about the “learning” aspect of machine learning? • Quality metrics • For datasets on which the ML trains, tests, and validates • For reference datasets the algorithm would use in practice • For the algorithm’s performance

  5. April 2019 V2, June 2018 July 2017

  6. FDA, ML, and Healthcare Fairness • Existing data reflects or incorporates unfairness based on socio-economic class, race, gender, language spoken, etc. • Bias in the hc system • Social inequality that manifests in health disparities • Exacerbate by data cleaning? • Unrepresentative data • E.g., In many cases, existing reference databases used by genomic algorithms are not representative of patient populations! 2017 2003

  7. Other problems • Overfit algorithm • Other surprises • A health care algorithm can be properly trained, tested, and validated and still produce results that are dangerous to some group of patients. See, e.g., Rich Caruana, Friends Don’t Let Friends Deploy Black Box Models. • Confounding • Site specific effects • Other problems in casual inference

  8. Impacts of ML on Healthcare Fairness and Health Disparities… • ML in healthcare could exacerbate health disparities or alleviate them. Depends on whether we pay attention! • Need to characterize the training and test datasets to assess for biases (need metrics) before and after cleaning • Experiment with the algorithm to identify biases and quirks • FDA can require these things!

  9. Ethics and the FDA • Safety and efficacy determinations • Biases in the data on which ML is trained/tested/validated might affect the risks or effectiveness of the ML device, at least for some groups of people. • Indication(generalizability/external validity) • If the ML device is not extensively studied, we will not know enough about the people for whom it is indicated and the circumstances under which it is indicated. • LABELING! • Not standardized or centralized for med devices • Empirical results show that hc providers do not understand a significant percent of the labeling, although empirical data are limited • My own pilot interviews w users of diagnostic devices found big problems w labeling.

  10. Thank you!

  11. Which software is FDA regulated? • 21st Century Cures Act §3060 (a)(1)(E), amended FD&CA §520(o) (Food Drug & Cosmetics Act): Software that meets the statutory definition of a medical device but is nonetheless excluded from regulation: • Not intended to acquire, process, or analyze a medical image or a signal from an in vitro diagnostic device or a pattern or signal from a signal acquisition system • Intended for displaying, analyzing, or printing medical information... • Intended for supporting or providing recommendations to a health care professional about prevention, diagnosis, or tx.. • Intended to enable such hc professional to independently review the basis for such recommendation… so that it is not the intent that such hc professional rely primarily on any such recommendation… See, Evans and Ossorio, The Challenge of Regulating Clinical Decision Support Software After 21st Century Cures, 44 Journal of Law, Medicine and Ethics 237 (2018). On the “black boxiness” of ML, see the work of W. Nicholson Price of U Michigan and see video of talks by Rich Caruana – Friends Don’t Let Friends Deploy Black Box Models.