html5-img
1 / 53

Computational Aspects of Medical Imaging

Computational Aspects of Medical Imaging. Christopher Stubbs Michael Brenner Al Despain Stan Flatte Robert Henderson Darrel Long John Tonry Peter Weinberger JASON Summer 2003.

ahanu
Télécharger la présentation

Computational Aspects of Medical Imaging

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Computational Aspects of Medical Imaging Christopher Stubbs Michael Brenner Al Despain Stan Flatte Robert Henderson Darrel Long John Tonry Peter Weinberger JASON Summer 2003

  2. JASON will undertake a study for the DOE and the NIH National Institute for Biomedical Imaging and Bio-engineering on the role of computation (broadly defined to include raw computational capabilities, mass storage needs, and connectivity) for medical imaging. This study will address the computational requirements in three general areas: The fusion of image data of varying modalities, over differing spatial and temporal scales and resolutions. The extraction and display of quantitative information, with associated uncertainties. Data archiving: raw vs. extracted parameters, metadata standards. JASON will assess the present status of computational, storage and connectivity needs for existing tools and techniques, and will project likely computational demands for the future. The imaging systems under consideration include both diagnostic and real-time clinical tools.

  3. Context • Bio-computing is a vast enterprise: protein folding, genetic databases, medical records... • Other committees have reviewed bio-computing: • Biomedical Information Science and Technology Initiative (BISTI) • Coalition for Advanced Scientific Computing (CASC) • President’s Information Technology Advisory Committee (PITAC) • Our focus was narrower, dealt only with biomedical imaging

  4. Findings • We were impressed with existing efforts! • Combination of applied mathematics, computational techniques, biological sciences • Computational demand (processing, storage) of generating, analyzing and displaying biomedical image data is within capabilities of current high-end systems (e.g. Linux clusters) • A caveat: this is a huge field, and there are counterexamples to our generalizations. However, there are also clear trends

  5. Relevant communities • Mathematicians • Physicists and Engineers • Biological Scientists • Computer Scientists and Computer Engineers • Clinical Physicians • You and me!

  6. So what is computationally hard? • Evolving from qualitative to quantitative analysis • Using common metrics for algorithm appraisal • Integrating across modalities and length scales • Evolving towards an accepted metadata standard • Database architectures that accommodate image data • Connectivity across federations of distributed data sets • Cultural issues – data access, open source...

  7. Briefers • Richard Leahy, Neuroimaging Research Group, USC • Christopher Johnson, Director, Scientific Computing and Imaging Institute, Univ. of Utah • Michael Miller, Director, Center for Imaging Science, Johns Hopkins Univ. • Mark Ellisman, Director, National Center for Imaging and Microscopy Research, UCSD • Larry Frank, Center for functional MRI imaging, UCSD • Michael Vannier, Chair, Dept of Radiology, Univ. of Iowa • Richard Martino, Director, Division of Computational Bioscience, Center for Information Technology, NIH • Judith Niland, Director, Division of Information Sciences, City of Hope Hospital, LA

  8. Domains of Medical Imaging Cellular Organs Vascular Body Basic Clinical Intra-cellular Molecular 10-9 10-8 10-7 10-6 10-5 10-4 10-3 10-2 10-1 1 Physical scale (m)

  9. Important Distinctions between Clinical Practice and Basic Science • Clinical practice: • $ driven -- throughput! • Commercial medical imaging systems • Basic research: • Can tolerate longer latencies • More interaction with acquisition hardware

  10. Diverse Biomedical Imaging Modalities • CT: Tomographic Xray imaging • MRI, fMRI: Magnetic Resonance Imaging • PET: Positron Emission Tomography • Ultrasound • Microscopy • EEG, MEG: electro-encephalography, magneto-encephalography • ...

  11. Biomedical Image Analysis • From Bits to Pictures • From raw image data to 2-d or 3-d images • From Pictures to Numbers • Identification and extraction of features • From Numbers to Knowledge • Comparison with comparable cases • Comparison with prior history • Archive both images and extracted parameters Quantitative Analysis Qualitative analysis

  12. From Bits to Images • The different imaging techniques require different image construction algorithms • CT scans use tomographic inversion • Positron imaging uses back-to-back photon detection geometry to reconstruct source location • EEG and MEG images typically use forward modeling • Raw data, and calibration parameters, are seldom stored. • This is not a solved problem – algorithmic work still needed!

  13. Converting from Raw Measurements to Images • CT scans require conversion from integral transmission measurements vs. f to voxels • Inverse problems such as these are often ill-posed (Swiss Federal Institute of Technology animation)

  14. www.uke.uni-hamburg.de/institute/physiologie/neuro_physiologie/images/meg_bem_icon.gifwww.uke.uni-hamburg.de/institute/physiologie/neuro_physiologie/images/meg_bem_icon.gif Forward Modeling Example: MEG Goal is to measure electrical activity in the brain, from exterior B field measurements Simple models assume spherical head with scalar conductivity field More sophisticated models use MRI data to determine both brain geometry and conductivity tensor Exterior magnetic field measurements are matched to current sources within this conductive volume

  15. How “hard” are the computational challenges in generating images? CPU cycles for image generation • Inverse problem: CAT scan of N integral transmission measurements into i x j x k voxels • Near-real time turnaround from commercial scanners • Typical research image processing challenges are in the domain of few-CPU clusters, for typical image acquisition rates. Typical research MRI imaging takes ~ 1hr • Necessary CPU resources depend on tolerable latency and data rate, but for existing techniques this is not presently a limitation: images analyzed essentially in real time. • Moore’s Law rules.

  16. How “hard” is data storage and transfer? • Typical images are at most 2563 for 3-d • At 16 bits this corresponds to 33 MBytes • Typical fMRI image archive is a few TBytes • Terabyte data sets are common today. 1 TB of RAID disk costs ~4K$. Cheap! • Bottleneck is in image transfer – At 100 Kbytes/sec, a 33 Mbyte image takes > 5 minutes to transfer.

  17. From Images to Numbers: Quantitative Analysis • Present clinical practice is uses expert judgment to obtain qualitative assessment, often as a narrative • Move to quantitative analysis has been slow • Present research focus assumes pathology is manifested as morphological change – identification of surfaces and boundaries, determination of volumes • Eventually, we’ll have clinical applications of “functional molecular imaging”

  18. Why Quantitative Image Analysis is Hard • Uncalibrated data • Geometrical registration • Things move over time • Automated feature recognition • Parameter fitting/extraction • Uncertainties • Quantitative comparison with relevant group or past history S. Koonin (really)

  19. Mapping Macaque Brain Differences with Geometric Flow Faisal Beg, Johns Hopkins, from M. Miller talk

  20. Geometrical “Flows” to Map Changes (From M. Miller talk) Distortion computed by defining a continuous flow field

  21. Image Subtraction (High-z Supernova Team)

  22. \tarsal 4 \length 44 \width 20 \tibia \width 34.2 \density 12 \fibia \width 22 \density 5 \fracture \x 12 \y 22 \signif 30 . . . Extracted Parameters http://radiology.bidmc.harvard.edu/prev/Headings/MedProfs/Research/paget_paper/cases/cases.html http://radiology.bidmc.harvard.edu/prev/Headings/MedProfs/Research/paget_paper/cases/cases.html Feature Interpretation code Sectioned Image Image analysis code Reconstructed Image Image generation code Ideally.... <xml id=“mricat" src=“mri_catalog.xml"></xml> <table border="1" datasrc="#mricat"> <tr> <td><span datafld=“PATIENT"></span></td> <td><span datafld=“IMAGE"></span></td> </tr> </table> ... • 43 55 234 • 56 22 45 • 564 5 54 • 67 300 56 • 89 340 89 • 546 44 66 • 345 506 12 • 45 234 78 • 34 349 45 • .... Code bank with version control Raw data, with calibration info.

  23. From Numbers to Knowledge - Databases • Current clinical approach assumes pathology is manifested in morphology • Extraction of relevant image parameters is an ill-posed problem – morphological analysis of variable blobs • Feature parameters are only meaningful in the context of comparison populations and/or history • Need (image features + clinical information + history) • Drives a need for merged data structures • Now starting in the context of patient records • No ability to query across populations • Need to include calibration data and uncertainties also

  24. Data Management Issues • Database issues – schema, indexing, transaction model • Longevity • Confidentiality legalities • Resiliency/Redundancy • Interoperability and federation across databases • Metadata standards • Proprietary vs. Open standards • Data Access!

  25. Database Issues • We see database challenges as a major oncoming freight train • Current databases do a poor job of dealing with images as database objects • Present practice in other fields is to store a pointer to the image data within the database structure • Effectively exploiting image archives will require a queryable database that returns an image stack

  26. The Database Opportunity • Merge image archives with databases that contain full data pedigree: • Image header information • Analysis pipeline version data and parameters • Calibration information • Extracted image feature parameters • Clinical presentation and annotations • Effective user interface

  27. Some Goals SQL-compatible database of extracted feature parameters, that contains pointers to images “Find me the set of images that contain identified features like this one..., with the following additional criteria” Pipelined on-the-fly execution of a new image analysis algorithm, on a query-selected subset of images in archive “I have a new way to identify tumors, and want to run it on all archived lung images” Image-based query tools “Return all images that have things that look like this....”

  28. An Example of a Federated Distributed System: Biomedical Informatics Research Network (BIRN) • Ambitious program attempting to link distributed data sets, across disparate length scales • Appears to us that the PI’s are well versed in contemporary computer technology • Links across agencies and disciplines

  29. Data Acquisition • 3D or 4D Data Refinement • Data Reduction • Database Deposition Organism Macromolecular Complexes, Organelles & Cells Molecules (M. Ellisman slide) Genome DB’s

  30. BIRN Project Objectives(from M. Ellisman talk) • Establish a stable, high performance network linking key Biotechnology Centers and Clinical Research Centers • Establish distributed and linked data collections with partnering groups - • Facilitate the use of computational GRID infrastructure and integrate BIRN with other middleware projects - • Enable data mining from multiple data collections or databases on neuroimaging and bioinformatics - • Build a stable software and hardware infrastructure that will allow centers to coordinate efforts to accumulate larger studies than can be carried out at one site. BIRN ‘Test-Beds” have very clear technical and scientific goals!

  31. Data Access Issues • Biomedical Imaging culture does not have a heritage of sharing code or images • This is in contrast to other communities, even with in the life sciences (genetic data, for example) • A thoughtful examination of NASA data management, with lessons learned, is available at http://adc.gsfc.nasa.gov/~gass/linsky/linsky.html

  32. NIH Data Access Policy (Feb 2003) “Starting with the October 1, 2003 receipt date, investigators submitting an NIH application seeking $500,000 or more in direct costs in any single year are expected to include a plan for data sharing or state why data sharing is not possible.” “Reviewers will not factor the proposed data-sharing plan into the determination of scientific merit or priority score. Program staff will be responsible for overseeing the data sharing policy and for assessing the appropriateness and adequacy of the proposed data-sharing plan.” (http://grants1.nih.gov/grants/guide/notice-files/NOT-OD-03-032.html)

  33. “Sharing Images”, Radiology, July 2003 Vannier and Summers

  34. A Typical Biomedical Imaging Paper My clever algorithm Random image, after Random image, before No common set of test images Source code usually unavailable No repository of before and after images Few quantitative comparison metrics

  35. Examples of Image Databases From Vannier et al, Proceedings of the IEEE, in press

  36. A Middle Ground: Computer-Assisted Qualitative Analysis • Interactive visualization tools • Feature identification and highlighting • Identification of relevant comparison images

  37. Interactive Visualization Example Courtesy of Chris Johnson, University of Utah

  38. How “hard” is uncertainty visualization? • Heritage of xray films is a strong influence • Interactive displays are useful, anecdotal evidence of value • There is a research community working on multi-dimensional visualization, including uncertainties • We think the impediments to incorporating uncertainties are twofold • Intrinsic: imaging community seldom uses statistical/likelihood formulation • Cultural: physicians not accustomed to this

  39. A summary: Present day and 5 year outlook

  40. An Alternative Approach....? • Our perspective thus far has been extending current state-of-the-art. • An alternative approach would be to use modeling & computational power to do what radiologists do, namely impose an informed filter on the imaging data – using “laws” of physiology and prior experience they collapse from many degrees of freedom to only a few. • Can this be accomplished using physical laws (mechanics, elastic properties, anatomic constraints...)? • Full functional models at imaging resolution? • This is a major computational task, for each image.

  41. Recommendation #1: Implement the BISTI recommendations • The Biomedical Information Science and Technology Initiative (BISTI) report recommended investment to provide access to a hierarchy of computer resources. • Availability of state-of-the-art computing platforms is essential to the success of the biomedical imaging community. • The only way to benefit from Moore’s Law is to periodically buy new hardware...

  42. Recommendation #2: Calibrate! • Tradition of qualitative analysis of images has not required procedures that compensate for distortion and other instrumental systematics • Calibration of each apparatus, e.g. MRI system, would allow for real registration of images Regular lattice Known and stable spacing MRI scanner

  43. Calibration Data not Typically Bundled with Images Typical geometrical distortion from clinical MRI scanner Tradition of qualitative analysis has hindered quantitative calibration

  44. Calibrations Can Aid in Fusing Imaging Data Registration of images can use feature recognition, or actual geometrical coordinates Incorporating calibration data into image structure allows post-processing www.egi.com

  45. Recommendation #3: Develop and distribute an open archive of standard test images. • Typical medical image analysis paper uses proprietary data set, qualitative before/after comparison • NIBIB could promote apples-to-apples comparison with “Bio-Lena” data set, both raw and image data New algorithms would use these test data, along with other images... Ideally, upload results to an archive for community access

  46. Standardized Test Image Sets • Use this as a testbed for metadata standards • National Library of Medicine Visible Man is good starting point • Include images from all modalities, model systems • Eventually, link to genetic data Mouse Atlas, Richard Leahy

  47. Recommendation #4: Cultivate an “open source” approach to data sets and code • Considerable spectrum across different scientific disciplines in their approach to proprietary vs. open access data sets • NASA limits proprietary access to 12 months • Gene sequences placed in archive upon publication • Some federally funded projects have immediate release of all data • Evolve to curating a CVS repository for both images and codes?

  48. An NIH Example of Open Source Development • Analysis of Functional NeuroImages (AFNI) • Robert Cox, NIMH • Modular • Open Source • Extensible

  49. Interactive Analysis with AFNI (from L. Frank) Control Panel Displaying images from time series Graphing voxel time series data

More Related