1 / 17

Core 1b – Engineering Computational Platform

Core 1b – Engineering Computational Platform. Jim Miller GE Research. Core 1b – Engineering 5 Aims / 5 Platforms. 4. 1. Architecture – tools, operating paradigms, reporting mechanisms, integration points

mattox
Télécharger la présentation

Core 1b – Engineering Computational Platform

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Core 1b – EngineeringComputational Platform Jim Miller GE Research

  2. Core 1b – Engineering5 Aims / 5 Platforms 4 1 Architecture – tools, operating paradigms, reporting mechanisms, integration points End-user platform – interactive methods and information visualization for longitudinal analysis, exploratory data analysis, and translational research Computational platform – stream processing, cloud computing, statistical analysis, informatics, machine learning Data management – non-imaging and derived data, DICOM and cloud services Software engineering and software quality – navigable timeline for revision control, build, test, documentation and release 3 2 5

  3. Computational platform The objective of the Engineering component of Core 1 is to provide software tools and software development processes to deploy innovative technology to clinical researchers, support the scientific algorithm innovation of the Algorithm scientists, and to foster a community to produce high quality software. Computing architecture Desktop Grid computing Stream computing (GPGPU) Cloud computing DBP needs Pathology segmentation Longitudinal analysis Patient specific analysis Population analysis Algorithm needs Exploratory methods Longitudinal frameworks Interactive frameworks Statistics Machine learning Analysis platform Feature libraries Multivariate clustering Machine learning Statistical inference Regression analysis Information visualization Informatics

  4. ITKv4 ARRA Funded effort from the National Library of Medicine • Kickoff 6/2010 • v4.0 released 12/2011 • v4.0.1 SPIE • v4.2 June 2012

  5. ITKv4 Team ITKv4 Team • GE Research • Kitware Inc. • University of Pennsylvania • Harvard University • University of Iowa • CoSMo Software • Mayo Clinic • University of Utah ITKv4 A2D2 Team • Georgetown University • University of Utah • University of North Carolina Chapel Hill • The Ohio State University • Carnegie Mellon University • Harvard University • William and Mary • Old Dominion University • Kitware Inc.

  6. ITKv4 Overview • Apache 2.0 License • “Patented” directory removed • New software process • Git, Gerrit, Jira, cdash@home, Testing Data • Deprecated compilers • Visual Studio 6, Visual Studio 7, Borland 5.5, SUN CC < 5.9, SGI CC, MWORKS, Cygwin, GCC < 3.4 • 64 bit improvements

  7. ITKv4 - Modularization ITKv3 – monolithic organization ITKv4 – modular organization

  8. ITKv4 – Level sets • ITKv3 provides several flavors of level sets • Implemented in a general finite difference framework • Each level set equation encapsulated in a separate filter • ITKv4 provides a more general framework • Level set equation can be constructed term by term • Advection terms • Propagation terms • Region terms • Regularization terms • Multi-material • Level set stopping criterions

  9. ITKv4 - Registration • Unbiased registration support • Composite transforms • Multithreaded metrics • New and updated metrics • Neighborhood correlation • Mutual information • Point set metrics • Tensor metrics • Vector metrics • New and update transforms • Displacement, Bspline, Poly-affine, Diffeomorphic • New optimizers • Efficient for high dimensional transforms • Automated parameter initialization

  10. ITKv4 - FEM • Broader FEM support (not just for registration) • ITK conformant • Memory and reference management improvements • SpatialObjects for IO • Removed other FEM IO mechanisms • FEMObject constructs problem instead of the solver More details at http://www.itk.org/Wiki/Refactoring_itk::FEM_framework_-_V4

  11. ITKv4 – GPU (SPIE release or v4.1) • Architecture for ITK filters to utilize GPUs • Lazy synchronization of data between CPU and GPU • Factory mechanism • Samples of various “classes” of algorithms • BinaryThreshold, Mean, DiscreteGaussian, AnisotropicDiffusion, DemonsRegistration CPU buffer CPU buffer CPU buffer Filter on CPU Filter on GPU Filter on GPU Filter on CPU GPU buffer GPU buffer GPU buffer

  12. ITKv4 - SimpleITK • On-ramp to ITK • Templateless layer for C++ • Function Paradigm, Object Paradigm • but no Pipeline Paradigm • Wrapping • Python, Java, C#, Tcl, Lua, Ruby, R • Access to most ITK algorithms • Simple image viewing

  13. ITKv4 - Video • Cameras, Ultrasound, Fluoro, OCT, … • Video bridge – OpenCV, VXL • Video grabber • Ringbuffer • Video pipelines

  14. ITKv4 - Microscopy • Point spread function models • Deconvolution • Denoising • Color correction • Colocalization • IO

  15. Nipype • Python scriptable engine • Local and distributed execution • Semantically uniform access to analysis tools • SPM, FSL, FreeSurfer, AFNI, Brains, Slicer, Camino • Example: • Huntington’s Disease DBP tying all their tools through Nipype • Nipype as the documentation vehicle for the processing chain • Slicer Execution Model (Command Line Modules) accessible through Nipype

  16. Feature libraries • Enable exploratory methods and learning based methods • ITK filter library • Gabor features • Haralick features • Entropy features • Polynomial fits to time varying data • Histogram features Haralick Entropy Histogram

  17. Core 1b – EngineeringComputational platform 4 1 Architecture – tools, operating paradigms, reporting mechanisms, integration points End-user platform – interactive methods and information visualization for longitudinal analysis, exploratory data analysis, and translational research Computational platform – stream processing, cloud computing, statistical analysis, informatics, machine learning Data management – non-imaging and derived data, DICOM and cloud services Software engineering and software quality – navigable timeline for revision control, build, test, documentation and release 3 2 5

More Related