1 / 7

OIA Key Attributes DRAFT

OIA Key Attributes DRAFT. March 16, 2011. Contributors. Michael Ackerman, NLM Rick Avila, Kitware Andy Buckler, Buckler Biomedical Terry Yoo, NLM David Clunie, Core Lab Partners James Luo, NIBIB Tony Reeves, Cornell University Daniel Rubin, Stanford University. Key Attributes.

gryta
Télécharger la présentation

OIA Key Attributes DRAFT

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. OIA Key AttributesDRAFT March 16, 2011

  2. Contributors • Michael Ackerman, NLM • Rick Avila, Kitware • Andy Buckler, Buckler Biomedical • Terry Yoo, NLM • David Clunie, Core Lab Partners • James Luo, NIBIB • Tony Reeves, Cornell University • Daniel Rubin, Stanford University

  3. Key Attributes • Contribution Support • Quality of the data curation process • Speed to post datasets • Support for imaging data types and metadata • User Support • Robust querying and ease of performing a download • Advanced computing services • General • Long-term integrity and support

  4. Contribution Support:Quality of the data curation process • Anonymization support • Validate/verify that de-identification was successful • Example: BIRN DUP application that de-identifies • Metadata preparation tools • Tools for efficient capture and organization of metadata • Utilization of common nomenclature • Example: OSA ISP metadata tool, Ontologies: BRDG, imaging biomarker ontology, AIM, … • NLM: Numerous ontologies being developed – this must be considered carefully. • Revision control • Apply revision control concepts to data elements • Examples: Commercial institutions do this routinely. EHRs. NBIA may have this capability (Eliot Siegel). • Capturing provenance • Capturing important information on the acquisition process is needed • Examples: Perhaps “data papers” will help • There are also goal specific requirements

  5. Contribution Support:Speed to post datasets • Avoid Limits on data upload size and speed • Protocols to load the data • FTP, SCP • Re-de-identify using automated methods • Retain certain fields for potential future purposes • Automated methods to check that the data complies with expectations • Example: PET SUV, need to know patient weight and height • Goal is to try to obtain high quality data, but we would not throw away data if not conforming • Some expectations we may know in advance, others not • Organization • David: Try to be agnostic on data organization

  6. Data Upload Attributes Continued • DICOM conformance checks • Automated methods are preferred • ADNI is performing automated qa • Metadata Expectations • Utilize a standard information model • Example: Use AVT to • Definition : Ontologies vs information models • Ontology – standard terminology • Information Model – the syntax for making statements (DICOM structured reporting, NBIA has proprietary XML format)

  7. User Support • Query Capabilities • More generic than web page queries • More sophisticated query methods will drive database design • Outside applications can access and perform queries and get a response using a service model • Flexibility to support a range of use cases • Support both plain text search and a structured query • One day support content based retrieval • If we were to support data papers, there would be additional content and terms we could use

More Related