1 / 8

Product Quality and Documentation – Recent Developments

Product Quality and Documentation – Recent Developments. Sumer ESIP Meeting July 8, 2014. H. K. Ramapriyan Assistant Project Manager ESDIS Project, Code 423, NASA GFSC Rama.Ramapriyan@nasa.gov. Motivation.

leanne
Télécharger la présentation

Product Quality and Documentation – Recent Developments

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Product Quality and Documentation – Recent Developments Sumer ESIP Meeting July 8, 2014 H. K. Ramapriyan Assistant Project Manager ESDIS Project, Code 423, NASA GFSC Rama.Ramapriyan@nasa.gov

  2. Motivation • Scientists (Providers/Dataset Producers) are motivated to provide high quality products and have a stake in ensuring that their data are not misused • Users need to know quality of data they use • Many ways to express quality • Makes it difficult for both providers and users • Data Centers are intermediaries • Need to simplify providers’ job of supplying information • Express information conveniently for users to access and understand

  3. Background QA4EO Guidelines (2010) NASA “Making Earth System Data Records (ESDRs) for User in Research Environments (MEaSUREs)” Product Quality Checklist (2012) NOAA Climate Data Records (CDR) Maturity Matrix (Bates and Privette, 2012) Improving Data Quality Information for NASA Earth Observation Data (Lynnes et al, 2012) Obs4MIPS – Climate Model Intercomparison Project (CMIP5) (2012) Committee on Earth Observing Satellites (CEOS) Essential Climate Variables (ECV) inventory questions (2012) National Center for Atmospheric Research (NCAR) Community Contributions Pages (2013) CEOS Working Group on Information Systems and Services (WGISS) Metadata Quality Exploration Questionnaire (2013) Global Earth Observation System of Systems (GEOSS) Data Quality Guidelines (2013) EU FP7 project CORE-CLIMAX assessment of European capacity to produce ECV climate data records from satellite, in situ and reanalysis data – NOAA maturity matrix extended/revised (2013) ISO 19157 – Metadata Standard for Geographic Information Data Quality (2013)

  4. Product Quality Checklist • Result of about 2 years’ discussions in Metrics Planning and Reporting Working Group – MEaSUREs PI’s and DAACs represented • Distinction between “Scientific Data Quality” and “Product Quality” • Two separate checklists created – one for PI’s and another for DAACs to fill out • Recommendation made to HQ and approved • Adopted and used for MEaSUREs 2006 projects • Included in Cooperative Agreements for MEaSUREs 2012 projects

  5. Product Quality Checklist – PI’s

  6. Product Quality Checklist – DAACs

  7. NCAR Climate Data Guide - Community Contributions Pages What are the key strengths of this data set? What are the key limitations of this data set? What are the typical research applications of these data? What are examples from your work? What are some common mistakes that users encounter when processing or interpreting these data? What are the likely spurious (non-climatic) features, if any, of time series derived from these data? What corrections were applied to account for changes in observing systems, sampling methods or density, and satellite drift or degradation? Describe any conversion steps that are necessary or general strategies to compare these data with model output. What are some comparable data sets, if any? Why use this data set instead of another? How is uncertainty characterized in these data? Provide a summary statement about these data and their utility for climate research and model evaluation.

  8. CEOS WGISS Metadata Quality Exploration Questionnaire Why did you choose this dataset for the survey? How does your organization define “fitness for purpose” for this dataset? What quality measures do you use to assess scientific quality? How are quality measures created initially? How do you store quality measures in your metadata? Are uncertainties estimated and documented, including in their spatial or temporal dimension? Have the data been validated, i.e. ‘assessed for uncertainties’, to the extent possible by comparison with alternative measurements’? Have the algorithm or analysis method, product description and product evaluation results been published in peer-reviewed literature? Is the data evaluated by external users? If so, how are the comments from external users captured? Any other relevant comments regarding Quality metadata.

More Related