1 / 9

Metrics Planning Group (MPG) Report to Plenary

Metrics Planning Group (MPG) Report to Plenary. Clyde Brown ESDSWG Nov 3, 2011. MPG Focus on Product Data Quality and Citation Metrics. Product Quality Metrics. Overall Objective:

rafal
Télécharger la présentation

Metrics Planning Group (MPG) Report to Plenary

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Metrics Planning Group (MPG) Report to Plenary Clyde Brown ESDSWG Nov 3, 2011

  2. MPG Focus on Product Data Quality and Citation Metrics

  3. Product Quality Metrics • Overall Objective: Given that the objective of the MEaSUREs program is to produce and deliver readily usable Earth Science Data Records of high quality: • Define program level metric(s) that permit assessment of the steps taken / progress made to ensure that high quality products are provided by MEaSUREs projects and the MEaSUREs program as a whole. • Develop a draft recommendation for Product Quality Metric(s) that would then go through the regular MPARWG review process. • Recommendation from the October 2010 [ESDSWG] meeting: • Develop a checklist of a small number of questions that represent progress in the product quality area. • We considered product quality to be a combination of scientific quality and the completeness of associated documentation and ancillary information, and effectiveness of supporting services. • The responsibility for the product quality is shared between the projects generating the ESDRs and the DAACs that eventually archive and distribute them.

  4. Product Quality Metrics • Completed work on questions / checklists, reach agreement on a first version to work with • Next steps • Projects & DAACs compile initial set of checklists, P. I.s send to Rama. • Rama creates a strawman set of project level summary roll-ups, and an aggregated program level roll-up, sends back to P.I.s. • Telecon to discuss, modify, etc., the summary roll-ups. • Draft MPARWG recommendation for product quality metrics (i.e., the agreed summary roll-ups).

  5. Draft Project Checklist Per a reviewer recommendation, responses to all questions could include comments, and would use a text format such as Word that would facilitate commenting.

  6. Draft DAAC Checklist Per a reviewer recommendation, responses to all questions could include comments, and would use a text format such as Word that would facilitate commenting.

  7. Citations Metrics • A change to the baseline adding new Citations Metrics was recommended by the MPARWG in 2009 and approved by NASA HQ in October 2010 for the FY2011 reporting year. • NASA HQ requested a report on the first year of citations metrics. • The expectation was that MEaSUREs projects that are REASoN continuations would be in the best position to begin reporting Citations Metrics in 2011. • By September 30, 14 projects reported on citations. • 6 of the 7 MEaSUREs projects that are REASON continuations reported. • 8 new MEaSUREs projects reported. • 1,708 citations in peer-reviewed publications were reported (excluding ISCCP), and 235 citations in non-peer-reviewed publications. • The goal of this session was to examine the experience and lessons learned from the first year effort, and chart a course for citations metrics reporting in 2012. • The report to NASA HQ will reflect the results of the first year of citations metrics reporting and the way forward agreed to here.

  8. Citations Metrics • Reviewed citation metrics for FY2011 and the methods used by the projects to identify best practices and assess level of effort • Next Steps: • Develop guidance for projects based on this year’s experience and results of our discussion. • Citation Metrics for FY2012 will be collected by September 30 to allow for annual report to NASA HQ

  9. Future Work • MPG will continue to function on an ad hoc basis to consider metrics issues as they arise, e.g. • Metrics for Distributed Services • Ensuring that data access by online services are accounted for (e.g. which data granule(s) were accessed to produce a plot)

More Related