1 / 25

Citation Metrics – Round One, 2011 H. K. (Rama) Ramapriyan NASA/GSFC Greg Hunolt

Citation Metrics – Round One, 2011 H. K. (Rama) Ramapriyan NASA/GSFC Greg Hunolt Columbus Technologies Metrics Planning and Reporting (MPAR) WG 10 th Earth Science Data Systems Working Group Newport News, VA November, 2011. Citations Metrics.

anahid
Télécharger la présentation

Citation Metrics – Round One, 2011 H. K. (Rama) Ramapriyan NASA/GSFC Greg Hunolt

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Citation Metrics – Round One, 2011 H. K. (Rama) Ramapriyan NASA/GSFC Greg Hunolt Columbus Technologies Metrics Planning and Reporting (MPAR) WG 10th Earth Science Data Systems Working Group Newport News, VA November, 2011

  2. Citations Metrics • A change to the baseline adding new Citations Metrics was recommended by the MPARWG in 2009 and approved by NASA HQ in October 2010 for the FY2011 reporting year. • NASA HQ requested a report on the first year of citations metrics. • The expectation was that MEaSUREs projects that are REASoN continuations would be in the best position to begin reporting Citations Metrics in 2011. • By September 30, 14 projects reported on citations. • 6 of the 7 MEaSUREs projects that are REASON continuations reported. • 8 new MEaSUREs projects reported. • 1,708 citations in peer-reviewed publications were reported (excluding ISCCP), and 235 citations in non-peer-reviewed publications. • The goal of this session is to examine the experience and lessons learned from the first year effort, and chart a course for citations metrics reporting in 2012. • The report to NASA HQ will reflect the results of the first year of citations metrics reporting and the way forward agreed to here.

  3. Citations Metrics Reporting From the Current Metrics Baseline Description: • Projects will report (using the MCT) two counts: • 1) Metric 13: a count of citations in peer-reviewed publications, and • 2) Metric 14: a count of citations in any other publications, conference or workshop proceedings, posters, online publications, abstracts, etc. • Note: a ‘count’ is recorded for each publication that contains one or more references to a project’s work (e.g. a project’s publications, products or services). • Reporting of the counts is voluntary, on a ‘best effort’ basis, at an interval to be chosen by the project, with semi-annual or annual reporting recommended. • In addition to the counts, the project will provide the actual citations themselves. • Each project will use, and provide a description of, its own methodology for collecting the citations counts and strive for consistency from one reporting period to the next.

  4. Overview of 2011 Citations Metrics Reporting

  5. Citation Metrics – Session Plan • Do a walk-through of each project’s experience. • Look at the project’s methodology; • Look at the issues raised by the project; • Give project representatives an opportunity to discuss their work, problems they encountered, lessons learned. • Based on consideration of the 2011 experience, determine how should we approach 2012? • Do we need to make changes to the metrics definitions, especially for metric 14? • Do we need for projects to send in the actual citations? • Do we need a common approach, or a ‘base’ to which each project could add as it wishes? • When / how frequently should we ask projects to report citations metrics? • We will come back to these questions after the walk-throughs.

  6. Robert Atlas (David Moroni)

  7. Gao Chen (Clyde Brown)

  8. Christina Hsu (Corey Bettenhausen)

  9. Ian Joughin (Jennifer Bohlander)

  10. John Kimball (Diane Whited)

  11. Mike Kobrick

  12. Ron Kwok

  13. Stephane Maritorena

  14. Bill Rossow

  15. David Roy

  16. Chung Lin Shie

  17. John Townshend (Saurabh Channan)

  18. Frank Wentz (Deborah Smith)

  19. Victor Zlotnicki (Rob Raskin)

  20. Citations Metrics – Some Observations • A frequently used approach was to use a search tool(s) to find citations for a peer-reviewed paper published by the project. • Tools Used: • Google Scholar – 8 projects • Web of Knowledge – 2 projects (1 other looked at it) • Web of Science – 2 projects • Google search – 2 projects • Citation Index – 1 project • Science Direct – 1 project • AGU Digital Library – 1 project • Email Notifications – from journals, 1 project; from Google Scholar, 1 project. • One project that used three tools reported that they did not provide consistent results. • Metric 14 – citations in non-peer-reviewed publications was a problem for many projects. • A wide variance in level of effort. • Other observations?

  21. Citations Metrics – Questions • What Level of Effort is appropriate for collecting and reporting citations metrics? • Do we need to make changes to the metrics definitions, especially for metric 14? Note that one project combed through its Google Scholar citations and manually sorted them out between peer-reviewed and non-peer-reviewed publications. • Do we need to examine the citation publications to see which discussed actual use of a project’s ESDRs rather than just citing the work of the project? • Do we need projects to send in the actual citations? If so, we should agree on a standard way of reporting them, e.g. into e-books. • Do we need to suggest guidelines for a common approach to collecting the metrics, e.g. suggest Google Scholar, or email notifications, and letting projects who wish to do more, do more? • When / how frequently should we ask projects to report citations metrics? Is annually sufficient?

  22. Citations Metrics – Next Year • Suggest at minimum collecting metrics by September 30 again, to allow for report and discussion by the MPG and annual report to NASA HQ. • Develop guidance for projects based on this year’s experience and results of our discussion, and resolution of any questions for NASA management.

  23. Background

  24. Outcome Metric #13, Citations Count – Peer Reviewed • Purpose:The objective of the Citations metric is to obtain a better measure of user satisfaction with project’s products and services and enable a better assessment of their contribution to the NASA science and applications programs and Earth science research in general. • MCT Question:Please enter the number of citations of your project’s data, products, services or publications in peer-reviewed publications.

  25. Outcome Metric #14, Citations Count - Other than Peer Reviewed • Purpose: The objective of the Citations metric is to obtain a better measure of user satisfaction with project’s products and services and enable a better assessment of their contribution to the NASA science and applications programs and Earth science research in general. • MCT Question:Please enter the number of citations of your project’s data, products, services or publications in other than peer-reviewed publications.

More Related