1 / 20

The Problem: How to Evaluate Scholarly Productivity

Academic Analytics: Obtaining Benchmarks for Faculty Productivity Carol Livingstone Division of Management Information www.dmi.illinois.edu livngstn@illinois.edu 333-3551. The Problem: How to Evaluate Scholarly Productivity. Very few good measures G & C – dollars/numbers

lloyd
Télécharger la présentation

The Problem: How to Evaluate Scholarly Productivity

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Academic Analytics:Obtaining Benchmarks for Faculty ProductivityCarol Livingstone Division of Management Informationwww.dmi.illinois.edulivngstn@illinois.edu333-3551

  2. The Problem: How to Evaluate Scholarly Productivity • Very few good measures • G & C – dollars/numbers • College/dept collections of publication and awards data • Anecdotal evidence from faculty search and outside offer processes

  3. The Problem: How to Evaluate Scholarly Productivity • No easy benchmarks/peer comparisons • No campus-wide database for vita info - publications and awards

  4. Overview of Academic Analytics • AA has compiled a massive database of journal articles, books, chapters, grants, honors, awards, etc. • The scholarly output is matched by name with faculty and researchers at schools nationwide • The database can be used to quantify scholarly output at any university – by discipline – and compare it to peers

  5. AA Database contents - Measures • Books • Honorific awards • Journal Articles • Citations • Grants

  6. Database Scope • 9,874 PhD programs • 385 institutions, 170,000+ faculty • 172 disciplines in 11 broad fields • 22 AAU institutions actively participate, 30 others considering

  7. Book Variables • 56,000 authored & edited books (Baker & Taylor, Library of Congress, British Library) • Books per faculty member • # Faculty who published a book • % of faculty with a book • Total number of books

  8. JournalArticleVariables • 1 million+ articles (journals, and conference proceedings) • Articles per author • Articles per faculty member • # Faculty with an article • % of Faculty with an article • Total number of articles

  9. Citations • 10 million + Citations (Scopus & Proquest) • Citations per author • Citations per article • Citations per faculty member • # Faculty with a citation • % of Faculty with a citation • Total number of citations

  10. Honorific Awards • 395 awarding bodies, 2700+ award types • Awards per faculty member • # Faculty with an award • % of Faculty with an award • Total number of awards

  11. Grants Variables • 95,000+ federal research grants • Dollars per Grant • Grant dollars per faculty member • # of Faculty with a grant • % of Faculty with a grant • Total Grant dollars • Total number of grants

  12. Academic Analytics AAU Clients Cornell Duke Emory Iowa State Johns Hopkins MIT Ohio State SUNY - Stony Brook Texas A & M Chicago Colorado Kansas Missouri - Columbia Missouri - Kansas City North Carolina Pittsburgh USC Wisconsin - Madison Vanderbilt Wash U - St. Louis

  13. Where we are • Summer, 2011 – 3 yr contract signed • Fall, 2011 - Steering team commissioned • 30 users have been authorized • Fall, 2009 faculty data available through web dashboard • Fall, 2010 faculty lists are being processed

  14. Annual Process –Faculty Lists • AA

  15. Academic Analytics Output • Summary data by department or broad disciplinary area (not by faculty member) • Comparisons to selected peers • Access to entire scrubbed database • Analytical tools through a dashboard interface

  16. Sample Output – Productivity Radar

  17. Sample Output – Program Gauge

  18. Sample Output – Quintile Analysis

  19. Steering Committee Goals • Refine faculty selection process/criteria • Oversee rollout to college/dept staff • Evaluate tool

  20. DemoQuestions??

More Related