1 / 39

Thed van Leeuwen Centre for Science and Technology Studies (CWTS), Leiden University

Linking policy initiatives to available data Assessment of scholarly activity in SSH and Law in a new perspective. Thed van Leeuwen Centre for Science and Technology Studies (CWTS), Leiden University Post-Conference Seminar, Vilnius, Lithuania September 25 th , 2013. Outline.

torin
Télécharger la présentation

Thed van Leeuwen Centre for Science and Technology Studies (CWTS), Leiden University

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Linking policy initiatives to available dataAssessment of scholarly activity in SSH and Law in a new perspective Thed van Leeuwen Centre for Science and Technology Studies (CWTS), Leiden University Post-Conference Seminar, Vilnius, Lithuania September 25th, 2013

  2. Outline • Policy context • Proposed solutions • Case study in a Dutch university • Linking it together ! • Conclusions, discussion, and future steps

  3. Policy context

  4. Overview of the organization of Dutch research evaluation Standard Evaluation Protocol (SEP – 2003, 2009) • Association of Dutch Universities (VSNU) • National Research Council (NWO) • Royal Dutch Academy of Sciences (KNAW) Judging Research on its Merits (2005) Report “Quality indicators for research in the Humanities”(Committee on quality indicators for the humanities, November 2011). Report “Towards a framework for the quality assessment of social science research” (Committee on quality indicators for the social sciences, March 2013). Key issues that were addressed in both reports: • How to deal with heterogeneity? [without ‘standardizing’ it away] • Publication cultures • Societal relevance

  5. Proposed solutions

  6. Quality indicators for research in the Humanities 5

  7. Quality Aspects Assessment criteria Indicators Articles Monographs Scholarly publications Chapters in books Dissertations Other output Reviews Scholarly output Scholarly use of output Citations Other evidence of use Scholarly prizes Evidence of scholarly recognition Personal grants Other evidence of recognition

  8. Assessment criteria Indicators Quality Aspects Articles in specialist publications Monographs for a wider public Societal publications Chapters in books for a wider public Other societal output Projects in collaboration with civil-society actors Societal quality Societal use of output Contract research Demonstrable civil-society effects Other evidence of use Societal prizes Evidence of societal recognition Other evidence of Societal recognition

  9. A case study in a Dutch University

  10. Bibliometric analysis of output in a Dutch university: A case study on research output ‘04-’09 Scientific disciplines cover medicine, social sciences, law, philosophy, history, and economics & business. Publication data: internal output registration system (METIS), covering 2004-2009. Various types of scientific output were included. Purpose of the study: to analyze the ‘impact’ of the university. 9

  11. Difference between the internal registration system & representation WoS Dominance university hospital in WoS realm extremely visible Law and Humanities ‘disappear’ in WoS realm 10

  12. Composition of the output of the university in METIS:The external coverage of a university The category General is in some cases voluminous All units do have journal publications ! 11

  13. Does it have impact ? Taking all publications into consideration does not make any sense ! For two units international visibility increases! 12

  14. Linking it together !

  15. Scholarly output Criteria Indicators Metis categories Articles Scholarly publications Journals Monographs Books Chapters in books Chapters Dissertations Theses Other output (WoS) Book Reviews Reviews Scholarly use of output Citations WoS/Scopus/GS Citations Influencing other scholars Other evidence of use Evidence of scholarly recognition Scholarly prizes Other Personal grants Other Other evidence of recognition Review committees, editorial boards, etc.

  16. Societal quality Criteria Indicators Metis categories Articles in specialist publications Non scholarly journals Societal publications Monographs for a wider public Monographs for a wider public Chapters in books for a wider public Chapters in books for a wider public Other societal output Media appearances Societal use of output Other Projects in collaboration with civil-society actors Reports Contract research Participation in advisory councils, or the public debate Demonstrable civil-society effects Other evidence of use Media appearances Evidence of societal recognition Other Societal prizes Participation in advisory councils, or the public debate Other evidence of Societal recognition

  17. Conclusions, Discussion, and future steps

  18. Some conclusions of the study • The Metis data clearly showed the possibilities to link the scientific outlets (registered in Metis) to the proposed assessment schemes. • … which also allows to focus on societal quality ! • Working on an environment that assists research assessments in the SSH should be done in close collaboration with the scholarly community involved. • Citation analysis of non WoS source material seemed a fruitful approach.

  19. Some recommendations … The next challenge is the adding of the possible audiences of the various outlets now linked to the indicators. In addition to this search for the audiences, inevitably the request for ‘value-ing’ the various indicators will pop up ! Challenge in the design of indicators based on such a system is to avoid thinking of this as a numbers game. National discipline-wide initiative to register research output and societal impact seems called for … 18

  20. Future steps… We have inquired the possibilities to conduct a follow-up study within Leiden University, to further improve the methodology and discuss the outcomes with researchers and research managers. We have planned a Workshop to discuss the possibilities to come to a national system of data collection that could support assessment procedures as shown in this presentation. 19

  21. Desert !

  22. Development of authorship across all domains of scholarly activity

  23. Definitions of JIF and Hirsch Index Definition of JIF: The mean citation score of a journal, determined by dividing all citations in year T by all citable documents in years T-1 and T-2. Definition of h-index: The ‘impact’ of a researcher, determined by the number of received citations of an oeuvre, sorted by descending order, where the number of citations equals the rank position.

  24. Problems with JIF Some methodological problems of JIF: Was/is calculated erroneously. Not field normalized. Not document type normalized. Underlying citation distributions are highly skewed Some conceptual problems of JIF: Inflates the impact of all researchers publishing in the journal. Promotes journal publishing, as JIF is easily measured. Stimulates one-indicator thinking. Is based on expected values only, does not relate to reality. Ignores other scholarly virtues.

  25. Problems with H-index Some bibliometric-mathematical problems of H-index: Is mathematically inconsistent in its’ behavior. Tends to rise only, no decrease possible, and thus conservative by nature. Not field normalized. Some bibliometric-methodological problems of H-index: How to define an author? In which bibliographic/metric environment? Some conceptual problems of H-index: Is biased against youth, and favors age and experience. Is biased against selective researchers, and favors highly productive scientists. No relationship between H-index and research quality. Ignores other elements of scholarly activity. Promotes one-indicator thinking.

  26. Thank you for your attention!Any questions?Ask me, or mail me leeuwen@cwts.nl

  27. Appendix on H-index

  28. The H-Index and its limitations

  29. The H-Index, defined as … The H-Index is the score that indicates the position at which a publication in a set, the number of received citations is equal to the ranking position of that publication. Idea of an American physicist, J. Hirsch, who published about this index in the Proc. NAS USA.

  30. Examples of Hirsch-index values • Environmental biologist, output of 188 papers, cited 4,788 times in the period 80-04. • Hirsch-index value of 31 • Clinical psychologist, output of 72 papers, cited 760 time sin the period 80-04. • Hirsch-index value of 14

  31. Actual versus field normalized impact (CPP/FCSm) displayed against the output. Large output can be combined with a relatively low impact

  32. H-Index displayed against the output. Larger output is strongly correlated with a high H-Index value.

  33. Consistency: Definition Definition. A scientific performance measure is said to be consistent if and only if for any two actors A and B and for any number n ≥ 0 the ranking of A and B given by the performance measure does not change when A and B both have a new publication with n citations. 32

  34. Consistency: Motivation Consistency ensures that if the publishing behavior of two actors does not change over time, their ranking relative to each other also does not change Consistency ensures that if the individual researchers in one research group X outperform the individual researchers in another research group Y, the former research group X as a whole outperforms the latter research group Y. 33

  35. Inconsistency of the h-index h = 4 h = 6 h = 8 h = 6 Actor A Actor B 34

  36. Problems with the H-Index For serious evaluation of scientific performance, the H-Index is as indicator not suitable, as the index: Is insensitive to field specific characteristics (e.g., difference in citation cultures between medicine and other disciplines). Does not take into account age and career length of scientists, a small oeuvre leads necessarily to a low H-Index value. Is inconsistent in its ‘behaviour’.

  37. Appendix on JFIS

  38. Other journal impact measures … JFIS (CWTS) Journal-to-Field Impact Score A field- and document type normalized journal impact score, based on more publication data and longer citation windows.

  39. Journals within their JFIS-values ------------------------------------------------------------------------------------------------------------------------------------------ JOURNAL JFIS Ranking - Field ------------------------------------------------------------------------------------------------------------------------------------------ CELL 7.06 ( 1 - Biochem & Mol Biol) REV MOD PHYSICS 5.15 ( 2 - Physics) ANN REV CELL DEV BIOL 5.04 ( 3 - Biochem & Mol Biol) CHEMICAL REVIEWS 4.90 ( 4 - Chemistry) NATURE MEDICINE 4.73 ( 5 - Medicine) ANN REV OF BIOCHEM 4.64 ( 6 - Biochem & Mol Biol) ANNALS OF MATHEMATICS 4.46 ( 7 - Mathematics) NATURE BIOTECHNOLOGY 4.07 ( 8 - Biotech & Appl Microb) ACTA MATHEMATICA 4.01 ( 9 - Mathematics) BULL AM MATH SOC 4.00 ( 10 - Mathematics) ANN REV CELL BIOL 3.78 ( 11 - Biochem & Mol Biol) J AM MATH SOC 3.71 ( 12 - Mathematics) J ROYAL STAT SOC B 3.49 ( 13 - Statistics & prob) PROG CHEM ORG NAT PROD 3.35 ( 14 - Organic Chem) ACTA METALL MATER 3.19 ( 15 - Metall & Met Eng) ANGEW CHEM-INT EDIT 3.15 ( 16 - Chemistry) PHYS REV LETT 3.13 ( 17 - Physics) J MICROELECTROMECH SYST 3.04 ( 18- Elec & Electr Eng) J RHEOLOGY 3.02 ( 19 - Mechanics) INVENT MATH 3.01 ( 20 - Mathematics) ------------------------------------------------------------------------------------------------------------------------------

More Related