1 / 70

Citation Analysis in Research Evaluation (Published by Springer, July 2005)

Citation Analysis in Research Evaluation (Published by Springer, July 2005). Henk F. Moed CWTS, Leiden University. Contents. Introduction: Potentialities of bibliometrics Adequacy of WoS coverage per main discipline

kezia
Télécharger la présentation

Citation Analysis in Research Evaluation (Published by Springer, July 2005)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Citation Analysis in Research Evaluation (Published by Springer, July 2005) Henk F. Moed CWTS, Leiden University

  2. Contents • Introduction: Potentialities of bibliometrics • Adequacy of WoS coverage per main discipline • Effects of the use of bibliometric indicators upon publication and citation practices • Citation analysis and peer review

  3. Book is written for a broad audience • Scientists and scholars subjected to citation analysis • Research policy makers and managers • Members of peer review committees and other evaluators • Practitioners and students in science studies, library & information science

  4. Book Structure

  5. Main objective • A firm political or societal basis for ‘basic’ science can be maintained only by further developing a system of internal quality control and performance enhancement • This book aims at showing that citation analysis is a useful tool in such a system

  6. General approach: uses and limits • The book shows that the application of citation analysis in research evaluation has reached a high level of sophistication • It discusses numerous issues raised by scientists, journal editors and policy makers • It shows how such issues can in principle be accounted for or solved technically

  7. The use of citation analysis in research evaluation is more appropriate the more it is: • Formal • Open      • Scholarly founded • Supplemented with expert knowledge • Carried out in a clear policy context with clear objectives • Stimulating users to explicitly state basic notions of scholarly quality • Enlightening rather than formulaic

  8. Macro Trends in the Global Science System

  9. Fractional counting scheme for a paper with 4 addresses

  10. Normalised citation impact The average citation rate of a unit’s papers ÷ World citation average in the subfields in which the unit is active ( >1.0 : above world average)

  11. % US papers declined

  12. JP, UK, DE, FR, CA decline China No. 2 in 2006 Spain increases

  13. Convergent trend in citation impact because of globalisation Impact Spain >1.0 in 2004/6

  14. Conclusion • Assessment of a country’s performance should be carried out from a comparative, global perspective

  15. Bibliometric Rankings of World UniversitiesAvailable at: www.cwts.nl/hm Henk F. Moed CWTS, Leiden University The Netherlands CWTS Graduate Course 2006

  16. Research universities • Conduct research at a certain scale This study: • Universities with > 5,000 published articles during 1977-2004 (> 625 per year) • With more than about 600 active researchers

  17. Distribution of universities among regions

  18. Distribution of universities among regions

  19. 16 broad disciplines

  20. General European Univ Among top 25 % in publication output and citation impact TOP 25% Impact BOTTOM 25% Publications TOP 25% BOTTOM 25%

  21. ‘Top’ US/UK research university University has a top position in each discipline

  22. Conclusion • The best European universities are among the top 25% in the world • But their top is less broad than that of the best US research universities • In European national academic systems top research is more evenly distributed among universities

  23. Policy issue • Concentration model (US, UK) versus distributed model • Is a concentration model a proper policy incentive in the European Research Area?

  24. BIGGEST 100 WORLD UNIVERSITIES (Calero et al., 2006) USA Univ Harvard Japan Univ Toronto Univ Tokyo Univ Melbourne Univ Sydney Europe Univ Queensland

  25. BIGGEST 100 EUROPEAN UNIVERSITIES (Calero et al, 2006) The Netherlands United Kingdom Univ Munich Katholieke Univ Leuven Germany Oxford Belgium Univ Antwerpen France Italy Scandinavia Univ Milano Karolinska Inst. ETH Zurich Switzerland Spain

  26. Bottom-up Approach: 159 NL Academic Chemistry Groups

  27. Did global scientific publication productivity increase during the 1980s and 1990s? A Meta Analysis of the ISI Database

  28. Research questions • Did authors publish in recent years more papers than they did in the past? • How did scholarly collaboration develop? • Are there traces of “publication pressures” in the ISI database?

  29. Absolute numbers

  30. Three indicators

  31. Patterns in total ISI database Authorships/Paper (+52%) Collabo- ration Papers Published by Average Author Authorships/Author (+29%) Author Produc- tivity Papers/Unique Publishing Author (-15%)

  32. Did authors publish in recent years more papers than they did in the past? • Answer: Yes and No • Yes: Number of ISI articles in an average author’s annual publication list increased • No: Number of ISI articles per unique author, published by total collection of unique publishing authors, (slightly) declined

  33. Conjecture – 1 If publication pressure is the dominant factor: • Scholars increased their publication counts by collaborating more intensively – or through authorship inflation - , rather than by collectively publishing more papers

  34. Adequacy of ISI/WoS coverage

  35. WoS Coverage (%) = ? ‘Important’ literature WoS All literature

  36. Selection of WoS source journals The real problem is to “make the coverage as complete as possible by expanding it beyond the core of journals whose importance to a given field is obvious” (Garfield 1979)

  37. How is the core expanded? • Garfield’s criterion: The frequency at which journals are cited in the source journals that are already included in the index • Assumption: The number of times a journal’s items are cited is an expression of its utility as communication medium

  38. Adequacy of ISI coverage: Two approaches

  39. Measurement of internal WoS Coverage Citing/Source Non-WoS WoS Non-Wos Journals Books Conference proceedings Reports Etc. ?% ?% Cited/Target Non-WoS WoS

  40. Not in WoS WoS Coverage = 5/7 = 71%

  41. Total ISI/WoS Database Citing/Source Non-WoS WoS 25% 75% Cited/Target Non-WoS WoS

  42. Molecular Biol & Biochem Citing/Source Non-WoS WoS 8% 92% Cited/Target Non-WoS WoS

  43. Social Sciences Citing/Source Non-WoS WoS 71% 29% Cited/Target Non-WoS WoS

  44. Overall ISI coverage by main field

  45. Sub-disciplines (non-exhaustive list)

  46. Important factors

  47. Three types of studies Citing/Source Non-WoS WoS 3. Source Expanded 2. Target Expanded 1. Pure WoS Cited/Target Non-WoS WoS

  48. 4 Types of bibliometric studies

  49. Social Sciences and Humanities • It cannot be taken for granted that the ISI Citation Indexes provide such indicators in all subfields of these domains of scholarship • A challenge would be to systematically explore alternative data sources and methodologies

More Related