1 / 64

Using Incites to evaluate Research Performance Advanced rachel.mangan@thomsonreuters

Using Incites to evaluate Research Performance Advanced rachel.mangan@thomsonreuters.com http:// incites.isiknowledge.com. Objectives:. During this session we will explore the scenarios and questions related to evaluating research performance on three different levels: Individual researcher

jui
Télécharger la présentation

Using Incites to evaluate Research Performance Advanced rachel.mangan@thomsonreuters

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Using Incites to evaluate Research Performance Advanced rachel.mangan@thomsonreuters.com http://incites.isiknowledge.com

  2. Objectives: • During this session we will explore the scenarios and questions related to evaluating research performance on three different levels: • Individual researcher • Academic department • Institution • We will discuss each scenario/question and provide evidence based responses using metrics and reports taken from the various modules of Incites. • The aim of this workshop is to show with practical examples how Incites data can be applied to provide citation based evidence to support decisions for the purpose of Research Evaluation. • This is an interactive workshop and participants are encouraged to input questions they have regarding Research Evaluation at their institutions and the group will discuss how Incites can be used to provide a solution. • Thomson Reuters will also be presenting the upcoming developments to the Incites platform including the improvements coming to optimise the integration of Research Analytics data. (Slides 55-63) 2

  3. Guidelines for citation analysis • Compare like with like – The Golden Rule • Use relative measures, not just absolute counts • More applicable to hard sciences than arts/humanities • Know your data parameters: • journal categories • author names • author addresses • time periods • document types • Obtain multiple measures • Recognize skewed nature of citation data • Ask whether the results are reasonable • Understand your data source For further guidance on citation metrics, download the white papers at: http://science.thomsonreuters.com/info/bibliometrics/http://isiwebofknowledge.com/media/pdf/UsingBibliometricsinEval_WP.pdf 3

  4. Know your dataset (1)Options for building a Research Performance Profile • Address Based • Extraction of Web of Science records based on author affiliations. Created snapshot in time of work produced at your institution. • Extraction of WOS records that contain at least one occurrence of affiliation in the address field • All potential address variants are searched, with input from you. • This is a straightforward, relatively fast way to create an RPP dataset, and by far the most common method. • Author Based • Reflects your internal groups (specific departments, schools, etc.) • Data will reflect papers by current staff produced at prior institutions, if the publication lists are provided. • This does require effort on your part with provision of information for each author. 4

  5. Know your dataset (2) Address based v Author based RPP • Address-based dataset + Fast to build + Easy to maintain + Easy to keep up with data +Can receive dataset data (including metrics/percentiles) through FTP or any other type of file exchange system - Authors are not uniquely identified - No differentiation between departments - Can not use an API to pull data (The API can only be used to pull their dataset's WoS data (abstract, affiliations, etc.) without any InCites metric). • Author-based dataset + Authors can be uniquely identified + There is differentiation between departments + Can provide bibliometric information at department level + Can receive dataset data (including metrics/percentiles) through FTP or any other type of file exchange system PLUS they can also pull their dataset (with InCites metrics) through an API. - More complex to build if repository data are not “clean” 5

  6. Individual Researcher Evaluation-what do you want to measure/analyse/identify? 6

  7. Individual Researcher • How many papers have I published? (Slide 8) • What types of papers do I publish? (Slide 8) • Which is my most cited paper? (Slide 9) • In which journals do I publish? (Slide 10) • In which journals should I look to publish my research? (IF) (Slide 11) • Who do I collaborate with (other institution) and which are the best performing collaborations? (Slide 12, 13 & 14) • Who do I collaborate with (with in my organisation) and with which co-authors does the research perform the best? (Slide 12, 13 & 14) • Who is funding my research? (Slide 15) • How can I be more successful at procuring funding? (Slide16) • Do I have papers which have an impact above the journal average? (Slide 16) • Do I have papers which have an impact above the field average? (Slide 16) • How many papers do I have in the top 1%, 5% or top 10% of their field? (Slide 17) • Can you think of any other author related questions/topics? 7

  8. Author Profile- Author Publication Report 1.How many papers have I published? 4.In which journals do I publish? 2.What types of documents do I publish? 8

  9. Author- Source Articles Listing 3. Which is my highest cited paper? 9

  10. Journal Ranking 4.In which journals do I publish? How does the performance compare to similar research? 10

  11. 5. In which journals should I publish? The papers in this journal have a below expected impact. The journal ranks in the 2nd quartile of the category in JCR. The author might want to publish in journals that are in the 1st quartile. Journals in 1st Quartile for Environmental Studies-2012 JCR. 11

  12. Collaborating Authors-list report 6. Who do I collaborate with (internal and external) and which are the best performing collaborations? 7. Who do I collaborate with (within organisation) and which are the best performing collaborations? 12

  13. Collaborating Authors-ego network report 7. Who do I collaborate with (within organisation) and which is the best performing collaboration? 13

  14. Collaborating Authors-ego network report 6. Who do I collaborate with (internal and external) and which are the best performing collaborations? 14

  15. Funding Agency Listing • 8.Which funding bodies are funding my research? • Which funding agency is occurring most frequently? • With which agency does the research provide greatest return on investment ( order by average impact or view source articles listing for paper level metrics) 15

  16. 9. How can I be more successful at procuring funding? Provide evidence that your research has an impact above expected in the journals and categories where you publish. 10. Do I have papers which have an impact above the journal expected citations? 11. Do I have papers which have an impact above the category expected citations? 16

  17. Summary Metrics 12. How many papers do I have in the top 1%, 5% and top10% of the categories where I publish? 17

  18. Academic Department- what do you want to measure/analyse/evaluate? 18

  19. Academic Department (Author dataset, focus on Biological Sciences at Simon Fraser University) • What is the total output of this department? (Slide 20 & 21) • What type of documents do we produce? (Slide 22) • In which journals do we publish? (Slide 23) • Which are our highest cited papers? (Slide 24) • Which papers have exceeded the journal average impact? (Slide 24) • Which papers have exceeded the category average impact? (Slide 24) • What percentage of our research is uncited? (Slide 25) • Who are our top producing authors? (Slide 26) • Which of our authors have a better than expected performance in the journals and categories where they publish? (Slide 27) • Which authors could be mentors to our faculty members? (Slide 28) • Which institutions do we collaborate with? (Slide 29) • Which are the best performing (impact in field-use percentile) collaborations? (Slide 30) • Are there collaborations which are not providing return on investment? (Slide 31) • Who are potential new collaborators? (Citing Articles) (Slide 32 & 33) • Which funding agencies are funding our research? (Slide 34) • Has our research impact changed over time? Is it better in recent years? (Slide 35) • Has our department grown in size? (Slide 36) • Has our output increased over time and how does this compare to the total output for the university? (Slide 37) • How does our performance compare to other departments? (Slide 38) • Can you think of any other department related topics/questions? 19

  20. 1. What is the total output of the department? 20

  21. 1. What is the total output of the department? Document type breakdown 21

  22. 2. What type of documents do we produce? Measure the performance of the document types 22

  23. Journal Ranking How does the impact compare to similar research? 3. In which journals do we publish our research? 23

  24. Department Source Article Listing 6. Which papers have exceeded the category expected impact? 4. Highest cited papers for department? 5. Which papers have exceeded the journal expected impact? 24

  25. Department Summary Metrics 7. What percentage of our research is un-cited? 25

  26. Department Author Ranking 8. Who are our top producing authors? 26

  27. Department Author Ranking 9. Which authors have a better than expected performance in the journals and categories where they publish? 27

  28. 10. Which authors could be mentors to our faculty members? List authors in department who have published a minimum number of papers and identify authors who have a performance above journal/category expected impact. 28

  29. Department- Institutional Collaborations 11. Which institutions do our authors collaborate with? 29

  30. Department Institutional Collaborations 12. Which are the best performing collaborations? 30

  31. Department- Institutional Collaborations 13. Are there collaborations which are not providing return on investment? Note: Time period must be taken into consideration. Further investigate these papers with Source Articles Listing 31

  32. Citing Article Set- Institution Ranking 14. Who are potential new collaborations? Authors from Simon Fraser Biological Sciences have collaborated with Univ Sussex on one paper. Authors from Univ Sussex have cited 18 papers from the Biological Sciences Department- this could be a potential new collaboration. 32

  33. Source Article Listing-Citing Article Dataset Which papers were influential to authors at Univ Sussex? How influential are the 2nd generation citation papers? These are the 18 papers that cited publications from the Biological Sciences Department at Simon Fraser University 33

  34. Department Funding Agency Listing 15. Which funding bodies are funding our research? 34

  35. Department- Trended Graph • Publications from 2002 to 2007 are slightly under or slightly above the expected impact at the journal level. • Publications from 2008 onwards have an improved impact relative to the journal. 16. Has our research impact changed over time? Is it better in more recent years? 35

  36. Department size Has our department grown in size? (defined by number of authors). Run this report periodically. 36

  37. Department- Trended Output 18. Has our output increased over time and how does this compare to the total output for the university? Department Biological Sciences Simon Fraser University 37

  38. DepartmentComparison 19. How does our performance compare to other departments? A comparison of Summary Metrics report for Biological Sciences and Computer Science 38

  39. University- What do want to measure/analyse/evaluate? 39

  40. University • How does our productivity compare to institution x? (Slide 41) • Has our citation impact changed over time and how does that compare to university x? (Slide 42) • What effect do international collaborations have on our impact? (Slide 43) • Which are our field strengths ? How does that compare to university x? (Slide 44) • Has our research focus changed over time? How does that compare to university x? (Slide 45) • In comparison to other institutions within “country” how are we performing in field x? (Slide 46) • How can we identify top researchers for recruitment? (Slide 47 &48) • How does our research reputation compare to university x? (Slide 49) • How does our institutional income compare to university x? (Slide 50) • How is our teaching performance compare to university x? (Slide 51) • Which research areas need more support? (Slide 52) • Which metrics can support promotion/ tenure decisions? (Slide 53) • Which metrics/reports are best to provide evidence that a research strategy has been successful over time? (Slide 54) • Example. Has a recruitment drive in year x provided return on investment? 40

  41. Institutional Comparisons 1.How does our productivity compare to other institutions? 41

  42. Institutional Comparisons 2. Has our citation impact (average cites) changed over time and how does that compare to other universities? 42

  43. Institutional Comparisons-ESI Disciplines 3. What effect do international collaborations have on our impact? 43

  44. 4. Which are our field strengths? How does this compare to university x? University of Glasgow University of Edinburgh 44

  45. Institutional Comparisons- % of papers in institution for ESI Disciplines 5. Has our research focus changed over time? How does that compare to university x? 45

  46. Institutional Comparisons- Comparison of UK institutions in UoA 2014 Clinical Medicine 6. In comparison to other UK institutions, how are we performing in ‘Clinical Medicine’? -All UK, in Clinical Medicine, 2002-2012, ordered by Impact Relative to Subject Area 46

  47. 7. How can we identify top researchers for recruitment? Institutions may have various strategies for identifying new researchers to recruit. The following example looks at using data in Incites The Process • Identify international or domestic collaborations (where do you want to recruit from) • Identify top collaborating institutions within region • Drill down to author providing most impactful research (that contributes to your impact) 47

  48. a) Look at which countries your researchers most frequently collaborate with and select one. b) Here I have selected ‘England’ and examined the institutions by average impact. • c) I then selected Univ London Imperial Coll since this collaboration has the highest average impact. I then viewed the authors just from this institution. • The report provides a list of authors who maybe be potential new recruitments. • They have collaborated with authors at Simon Fraser and by using the Average Percentile or other normalised metrics you are able to identify the most impactful research by the collaborating authors. • These authors could be potential recruits since the collaboration has proven successful to Simon Frasers research impact. 48

  49. 8. How does our research reputation compare to university x? 49

  50. Institutional Profiles- Finance Indicators 9. How does our institutional income compare to university X 50

More Related