Presentation is loading. Please wait.

Presentation is loading. Please wait.

Using Incites to evaluate Research Performance Advanced

Similar presentations


Presentation on theme: "Using Incites to evaluate Research Performance Advanced"— Presentation transcript:

1 Using Incites to evaluate Research Performance Advanced rachel.mangan@thomsonreuters.com http://http://incites.isiknowledge.com

2 Objectives: During this session we will explore the scenarios and questions related to evaluating research performance on three different levels: –Individual researcher –Academic department –Institution We will discuss each scenario/question and provide evidence based responses using metrics and reports taken from the various modules of Incites. The aim of this workshop is to show with practical examples how Incites data can be applied to provide citation based evidence to support decisions for the purpose of Research Evaluation. This is an interactive workshop and participants are encouraged to input questions they have regarding Research Evaluation at their institutions and the group will discuss how Incites can be used to provide a solution. Thomson Reuters will also be presenting the upcoming developments to the Incites platform including the improvements coming to optimise the integration of Research Analytics data. (Slides 55-63) 2

3 GUIDELINES FOR CITATION ANALYSIS 3 Compare like with like – The Golden Rule Use relative measures, not just absolute counts More applicable to hard sciences than arts/humanities Know your data parameters: –journal categories –author names –author addresses –time periods –document types Obtain multiple measures Recognize skewed nature of citation data Ask whether the results are reasonable Understand your data source For further guidance on citation metrics, download the white papers at: http://science.thomsonreuters.com/info/bibliometrics/ http://isiwebofknowledge.com/media/pdf/UsingBibliometricsinEval_WP.pdf http://science.thomsonreuters.com/info/bibliometrics/ http://isiwebofknowledge.com/media/pdf/UsingBibliometricsinEval_WP.pdf

4 Know your dataset (1) Options for building a Research Performance Profile  Address Based Extraction of Web of Science records based on author affiliations. Created snapshot in time of work produced at your institution.  Extraction of WOS records that contain at least one occurrence of affiliation in the address field All potential address variants are searched, with input from you. This is a straightforward, relatively fast way to create an RPP dataset, and by far the most common method.  Author Based Reflects your internal groups (specific departments, schools, etc.) Data will reflect papers by current staff produced at prior institutions, if the publication lists are provided. This does require effort on your part with provision of information for each author. 4

5 Know your dataset (2) Address based v Author based RPP Address-based dataset + Fast to build +Easy to maintain +Easy to keep up with data + Can receive dataset data (including metrics/percentiles) through FTP or any other type of file exchange system -Authors are not uniquely identified - No differentiation between departments -Can not use an API to pull data (The API can only be used to pull their dataset's WoS data (abstract, affiliations, etc.) without any InCites metric). Author-based dataset + Authors can be uniquely identified +There is differentiation between departments +Can provide bibliometric information at department level + Can receive dataset data (including metrics/percentiles) through FTP or any other type of file exchange system PLUS they can also pull their dataset (with InCites metrics) through an API. -More complex to build if repository data are not “clean” 5

6 Individual Researcher Evaluation-what do you want to measure/analyse/identify? 6

7 Individual Researcher 1.How many papers have I published? (Slide 8) 2.What types of papers do I publish? (Slide 8) 3.Which is my most cited paper? (Slide 9) 4.In which journals do I publish? (Slide 10) 5.In which journals should I look to publish my research? (IF) (Slide 11) 6.Who do I collaborate with (other institution) and which are the best performing collaborations? (Slide 12, 13 & 14) 7.Who do I collaborate with (with in my organisation) and with which co-authors does the research perform the best? (Slide 12, 13 & 14) 8.Who is funding my research? (Slide 15) 9.How can I be more successful at procuring funding? (Slide16) 10.Do I have papers which have an impact above the journal average? (Slide 16) 11.Do I have papers which have an impact above the field average? (Slide 16) 12.How many papers do I have in the top 1%, 5% or top 10% of their field? (Slide 17) 13.Can you think of any other author related questions/topics? 7

8 Author Profile- Author Publication Report 8 1.How many papers have I published? 4.In which journals do I publish? 2.What types of documents do I publish?

9 Author- Source Articles Listing 9 3. Which is my highest cited paper?

10 Journal Ranking 10 4.In which journals do I publish? How does the performance compare to similar research?

11 5. In which journals should I publish? 11 The papers in this journal have a below expected impact. The journal ranks in the 2 nd quartile of the category in JCR. The author might want to publish in journals that are in the 1 st quartile. Journals in 1 st Quartile for Environmental Studies-2012 JCR.

12 Collaborating Authors-list report 12 7. Who do I collaborate with (within organisation) and which are the best performing collaborations? 6. Who do I collaborate with (internal and external) and which are the best performing collaborations?

13 Collaborating Authors-ego network report 13 7. Who do I collaborate with (within organisation) and which is the best performing collaboration?

14 Collaborating Authors-ego network report 14 6. Who do I collaborate with (internal and external) and which are the best performing collaborations?

15 Funding Agency Listing 15 8.Which funding bodies are funding my research? Which funding agency is occurring most frequently? With which agency does the research provide greatest return on investment ( order by average impact or view source articles listing for paper level metrics)

16 9. How can I be more successful at procuring funding? 16 Provide evidence that your research has an impact above expected in the journals and categories where you publish. 10. Do I have papers which have an impact above the journal expected citations? 11. Do I have papers which have an impact above the category expected citations?

17 Summary Metrics 17 12. How many papers do I have in the top 1%, 5% and top10% of the categories where I publish?

18 Academic Department- what do you want to measure/analyse/evaluate? 18

19 Academic Department (Author dataset, focus on Biological Sciences at Simon Fraser University) 1.What is the total output of this department? (Slide 20 & 21) 2.What type of documents do we produce? (Slide 22) 3.In which journals do we publish? (Slide 23) 4.Which are our highest cited papers? (Slide 24) 5.Which papers have exceeded the journal average impact? (Slide 24) 6.Which papers have exceeded the category average impact? (Slide 24) 7.What percentage of our research is uncited? (Slide 25) 8.Who are our top producing authors? (Slide 26) 9.Which of our authors have a better than expected performance in the journals and categories where they publish? (Slide 27) 10.Which authors could be mentors to our faculty members? (Slide 28) 11.Which institutions do we collaborate with? (Slide 29) 12.Which are the best performing (impact in field-use percentile) collaborations? (Slide 30) 13.Are there collaborations which are not providing return on investment? (Slide 31) 14.Who are potential new collaborators? (Citing Articles) (Slide 32 & 33) 15.Which funding agencies are funding our research? (Slide 34) 16.Has our research impact changed over time? Is it better in recent years? (Slide 35) 17.Has our department grown in size? (Slide 36) 18.Has our output increased over time and how does this compare to the total output for the university? (Slide 37) 19.How does our performance compare to other departments? (Slide 38) 20.Can you think of any other department related topics/questions? 19

20 1. What is the total output of the department? 20

21 1. What is the total output of the department? 21 Document type breakdown

22 2. What type of documents do we produce? 22 Measure the performance of the document types

23 Journal Ranking 23 3. In which journals do we publish our research? How does the impact compare to similar research?

24 Department Source Article Listing 24 5. Which papers have exceeded the journal expected impact? 6. Which papers have exceeded the category expected impact? 4. Highest cited papers for department?

25 Department Summary Metrics 25 7. What percentage of our research is un-cited?

26 Department Author Ranking 26 8. Who are our top producing authors?

27 Department Author Ranking 27 9. Which authors have a better than expected performance in the journals and categories where they publish?

28 10. Which authors could be mentors to our faculty members? 28 List authors in department who have published a minimum number of papers and identify authors who have a performance above journal/category expected impact.

29 Department- Institutional Collaborations 29 11. Which institutions do our authors collaborate with?

30 Department Institutional Collaborations 30 12. Which are the best performing collaborations?

31 Department- Institutional Collaborations 31 13. Are there collaborations which are not providing return on investment? Note: Time period must be taken into consideration. Further investigate these papers with Source Articles Listing

32 Citing Article Set- Institution Ranking 32 14. Who are potential new collaborations? Authors from Simon Fraser Biological Sciences have collaborated with Univ Sussex on one paper. Authors from Univ Sussex have cited 18 papers from the Biological Sciences Department- this could be a potential new collaboration.

33 Source Article Listing-Citing Article Dataset 33 These are the 18 papers that cited publications from the Biological Sciences Department at Simon Fraser University Which papers were influential to authors at Univ Sussex? How influential are the 2 nd generation citation papers?

34 Department Funding Agency Listing 34 15. Which funding bodies are funding our research?

35 Department- Trended Graph 35 16. Has our research impact changed over time? Is it better in more recent years? Publications from 2002 to 2007 are slightly under or slightly above the expected impact at the journal level. Publications from 2008 onwards have an improved impact relative to the journal.

36 Department size 36 Has our department grown in size? (defined by number of authors). Run this report periodically.

37 Department- Trended Output 37 18. Has our output increased over time and how does this compare to the total output for the university? Department Biological Sciences Simon Fraser University

38 Department Comparison 38 19. How does our performance compare to other departments? A comparison of Summary Metrics report for Biological Sciences and Computer Science

39 University- What do want to measure/analyse/evaluate? 39

40 University 40 1.How does our productivity compare to institution x? (Slide 41) 2.Has our citation impact changed over time and how does that compare to university x? (Slide 42) 3.What effect do international collaborations have on our impact? (Slide 43) 4.Which are our field strengths ? How does that compare to university x? (Slide 44) 5.Has our research focus changed over time? How does that compare to university x? (Slide 45) 6.In comparison to other institutions within “country” how are we performing in field x? (Slide 46) 7.How can we identify top researchers for recruitment? (Slide 47 &48) 8.How does our research reputation compare to university x? (Slide 49) 9.How does our institutional income compare to university x? (Slide 50) 10.How is our teaching performance compare to university x? (Slide 51) 11.Which research areas need more support? (Slide 52) 12.Which metrics can support promotion/ tenure decisions? (Slide 53) 13.Which metrics/reports are best to provide evidence that a research strategy has been successful over time? (Slide 54) 1.Example. Has a recruitment drive in year x provided return on investment?

41 Institutional Comparisons 41 1.How does our productivity compare to other institutions?

42 Institutional Comparisons 42 2. Has our citation impact (average cites) changed over time and how does that compare to other universities?

43 Institutional Comparisons-ESI Disciplines 43 3. What effect do international collaborations have on our impact?

44 4. Which are our field strengths? How does this compare to university x? 44 University of GlasgowUniversity of Edinburgh

45 Institutional Comparisons- % of papers in institution for ESI Disciplines 45 5. Has our research focus changed over time? How does that compare to university x?

46 Institutional Comparisons- Comparison of UK institutions in UoA 2014 Clinical Medicine 46 6. In comparison to other UK institutions, how are we performing in ‘Clinical Medicine’? -All UK, in Clinical Medicine, 2002- 2012, ordered by Impact Relative to Subject Area

47 7. How can we identify top researchers for recruitment? Institutions may have various strategies for identifying new researchers to recruit. The following example looks at using data in Incites The Process a)Identify international or domestic collaborations (where do you want to recruit from) b)Identify top collaborating institutions within region c)Drill down to author providing most impactful research (that contributes to your impact) 47

48 48 a) Look at which countries your researchers most frequently collaborate with and select one. b) Here I have selected ‘England’ and examined the institutions by average impact. c) I then selected Univ London Imperial Coll since this collaboration has the highest average impact. I then viewed the authors just from this institution. The report provides a list of authors who maybe be potential new recruitments. They have collaborated with authors at Simon Fraser and by using the Average Percentile or other normalised metrics you are able to identify the most impactful research by the collaborating authors. These authors could be potential recruits since the collaboration has proven successful to Simon Frasers research impact.

49 8. How does our research reputation compare to university x? 49

50 Institutional Profiles- Finance Indicators 50 9. How does our institutional income compare to university X

51 10. How does our teaching performance compare to university x? 51

52 11. Which research areas need more support? 52 Web of Science categories that have an impact below what is expected (Impact Relative to Subject Area, value below 1)

53 12. Which reports/metrics can support promotion/tenure decisions? Report Source articles listing- individual author Summary Metrics- individual author Collaborating institutions- individual author Funding Ranking- individual author Trended Report- individual author Can you think of any others? Metrics Journal Actual/Journal Expected Category Actual/Category Expected Percentile in Field %Documents uncited 53

54 13. Which metrics/reports are best to provide evidence that a research strategy has been successful? Eg. Has a recruitment drive (in specific area of research) in year x provided return on investment? Impact relative to Subject Area (overall or trended) % documents in institution (overall or trended) % documents cited (overall or trended) % documents cited relative to subject area (overall or trended) Aggregate Performance Indicator Can you think of any other reports/metrics? 54

55 THE NEXT GENERATION OF INCITES Services Global Comparisons Research In View Institutional Profiles Research Profiles Journal Citation Reports INCITES A single destination for all research assessment & evaluation needs Tools Content Essential Science Indicators Semi-Custom Analytics Custom Analytics Syndicated Analytics Custom Data Sets Web Based Cloud & API Mobile Access 55

56 THE NEXT GENERATION INCITES Simplified interface for at-a-glance information from your desktop or mobile device Compelling, customizable visualizations with links to underlying data Compare institutions and individuals any time, any where Access both summary level and detailed faculty profiles and benchmarks to make informed staffing 56

57 57

58 58

59 59

60 60

61 61

62 62

63 63

64 Thank You! rachel.mangan@thomsonreuters.com http://researchanalytics.thomsonreuters.com/incites/


Download ppt "Using Incites to evaluate Research Performance Advanced"

Similar presentations


Ads by Google