Presentation is loading. Please wait.

Presentation is loading. Please wait.

Assessing faculty productivity and institutional research performance: Using publication and citation key performance indicators Ann Kushmerick, Manager,

Similar presentations


Presentation on theme: "Assessing faculty productivity and institutional research performance: Using publication and citation key performance indicators Ann Kushmerick, Manager,"— Presentation transcript:

1 Assessing faculty productivity and institutional research performance: Using publication and citation key performance indicators Ann Kushmerick, Manager, Research Evaluation and Bibliometric Data AEA Conference November, 2010

2 –Introduce you to the background of bibliometrics –InCites demonstration: Show examples of bibliometric analyses –Explain a variety of bibliometric indicators and what they are used to measure –Introduce our upcoming enterprise knowledge management solution –Allow you to ask questions Agenda 2

3 Trends in scholarly research Competition for government research funding increasing Available funding decreasing Competition for top research faculty is on the rise Accountability: –Research spending –Demonstrating return on investment (ROI) Proving the institution’s quality of research to: –Prospective students –Prospective faculty members/research staff –Investors/donors Result: Institutions seek objective data on research performance, for data-based decision making

4 There are a variety of quantitative measures of research productivity and impact 4 Citation Metrics Funding data Peer review Awards/Honors  Today we will focus on citation metrics, derived from data on peer reviewed journals and citations received by those articles.  Citation metrics complement other types of performance data and should be used in conjunction with other measures.  Citation metrics have become the primary quantitative measure for research output. They are transparent, repeatable, and easily understood.  Citations are an indicator of an article’s impact and usefulness to the worldwide research community; they are the mode by which peers acknowledge each other’s research.

5 Validation Studies: Citation Frequency and Its Correlation with Other Measures of Peer Esteem Typical findings r =.7 to.9 Smith and Eysenck, comparing 1996 and 2001 RAE scores given to psychologists at 38 UK universities (peer review) with their citation counts, concluded: “The two approaches measure broadly the same thing.” NameFieldYear Kenneth E. ClarkPsychology1957 Jonathan R. Cole & Stephen ColePhysics1967, 1973 Henry G. SmallCollagen research1977 Julie A. VirgoCancer research1977 Michael E.D. KoenigPharmaceutical research1983 Eugene GarfieldNobel Prize winners1992 Charles OppenheimUniversity rankings (RAE)Mid 1990s- Andy T. Smith & Michael Eysenck Giovanni Abramo & C. D’Angelo Psychology Hard sciences 2002 2009

6 6 Compare like with like – The Golden Rule Use relative measures, not just absolute counts More applicable to hard sciences than arts/humanities Know your data parameters: –journal categories –author names –author addresses –time periods –document types Obtain multiple measures Recognize skewed nature of citation data Ask whether the results are reasonable Understand your data source Guidelines for citation analysis Download the white papers at: http://science.thomsonreuters.com/info/bibliometrics/ http://isiwebofknowledge.com/media/pdf/UsingBibliometricsinEval_WP.pdf http://science.thomsonreuters.com/info/bibliometrics/ http://isiwebofknowledge.com/media/pdf/UsingBibliometricsinEval_WP.pdf

7 Web of Science- the first and largest citation index Selectivity and control of content- high, consistent standards 11,500+ journals and 716 million+ cited references Multidisciplinary- Science, Social Science, Arts/Humanities Depth- 100+ years- including cited references Consistency and reliability- ideal for research evaluation e.g. field averages Unmatched expertise- 40+ years of citation analysis and research evaluation Conference Proceedings- 12,000 conferences annually Funding acknowledgments The gold standard-used by over 3,200 institutions in more than 90 countries 7

8 8 Science Citation Index Arts & Humanities Citation Index Web of Knowledge 4.0 Social Sciences Citation Index ISI Web of Knowledge 1960 1965 1970 1975 1980 1985 1990 1995 2000 2005 2007 2008 2010 SciSearch Century of Science InCites Essential Science Indicators PC-based Indicators for journals, nations, institutions PC-based citation sets Custom citation projects and national indicators - mainframe Thomson Reuters (formerly ISI) has been the authority on citation data for over 50 years. Century of Social Science

9 US National Science Foundation Science & Engineering Indicators Web of Science data have been used in major research evaluation initiatives around the globe for decades. European Commission European Union Science & Technology Indicators  US National Institutes of Health- Electronic Scientific Portfolio AssistantElectronic Scientific Portfolio Assistant  Times Higher Education world university ranking Times Higher Education world university ranking  Academic Ranking of World Universities (Shanghai Jiao Tong University) Academic Ranking of World Universities  Government bodies in all major countries. Examples of bibliometrics used in evaluative activities US National Research Council Assessment of Research-Doctorate Programs

10 Annual reports, Board of trustees reports Strategic plans/performance dashboards Accreditation PR, research magazines Annual faculty reviews/tenure review Collaboration analysis Departmental research strategy Building an institutional repository Build faculty expertise database How do universities use citation metrics?

11 Thomson Reuters: Research Analytics tools and services

12 Types of citation metrics and what they measure 12 MetricCalculationEvaluator questions Productivity-# papers -share of papers in field -# papers -# papers in field/ papers in field What is the research output of X? (a country, institution, researcher, etc.) Total influence # citations What is the overall impact of a body of work? H-indexNumber of papers (N) with at least N citations each. What is the impact and productivity of a body of work? Efficiency Average citation rateTotal citations/Total papers What is the rate at which a body of work is cited? Percent of papers cited # papers with at least one citation/ Total # papers in population How many papers get cited? Never get cited?

13 13 MetricCalculationEvaluator questions Relative Impact/ Benchmarking Journal performance ratio Sum of citations/sum of journal expected citation rates Expected citation rate calculated for a journal, for each year and document type combination (e.g. JAMA, 2001, review) Has this body of work performed better than average vis-à-vis the journals represented? Category performance ratio Aggregate Performance Indicator Sum of citations/sum of category expected citation rates Expected citation rate calculated for a journal category, for each year and document type combination (e.g. Physics, 1995, article) Has this body of work performed better than average vis-à-vis the specific disciplines represented? Types of citation metrics and what they measure

14 Two fields, two different citation patterns 14 Mean <50% >50% >10% >1% <50% >50% >10% >1% Chemistry in 2003- more citation activity than physics Physics in 2003 Number of articles Number of citations 10 7.5

15 15 MetricCalculationEvaluator questions Relative Impact/ Benchmarking Percentile in category and mean percentile Percentile placement of article within a journal category (e.g. oncology, 2002) How has this body of work performed compared to the disciplines represented? % papers in top x% of their field e.g. 10% of Dr. Lopez’ papers are in the top 1% of their fields What proportion of a body of work achieves a specific level of performance? Emerging areas of research Research FrontsClusters of highly cited papers identified via co-citation analysis What are the emerging areas of research in chemistry? Types of citation metrics and what they measure

16 Research front map in materials science 16

17 17 MetricCalculationEvaluator questions SpecializationDisciplinarity index where s is the share of papers in category i and n is the number of categories How multidisciplinary is a body of work? Interdisciplinarity index where p is the share of papers in category i and n is the number of categories How dispersed is a body of work across disciplines? Indirect impact Second generation citations Citations received by a paper’s citing papers Did the papers citing a body of work go on to have impact? Types of citation metrics and what they measure

18 Journal-level metrics 18 CalculationResearcher Questions Librarian Questions Evaluator Questions Journal impact factor A= total cites in 2008 B= 2008 cites to articles published in 2006-2007 C= # of articles published in 2006- 2007 D= B/C = 2008 impact factor What are the most highly cited journals in my discipline? Where should I submit my new paper? What journals are most important for my library’s collection? Does this researcher publish papers in journals with impact factors in the top quartile of their fields? Immediacy Index # of citations to articles published in 2008/ # of articles 2008 Do articles in this journal receive citations quickly after publication? Should I expect a paper in Journal X to receive citations quickly?

19 Journal-level metrics 19 CalculationResearcher Questions Librarian Questions Evaluator Questions Eigenfactor™ score Measures journal impact in terms of embedment within the entire citation network. -scaled so that the sum of the scores of all JCR journals is 100. How influential is a journal, weighted by the influence of the journals citing it? Which journals are most important to my stakeholders? How important is this journal within the research network? Eigenfactor Article Influence™ score Per article measure using Eigenfactor™ score - Normalized so the mean JCR journal’s article influence is 1.0. Compared to other journals, how influential is this journal?

20 Demonstration of these metrics in InCites (the following slides are included as a reference for you after the conference) 20

21 How does my institution compare to peer institutions? How does the research impact of different disciplines in my institution compare to each other? How does Dr. Smith’s research performance compare to Dr. Jones’? What has the organization published? What is the impact of that output? What disciplines are represented? What journals? Whom are we collaborating with most effectively? InCites is a web-based research evaluation tool designed for detailed bibliometric analysis and reporting on your journal output. 21

22 Benchmark research performance against peers 22 +33% +143% +54% +17% +145% +94% Relative impact reflects citation performance against the field average.

23 Track changes in research output Do the increases or decreases correlate with staffing or facility changes you made? 23 Scripps increased article output in pharmacology/ toxicology by 90% over the last 9 years.

24 Support accreditation Example: Scripps Research Institute 24

25 Aggregate Performance Indicator- normalized metric accounting for performance across many fields 25 These universities may focus on very different fields of research. This metric enables us to compare them fairly. 1.69 1.59 1.40 1.16 0.92 1.22

26 Discover potential areas of excellence for growth Trend in biophysics research output 26 Cornell, the largest institution of the group, produces the most papers.

27 Discover potential areas of excellence for growth Trend in biophysics research citation impact (cites/paper) 27 However, MSKCC’s papers receive the highest rate of citation, and it’s on an upward trend.

28 San Antonio (April 3, 2007) — The University of Texas Health Science Center at San Antonio ranked sixth in the nation in clinical medicine research impact for the period 2001 to 2005… Survey results were derived from the Thomson Scientific University Science Indicators database… Research productivity must be accompanied by use of the research by others in the field, said Brian Herman, Ph.D., vice president for research and professor of cellular and structural biology at the Health Science Center. “If a finding at the Health Science Center is truly novel and able to move a field forward, it will be replicated by researchers in other institutions and will be cited in their works,” Dr. Herman said. The Health Science Center produced 2,576 papers in the clinical medicine category over the five-year period and was assigned a relative impact percentage of 90. According to survey parameters, this meant the Health Science Center’s clinical medicine papers were cited 90 percent more often than the world average for papers in clinical medicine. “To have enough publications and enough citations per paper to be considered in a field as competitive as clinical medicine is a credit to our outstanding faculty,” said Francisco G. Cigarroa, M.D., president of the Health Science Center. “Furthermore, to be that much better than the world average in citations shows the growing recognition of the excellence of this institution.” What data can help me promote our research accomplishments? http://www.uthscsa.edu/hscnews/singleformat2.asp?newID=2353

29 How can we identify and promote outstanding or up- and-coming researchers? Identify researchers with strong citation performance who are less successful at obtaining funding. Help them to demonstrate their research success to funders using citation metrics. Identify researchers will strong funding success but low citation metrics. Help them to identify and submit to high- impact journals. Identify high-impact researchers in each department and utilize them as mentors for other faculty.

30 What research is our staff producing? Is it high impact research? InCites enables detailed analysis of the publication activity of your staff Example: Mt Sinai School of Medicine 30

31 Assess faculty impact 31 http://www.mssm.edu/profiles/valentin-fuster#bot

32 What has Dr. Fuster published? 32 Each article has descriptive information and performance metrics

33 Dr. Fuster’s most cited paper 33 Indirect, long term impact Baseline for comparison on the journal level. The average citation count for reviews from 1995 in Circulation is 301 citations. Fuster’s ratio is 4.87 (1,466/301).

34 Dr. Fuster’s most cited paper 34 Baseline for comparison on the category level. The average citation count for reviews from 1995 in all cardiac and cardiovascular systems journals is 65.04 citations. Fuster’s ratio is 22.54 (1,466/65.04).

35 Dr. Fuster’s most cited paper 35 Percentile ranking of paper in category. Fuster’s paper ranks in the top 0.009 percent of all cardiac and cardiovascular systems journals from 1995.

36 Dr. Fuster’s overall performance 36 Mean percentile: on average, Fuster’s papers rank in the top 31% of their respective fields Self citation analysis: 9.34% of Fuster’s citations are self citations. H index without self citations is 72. H index: 77 papers have been cited at least 77 times each Journal actual/expected ratio: Fuster gets cited 48% above average for the journals he publishes in

37 What fields is Dr. Fuster publishing in? 37 When we rank these categories by their category actual/expected citation ratio, we see that he has the highest impact in hematology (3.85) and peripheral vascular disease (3.47).

38 Compare faculty performance 38 RJ Desnick specializes in genetics. Normalized metrics help to compare fairly across disciplines, as well as length of career. View summary metrics for Desnick.

39 All metrics and analyses can be performed on any subset of the data Your entire institution One researcher Group of researchers (e.g. a department) Field of research within your institution Topic of research within your institution A collaboration partnership within your institution 39

40 How can we maximize our research collaboration? Which institutions do we collaboration frequently with in cell biology? Which collaborations have the most impact in this field (category performance ratio)? Example: SUNY Stony Brook, cell biology Identify high impact collaborations you can expand.

41 Case Studies: see how other institutions have used our data to make strategic decisions –Rockefeller University http://wokinfo.com/wok/media/pdf/tr_rockefeller_case_study.pdf http://wokinfo.com/wok/media/pdf/tr_rockefeller_case_study.pdf –University of Toronto http://scientific.thomsonreuters.com/media/newsletterpdfs/8462431/usi. pdf http://scientific.thomsonreuters.com/media/newsletterpdfs/8462431/usi. pdf 41

42 The future: enterprise knowledge management solution Institutions, particularly universities, need to track their faculty’s activities in a uniform, up-to-date fashion to facilitate performance reviews and institutional reporting (e.g. accreditation reports, board of trustees reports, etc.) To support this, we are building a knowledge management system that goes beyond the traditional bibliometrics that we are known for. This system will enable the profiling of staff across all activities: research, teaching, awards, funding, service, etc. 42

43 43 This solution will integrate data from multiple sources 43 University Data  Faculty  Courses and Evaluations  Contracts and Grants  Students  Document Repository  Inventions Thomson Reuters Provided Data  Publications (Web of Science)  Conference Proceedings  Library of Congress Books  Federal Grants  Patents Faculty Data Creative & Published Works, Teaching, Service, Civic Engagement & Partnerships

44 We will alert the user of new items they should consider adding to their profile. Example of faculty profile Publication records are linked to more information, e.g. Web of Science.

45 Finding expertise The system will have a internal view and a public view. Search by name or subject area – find appropriate individuals and view their accomplishments and performance metrics. Institution and Faculty can flag “private” data so it is not viewed publically. Role dependent search – only view data you are entitled to see.

46 Dashboards will synthesize data for evaluation decisions Instant overviews of departments or individuals to answer crucial questions. What activities is this department involved in? How do faculty activities relate to strategic goals?

47 47 Detailed views of all activities E.g. Funded Grants:  How has funding for this department changed over time?  Where is the funding coming from?  Are collaborators involved?  How does funding for this department compare with other departments?

48 Thank you Websites for further information Research Analytics http://researchanalytics.thomsonreuters.com/ InCites http://researchanalytics.thomsonreuters.com/incites/ Global Research Reports http://researchanalytics.thomsonreuters.com/grr/ Contact: ann.kushmerick@thomsonreuters.comann.kushmerick@thomsonreuters.com


Download ppt "Assessing faculty productivity and institutional research performance: Using publication and citation key performance indicators Ann Kushmerick, Manager,"

Similar presentations


Ads by Google