Research evaluation at CWTS Meaningful metrics, evaluation in context

Slides:



Advertisements
Similar presentations
1 Improving School Leadership - Guidelines for Country Background Reports - Education and Training Policy Division Directorate of Education.
Advertisements

Using Incites to evaluate Research Performance Advanced
INCITES PLATFORM NATIONAL OCEANIC AND ATMOSPHERIC ADMINISTRATION (NOAA)
1 Academic Rankings of Universities in the OIC Countries April 2007 April 2007.
Bibliometrics overview slides. Contents of this slide set Slides 2-5 Various definitions Slide 6 The context, bibliometrics as 1 tools to assess Slides.
1 Using metrics to your advantage Fei Yu and Martin Cvelbar.
Journal Status* Using the PageRank Algorithm to Rank Journals * J. Bollen, M. Rodriguez, H. Van de Sompel Scientometrics, Volume 69, n3, pp , 2006.
Publishing strategies – A seminar about the scientific publishing landscape Peter Sjögårde, Bibliometric analyst KTH Royal Institute of Technology, ECE.
T H O M S O N S C I E N T I F I C Editorial Development James Testa, Director.
OCLC Changing support/supporting change, June 2014, Amsterdam New roles for research libraries in performance measurement? Paul Wouters, Centre for.
Journal Impact Factors and H index
The Changing Role of Intangibles over the Crisis Intangibles & Economic Crisis & Company’s Value : the Analysis using Scientometric Instruments Anna Bykova.
ISI Web of Knowledge – Update Jan 2009 New features of ISI Web of Knowledge.
Guillaume Rivalle APRIL 2014 MEASURE YOUR RESEARCH PERFORMANCE WITH INCITES.
The Web of Science database bibliometrics and alternative metrics
Welcome to Scopus Training by : Arash Nikyar June 2014
Institute of Information Technology of ANAS Rahila Hasanova "New Challenges in the European Area: International Baku Forum of Young Scientists.
Social Networking Techniques for Ranking Scientific Publications (i.e. Conferences & journals) and Research Scholars.
The search for alternative metrics for taxonomy Daphne Duin & Peter van den Besselaar VU university Amsterdam Org Science & Network Institute.
Ranking and classification of universities based on advanced bibliometric mapping Leiden University 3rd International Symposium on University Rankings.
Innovation for Growth – i4g Universities are portfolios of (largely heterogeneous) disciplines. Further problems in university rankings Warsaw, 16 May.
Toolbox CRC programme managers – Dag Kavlie, RCN Analysis of indicators used for CRC Monitoring and Evaluation Ljubljana, 15 September 2009.
Beyond the RAE: New methods to assess research quality July 2008.
Detection of different types of bibliometric performance at the individual level in the Life Sciences: methodological outline Rodrigo Costas & Ed Noyons.
Frances Lawrenz and The Noyce evaluation team University of Minnesota 1 Acknowledgement: This project was funded by National Science Foundation (Grant#REC )
The Web of Science, Bibliometrics and Scholarly Communication 11 December 2013
T H O M S O N S C I E N T I F I C Marian Hollingsworth Manager, Publisher Relations July 18, 2007 Using Metrics to Improve your Journal Veterinary Journal.
LORRIE JOHNSON U.S. DEPARTMENT OF ENERGY OFFICE OF SCIENTIFIC AND TECHNICAL INFORMATION (OSTI) ICSTI TECHNICAL ACTIVITIES COORDINATING (TACC) MEETING OCTOBER.
THOMSON SCIENTIFIC Patricia Brennan Thomson Scientific January 10, 2008.
Web of Science® Krzysztof Szymanski October 13, 2010.
NIFU STEP Norwegian Institute for Studies in Innovation, Research and Education 7 th euroCRIS strategic seminar, Brussels Recording Research.
ISC Journal Citation Reprots تقارير استنادية للمجلات Mohammad Reza – Ghane Assistant Prof. in Library and Information Science & Director of Research Department.
1 NEST New and emerging science and technology EUROPEAN COMMISSION - 6th Framework programme : Anticipating Scientific and Technological Needs.
An overview of main bibliometric indicators: Dag W. Aksnes Data sources, methods and applications Nordic Institute for Studies in Innovation, Research.
Where Should I Publish? Journal Ranking Tools eigenfactor.org SCImago is a freely available web resource available at This uses.
A Bibliometric Comparison of the Research of Three UK Business Schools John Mingers, Kent Business School March 2014.
Bibliometric assessment of research performance in social sciences and humanities Henk F. Moed Centre for Science and Technology Studies (CWTS), Leiden.
The ISI Web of Knowledge nce/training/wok/#tab3.
ERIM Next Generation Graduate Programme & Networking Dynamics Wilfred Mijnhardt Executive Director Erasmus Research Institute of Management - ERIM Presentation.
1 Scopus as an Editor’s Workflow Tool Andy Teo Account Manager, Elsevier Science & Technology 17 Feb 2011
FP7 /1 EUROPEAN COMMISSION - DG Research Nikos Kastrinos Directorate for Research in Social Sciences, Humanities and Foresight of DG Research, European.
The Web of Science, Bibliometrics and Scholarly Communication
Database collection evaluation An application of evaluative methods S519.
SciVal Spotlight Training for KU Huiling Ng, SciVal Product Sales Manager (South East Asia) Cassandra Teo, Account Manager (South East Asia) June 2013.
1 Making a Grope for an Understanding of Taiwan’s Scientific Performance through the Use of Quantified Indicators Prof. Dr. Hsien-Chun Meng Science and.
RESEARCH EVALUATION - THE METRICS UNITED KINGDOM OCTOBER 2010.
The weight of research internationalization indicators Ülle Must Estonian Research Council SCIENTOMETRICS Status and Prospects for development.
Bibliometrics and Publishing Peter Sjögårde, Bibliometric analyst KTH Royal Institute of Technology, ECE School of Education and Communication in Engineering.
Karin Henning Bibliometric Services Gothenburg University Library Bibliometrics – an introduction to indicators and analyses.
Preliminary Survey on forming an Africa-Lics network Erika Kraemer-Mbula & Watu Wamae All African Globelics Seminar on Innovation and Economic Development.
TIPS WHEN USING BIBLIOMETRICS UNITED KINGDOM OCTOBER 2010.
1 RUSSIAN SCIENCE UNDER THE MICROSCOPE Philip Purnell Moscow, October 2013.
MARKO ZOVKO, ACCOUNT MANAGER STEPHEN SMITH, SOLUTIONS SPECIALIST JOURNALS & HIGHLY-CITED DATA IN INCITES V. OLD JOURNAL CITATION REPORTS. WHAT MORE AM.
MEASURING RESEARCHERS: FROM RESEARCH PUBLICATIONS AND COMMUNICATION TO RESEARCH EVALUATION Lucie Vavříková 1.
The Thomson Reuters Journal Selection Policy – Building Great Journals - Adding Value to Web of Science Maintaining and Growing Web of Science Regional.
THE BIBLIOMETRIC INDICATORS. BIBLIOMETRIC INDICATORS COMPARING ‘LIKE TO LIKE’ Productivity And Impact Productivity And Impact Normalization Top Performance.
Measuring Research Impact Using Bibliometrics Constance Wiebrands Manager, Library Services.
Bibliometrics at the University of Glasgow Susan Ashworth.
INTRODUCTION TO BIBLIOMETRICS 1. History Terminology Uses 2.
Tools for Effective Evaluation of Science InCites David Horky Country Manager – Central and Eastern Europe
Bibliometrics as a pathway to research strategies
Demonstrating Scholarly Impact: Metrics, Tools and Trends
Prof.Dr. Melih Bulu, Istinye University March 23
By: Azrul Abdullah Waeibrorheem Waemustafa Hamdan Mat Isa Universiti Teknologi Mara, Perlis Branch, Arau Campus SEFB, Universiti Utara, Malaysia Disclosure.
Optimize your research performance using SciVal
Advanced Scientometrics Workshop
Sándor Soós1 1Hungarian Academy of Sciences (MTA), Budapest, Hungary
SciVal to support building a research strategy
Indication of Publication Pattern of Scientometrics
Bibliometric Services at the Masaryk University
Presentation transcript:

Research evaluation at CWTS Meaningful metrics, evaluation in context scheduled: 23 aug 2012 Ed Noyons, Centre for Science and Technology Studies, Leiden University RAS Moscow, 10 October 2013

Outline Centre of science and Technology Studies (CWTS, Leiden University) history in short; CWTS research program; Recent advances.

25 years CWTS History in Short 3

25 years CWTS history in short (1985-2010) Started around 1985 by Anthony van Raan and Henk Moed; One and a half person funded by university; Context is science policy, research management; Mainly contract research and services (research evaluation); Staff stable around 15 people (10 researchers); Main focus on publication and citation data (in particular Web of Science).

25 years CWTS history in short (2010 - …) Block funding since 2008; Since 2010 moving from Services mainly with some research to: Research institute with services; New director Paul Wouters; New recruitments: now ~35 people.

CWTS Research programme Research and services CWTS Research programme

Bibliometrics (in context science policy) is ...

Research Accountability => evaluation Opportunities Research Accountability => evaluation Need for standardization, objectivity More data available

Quantitative analyses Beyond the ‘lamppost’ Vision Quantitative analyses Beyond the ‘lamppost’ Other data Other outputs Research 360º Input Societal impact/quality Researchers themselves

Background of the CWTS research program Already existing questions New questions: How do scientific and scholarly practices interact with the “social technology” of research evaluation and monitoring knowledge systems? What are the characteristics, possibilities and limitations of advanced metrics and indicators of science, technology and innovation? Existing: how are actors doing, what does a field look like, what are the main developments, which university performs best, …

Current CWTS research organization Chairs Scientometrics Science policy Science Technology & innovation Working groups Advanced bibliometrics Evaluation Practices in Context (EPIC) Social sciences & humanities Society using research Evaluation (SURE) Career studies

A look under the lamp post Back to Bibliometrics

Recent advances at CWTS Platform: Leiden ranking Indicators: New normalization to address: Multidisciplinary journals (Journal based) classification Structuring and mapping Advanced network analyses Publication based classification Visualization: VOSviewer

http://www.leidenranking.com The Leiden Ranking 14

Platform: Leiden Ranking http://www.leidenranking.com Based on Web of Science (2008-2011); Only universities (~500); Only dimension is scientific research; Indicators (state of the art): Production Impact (normalized and‘absolute’) Collaboration. Research institutes will added Size dependent and independent

Leiden Ranking – world top 3 (PPtop10%) Normalized impact Stability: Intervals to enhance certainty Expected value is 10% Stability intervals to prevent misinterpretation

Russian universities (impact) RAS is not a university. Will be added in the next edition

Russian universities (collaboration)

Impact Normalization (MNCS) Dealing with field differences Impact Normalization (MNCS) We will look at the Mean Normalized Citation Score (MNCS). This is similar to PP top10%. The former is an average and more sensitive to outiers 19

Background and approach Impact is measured by numbers of citations received; Excluding self-citations; Fields differ regarding citing behavior; One citation is one field is more worth than in the other; Normalization By journal category By citing context. 20

Issues related to journal category-based approach Scope of category; Scope of journal.

Journal classification ‘challenge’(scope of category) (e. g Journal classification ‘challenge’(scope of category) (e.g. cardio research)

Approach Source-normalized MNCS Source normalization (a.k.a. citing-side normalization): No field classification system; Citations are weighted differently depending on the number of references in the citing publication; Hence, each publication has its own environment to be normalized by. 23

Source-normalized MNCS (cont’d) Normalization based on citing context; Normalization at the level of individual papers (e.g., X) Average number of refs in papers citing X; Only active references are considered: Refs in period between publication and being cited Refs covered by WoS. 24

Networks and visualization Collaboration, connectedness, similarity, ... Networks and visualization 25

VOSviewer: collaboration Lomonosov Moscow State University (MSU) WoS (1993-2012) Top 50 most collaborative partners Co-published papers

Structure of science output (maps of science); Oeuvres of actors; Other networks Structure of science output (maps of science); Oeuvres of actors; Similarity of actors (benchmarks based on profile); …

Publication based classification Structure of science independent from journal classification Publication based classification 28

Publication based classification (WoS 1993-2012) Publication based clustering (each pub in one cluster); Independent from journals; Clusters based on Citing relations between publications Three levels: Top (21) Intermediate (~800) Bottom (~22,000) Challenges: Labeling Dynamics.

Map of all sciences (784 fields, WoS 1993-2012) Each circle represents a cluster of pubs Colors indicate clusters of fields, disciplines Social and health sciences Cognitive sciences Maths, computer sciences Biomed sciences Physical sciences Earth, Environ, agricult sciences Distance represents relatedness (citation traffic) Surface represents volume

Positioning of an actor in map Activity overall (world and e.g., Lomonosov Moscow State Univ, MSU) Proportion Lomonosov relative to world; Activity per ‘field’ (world and MSU) Proportion MSU in field; Relative activity MSU per ‘field’; Scores between 0 (Blue) and 2 (Red); ‘1’ if proportion same as overall (Green).

Positioning Lomonosov MSU

Positioning Lomonosov MSU

Positioning Russian Academy of Sciences (RAS)

Alternative view Lomonosov (density)

Using the map: benchmarks Benchmarking on the basis of research profile Distribution of output over 784 fields; Profile of each university in Leiden Ranking; Distributions of output over 784 fields; Compare to MSU profile; Identify most similar.

Most similar to MSU (LR) universities FR - University of Paris-Sud 11 RU - Saint Petersburg State University JP - Nagoya University FR - Joseph Fourier University CN - Peking University JP - University of Tokyo For these we have cleaned data

Density view MSU

Density view St. Petersburg State University

VOSviewer (Visualization of Similarities) http://www.vosviewer.com Open source application; Software to create maps; Input: publication data; Output: similarities among publication elements: Co-authors Terms co-occurring Co-cited articles …

More information CWTS and methods www.cwts.nl www.journalindicators.com www.vosviewer.com noyons@cwts.leidenuniv.nl

THANK YOU

Basic model in which we operate (research evaluation) Research in context

Example (49 Research communties of a FI univ) ‘Positive’ effect ‘Negative’ effect

RC with a‘positive’effect Most prominent field Impact increases

Rc with a‘negative’ effect Most prominent field Impact same Less prominent field Impact decreases

Wrap up Normalization Normalization based on journal classification has its flaws; We have developed recently an alternative; Test sets in recent projects show small (but relevant) differences;