The Multi-Dimensional Research Assessment Matrix

Slides:



Advertisements
Similar presentations
Academic Search Engines
Advertisements

As You Begin Your Research … Diljit Singh. Preparing for the Journey.
FOR PROFESSIONAL OR ACADEMIC PURPOSES September 2007 L. Codina. UPF Interdisciplinary CSIM Master Online Searching 1.
Usage statistics in context - panel discussion on understanding usage, measuring success Peter Shepherd Project Director COUNTER AAP/PSP 9 February 2005.
Library The Web of Science, Bibliometrics and Rankings 23 November 2011.
María Muñoz General Directorate for Community Funds Ministry of Economy and Finance SPAIN María Muñoz General Directorate for Community Funds Ministry.
Changes in Doctoral Education Worldwide Past Differences, Current Commonalities, and Future Trends Associate Professor Maresi Nerad Associate Graduate.
A bibliometric approach to international scientific migration Henk F. Moed, Mhamed Aisati, Andrew Plume and Gali Halevi Elsevier (Netherlands, UK, USA)
MIRA - WP 2 Observatory of Euro-Med S&T cooperation White Paper Coord. IRD (France) CNRS (Lebanon) MIRA Mediterranean Innovation and Research coordination.
Academy’s role in communicating Science for Development 19 February 2009 Presented by Professor Robin Crewe President: ASSAf.
A researcher perspective: what they want and how to pay for it Michael Jubb RIN 12 th Fiesole Retreat Leuven 9 April 2010.
CRICOS No J a university for the world real R Queensland University of Technology Janet Baker, QUT Library.
A journey through the scholarly publishing and evaluation landscape Themes and trends in research from global to individual scales Sarah Huggett Publishing.
The building blocks What’s in it for me? Bibliometrics – an overview Research impact can be measured in many ways: quantitative approaches include publication.
A ‘how to’ guide to measuring your own academic and external impacts Patrick Dunleavy and Jane Tinkler LSE Public Policy Group Investigating Academic Impacts.
What are the characteristics of academic journals
CURTIN UNIVERSITY LIBRARY Curtin University is a trademark of Curtin University of Technology CRICOS Provider code 00301J July 2014 Tell your impact story:
How can information inform judgments? Nick Fowler, Managing Director, Research Management, Elsevier HEPI conference, London March 31 st 2015.
On Standards in Science Metrics and Classifications Henk F. Moed Elsevier, Amsterdam, Netherlands Workshop on Science Metrics, Classifications, and Mapping.
International Conference KRE-11 Prague, 9 September 2011 J.M. Verheggen, Elsevier Multi-dimensional research assessment and SciVal; a very brief overview.
Presented by: Michael A Mabe Director ofVisiting Professor Academic Relations Dept Information Science Elsevier City University, London Globality & Disciplinarity.
Scopus. Agenda Scopus Introduction Online Demonstration Personal Profile Set-up Research Evaluation Tools -Author Identifier, Find Unmatched Authors,
SCIENTROMETRIC By Preeti Patil. Introduction The twentieth century may be described as the century of the development of metric science. Among the different.
BIBLIOMETRICS Presented by Asha. P Research Scholar DOS in Library and Information Science Research supervisor Dr.Y.Venkatesha Associate professor DOS.
Presented by: Charles Pallandt Title: Managing Director EMEA Academic & Governmental Markets Date: April 28 th, Turkey “Driving Research Excellence.
Bibliometrics overview slides. Contents of this slide set Slides 2-5 Various definitions Slide 6 The context, bibliometrics as 1 tools to assess Slides.
1 Using metrics to your advantage Fei Yu and Martin Cvelbar.
Open Access Journals in Latin America Abel L. Packer SciELO / FAPESP Program, Director Federal University of São Paulo Foundation, Advisor on Information.
Innovation Measurement
Publishing Research Papers Charles E. Dunlap, Ph.D. U.S. Civilian Research & Development Foundation Arlington, Virginia
Global Trends in Chemistry Publishing Background and Developments
Distinctive Features of Russian Science and Government Policy Irina Dezhina Head of Economics of Science and Innovations Division, Institute of World Economy.
OCLC Changing support/supporting change, June 2014, Amsterdam New roles for research libraries in performance measurement? Paul Wouters, Centre for.
Workshop on research assessment in CERIF Stephen Grace, Brigitte Jörg, Aija Kaitera, Maximilian Stempfhuber.
SCOPUS AND SCIVAL EVALUATION AND PROMOTION OF UKRAINIAN RESEARCH RESULTS PIOTR GOŁKIEWICZ PRODUCT SALES MANAGER, CENTRAL AND EASTERN EUROPE KIEV, 31 JANUARY.
Are downloads and readership data a substitute for citations? The case of a scholarly journal? Christian Schlögl Institute of Information Science and Information.
Digital Libraries: Redefining the Library Value Paradigm Peter E Sidorko The University of Hong Kong 3 December 2010.
Rajesh Singh Deputy Librarian University of Delhi Measuring Research Output.
SCOPUS AND SCIVAL EVALUATION AND PROMOTION OF UKRAINIAN RESEARCH RESULTS PIOTR GOŁKIEWICZ PRODUCT SALES MANAGER, CENTRAL AND EASTERN EUROPE LVIV, 11 SEPTEMBER.
THOMSON SCIENTIFIC Patricia Brennan Thomson Scientific January 10, 2008.
Science Publishing An Elsevier Perspective Presented by: Carl Schwarz Location: Moscow Date:9 December 2006.
Open Access - an introduction, Aleppo, December Open Access – an introduction Ian Johnson.
SciVal Spotlight Training for KU Huiling Ng, SciVal Product Sales Manager (South East Asia) Cassandra Teo, Account Manager (South East Asia) June 2013.
Universiteit Antwerpen Conference "New Frontiers in Evaluation", Vienna, April 24th-25th Reliability and Comparability of Peer Review Results Nadine.
Outcomes of the online academia consultation Mr. Christopher Clark Head, Partnership and Resource Mobilization Division International.
Web of Science: The Use & Abuse of Citation Data Mark Robertson & Adam Taves Scott Library Reference Dept.
Preliminary Survey on forming an Africa-Lics network Erika Kraemer-Mbula & Watu Wamae All African Globelics Seminar on Innovation and Economic Development.
Scientists and public communication: A survey of popular science publishing across 15 countries EMA Thematic Conference, Bordeaux March 29-30, 2010 Peter.
MEASURING RESEARCHERS: FROM RESEARCH PUBLICATIONS AND COMMUNICATION TO RESEARCH EVALUATION Lucie Vavříková 1.
Research and Innovation Support Conference Library Support for Research Dr Stella Butler, University Librarian.
Measuring Research Impact Using Bibliometrics Constance Wiebrands Manager, Library Services.
Bibliometrics at the University of Glasgow Susan Ashworth.
CitEc as a source for research assessment and evaluation José Manuel Barrueco Universitat de València (SPAIN) May, й Международной научно-практической.
INTRODUCTION TO BIBLIOMETRICS 1. History Terminology Uses 2.
Tools for Effective Evaluation of Science InCites David Horky Country Manager – Central and Eastern Europe
Data Mining for Expertise: Using Scopus to Create Lists of Experts for U.S. Department of Education Discretionary Grant Programs Good afternoon, my name.
Demonstrating Scholarly Impact: Metrics, Tools and Trends
Measuring Scholarly and Public Impact: Let’s Talk Metrics
CRIStin, reporting and rewarding research
Bibliometrics toolkit: Thomson Reuters products
The Impact of African Science: A Bibliometric Analysis Scientometrics 102(2): (2015) Hugo Confraria and Manuel Mira Godinho (MERIT-UNU and.
Influence of UK Academic Research in Biochemistry
Altmetrics: Analysis of Library and Information Science (LIS) Research in the Social Media Ifeanyi J. Ezema (Ph.D) Paper Presented at the 1st International.
Critically Reviewing the Literature
Bibliometric Analysis of Water Research
Greg Tananbaum ScholarNext Consulting November 4, 2016
Introduction of KNS55 Platform
Altmetrics: The Practical Implications Michael Taylor Head of Metrics Development Digital
Portfolio Committee on Science & Technology
WMO Global Campus: Open Educational Practice in Action
Presentation transcript:

The Multi-Dimensional Research Assessment Matrix Henk F. Moed Elsevier, Amsterdam, the Netherlands Seminar Research Evaluation in Practice, National Geographic Society, Washington DC, 17 October 2012

Short CV Henk F. Moed Years Position 1981-2009 Staff member at Centre for Science and Technology Studies (CWTS), Leiden Univ. 2009 Professor of Research Assessment Methodologies at Leiden University 2010 – Sept 2012 Elsevier, SciVal Dept. Senior Scientific Advisor As from Sept 2012 Elsevier, AGRM Dept. Head of Informetric Research Group

1 2 3 Contents Boundaries of the playing field; rules of the game The multi-dimensional research assessment matrix 3 Towards more sophisticated indicators: combining big data sets

(o) Bibliometrics makes sense

Which country has these main collaborators? USA Main collaborators UK China Germany Canada

Which country has these main collaborators? Brazil Main collaborators Argentina USA Portugal France Chile

Which country has these main collaborators? Malaysia Main collaborators Thailand India Singapore Iran UK

Which country has these main collaborators? Romania Main collaborators France Hungary Germany Italy Bulgaria

Which country has these main collaborators? South Africa Main collaborators UK China USA Nigeria Australia Netherlands

Boundaries of the playing field; rules of the game 2 Contents 1 Boundaries of the playing field; rules of the game 2 The multi-dimensional research assessment matrix 3 Towards more sophisticated indicators: combining big data sets

(i) Bibliometric research assessment is potentially a proper tool to consolidate academic freedom

(ii) Bibliometric tools help establishing a longer term perspective in research funding

(iii) One must be cautious using “societal benefit” as an assessment criterion of basic research: it can not be measured in a politically neutral way

(iv) Citations measure scientific-scholarly impact rather than quality or validity

(v) Citation counts in social sciences and humanities may be influenced by political ideologies

Citation impact and ideology Fall of the Berlin wall in Nov. 1989

(vi) The future of research assessment lies in the intelligent combination of metrics and peer review

(vii) Case study on funding policies of a National Research Council reveals biases in peer review

Affinity Applicants – Evaluation Committee 0 Applicants are/were not member of any Committee Co-applicant is/was member of a Committee, but not of the one evaluating First applicant is/was member of a Committee, but not of the one evaluating Co-applicant is member of the Committee(s) evaluating the proposal First applicant is member of the Committee(s) evaluating the proposal

For 15 % of SUBMITTED applications an applicant is a member of the evaluating Committee (Affinity=3, 4) % SUBMITTED APPLICATIONS AFFINITY APPLICANT - COMMITTEE

Probability to be granted increases with increasing affinity applicants-Committee % GRANTED APPLICATIONS AFFINITY APPLICANT - COMMITTEE

Logistic regression analysis: Affinity Applicant-Committee has a significant effect upon the probability to be granted MAXIMUM-LIKELIHOOD ANALYSIS-OF-VARIANCE TABLE (N=2,499) Source DF Chi-Square Prob ------------------------------------------------------------- INTERCEPT 1 18.47 0.0000 CITATION IMPACT APPLICANT 3 26.97 0.0000 ** Rel transdisc impact applicant 1 0.29 0.5926 AFFINITY APPLICANT-COMMITTEE 2 112.50 0.0000 ** Sum requested 1 45.47 0.0000 ** Institution applicant 4 25.94 0.0000 ** LIKELIHOOD RATIO 199 230.23 0.0638

(viii) Data must be accurate and verifiable

(ix) Assessed researchers must have the opportunity to check data and comment on outcomes

(x) A framework is needed to characterize and position bibliometric indicators and products

The multi-dimensional research assessment matrix Contents 1 Boundaries of the playing field and rules of the game 2 The multi-dimensional research assessment matrix 3 Towards more sophisticated indicators: combining big data sets

The Multi-Dimensional Research Assessment Matrix Expert Group on the Assessment of University-Based Research (AUBR, 2010)

Multi‐dimensional Research Assessment Matrix (Part) Unit of assessment Purpose Output dimensions Bibliometric indicators Other indicators Individual Allocate resources Research productivity Publications Peer review Research group Improve performance Quality, scholarly impact Journal citation impact Patents, licences, spin offs Department Monitor research programs Innovation and social benefit Actual citation impact Invitations for conferences Institution Increase regional engagement Sustainabi-lity & Scale Internat. co-authorship External research income Research field Promotion, hiring Research infrastruct. citation ‘prestige’ PhD com-pletion rates

Multi‐dimensional Research Assessment Matrix (Part) Unit of assessment Purpose Output dimensions Bibliometric indicators Other indicators Individual Allocate resources Research productivity Publications Peer review Research group Improve performance Quality, scholarly impact Journal citation impact Patents, licences, spin offs Department Monitor research programs Innovation and social benefit Actual citation impact Invitations for conferences Institution Increase regional engagement Sustainabi-lity & Scale Internat. co-authorship External research income Research field Promotion, hiring Research infrastruct. citation ‘prestige’ PhD com-pletion rates Read column- wise

Multi‐dimensional Research Assessment Matrix (Part) Unit of assessment Purpose Output dimensions Bibliometric indicators Other indicators Individual Allocate resources Research productivity Publications Peer review Research group Improve performance Quality, scholarly impact Journal citation impact Patents, licences, spin offs Department Monitor research programs Innovation and social benefit Actual citation impact Invitations for conferences Institution Increase regional engagement Sustainabi-lity & Scale Internat. co-authorship External research income Research field Promotion, hiring Research infrastruct. citation ‘prestige’ PhD com-pletion rates

Multi‐dimensional Research Assessment Matrix (Part) Unit of assessment Purpose Output dimensions Bibliometric indicators Other indicators Individual Allocate resources Research productivity Publications Peer review Research group Improve performance Quality, scholarly impact Journal citation impact Patents, licences, spin offs Department Monitor research programs Innovation and social benefit Actual citation impact Invitations for conferences Institution Increase regional engagement Sustainabi-lity & Scale Internat. co-authorship External research income Research field Promotion& hiring Research infrastruct. citation ‘prestige’ PhD com-pletion rates

MD-RAM: Example 1 Publications in international jrnls; Individual Hiring/promotion Productivity & impact Publications in international jrnls; Actual citation impact PhD date, place, supervisor; Invitations for conferences

Multi‐dimensional Research Assessment Matrix (Part) Unit of assessment Purpose Output dimensions Bibliometric indicators Other indicators Individual Allocate resources Research productivity Publications Peer review Research group Improve performance Quality, scholarly impact Journal citation impact Patents, licences, spin offs Department Monitor research programs Innovation and social benefit Actual citation impact Invitations for conferences Institution Increase regional engagement Sustainabi-lity & Scale Internat. co-authorship External research income Research field Promotion, hiring Research infrastruct. citation ‘prestige’ PhD com-pletion rates

MD-RAM: Example 2 Publications in international jrnls; Research group Monitoring research program Scientific impact Publications in international jrnls; (Trend in) actual citation impact Collaborations Topicality

Research assessment methodologies: Important considerations Methodology must be fit-for-purpose What is the primary “problem” to be solved? Be aware of unintended effects Change a methodology every 5-10 years What is an acceptable “error rate”? Wrong in individual cases  benificiary for the system as a whole

Towards more sophisticated indicators: combining big data sets Contents 1 Boundaries of the playing field and rules of the game 2 The multi-dimensional research assessment matrix 3 Towards more sophisticated indicators: combining big data sets

Journal articles + citations Journal full text data Journal usage data Unit of assess- ment Books Patents Conference Procs Trade jrnls Newspapers Social media

(a) Downloads vs. Citations What do full article downloads measure?

Authors vs. readers Readers Authors ?

Hypothesis on degree of correlation between downloads and citations Authors Authors Readers Readers Strong Weak

Usage vs. citations per main field Scientific ? PSYCHOL Societal

(b) Patent citations to journal articles: The technological impact of research

PATENTS (TotalPatent) 42 LIBRARY SCIENCE JOURNALS (Scopus) The Technological Impact of Library Science Research: A Patent Analysis [Halevi et al, 2012] PATENTS (TotalPatent) Citations by patent examiners and inventors 42 LIBRARY SCIENCE JOURNALS (Scopus)

Cited articles: keywords in titles The articles feature information retrieval and indexing, information and documents management systems which pertain to electronic and digital libraries development Citing patents: keywords in titles The patents focus on electronic information administration, navigation, and products and services management in commercial systems.

(c) International scientific migration

International migration vs. co-authorship Relationship Definition Comment International co-authorship Authors from institutions located in different countries jointly publish a paper Country relates to where authors work, NOT to their nationality migration A scientific author moves from one country to another

Map of countries with Ratio migration/collaboration > 1.2

TO COUNTRY FROM Country No. Co-authored papers No. Migrating authors Ratio % Migration / % Co-authorship PAK IND 276 118 3.6 PRT BRA 1,971 423 3.4 96 3.3 NLD IRN 492 80 3.1 USA 12,013 3,307 2.8 145 21 CHN TWN 3,979 1,048 2.6 3,039 780 352 2.4 MYS NGA 122 31 2.1 Language similarity drives migration stronger than it drives co-authorship Political tensions affect migration less than they affect co-authorship

(d) Citation context analysis Combining citation data from Scopus with full text article data from ScienceDirect

The use of contextual citations analysis to disclose the thematic and conceptual flow of cross- disciplinary research: the case of the Journal of Informetics 2007 (Gali Halevi et al., to be published, 2012)

Emerging sectional themes (OUTSIDE DISCIPLINE) FINDINGS & DISCUSSION INTRODUCTION The themes sequence in the word clouds below might suggest that the individual output evaluation done by structured peer review leads to an acknowledgment of the importance and evolution of networks rather than individuals CONCLUSIONS

(e) Book Citation Index Citation flows between books and journals

A Scholarly Book Citation Index: Approaches Add selected book series Add books from selected publishers Add selected individual book titles

Thank you for your attention!

Elsevier Bibliometric Research Program: www.ebrp.elsevier.com