RESEARCH EVALUATION: chasing indicators of excellence and impact Evidence Thomson Reuters UHMLG Preparing for research assessment, Royal Society of Medicine.

Slides:



Advertisements
Similar presentations
1 of 19 How to invest in Information for Development An Introduction IMARK How to invest in Information for Development An Introduction © FAO 2005.
Advertisements

Erasmus Mundus Information Day 20 January Erasmus Mundus Information Day 20 January ERASMUS MUNDUS PREPARING YOUR APPLICATION.
Identifying, Monitoring, and Assessing Promising Innovation: Using Evaluation to Support Rapid Cycle Change July Presentation at a Meeting sponsored.
Scholarly Communications in Flux Michael Jubb Director, Research Information Network Bloomsbury Conference on E-Publishing and E-Publications 29 June 2007.
Working with the Research Excellence Framework Dr Ian Carter Director of Research and Enterprise Sussex Research Hive Seminars 10 March 2011.
[ 1 ] © 2011 iParadigms, LLC Benefits for Teaching. Impact on Learning. Introduction to Turnitin.
Oct 2006 Research Metrics What was proposed … … what might work Jonathan Adams.
The DSF Revisited Status of World Bank-IMF Review Jeffrey D. Lewis Director, Economic Policy and Debt Department World Bank. Presentation at the European.
Reform and Innovation in Higher Education
Qualitative Indicator Prepared by Nyi Nyi THAUNG, UIS (Bangkok) Capacity Building Workshop on Monitoring and Evaluating Progress in Education in the Pacific.
TQA CONCEPTS & CORE VALUES
1 NEST New and emerging science and technology EUROPEAN COMMISSION - 6th Framework programme : Anticipating Scientific and Technological Needs.
Research article structure: Where can reporting guidelines help? Iveta Simera The EQUATOR Network workshop.
Strategic Financial Management 9 February 2012
Developing an Effective Tracking and Improvement System for Learning and Teaching Achievements and Future Challenges in Maintaining Academic Standards.
Internal Control–Integrated Framework
Low-Cost Private Schools Knowledge Framework Research methodology template.
Regulating the engineering profession Accrediting Engineering Degrees: Practice and Challenges Richard Shearman Director of Formation.
Paris, May 2007 How good is the research base? New approaches to research indicators Colloque de l’Académie des sciences "Évolution des publications scientifiques"
EVOLVING RESEARCH EVALUATION RAE2008 OUTCOMES AND REF PROSPECTS JONATHAN ADAMS 10 NOVEMBER 2009.
A researcher perspective: what they want and how to pay for it Michael Jubb RIN 12 th Fiesole Retreat Leuven 9 April 2010.
Where did the Quality Principles come from and what do they mean? Caroline Sharp Research Director, National Foundation for Educational Research.
The building blocks What’s in it for me? Bibliometrics – an overview Research impact can be measured in many ways: quantitative approaches include publication.
Ray C. Rist The World Bank Washington, D.C.
Creating Better Health and Care Services An overview of a Better Health and Care Review process.
Publishing Opportunities Alan Fyall Deputy Dean Research & Enterprise School of Services Management.
Bibliometrics overview slides. Contents of this slide set Slides 2-5 Various definitions Slide 6 The context, bibliometrics as 1 tools to assess Slides.
Aims Correlation between ISI citation counts and either Google Scholar or Google Web/URL citation counts for articles in OA journals in eight disciplines.
1 Using metrics to your advantage Fei Yu and Martin Cvelbar.
EUA Higher Education Convention, Graz, May 29-31, 2003 Group 5A First discussion: overriding principles of the two-cycle structure overriding structural.
Innovation Measurement
THE ROLE OF CITATION ANALYSIS IN RESEARCH EVALUATION Philip Purnell September 2010.
T H O M S O N S C I E N T I F I C Editorial Development James Testa, Director.
Guillaume Rivalle APRIL 2014 MEASURE YOUR RESEARCH PERFORMANCE WITH INCITES.
SCOPUS AND SCIVAL EVALUATION AND PROMOTION OF UKRAINIAN RESEARCH RESULTS PIOTR GOŁKIEWICZ PRODUCT SALES MANAGER, CENTRAL AND EASTERN EUROPE KIEV, 31 JANUARY.
UTIA Promotion & Tenure Workshop May 19, 2015 UTIA Promotion & Tenure Workshop May 19, 2015 Overall Philosophy: Maximize faculty FTE while maintaining.
Writing Impact into Research Funding Applications Paula Gurteen Centre for Advanced Studies.
The Role of Citations in Warwick’s Strategy and Improving Them Nicola Owen (Academic Registrar) Professor Mark Smith (PVC Research: Science and Medicine)
Experiences with a bibliometric indicator for performance-based funding of research institutions in Norway Gunnar Sivertsen Nordic Institute for Studies.
Major Current Trends in Innovation: The OECD Science, Technology and Industry Outlook 2014 Dominique Guellec Head, Country Studies and Outlook Division.
Reflections on the Independent Strategic Review of the Performance-Based Research Fund by Jonathan Adams Presentation to Forum on “ Measuring Research.
2 Journals in the arts and humanities: their role and evaluation Professor Geoffrey Crossick Warden Goldsmiths, University of London.
Bibliometrics: coming ready or not CAUL, September 2005 Cathrine Harboe-Ree.
Beyond the RAE: New methods to assess research quality July 2008.
Research Quality Assessment following the RAE David Sweeney Director, Research, Innovation, Skills.
SCOPUS AND SCIVAL EVALUATION AND PROMOTION OF UKRAINIAN RESEARCH RESULTS PIOTR GOŁKIEWICZ PRODUCT SALES MANAGER, CENTRAL AND EASTERN EUROPE LVIV, 11 SEPTEMBER.
THOMSON SCIENTIFIC Patricia Brennan Thomson Scientific January 10, 2008.
THOMSON REUTERS—GLOBAL INSTITUTIONAL PROFILES PROJECT DR. NAN MA SCIENCE AND SOLUTION CONSULTANT THOMSON REUTERS OCT 19 TH, 2010.
Commissioning Self Analysis and Planning Exercise activity sheets.
Making Good Use of Research Evaluations Anneli Pauli, Vice President (Research)
The Evaluation of Publicly Funded Research Berlin, 26/27 September 2005 Evaluation for a changing research base Paul Hubbard Head of Research Policy, HEFCE,
Where Should I Publish? Journal Ranking Tools eigenfactor.org SCImago is a freely available web resource available at This uses.
The Research Excellence Framework Expert Advisory Groups round 1 meetings February 2009 Paul Hubbard Head of Research Policy.
The Research Excellence Framework: principles and practicalities Stephen Pinfield Thanks to Paul Hubbard and Graeme Rosenberg of HEFCE for providing much.
DETERMINE Working document # 4 'Economic arguments for addressing social determinants of health inequalities' December 2009 Owen Metcalfe & Teresa Lavin.
RESEARCH EVALUATION - THE METRICS UNITED KINGDOM OCTOBER 2010.
Universiteit Antwerpen Conference "New Frontiers in Evaluation", Vienna, April 24th-25th Reliability and Comparability of Peer Review Results Nadine.
Rosie Drinkwater & Professor Lawrence Young Group Finance Director, Pro Vice-Chancellor (Academic Planning & Resources) League Tables Where are we, why.
MEASURING RESEARCHERS: FROM RESEARCH PUBLICATIONS AND COMMUNICATION TO RESEARCH EVALUATION Lucie Vavříková 1.
THE BIBLIOMETRIC INDICATORS. BIBLIOMETRIC INDICATORS COMPARING ‘LIKE TO LIKE’ Productivity And Impact Productivity And Impact Normalization Top Performance.
INTRODUCTION TO BIBLIOMETRICS 1. History Terminology Uses 2.
Tools for Effective Evaluation of Science InCites David Horky Country Manager – Central and Eastern Europe
Where Should I Publish? Journal Ranking Tools
Research Indicators for Open Science
Bibliometrics toolkit: Thomson Reuters products
Engaging creative arts cultures in the scholarship of teaching
UGC RAE /9/20.
What Does Responsible Metrics Mean?
Teaching Excellence Development Fund
A Focus on Outcomes and Impact
Presentation transcript:

RESEARCH EVALUATION: chasing indicators of excellence and impact Evidence Thomson Reuters UHMLG Preparing for research assessment, Royal Society of Medicine JONATHAN ADAMS, Director Research Evaluation 07 MARCH 2011

WE HAVE TO RESPOND TO GLOBAL CHALLENGES 2

RESEARCH ASSESSMENT PROVIDES US WITH INFORMATION TO DO THAT Global challenges and dynamism Economic turbulence and threats to public resourcing in all sectors Scarce resources –must be distributed selectively –in a manner that is equitable –and maintains academic confidence But what are our criteria? –What is research quality? –What is excellence? –What is impact? 3

WE CANNOT DIRECTLY ASSESS WHAT WE WANT TO KNOW 4 Conventionally, this problem is addressed by expert and experienced peer review. Peer review is not without its problems. Peer review of academic research tends to focus on academic impact, so other forms of impact require merit review. Expert review may be opaque to other stakeholders. Objectivity is addressed by introducing quantitative indicators.

INDICATORS, NOT METRICS It’s like taking bearings from your yacht 5 A single indicator is not enough Good combinations of indicators take distinctive bearings, or differing perspectives across the research landscape A single indicator is not enough Good combinations of indicators take distinctive bearings, or differing perspectives across the research landscape They are unlikely to agree completely, which gives us an estimate of our uncertainty

PRINCIPLES OF QUANTITATIVE RESEACH EVALUATION First, note that there are no absolutes; it’s all relative Impact may be local, national or international; we need benchmarks to make any sense of a number Are the proposed data relevant to the question? Can the available data address the question? What data do we have that we can use... ? 6

RESEARCH PERFORMANCE INDICATORS COME FROM THE RESEARCH PROCESS 7

WE CAN EXTEND THIS OVER THE WHOLE CYCLE (activities are then not synchronous) 8

WE HAVE A WIDE RANGE OF DATA AND POTENTIAL INDICATORS 9 Note that all these data points are characterised by: Location – where the activity took place; Time – when the activity took place; Discipline – the subject matter of the activity All these should be taken into account in evaluation

PRINCIPLES OF QUANTITATIVE RESEACH EVALUATION Are the proposed data relevant to the question? Can the available data address the question? Are we comparing ‘like-with-like’? Can we test outcomes by using multiple indicators? Have we ‘normalised’ our data? –Consider relative values, not absolute values Do we understand the characteristics of the data? Are there artefacts in the data that require editing? Do the results appear reasonable? 10

HOW CAN WE JUDGE POSSIBLE INDICATORS? Relevant and appropriate –Are indicators correlated with other performance estimates? –Do indicators really distinguish ‘excellence’ as we see it? –Are these the indicators the researchers would use? Cost effective –Data accessibility, coverage, cost and validation Transparent, equitable and stable –Are the characteristics and dynamics of the indicators clear? –Are all institutions, staff and subjects treated equitably? –How do people respond? Can they manipulate indicator outcomes? “Once an indicator is made a target for policy, it starts to lose the information content that initially qualified it to play such a role” (Goodhart’s Law) 11

COMMUNITY BEHAVIOUR HAS RESPONDED TO EVALUATION RAE1996 ScienceEngineeringSocial sciencesHumanities and arts Outputs% % % % Books and chapters5, , , , Conference proceedings2, , , , Journal articles77, , , , Other1, , , , RAE2001 Books and chapters1, , , , Conference proceedings , , Journal articles76, , , , Other , , RAE2008 Books and chapters1, , , Conference proceedings2, Journal articles80, , , , Other2, , ,

WHY BIBLIOMETRICS ARE A POPULAR SOURCE OF RESEARCH INDICATORS Publication is a universal characteristic of academic research and provides a standard ‘currency’ Citations are a natural part of academic behaviour Citation counts are associated with academic ‘impact’ –Impact is arguably a proxy for quality Data are accessible, affordable and increasingly international – though there is subject imbalance Data characteristics are well understood and widely explored –Citation counts grow over time –Citation behaviour is a cultural characteristic, which varies between fields –Citation behaviour may vary between countries 13

CITATION COUNTS GROW OVER TIME AND RATES VARY BETWEEN FIELDS 14

PAPERS ARE MORE LIKELY TO BE CITED OVER TIME 15

RAW CITATION COUNTS MUST BE ADJUSTED USING A BENCHMARK First, we need to separate articles and reviews Then ‘normalise’ the raw count by using a global reference benchmark Take year of publication into account Take field into account But how do we define field? –Projects funded by a Research Council –Departments which host a group of researchers –Journal set linked by citations –Granularity Physiology – Life science – Natural sciences 16

NORMALISED CITATION IMPACT CORRELATES WITH PEER REVIEW (Chemistry data) 17 Methodology affects the detail but not the sense of the outcome

THIS IS MOSTLY ABOUT EXCELLENCE: WHAT IS IMPACT? Research excellence might be termed ‘academic impact’ Other forms of impact for which we legitimately may seek evaluation are –Economic impact –Social impact Quantitative research evaluation traces its origins back to the 1980s The DTI spent much money in the 1990s failing to index economic impact It is difficult to track many research innovations through to a new product or process, or vice versa –Links are many-to-many and time-delayed Social impact is difficult to define or capture 18

CHASING IMPACT Eugene Garfield originally talked about citation counts as an index of ‘impact’, fifty years ago Current focus on economic and social impact should be seen as a serious engagement with other modes of recognising and demonstrating the value of original and applied research Of course –The objectives are undefined, which undermines any evaluation –It is easier to do this in some subjects than others –Much of the current material is anecdotal –It is difficult to validate without indicators But a start has been made –The principles should follow those of research evaluation –There must be ownership by the disciplinary communities 19

RESEARCH EVALUATION: chasing indicators of excellence and impact Evidence Thomson Reuters UHMLG Preparing for research assessment, Royal Society of Medicine JONATHAN ADAMS, Director Research Evaluation 07 MARCH 2011