Oct 2006 Research Metrics What was proposed … … what might work Jonathan Adams.

Slides:



Advertisements
Similar presentations
Paul Smith Office for National Statistics
Advertisements

RAE 2008: Goldsmiths Outcomes. Sample Quality Profile.
1 The Research Environment Post 2008 Some Possibilities Professor Peter Gilroy.
DUAL SUPPORT DUEL FOR SUPPORT Professor Sir Gareth Roberts University of Oxford.
June 2006 How good is our research? New approaches to research indicators.
Research funding and assessment: beyond 2008 Professor David Eastwood Vice Chancellor University of East Anglia, Chair 1994 Group, Chief Executive Designate.
Research Funding and Assessment: Beyond 2008 Government funding for Research: what is it for and how should it be distributed? Michael Driscoll Middlesex.
Charities, excellence & assessment Dr Mark Walport Director The Wellcome Trust.
Research Funding and Assessment: The Future Professor David Eastwood Vice-Chancellor and Principal.
Working with the Research Excellence Framework Dr Ian Carter Director of Research and Enterprise Sussex Research Hive Seminars 10 March 2011.
Quality Accounts: Stakeholder Engagement. Introduction.
RAE Review: Interim feedback from facilitated workshops Jonathan Grant and Steven Wooding.
The future of the British RAE The REF (Research Excellence Framework) Jonathan Adams.
The Research Excellence Framework RIOJA meeting 7 July 2008 Graeme Rosenberg REF Pilot Manager.
Evaluating health informatics projects Reasons for and problems of evaluation Objective model Subjective model.
Transparency as a means to achieve institutional objectives Jim Port J M Consulting Ltd The big picture Transparency and public funding TRAC as an aid.
Paris, May 2007 How good is the research base? New approaches to research indicators Colloque de l’Académie des sciences "Évolution des publications scientifiques"
EPC Congress 2005 Research Funding Paul Hubbard Head of Research Policy, HEFCE.
Tired of hanging around Evaluating projects with young people.
Experiments in Measuring Sustainability - The Environmental Sustainability Index and its Critics Marc Levy CIESIN
Analysis of systemic reasons for lower competitiveness of European universities Global rankings do not demonstrate higher (or lower) ‘competitiveness’
Ray C. Rist The World Bank Washington, D.C.
Research Assessment and UK publication patterns Jonathan Adams.
HOW CONCENTRATED IS THE UK RESEARCH BASE? THE DISTRIBUTION OF EXCELLENCE AND DIVERSITY JONATHAN ADAMS 14 OCTOBER 2009.
Communicating the outcomes of the 2008 Research Assessment Exercise A presentation to press officers in universities and colleges. Philip Walker, HEFCE.
The Research Assessment Exercise in the United Kingdom Paul Hubbard International colloquium “Ranking and Research Assessment in Higher Education” 13 December.
“Hot issues” in validation of non- formal and informal learning ECOTEC Research and Consulting 22nd January 2005.
The NHS KSF Learning Programme Days One & Two [Sessions 1- 6] The NHS Knowledge and Skills Framework.
Impact assessment framework
Institutional Repositories The invisible & unwritten rules of Project Management in ETD Collection Building.
By Hui Bian Office for Faculty Excellence 1. K-group between-subjects MANOVA with SPSS Factorial between-subjects MANOVA with SPSS How to interpret SPSS.
Strategic Planning Session David Rudder, Ph.D. Rudder Consultants, LLC. May 17, 2006.
Reflections on the Independent Strategic Review of the Performance-Based Research Fund by Jonathan Adams Presentation to Forum on “ Measuring Research.
2 Journals in the arts and humanities: their role and evaluation Professor Geoffrey Crossick Warden Goldsmiths, University of London.
Beyond the RAE: New methods to assess research quality July 2008.
Research Quality Assessment following the RAE David Sweeney Director, Research, Innovation, Skills.
Research Assessment Exercise RAE Dr Gary Beauchamp Director of Research School of Education.
Review of the Transparent Approach to Costing A report by KPMG for HEFCE.
The Evaluation of Publicly Funded Research Berlin, 26/27 September 2005 Evaluation for a changing research base Paul Hubbard Head of Research Policy, HEFCE,
LEVEL 3 I can identify differences and similarities or changes in different scientific ideas. I can suggest solutions to problems and build models to.
© New Zealand Ministry of Education copying restricted to use by New Zealand education sector. Page 1 Consider the Evidence Evidence-driven.
The Research Excellence Framework Expert Advisory Groups round 1 meetings February 2009 Paul Hubbard Head of Research Policy.
The Research Excellence Framework: principles and practicalities Stephen Pinfield Thanks to Paul Hubbard and Graeme Rosenberg of HEFCE for providing much.
The REF assessment framework (updated 23 May 2011)
We provide web based benchmarking, process diagnostics and operational performance measurement solutions to help public and private sector organisations.
Introduction to the New Washington State Achievement Index Jack B. Monpas-Huber, Ph.D. Director of Assessment & Student Information Board of Directors.
Kathy Corbiere Service Delivery and Performance Commission
Evaluation Framework Phase 1 - EFRG update to Council 5 December 2007 Annex A – Evaluation Framework Phase 1.
1 Modeling Complex Systems – How Much Detail is Appropriate? David W. Esh US Nuclear Regulatory Commission 2007 GoldSim User Conference, October 23-25,
Managing Multi Mode Collection Instruments in the 2011 UK Census Frank Nolan, Heather Wagstaff, Ruth Wallis Office for National Statistics UK.
Presentation to the Ad-hoc Joint Sub-Committee on Parliamentary Oversight and Accountability Wednesday 20 March 2002 PUBLIC SERVICE MONITORING AND EVALUATION.
Research Excellence Framework 2014 Michelle Double Hyacinth Gale Sita Popat Edward Spiers Research and Innovation Support Conference.
THE BIBLIOMETRIC INDICATORS. BIBLIOMETRIC INDICATORS COMPARING ‘LIKE TO LIKE’ Productivity And Impact Productivity And Impact Normalization Top Performance.
WIS DOT MCLARY MANAGEMENT PERFORMANCE MEASUREMENT.
Current Developments in Higher Education and at Birmingham City University Professor Fiona Church Pro-Vice-Chancellor Student Learning Experience.
Dr Camille B. Kandiko King’s College London
Current R& KE Issues David Sweeney
Jisc Open Access Dashboard
Teaching Excellence Framework (TEF) Higher Education White Paper
Dr Mark Walport Director The Wellcome Trust
EVAAS Overview.
Resource 1. Evaluation Planning Template
Research Update GERI May 2010.
Consider the Evidence Evidence-driven decision making
Funding the full economic costs of research
DUAL SUPPORT DUEL FOR SUPPORT
REF and research funding update
How good is our research? New approaches to research indicators
Research Funding and Assessment: Beyond 2008
Professor David Eastwood
Presentation transcript:

Oct 2006 Research Metrics What was proposed … … what might work Jonathan Adams

Oct 2006 Overview RAE was seen as burdensome and distorting Treasury proposed a metrics-based QR allocation system The outline metric model is inadequate, unbalanced and provides no quality assurance A basket of metrics might nonetheless provide a workable way of reducing the peer review load Research is a complex process so no assessment system sufficient to purpose is going to be completely light touch

Oct 2006 The background RAE introduced in 1986 –ABRC and UGC consensus to increase selectivity Format settled by 1992 Progressive improvement in UK impact Dynamic change and improvement at all levels

Oct 2006 The RAE period is linked to an increase in UK share of world citations

Oct 2006 UK performance gain is seen across all RAE grades (Data are core sciences, grade at RAE96)

Oct 2006 Treasury proposals RAE peer review produced a grade –Weighting factor in QR allocation model –Quality assurance But there were doubters –Community said the RAE was onerous –Peer review was opaque –Funding appeared [too] widely distributed Treasury wanted transparent simplification of the allocation side

Oct 2006 The next steps model Noted correlation between QR and earned income (RC or total) –Evidence drew attention to statistical link in work on dual support for HEFCE and UUK in 2001 & 2002 Treasury hard-wired the model as an allocation system –So RC income determines QR But … –Statistical correlation is not a sufficient argument –Income is not a measure of quality and should not be used as a driver for evaluation and reward

Oct 2006 QR and RC income scale together, but the residual variance would have an impact HEPI produced additional analyses in report

Oct 2006 Unmodified outcomes of outline metrics model perturb current system unduly £MILLIONSCurrent HEFC RChange WINNERS Univ Southampton Univ Cambridge Univ Leicester Univ Manchester LOSERS Univ Oxford Royal Holloway, Univ London Univ Arts London Imperial Coll London Univ Coll London King's Coll London A new model might produce reasonable change, but few would accept that the current QR allocations are as erroneous as these outcomes suggest

Oct 2006 The problem The Treasury model over-simplifies Outcomes are unpredictable –There are confounding factors such as subject mix –Even within subjects there are complex cost patterns The outcome does not inspire confidence and would affect morale There are no checks and balances –Risk of perverse outcomes, drift from original model –Drivers might affect innovation, emerging fields, new staff There is no quality assurance

Oct 2006 What are we trying to achieve? We want to lighten the peer review burden so we need indicators to evaluate research performance but not simplistic mono-metrics Inputs Research black box Outputs Funding Numbers..Publications research quality Time What we want to know What we have to use

Oct 2006 Informed assessment comes from an integrated picture of research, not single metrics

Oct 2006 Data options for metrics and indicators Primary data from a research phase –Input, activity, output, impact Secondary data from combinations of these –e.g. money or papers per FTE Three attributes for every datum –Time, place, discipline –This limits possible sources of valid data Build up a picture –Weighted use of multiple indicators –Balance adjusted for subject –Balance adjusted for policy purpose

Oct 2006 We need assured data sourcing Where the data comes from –Indicator data must emerge naturally from the process being evaluated –Artificial PIs are just that, artificial Who collects and collates the data –This affects accessibility, quality and timeliness HESA –Data quality and validation –Discipline structure Game playing

Oct 2006 We need to agree discipline mapping What is Chemistry?

Oct 2006 We have to agree how to account for the distribution of data values e.g. income MaximumMinimum

Oct 2006 Distribution of data values - impact The variables for which we have metrics are skewed and therefore difficult to picture in a simple way

Oct 2006 Agree purpose for data usage Data are only indicators –So we need some acceptable reference system Skewed profiles are difficult to interpret We need simple, transparent descriptions –Benchmarks –Make comparisons –Track changes Use metrics to monitor performance –Set baseline against RAE2008 outcomes –Check thresholds to trigger fuller reassessment

Oct 2006 Example - categorising impact data This grouping is the equivalent of a log 2 transformation. There is no place for zero values on a log scale.

Oct 2006 UK ten-year profile 680,000 papers AVERAGE RBI = 1.24 MODE (cited) MEDIAN THRESHOLD OF EXCELLENCE? MODE

Oct 2006 Subject profiles and UK reference

Oct 2006 HEIs – 10 year totals – 4.1 Smoothing the lines would reveal the shape of the profile

Oct 2006 HEIs – 10 year totals – 4.2 Absolute volume would add a further element for comparisons

Oct 2006 Conclusions We can reduce the peer review burden by increased use of metrics –But the transition wont be simple Research is a complex, expert system Assessment needs to produce –Confidence among the assessed –Quality assurance among users –Transparent outcome for funding bodies Light touch is possible, but not featherweight –Initiate a metrics basket linked to RAE2008 peer review –Set benchmarks & thresholds, then track the basket –Invoke panel reviews to evaluate change, but only where variance exceeds band markers across multiple metrics

Oct 2006 Overview (reprise) RAE was seen as burdensome and distorting Treasury proposed a metrics-based QR allocation system The outline model is inadequate, unbalanced and provides no quality assurance A basket of metrics might nonetheless provide a workable way of reducing the peer review load But research is a complex process so no assessment system sufficient to purpose is going to be completely light touch