The Research Excellence Framework: principles and practicalities Stephen Pinfield Thanks to Paul Hubbard and Graeme Rosenberg of HEFCE for providing much.

Slides:



Advertisements
Similar presentations
RAE 2008: Goldsmiths Outcomes. Sample Quality Profile.
Advertisements

DUAL SUPPORT DUEL FOR SUPPORT Professor Sir Gareth Roberts University of Oxford.
Research funding and assessment: beyond 2008 Professor David Eastwood Vice Chancellor University of East Anglia, Chair 1994 Group, Chief Executive Designate.
GSOE Impact Workshop Impact and the REF 19 th May 2010 Lesley Dinsdale.
Combining Analysis strands in a Main Gate Submission.
Working with the Research Excellence Framework Dr Ian Carter Director of Research and Enterprise Sussex Research Hive Seminars 10 March 2011.
Quality Accounts: Stakeholder Engagement. Introduction.
Research Excellence Framework Jane Boggan Planning Division Research Staff Forum - January 2010.
The Research Excellence Framework RIOJA meeting 7 July 2008 Graeme Rosenberg REF Pilot Manager.
The REF impact pilot findings Chris Taylor, Deputy REF manager.
REF2014 HODOMS Birmingham 8 th April Ann Dowling: Chairman of REF Main Panel B John Toland: Chairman of REF Sub-Panel B10: Mathematical Sciences.
Long live the REF!.  The RAE looks at three main areas: ◦ Outputs ◦ Environment ◦ Esteem  We are used evaluations of Environment and Esteem being “informed”
The transition to Finch: implications for the REF 29 November 2012 Paul Hubbard Head of Research Policy, HEFCE.
Pan Wales Hair & Beauty Network Learning Area Programmes Fran Hopwood.
1 Changing the way CQC regulates, inspects and monitors care.
Good practice & innovation Responses to the UK Government Agenda EUNIS Workshop March 2010 Stuart Bolton Frederique Van Till March 2010.
Transforming Patient Experience: The essential guide.
The building blocks What’s in it for me? Bibliometrics – an overview Research impact can be measured in many ways: quantitative approaches include publication.
“I can announce today that I have asked HEFCE to undertake a review of the role of metrics in research assessment and management. The review will consider.
A Snapshot of TEQSA Dr Carol Nicoll Chief Commissioner Festival of Learning and Teaching University of Adelaide Tuesday 6 November 2012.
British Nutrition Foundation Conference Update from Learning Teaching Scotland (LTS) Liz Nicoll – Development Officer Health & Wellbeing September 2010.
The Research Assessment Exercise in the United Kingdom Paul Hubbard International colloquium “Ranking and Research Assessment in Higher Education” 13 December.
RQF Trials and the Newcastle Experience Barney Glover.
UK Quality Framework OU and ARCs
 HEFCEmetrics. “I can announce today that I have asked HEFCE to undertake a review of the role of metrics in research assessment and management. The.
Demonstrating research impact in the REF Graeme Rosenberg REF Manager
REF Information Session August Research Excellence Framework (REF)
REC Subject Review Phase 1: Expert Panel Report and Recommendations.
Impact assessment framework
Reflections on the Independent Strategic Review of the Performance-Based Research Fund by Jonathan Adams Presentation to Forum on “ Measuring Research.
2 Journals in the arts and humanities: their role and evaluation Professor Geoffrey Crossick Warden Goldsmiths, University of London.
Beyond the RAE: New methods to assess research quality July 2008.
A month in the life of a university bibliometrician Dr Ian Rowlands University of Leicester, UK.
Research Quality Assessment following the RAE David Sweeney Director, Research, Innovation, Skills.
Promoting independent learning through technology Enhancement of Learning Support.
Introduction to the Research Excellence Framework.
Research Assessment Exercise RAE Dr Gary Beauchamp Director of Research School of Education.
Page 1 RESEARCH EXCELLENCE FRAMEWORK : RESEARCH IMPACT ASESSMENT LESSONS FROM THE PILOT EXERCISE Professor John Marshall Director Academic Research Development.
Qualifications Information Review CONSULTATION | 2012 QIR Consultation.
The Evaluation of Publicly Funded Research Berlin, 26/27 September 2005 Evaluation for a changing research base Paul Hubbard Head of Research Policy, HEFCE,
The Research Excellence Framework Expert Advisory Groups round 1 meetings February 2009 Paul Hubbard Head of Research Policy.
The REF assessment framework (updated 23 May 2011)
Delivering Strength Across the Piece David Sweeney Director, Research, Education and Knowledge Exchange HEPI, Royal Society 31 March 2015.
ESRC Research Methods Festival st July 2008 Exploring service user participation in the systematic review process Sarah Carr, Research Analyst,
APPG Equalities 29 th October 2014 Fair Financial Decision-Making: Follow up to the Equality & Human Rights Commission’s S31 assessment of the 2010 Spending.
February, MansourahProf. Nadia Badrawi Implementation of National Academic Reference Standards Prof. Nadia Badrawi Senior Member and former chairperson.
Raising standards improving lives The revised Learning and Skills Common Inspection Framework: AELP 2011.
Impact and the REF Consortium of Institutes of Advanced Study 19 October 2009 David Sweeney Director (Research, Innovation and Skills)
Bibliometrics at the University of Glasgow Susan Ashworth.
Nicki Horseman Lead HE analyst Times Higher Education.
Beyond the Repository: Research Systems, REF & New Opportunities William J Nixon Digital Library Development Manager.
Jisc Open Access Dashboard
Teaching Excellence Framework (TEF) Higher Education White Paper
IT Governance is …… ‘an integral part of enterprise governance and consists of the leadership and organizational structures and processes that ensure that.
The future of the REF: view from HEFCE
Welcome slide.
Study Programmes: Modelling & Operation Project
RESEARCH REALLY MATTERS
What Does Responsible Metrics Mean?
The Case for Teaching Excellence: From Policy to Practice
CIPFA Financial Assessments \ benchmarking
Minimum Contract Level Collaborative Self Assessment October 2011 Chris Bealey, LSIS Adviser Emphasise for contractors (lead/sub –contractors): existing.
Research Update GERI May 2010.
Research Excellence Framework: Past and Future
PROCESSES AND INDICATORS FOR MEASURING THE IMPACT OF EQUALITY BODIES
Subject Pilot Teaching Excellence and Student Outcomes Framework (TEF)
DUAL SUPPORT DUEL FOR SUPPORT
Minimum Contract Level Collaborative Self Assessment October 2011 Chris Bealey, LSIS Adviser Emphasise for contractors (lead/sub –contractors): existing.
REF and research funding update
Professor David Eastwood
Presentation transcript:

The Research Excellence Framework: principles and practicalities Stephen Pinfield Thanks to Paul Hubbard and Graeme Rosenberg of HEFCE for providing much of the information in this presentation

REF: key HEFCE principles A unified framework for research assessment and funding which accommodates differences between disciplines Robust research quality profiles for all subjects across the UK, benchmarked against international standards Emphasis on identifying and encouraging excellent research of all kinds Greater use of metrics than at present – including bibliometric indicators for all disciplines where these are meaningful Reduced burden on HEIs

Broad approach to assessment Assessment will be through: o Bibliometric indicators of quality or expert review of outputs (possibly a combination of these) o Other quantitative indicators o Supplementary qualitative information Which of these elements are employed, and the balance between them, will vary as appropriate to each subject For all subjects, expert panels will advise on the selection, interpretation and combination of the assessment components to produce overall quality profiles

Timetable Up to spring 2009 Bibliometrics pilot and other development work Mid 2009 Consult on all main features of the REF including operational details of the bibliometrics process and take decisions Calendar year 2010 Undertake full nationwide bibliometrics exercise in appropriate subjects. Establish expert panels for all subjects. Consult on choice and use of assessment components for each subject group Metrics begin to inform an element of HEFCE funding in some subjects Undertake full assessment process for all subjects including light-touch peer review.

Issues for further work How to produce robust bibliometric indicators for different fields How to combine metrics and expert input to form quality profiles How to group disciplines and configure expert panels The role and constitution of expert panels Reducing burden on the sector Promoting equality and diversity Preparing credible and deliverable detailed plans for phasing in and implementing the REF

Progress so far Pilot of the bibliometrics process is underway Exploring how best to assess the quality of user-valued research Identifying available sources of data for potential metrics HEFCE is establishing a series of expert groups to advise on the key issues for the design of the REF, drawing in particular on the experience of the RAE Work has been commissioned to gather evidence about the accountability burden and equality and diversity implications of the move from RAE to the REF Informal discussions with a range of stakeholders on the key questions have begun

Light touch peer review Expected to be the dominant mode of quality assessment for most social science, arts and humanities disciplines Need to consider workload on institutions and on panels Selection of staff and outputs – are there any realistic alternatives? HEFCE will consider options for greater use of metrics to inform peer review judgements

Metrics HEFCE aims to define a common family of metrics for the REF, that subject groups can draw on: o Bibliometrics o Research income o Research students o Esteem indicators? o Other qualitative information? HEFCE will take advice from expert groups on the choice and use of metrics, and seek to draw on existing sources of data as far as possible

Bibliometrics: principles Building on expert advice and consultation, HEFCE has identified the following key features: o Bibliometrics have the potential to provide robust proxy indicators of quality across a number of subjects (including most science based disciplines) o They need to be used alongside other data and information and taking advice on interpretation from expert panels o Indicators to be based on citation rates per paper, benchmarked against worldwide norms for the field (and year and type of publication) o Results to be aggregated for substantial bodies of work, presented as a citation profile

The bibliometrics pilot The pilot aims to develop and test a number of issues: o Which disciplines? (All disciplines with at least moderate citation coverage are included in the pilot) o Which staff and papers should be included? Universal or selective coverage? Are papers credited to the researcher or the institution? o How to collect data – and the implications for institutions o Which citation database(s)? o Refining the methods of analysis – including normalisation fields and handling self citation o Thresholds for the citation profile o Interpretation and use by expert panels

Bibliometrics pilot institutions Bangor UniversityLondon Sch of Hygiene and Trop Med University of BathUniversity of Nottingham University of BirminghamUniversity of Plymouth Bournemouth UniversityUniversity of Portsmouth University of CambridgeQueens University, Belfast University of DurhamRobert Gordon University University of East AngliaRoyal Veterinary College University of GlasgowUniversity of Southampton Imperial College LondonUniversity of Stirling Institute of Cancer ResearchUniversity of Sussex University of LeedsUniversity College London

The bibliometrics pilot - timetable May 08-Jun 08 Select HEIs/contractors Aug 08-Nov 08 Data collection Nov 08-March 09 Data analysis Spring 09 Pilot results

The pilot so far Evidence Ltd has been commissioned to run the pilot Mainly using the Web of Science data, but HEFCE will also explore SCOPUS The institutions have provided initial data about all known research staff and outputs for the period Jan 2001 to Dec 2007 (in relevant disciplines) This will be supplemented by records identified in the Web of Science JISC project to produce case studies of pilot institutions’ data collection systems HEFCE are currently commissioning a project to identify lessons learned by the pilot HEIs and disseminate these to the wider sector

Working with the pilot data Once HEFCE has citation data for the pilot HEIs: o HEFCE will undertake a number of different analyses in order to understand the pros and cons of different approaches and which of these might produce the best quality indicators for our purposes o Working with “real” data, HEFCE will investigate the options for combining these with other metrics and information to produce robust and fully rounded quality profiles o Then HEFCE will consult on the way forward

Further information REF-NEWS mailing list Queries to