The Evaluation of Publicly Funded Research Berlin, 26/27 September 2005 Evaluation for a changing research base Paul Hubbard Head of Research Policy, HEFCE,

Slides:



Advertisements
Similar presentations
Research Funding and Assessment: Beyond 2008 Government funding for Research: what is it for and how should it be distributed? Michael Driscoll Middlesex.
Advertisements

Professor Dave Delpy Chief Executive of Engineering and Physical Sciences Research Council Research Councils UK Impact Champion Competition vs. Collaboration:
University Alliance Driving forward excellence in research: institutional strategies and approaches Professor Janet Beer, Vice-Chancellor, Oxford Brookes.
Assessing Excellence with Impact Ian Diamond ESRC.
Excellence with Impact Declan Mulkeen January 2011.
EPC Congress 2005 Research Funding Paul Hubbard Head of Research Policy, HEFCE.
© UKCIP 2011 Learning and Informing Practice: The role of knowledge exchange Roger B Street Technical Director Friday, 25 th November 2011 Crew Project.
2013 Progress Review : progress during 2013 and delivering the Strategy Preliminary conclusions and challenges Steering Committee Meeting Dakar, Senegal.
Evaluating public RTD interventions: A performance audit perspective from the EU European Court of Auditors American Evaluation Society, Portland, 3 November.
Social Impact Measurement What gets measured gets managed
Capturing the impact of research Briony Rayfield.
March 5, 2002 Lessons Learned from GAO’s Evaluation of the Outcomes of R&D Programs Presentation to ORNL’s Conference on Estimating the Benefits of Government-Sponsored.
The Research Assessment Exercise in the United Kingdom Paul Hubbard International colloquium “Ranking and Research Assessment in Higher Education” 13 December.
Research Impact 19 November 2012 Dr Fiona Cameron Executive Director Australian Research Council.
Urban-Nexus – Integrated Urban Management David Ludlow and Michael Buser UWE Sofia November 2011.
Monitoring Evaluation Impact Assessment Objectives Be able to n explain basic monitoring and evaluation theory in relation to accountability n Identify.
The Higher Education Innovation Fund Vinnova and British Embassy seminar 21 March 2006.
PEIP National workshop in Montenegro: developing environmental infrastructure projects in the water sector Feasibility Study Preparation Venelina Varbova.
Defining and Measuring Impact Professor Andy Neely Deputy Director, AIM Research.
Writing Impact into Research Funding Applications Paula Gurteen Centre for Advanced Studies.
Reflections on the Independent Strategic Review of the Performance-Based Research Fund by Jonathan Adams Presentation to Forum on “ Measuring Research.
Have your say! 10 September Introductions  Nick Davies Public Services Manger, NCVO  Angie Macknight VCSE Review Manager.
JOINT STRATEGIC NEEDS ASSESSMENT Rebecca Cohen Policy Specialist, Chief Executive’s.
Research Quality Assessment following the RAE David Sweeney Director, Research, Innovation, Skills.
East Midlands Regional Volunteering Conference 9 th September 2009 Sarah Benioff, Deputy Director, Office of the Third Sector, Cabinet Office.
Susan Lloyd-Selby Senior Project Manager - Value Wales Uwech Rheolwr Prosiectau - Gwerth Cymru National Disability Authority of Ireland September 2011.
1 NEST New and emerging science and technology EUROPEAN COMMISSION - 6th Framework programme : Anticipating Scientific and Technological Needs.
Dr C Svanfeldt; DG RTD K2; October 6, Support for the coherent development of policies Regional Foresight in a European Perspective Dr. Christian.
Making Good Use of Research Evaluations Anneli Pauli, Vice President (Research)
Making an impact: making impact measurement work for you Chris Dayson, Sarah Pearson and Peter Wells 15 November 2012.
NHS Education for Scotland Defining A Quality Improvement Framework For A Coordinated Service Model Workshop 27 th May 2003 Dr Ann Wales NHS Scotland Library.
Student volunteers and the volunteer- involving community organisations vinspiredstudents research.
Knowledge Exchange and Impact in the AHRC Susan Amor Head of Knowledge Exchange Conny Carter Impact and Policy Manager University of Exeter 7 April 2011.
Graduates for the 21 st Century - Perspective from Research Ian Diamond RCUK.
The Research Excellence Framework Expert Advisory Groups round 1 meetings February 2009 Paul Hubbard Head of Research Policy.
Professor Andrew Wathey Vice-Chancellor and Chief Executive Northumbria University.
The Research Excellence Framework: principles and practicalities Stephen Pinfield Thanks to Paul Hubbard and Graeme Rosenberg of HEFCE for providing much.
DETERMINE Working document # 4 'Economic arguments for addressing social determinants of health inequalities' December 2009 Owen Metcalfe & Teresa Lavin.
Nef (the new economics foundation) Sustainable Commissioning NAVCA Susan Steed nef (the new economics foundation)
HEFCE Annual Conference Royal Holloway, University of London 1 and 2 April 2009 Tim Melville-Ross Chair.
IPC OUTCOMES WORKSHOP : DAY 1 National Drivers. Why Change our approach to outcomes ?  People are living longer:  180% increase in over 85s by 2036.
Regional Policy How are evaluations used in the EU? How to make them more usable? Stockholm, 8 October 2015 Kai Stryczynski, DG Regional and Urban Policy.
Delivering Strength Across the Piece David Sweeney Director, Research, Education and Knowledge Exchange HEPI, Royal Society 31 March 2015.
Delivering the Vision Scoping and prioritisation encourage and promote “membership” of CREW (barcode database of experts) Undertake scoping workshops in.
Management 2020 The Commission on the Future of Management and Leadership July 2014 Management 2020, CMI, July 2014.
Building Effective Staff Development to Support Employer Engagement Jane Timlin & Renata Eyres The University of Salford.
MANAGEMENT 2020 ›The Commission on the Future of Management and Leadership MANAGEMENT 2020 RESEARCH, CMI, JULY 2014.
MANAGEMENT INFORMATION, INFORMATION MANAGEMENT AND A PERFORMANCE MANAGEMENT STRATEGY YSDF THURSDAY 12, NOVEMBER, 2009.
WHAT INNOVATION FOR WHAT REGIONS? Can benchmarking be a driver? Ronald POHORYLES The Interdisciplinary Centre for Comparative Research in the Social Sciences,
The Engineering and Physical Sciences Research Council Funding (EPSRC)
Anne Lythgoe April What I want to do… Agree the scope of ‘social value’ Discuss why social value is important to commissioners of services and how.
Funders typically looking for applications to demonstrate: Evidence of need for what you are proposing, and why your project is the right approach to meeting.
Inspiring Learning for All Jonathan Douglas Head of Learning and Access Museums, Libraries and Archives Council.
Impact from a Research Council perspective Dr Alison Wall, Associate Director, Impact, EPSRC.
Raising standards improving lives The revised Learning and Skills Common Inspection Framework: AELP 2011.
Impact and the REF Consortium of Institutes of Advanced Study 19 October 2009 David Sweeney Director (Research, Innovation and Skills)
Current R& KE Issues David Sweeney
Fostering Excellence Through Knowledge and Innovation
Name Job title Research Councils UK
Dr Kieran Fenby-Hulse & Dr Rebekah Smith McGloin
Title of the Change Project
What is €5 billion worth? Magda Gunn, IMI Scientific Project Manager.
Data in the third sector (Health Development Officer)
Impact and the REF Tweet #rfringe17
Research Update GERI May 2010.
Towards Excellence in Research: Achievements and Visions of
A Focus on Outcomes and Impact
REF and research funding update
Research Funding and Assessment: Beyond 2008
Project intervention logic
Presentation transcript:

The Evaluation of Publicly Funded Research Berlin, 26/27 September 2005 Evaluation for a changing research base Paul Hubbard Head of Research Policy, HEFCE, UK

Evaluation for a changing research base Background: some assumptions Some models of funding and evaluation Change drivers: research methods Change drivers: policy related Challenges to the evaluator A remark about tools and indicators

Background: some assumptions Governments fund academic research for its benefits to economic strength and social cohesion Demand for funding will outpace budgets, forcing choices about what to support and increasing pressure to show what the funding buys Research will increasingly be internationally competitive

Three models of funding and evaluation (1) We commission R&D projects to find out things that we want to know. The findings will be a basis for further work or applications. Funding at project and programme level with research approaches defined at the outset – evaluation can be built in. Assumes the next programme can learn from the last one.

Three models (2) We fund excellent research (originality, significance and rigour) where we find it, based on whole sector peer review High autonomy, freedom to fail, room for innovation and dynamism Long evaluation-feedback times; hard to prove specific benefits have been gained

Three models (3) We know what we want from the whole R&D base and will judge it on how well it delivers Tends to mean we identify measurable desired outcomes and impacts May prefer relevance over excellence and privilege user requirements New figures annually so quick feedback possible –but may not see the whole picture

Change drivers: the research base Increasing diversity of subject focus – new disciplines, interdisciplinary studies Organisation – research collaboration across institutional boundaries and structured collaborative units

Change drivers: the research base (2) New forms of dissemination –IT enabled publication and citation analyses –Benefits of shared datasets

Change drivers: the policy environment Increased emphasis on showing: What public funding buys – including economic and social impact How funded research meets specific policy aims – strategically important knowledge How funded research meets needs of other major stakeholders (industry, health services) International standing of the national research effort

Challenges to the evaluator Carrying the community with us –From college to consultant –Lessons from 20 years of RAE (UK)

Challenges to the evaluator Looking to the paymasters: Targets and indicators – maintaining a balanced view Making the broader case for public investment: new approaches to demonstrating impact

Challenges to the Evaluator What approaches will we require to evaluate innovations in research organisation? How can we identify and measure innovative capacity?

Challenges to the Evaluator When budgets are tight, how shall we make the case for speculative investment in “blue skies” research?

A remark about tools and indicators We have at our disposal: Project and programme evaluation approaches Peer review Expert informed assessment Bibliometric/ citation indices Quantitative output measures Numerical and qualitative esteem indicators The “balanced scorecard”

A remark about tools and indicators (2) International comparisons – how much do they really tell us? Do we need more tools and indicators and if so where shall we find them?

Summary

No firm conclusions but some urgent and important questions