Policy Perspective: The Politics of Evaluating Science Impact Dr Claire Donovan Research Evaluation & Policy Project Research School of Social Sciences.

Slides:



Advertisements
Similar presentations
Reasons for Monitoring and Evaluation at the Project Level
Advertisements

Evidence based policy making STI policy and indicators Dr Shamila Nair-Bedouelle Chef of Unit, AU/CPA Implementation Division for Science Policies and.
Quality Assessment in the Humanities Wim Blockmans NIAS / KNAW Humanities and the ERC Oslo 20 June 2005.
DUAL SUPPORT DUEL FOR SUPPORT Professor Sir Gareth Roberts University of Oxford.
Assessing Excellence with Impact Ian Diamond ESRC.
Soft Systems: an Interdisciplinary Method Dr Karen Bowler Marine and Coastal Policy Research Group School of Earth, Ocean and Environmental Sciences.
CSHE & LH Martin Institute Seminar PERFORMANCE INDICATORS AND PERFORMANCE-BASED FUNDING FOR TEACHING AND LEARNING IN AUSTRALIAN HIGHER EDUCATION Contributing.
EAC HIGHER EDUCATION POLICY
Conducting Research on Student Learning in Higher Education Developing research questions and workable methods Gary Poole Institute for the Scholarship.
High Level Regional Consultation for Policy Makers to Enhance Leadership in Planning the National HIV & AIDS Response S P Aligning AIDS & Development Planning.
NMP-NCP meeting - Brussels, 27 Jan 2005 Towards FP 7: Preliminary principles and orientations… Nicholas Hartley European Commission DG Research DG Research.
FCM Conference, Recep Pekdemir1 The Impacts of New Information Technology Developments on the Accounting Profession Recep Pekdemir, Ph.D. Professor of.
Chapter8Chapter8 GLOSSARYGLOSSARY EXIT Glossary Modern Management, 9 th edition Click on terms for definitions Business portfolio analysis Commitment principle.
Analysing university-firm interaction in the SADC countries: An initial overview Glenda Kruss SARUA workshop October 2008.
Copyright © 2015 McGraw-Hill Education. All rights reserved. No reproduction or distribution without the prior written consent of McGraw-Hill Education.
Research Quality Framework Presentation to APSR - ARROW The Adaptable Repository 3 May 2007 Dr Alexander Cooke Department of Education Science and Training.
The Nature of Strategic Management
The Research Assessment Exercise in the United Kingdom Paul Hubbard International colloquium “Ranking and Research Assessment in Higher Education” 13 December.
Intelligence Unit 6 - Mandates for Action Policy exerts a powerful influence on public health nutrition (PHN) practice because it affects:  service delivery.
2005/6 ATN Research Quality Framework (RQF) Trial Curtin University of Technology Queensland University of Technology RMIT University University of South.
TAFTIE Policy Forum „Measuring innovation” New trends and challenges in innovation measurement Fred Gault UNU-MERIT.
Irrigation and Water Supply sector By Nicolas Rivière LRRD Project.
Research Impact 19 November 2012 Dr Fiona Cameron Executive Director Australian Research Council.
Public Budgeting and Finance. Donald Kettl: Decision about how much of society’s resources we want to take from the private sector to use for problems.
Needs Analysis Session Scottish Community Development Centre November 2007.
Health inequalities post 2010 review – implications for action in London London Teaching Public Health Network “Towards a cohesive public health system.
1 Enhanced Commonwealth Performance Framework: A Programme Manager’s Perspective Government Programmes Community of Practice Forum – 23 March 2015 Suzanne.
“Helping business to build an inclusive workplace” A Proposal for Membership of UK Council for Access and Equality.
Role of international networks in informing TVET policy and practice Rod Camm Managing Director UNESCO-UNEVOC Regional Forum 31 August 2014 #
Social Accounting – measuring impact. emphasis on early intervention to reduce the need for late, high cost crisis action; emphasis on early intervention.
Irene Khan – Secretary General Building effective and responsive INGOs, the strategic role of HR: The IS Job Value Review 8 February 2008.
Results Frameworks Somalia Joint Needs Assessment November 2005.
A Proposal to Develop a Regulatory Science Program under Carleton University’s Regulatory Governance Initiative Presentation to the fourth Special Session.
1 st INFASA Symposium and Workshop Synthesis March 16 and 17, 2006 Bern, Switzerland As presented at the Symposium and Workshop by Dr. Fritz Häni, SHL.
Accounting Information: Users and Uses Accounting Information: Users and Uses C H A P T E R 1.
UK government policy on social enterprise and public procurement Jonathan Bland 1.
Page 1 RESEARCH EXCELLENCE FRAMEWORK : RESEARCH IMPACT ASESSMENT LESSONS FROM THE PILOT EXERCISE Professor John Marshall Director Academic Research Development.
Making Sense of the Social World 4 th Edition Chapter 11, Evaluation Research.
Local Economic Assessment: Learning from International Practice Local Economic and Employment Development Programme (LEED) Dr Jonathan Potter Senior Economist,
Jonathan A. Morell, Ph.D. Director of Evaluation – Fulcrum CorporationFulcrum Corporation (734)
Making Universities More Entrepreneurial Dr. David Woollard Special projects Manager.
Third Sector Evaluation: Challenges and Opportunities Presentation to the Public Legal Education in Canada National Conference on “Making an Impact” 26.
Research Quality Framework Presentation to APSR - ARROW - Repository Market Day 4 May 2007 Sandra Fox Department of Education Science and Training.
Improving Outcome Measurement Scottish Annual Statistics Users Conference 29/10/09 Dr Mark McAteer Director of Governance & Performance Management.
Faisal Naru Head of Better Regulation DAI Europe Ltd November 2007 Washington London Johannesburg Ramallah RIA – An Art and not a Science.
The Workshop on “Strengthening dialogue between ESCWA and ESCAP countries on international migration and development”
RESEARCH METHODS IN TOURISM Nicos Rodosthenous PhD 07/02/ /2/20131Dr Nicos Rodosthenous.
Theme 2 Developing MPA networks Particular thanks to: Theme 2 Concurrent Session Rapporteurs, Dan Laffoley, Gilly Llewellyn G E E L O N G A U S T R A L.
BUSINESS MANAGEMENT. WHAT IS STRATEGY?  Strategy may be defined as a course of action, including the specification of resources required to achieve the.
How to measure the impact of R&D on SD ? Laurence Esterle, MD, PhD Cermes and Ifris France Cyprus, 16 – 17 October L. ESTERLE Linking science and.
Business Policy and Strategy Lecture-09 1Business Policy and Strategy.
Political Context of Research Evaluation Luke Georghiou.
Ajit Maru GFAR Secretariat FAO-EPSO Consultation on “Plant Sciences for Sustainable Crop Production” 25 June 2112.
Research quality and Impact: The measure of contemporary universities in globalised world Dr Joseph S. Agbenyega.
Innovation and health – and some other things Presentation of TIK research for master students, February 16 th 2016 Magnus Gulbrandsen.
Changing Governance and Authority Relations: The Funding of the Swedish Public Research System Presentation at the Tentative Governance Conference, University.
Implementation Science: Finding Common Ground and Perspectives Laura Reichenbach, Evidence Project, Population Council International Conference on Family.
Chapter 2 Public Budgeting and Finance.  political (political science)  economic (economics)  accountability/control (business)  managerial/administrative.
Becerra-Fernandez & Sabherwal -- Knowledge Management © 2010 M.E. Sharpe Chapter 12 Leadership and Assessment of Knowledge Management.
Impact and the REF Consortium of Institutes of Advanced Study 19 October 2009 David Sweeney Director (Research, Innovation and Skills)
Educational contributions to cohesion and well-being in European social and institutional life.
The RESEARCH DATA ALLIANCE Maturity Model Approach WG: Repository Audit and Certification Wim Hugo – ICSU-WDS/ SAEON.
Martin Müller InRoad Coordinator InRoad
Where is Your Organization on the Accessibility Maturity Scale
Towards Excellence in Research: Achievements and Visions of
DUAL SUPPORT DUEL FOR SUPPORT
Tegemeo Institute of Agricultural Policy and Development,
How does practice research fit into HEFCE’s future research policy?
Measuring Research Engagement and Impact in the Mathematical Sciences
Do we need a feminist bibliometrics?
Presentation transcript:

Policy Perspective: The Politics of Evaluating Science Impact Dr Claire Donovan Research Evaluation & Policy Project Research School of Social Sciences

Forthcoming publications on the RQF/RAE  Special edition of Science and Public Policy : ‘Future Pathways for Science Policy and Research Assessment: Metrics vs. Peer Review, Quality vs. Impact’ (October 2007)  ‘The Australian Research Quality Framework: A Live Experiment in Capturing the Social, Economic, Environmental and Cultural Returns of Publicly Funded Research’, in C. L. S. Coryn and M. Scriven (Eds.) New Directions for Evaluation: Reforming the Evaluation of Research. (Los Angeles: Jossey- Bass forthcoming 2008)

Policy dilemmas (1): economic rationalism  Science policy driven by wealth creation, international competition & technological advance  Simple metrics based on natural science activity  Simple ‘impact’ metrics do not measure research impact: Technometrics  economic returns  low level impact  private over public interest Sociometrics  macro social statistics  no credible causal link to particular research efforts  Simple impact metrics ignore wider public value  Holistic measures rely on peer (and ‘end-user’) judgements, and are necessarily complex

Role of impact evaluation: public value  Meaningful impact assessments must strive to measure the wider public value of research  Triple bottom line accounting: beyond neo-liberal NPM centre-left, centre-right economic, social, environmental gains  Redefines purpose of STI policy  Public value eludes quantitative approaches  simple metrics increasingly detached from STI policy imperatives

Policy dilemmas (2): the ‘Pushmi-pullyu’ Politics of constructing research evaluation exercises is like a ‘Pushmi-pullyu’: External audit  internal peer-based appraisal Broader relevance  scientific autonomy Interests of industry & commerce  broader public benefits

Policy dilemmas (2): the ‘Pushmi-pullyu’ Key points of tension: ‘Quality’ vs. ‘impact’ (the ‘relevance gap’) Metrics vs. peer review (simplicity vs. complexity) Isolation from developments in scientometrics Lack of policy learning from other exercises & nations

UK Research Assessment Exercise  Two decades of ‘quality’ assessment, peer and discipline-based  Post-2008 ‘quality’ evaluation: metrics only for STEM light touch peer review & discipline-specific metrics for HASS, maths and statistics  ‘Impact’ defined as maximising the economic impact of research  New system to reward ‘user-focused’ research, i.e. ‘the relative amount of research universities undertake with business’  HEFCE to distribute £60m (€85m)  Simple impact metric likely (revenue generated?)

UK Research Assessment Exercise: dilemmas  Simple income metrics/technometrics do not measure ‘impact’  Public subsidy for private value  Economic rationalism: the research society most needs?  Against the grain of international developments in ‘impact’ measurement  UK Research Councils to assess ‘economic impact’ of funding applications (includes social and cultural)  Public consultation ended 14 February 2008  Late ministerial interest in ‘impact’ assessment

Australian Research Quality Framework  ‘Quality’ and ‘impact’ was to begin 2008: research group focus with discipline ‘clusters’ of peer and ‘end-user’ panels  Qualitative approach for ‘impact’, with minimal role for metrics  ‘Impact’ defined as social, economic, environmental and cultural benefits from research  DEST to distribute AU$600m (€385m) per year, but proportion for ‘impact’ unknown  Evaluation of impact statements and case studies (plus ‘end- user’ testimony if required)  Complex contextual approach, with groups rated against a controversial ‘impact scale’

Australian Research Quality Framework: dilemmas  Collapse of ‘impact’ metrics when applied to impact scale and to broader public value of research  Resource intensive process  Does high ‘impact’ mean low ‘quality’ research?  Follows some developments in scientometrics, but complexity seized on by political opposition: RQF scrapped New ERA – quality metrics only But Australian Research Council keen to pursue ‘impact’

Policy dilemmas (3): implementation  Dominance of science policy network: desire for simple metrics ‘engineering’-based impact scale  Compromises in design: partial adoption of scientometric advice complexity a political weakness  Bureaucrats: high turnover & loss of policy learning cycle of conversion to qualitative processes hostility to non-metrics approaches confused guidelines  ‘Science’ default setting (economic rationalism, excluding HASS)