Making and demonstrating research and evaluation impact (in an era of austerity) Sandra Nutley.

Slides:



Advertisements
Similar presentations
Evidence is the six-stone weakling of the policy world … confronted by … the four-hundred pound brute called politics (Ray Pawson)
Advertisements

Linking research to policy in Vietnam: how can complexity concepts help?
Types and models of research impact Sandra Nutley University of St Andrews.
A Practical Framework. RAPID Programme SMEPOL, Cairo, February, An Analytical Framework The political context – political.
Making it work: co-producing impact evaluation. Professor Imogen Taylor Department of Social Work and Social Care.
Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
From Research to Advocacy
USE OF REGIONAL NETWORKS FOR POLICY INFLUENCE: THE HIS KNOWLEDGE HUB EXPERIENCE Audrey Aumua and Maxine Whittaker Health Information Systems Knowledge.
Developing an Evaluation Strategy – experience in DFID Nick York Director – Country, Corporate and Global Evaluations, World Bank IEG Former Chief Professional.
Understanding Incentives within Social Accountability Endeavour Asia Governance Learning Event – CARE International 12 June 2013, Kathmandu Naimur Rahman.
Making Evidence-Based Education Policy Ontario Research Chairs in Public Policy Symposium Carol Campbell Ontario Institute for Studies in Education, University.
© UKCIP 2011 Learning and Informing Practice: The role of knowledge exchange Roger B Street Technical Director Friday, 25 th November 2011 Crew Project.
Health literacy Impact and action at a national level 26 July, 2014 Nicola Dunbar Director, Strategy and Development.
Relating research to practice Heather King Department of Education King’s College London.
Strategies and Structures for Research and Policy Networks: Presented to the Canadian Primary Health Care Research Network, 2012 Heather Creech, Director,
Good Evaluation Planning – and why this matters Presentation by Elliot Stern to Evaluation Network Meeting January 16 th 2015.
1 © The Liverpool School of Tropical Medicine Capturing the impact of research: lessons from the SHHEP initiative Dr. Sally Theobald, SHHEP and REBUILD.
1 Beyond broadcast: the art (and science) of getting research into practice Dez Holmes Director Research in Practice.
Role of RAS in the Agricultural Innovation System Rasheed Sulaiman V
Towards a Global Nutrition Cluster Advocacy Strategy
What kind of development research centers Latin America needs? Research organisations and policy making in Latin America Valeria Arza CONICET & CENIT/UNTREF.
Using Evidence in Your Work* From Evidence to Action A CIHR Funded Project *Based on a presentation for the National RAI Forum: "Making the Most of It“:
Intelligence Unit 6 - Mandates for Action Policy exerts a powerful influence on public health nutrition (PHN) practice because it affects:  service delivery.
Getting research into health care practice: General lessons and the case of genetics Sue Dopson Saïd Business School Templeton College.
Healthy Ageing Research – Developments and Lessons By Hal Kendig Faculty of Health Sciences University of Sydney National Symposium on Ageing Research.
Dissemination pathways Science and policy
Health Communication, Advocacy and Integrated Strategic Communication. The case of HIV/AIDS Jan Servaes Professor and Chair, Department of Communication.
Health Systems and the Cycle of Health System Reform
Social Science Research Use in Policy Penelope Carroll, SHORE Michael Blewden, SHORE Funded by BRCSS Network BRCSS Conference: Social Sciences Research:
Centre for Public Policy & Health WHO Collaborating Centre on Complex Health Systems Research, Knowledge and Action Presented by David Hunter Professor.
Strategic Commissioning
KT-EQUAL/ CARDI Workshop: ‘Lost in Translation’ 23 June 2011 Communicating research results to policy makers: A practitioner’s perspective.
Canadian Cancer Society Manitoba Division: Knowledge Exchange Network (KEN) & CancerCare Manitoba Manitoba Integrated Chronic Disease Primary Prevention.
Assessing the impact of research and KTE activities Sandra Nutley.
Bridging the Gap between Research and Policymaking in India Seminar, Delhi, 3 rd January 2004 The Analytical Framework The political context – political.
How evidence is used: implications for thinking about quality Professor Sandra.Nutley University of Edinburgh Research Unit for Research Utilisation (RURU)
Taking account of culture: Lessons from Iraq Rebecca Ingram: Senior Schools Adviser, British Council Peter Fell: Consultant for.
Nef (the new economics foundation) Co-producing Lambeth what’s possible? Lucie Stephens and Julia Slay nef, October 2011.
Based upon a presentation by Dr. Rob Weinberg Director, Experiment in Congregational Education Thinking, Planning, and Acting Systemically in Communities.
NIPEC Organisational Guide to Practice & Quality Improvement Tanya McCance, Director of Nursing Research & Practice Development (UCHT) & Reader (UU) Brendan.
RAPID Outcome Mapping Approach Simon Hearn, ODI 16 April 2010 Bern, Switzerland.
Integrating Knowledge Translation and Exchange into a grant Maureen Dobbins, RN, PhD SON, January 14, 2013.
What do we mean by evidence-informed practice? Alison Petch What works in dementia care? April
Understanding and improving research use in social services Sandra.Nutley.
Guiding Principles We, the members of CPED, believe "The professional doctorate in education prepares educators for the application of appropriate and.
Transforming Institution for Results-focused Policy Jibgar Joshi.
(1) Bridging research, policy and politics the RAPID+ framework This presentation is based on: Court, J., and Young, J Bridging research and policy.
Chapter 4 Developing and Sustaining a Knowledge Culture
Fife Partnership…Strengthening Fife’s Future Research can make a difference! Workshop on communicating and disseminating research Facilitation - Chris.
Queen’s Management & Leadership Framework
Transforming Patient Experience: The essential guide
A short introduction to the Strengthened Approach to supporting PFM reforms.
Developing a Framework In Support of a Community of Practice in ABI Jason Newberry, Research Director Tanya Darisi, Senior Researcher
Kathy Corbiere Service Delivery and Performance Commission
Evolution of the UK Knowledge Exchange System Pacec report 2009 Timothy Dee Canterbury Christ Church University.
EVIDENCE BASED POLICY RECOMMENDATIONS – TAKE AWAY LESSONS ON HOW TO PROGRESS EFFECTIVE ALCOHOL EDUCATION BETSY THOM Drug and Alcohol Research Centre MIDDLESEX.
Dr. Shahram Yazdani 1 Policy Support Function Dr. Shahram Yazdani Shahid Beheshti University of Medical Sciences School of Medical Education Strategic.
Collaboration Between Researchers and State Policymakers: Models for Achieving Evidence-Informed Policy Andrew Coburn, Ph.D Muskie School of Public Service.
Capacity Development Results Framework A strategic and results-oriented approach to learning for capacity development.
DESIGNING RESEARCH TO INFLUENCE POLICY: [PART I] BY LORNE FOSTER YORK UNIVERSITY.
Connect2Complete Theory of Change Development for Colleges and State Offices November 10, 2011 OMG Center for Collaborative Learning.
First European conference on drug supply indicators, , Brussels – Feed-back Chloé Carpentier, Laurent Laniel, EMCDDA 33 rd meeting of the.
PHE Knowledge and Library Services – building an interactive evidence base Anne Brice Head of Knowledge & Library Services Public Health England.
Implementation Science: Finding Common Ground and Perspectives Laura Reichenbach, Evidence Project, Population Council International Conference on Family.
ISSUES & CHALLENGES Adaptation, translation, and global application DiClemente, Crosby, & Kegler, 2009.
Why Government Education Initiatives Work - or Don’t
Evidence to action: Issues, challenges and solutions
The Moroccan Observatory on Drugs
Public Sector Modernisation How do governments learn?
UNDMTP Presentation, Session V: Early Warning Symposium 24 May 2006
Presentation transcript:

Making and demonstrating research and evaluation impact (in an era of austerity) Sandra Nutley

Using Evidence: How research can inform public services (Nutley, Walter and Davies, Policy Press, 2007) My knowledge base Research Unit for Research Utilisation Developing cross-sector knowledge on research use Education Healthcare Social Care Criminal Justice

An era of austerity: a UK-centric view? Yes, but not limited to the UK Brings the issue of making and demonstrating impact into sharp relieve, especially after the boom years Australia may not be immune: ‘Some believe that the current boom could end as soon as 2014’ (The Economist 28/5/11)

Impact on research & evaluation: threat or opportunity? UK: ‘Arguably the role of social research becomes more important to guide practice in an era of austerity than one of affluence’ (SRA 2010) USA: ‘There seems to be broad [bipartisan] agreement: We need an evidence-based system to guide future budget decisions that assesses the relative performance and impact of all government programs’ (Center for American Progress, July 2011) Underpinning rationale: Evidence-based policies and practices ‘more likely to be better informed, more effective and less expensive’ (Campbell et al 2007)

Threat more of a reality in UK Job cuts for researchers in government Research and evaluation budgets slashed Researchers & evaluators having to do more with less ‘One person's riot is another’s research grant’ But Research impact demands have raised status of applied/policy- related research in universities Politicians still reach for research as a tactic

My questions Why has social research and evaluation been viewed as dispensable when the going gets tough? What challenges need to be tackled in order to increase and demonstrate the impact of research and evaluation? Some answers in form of 8 emerging lessons

Policy makers and practitioners tend not to recognise influence of research and evaluation Unrealistic ambitions and expectations Some persistent problems in supply and demand, and insufficient focus on what happens in between Some reflections

Recognising research use & impact is hard because ‘policy making’ is complex SOMETIMES: clearly defined event explicit decisions involving known actors conscious deliberation defined policies policy fixed at implementation The rational high ground OFTENTIMES: ongoing process piecemeal: no single decision many actors muddling through policies emerge and accrete shaped through implementation The swampy lowlands

Research used in many ways Awareness Knowledge Changing attitudes, perceptions, ideas Knowledge & understanding Persuasion Practice & policy changes DecisionImplementation Evaluation & Confirmation MORE CONCEPTUAL USESINSTRUMENTAL USES The “enlightenment” role of research (Weiss) PROBLEM REFRAMING

Importance of informal carers… Decarceration policies… Patient safety… Harm reduction in substance misuse… Service user engagement… Enhancing self-care… The happiness and well-being agenda… Enlightenment use: promoting new ways of thinking…

Lesson 1: Pay more attention to tracing research impact Need to refine our methods and tools: –Construct a convincing impact narrative: dealing with complexity, attribution and additionality –Consider conceptual and instrumental impacts (and symbolic use) –Account for the difference between actual and potential impacts: receptivity of context Not been good at revealing and relating persuasive research impact stories – a challenging task

Need to be aware of possible unintended consequences Research and evaluation funds may be increasingly targeted on short term and low risk projects A tendency to over-emphasise positive and intended impacts, and underplay unintended and dysfunctional consequences How do we safeguard serendipity, critique and paradigm challenging research and evaluation?

Lesson 2: Set realistic ambitions and expectations about research use Evidence-informed not evidence- determined policy: value judgements are important Research and evaluation studies can rarely provide the definitive word A cautious, ‘experimental’ approach to policy making

Addressing supply, demand, and that in between Stocks or reservoirs of research and evaluation-based knowledge Evidence demand in political and professional worlds, and wider society

Supply deficits Lack of timely and accessible research that addresses policy/practice-relevant questions Better at understanding and illuminating problems rather than identifying possible solutions Too much unwitting replication of studies Paucity of good quality studies of intervention effectiveness (prevention and ‘treatment’ interventions) Insufficient attention paid to cost-effectiveness Insufficient mining of secondary data sources Equivocal attitude to ‘engaged’ research in university research community

Lesson 3: Improve the supply of relevant, accessible & credible evidence… but don’t stop there Better R&D strategies Address methodological competency and capacity internally and externally (and incentives) Revisit research & evaluation commissioning processes Support synthesis of existing studies Better dissemination and archiving

Demand deficits Research evidence low in politicians’ hierarchy?

Policy Makers’ Hierarchy of Evidence ‘Experts’ evidence (incl. consultants and think tanks) Opinion-based evidence (incl. pressure groups) Ideological ‘evidence’ (incl. party think tanks) Media evidence Internet evidence Lay evidence (constituents’, citizens’ experiences) ‘Street’ evidence (urban myths, accepted wisdom) Cabbie’s evidence Research Evidence Source: Phil Davies, 2007

Demand deficits Research evidence low in politicians’ hierarchy? Certainly ministerial differences in emphasis Politicised decision making more likely at times of crisis (Peters 2011) Practitioners have varying incentives to pay attention to research

Lesson 4: Shape – as well as respond to – the demand for evidence in policy and practice settings Formal government commitment to an evidence-informed approach Improve analytical skills of policy makers and practitioners Address incentives Work with advocacy organisations to shape context for specific findings

Connecting supply and demand

What image best represents how you think about the main challenges? B D GH F C E A

Challenge of linking two worlds? Divergent –interests, priorities, incentives, language, dynamics –conceptions of knowledge and time-scales –status and power Leading to –communication difficulties –mismatch between supply and demand –rejection and implementation failure Research Policy

But many players in research use process PoliticiansCivil servantsPolitical advisors Professional bodies Government analysts University and college researchers Research institutes and independent evaluators Think tanks and knowledge brokers Wider community Loc govt officers Service providersService users Research funders Audit, inspection and scrutiny regimes Media Lobbyists and advocacy groups Multiple interests, many connections & pathways to impact

Players and processes more important than products Importance of context Interaction with other types of knowledge (tacit; experiential) Multi-voiced dialogue ‘Use’ an interactive, non-linear, social & political process Moving away from ideas of ‘packaging’ knowledge and enabling knowledge transfer – recognising instead: Lesson 5: Develop multifaceted strategies to address interplay between supply & demand

Three generations of knowledge to action thinking Knowledge transfer Knowledge exchange Knowledge integration Knowledge a product – degree of use a function of effective packaging Knowledge the result of social & political processes – degree of use a function of effective relationships and interaction Knowledge embedded in systems and cultures – degree of use a function of effective integration with organisations and systems Source: Best et al 2008

Generic features of effective practices to increase research impact  Research must be translated - adaptation of findings to specific policy and practice contexts  Enthusiasm- of key individuals - personal contact is most effective  Contextual analysis - understanding and targeting specific barriers to, and enablers of, change  Credibility - strong evidence from trusted source, inc. endorsement from opinion leaders  Leadership - within research impact settings  Support - ongoing financial, technical & emotional support  Integration - of new activities with existing systems and activities

Lesson 6: Recognise role of dedicated knowledge broker organisations/ networks Three brokerage frameworks Knowledge management - facilitating creation, diffusion and use of knowledge Linkage and exchange - brokering the relationship between ‘creators’ and ‘users’ Capacity building - improving capacity to interpret and use evidence, and produce more accessible analytical reports Based on Oldham and McLean 1997

Lesson 7: Target multiple voices to increase opportunities for evidence to become part of policy discourse Feeding evidence into wider political & public debate Deliberative inquiry, citizen juries, etc More challenging approach for governments – ‘letting go’ More challenging role for researchers and research advocates – contestation and debate

Lesson 8: Evaluate (KE) strategies to improve research use and learn from this Rarely done in any systematic way KE an immature discipline: under-theorised and limited empirical evidence Underdeveloped evaluation frameworks and tools

Conclusions (or delusions/illusions?) No room for complacency Making an impact on public policy and practice is challenging at all times Crises tend to unsettle existing patterns of policy making and create opportunities for innovation and learning Researchers & evaluators need to provide compelling ideas and persuasive evidence in innovative and efficient ways

Thank You

Any questions?  