Www.3ieimpact.org Philip Davies Using Evidence for Policy and Practice Philip Davies International Initiative for Impact Evaluation [3ie] BCURE Evidence-Informed.

Slides:



Advertisements
Similar presentations
Methodologies for Assessing Social and Economic Performance in JESSICA Operations Gianni Carbonaro EIB - JESSICA and Investment Funds JESSICA Networking.
Advertisements

GEH Montebello Retreat February 2005 Establishing Knowledge Translation Platforms The Zambian Experience by Joseph M. Kasonde, MD, Executive Director of.
Action Learning Set: Support for Middle Leadership in Multi- agency settings Summary of progress: January 20th Output from questionnaires: -What.
Key Messages National Riparian Lands Research & Development Program Assessing Community Capacity for Riparian Restoration.
Scaling-Up Early Childhood Intervention Literacy Learning Practices Maurice McInerney, Ph.D. American Institutes for Research Presentation prepared for.
Knowledge transfer to policy makers (with apologies to John Lavis!)
Screen 1 of 43 Reporting Food Security Information Reporting Formats for Food Security Report Types Learning Objectives At the end of this lesson you will.
Philip Davies The Challenges Ahead for Impact Evaluation Studies in Poland Using Mixed Methods of Evaluation for Policy Making Purposes.
Valuing evaluation beyond programme boundaries: Communicating evaluations to enhance development effectiveness globally Anna Downie
Philip Davies International Initiative for Impact Evaluation [3ie] Getting Evidence Into Policy 3ie-LIDC Seminar Series 'What Works In.
Roma Inclusion: Monitoring and Evaluation for Results November 2010, Brussels, Belgium – DG REGIONAL POLICY Joost de Laat (PhD), Economist, Human Development.
Aboriginal and Torres Strait Islander Women’s Fund Incorporated ATSI Women’s Initiatives For the advancement of Aboriginal and Torres Strait Islander Women.
Practical Events Management Lecture Eleven: Event Evaluation.
Cochrane Agenda and Priority Setting Methods Group (CAPSMG)
ECVET WORKSHOP 2 22/23/24 November The European Quality Assurance Reference Framework.
Empowering Staff Through Institute Planning (ESTIP) Executive Workshop Institute Name: XXXXXX Presenter: XXXXXX Date: XXXXXX.
Dissemination pathways Science and policy
Lessons learnt from communicating the results of the four HSV trials conducted in South Africa Sinead Delany-Moretlwe.
Philip Davies Identifying the Problem Philip Davies International Initiative for Impact Evaluation [3ie] BCURE Evidence-Informed Decision-Making.
Factors that Influence Evaluation Utilisation. Theoretical Perspectives Michael Quinn Patton’s ‘Utilisation Focused Evaluation’ Addresses issue of use.
Health Needs Assessment John O’Dowd Scottish Government.
Research and Evaluation Center Jeffrey A. Butts John Jay College of Criminal Justice City University of New York August 7, 2012 How Researchers Generate.
Dr Abdul Ghaffar Executive Director Influencing policy-making Results of a multi-country study (London, 13 December, 2011)
Philip Davies How is a Policy Supposed to Work? – Theory of Change Analysis Philip Davies International Initiative for Impact Evaluation.
Philip Davies Making Evidence Accessible and Relevant for Policy and Practice Philip Davies International Initiative for Impact Evaluation.
Welcome to our sales workshop Writing a Sales Plan
Using knowledge utilisation theory to demonstrate how commissioned evaluations can influence program design and funding decisions Case studies from consultancy.
South East Asia - Optimising Reproductive & Child Health Outcomes in Developing Countries SEA-ORCHID Project Centre for Perinatal Health Services Research,
Professional Certificate – Managing Public Accounts Committees Ian “Ren” Rennie.
KT-EQUAL/ CARDI Workshop: ‘Lost in Translation’ 23 June 2011 Communicating research results to policy makers: A practitioner’s perspective.
Accountability in Health Promotion: Sharing Lessons Learned Management and Program Services Directorate Population and Public Health Branch Health Canada.
FLLLEX – Final Evaluation
Proposal Development Sample Proposal Format Mahmoud K. El -Jafari College of Business and Economics Al-Quds University – Jerusalem April 11,2007.
1 Webinar: Challenges in Clinical Training Ben Wallace, Executive Director, Clinical Training Reform Health Workforce Australia.
Adaptation knowledge needs and response under the UNFCCC process Adaptation Knowledge Day V Session 1: Knowledge Gaps Bonn, Germany 09 June 2014 Rojina.
RPA in Health & Social Care “Review of Effectiveness of Communication & Implementation of the Review of Public Administration in Health & Social Care”
Sub-Regional Workshop for GEF Focal Points in West and Central Africa Accra, Ghana, 9-11 July 2009 Tracking National Portfolios and Assessing Results.
Rapid Evidence Assessments and Evidence Gap Maps
Integrating Knowledge Translation and Exchange into a grant Maureen Dobbins, RN, PhD SON, January 14, 2013.
Introduction to the UJ- BCURE programme UJ-BCURE Funded by.
1 Government Social Research Unit Randomised Controlled Trials Conference University of York, September 2006 Why Governments Need.
Indicators Workshop South Africa PARIS21 Consortium April 2002.
Philip Davies What Is Evidence, and How Can It Improve Decision Making? Philip Davies International Initiative for Impact Evaluation.
Assessing Vulnerability and Adaptation to Climate-related Risks A Flavour of SEI Activities Stockholm Environment Institute Frank Thomalla with contributions.
European policy perspectives on social experimentation Antoine SAINT-DENIS and Szilvia KALMAN, European Commission - DG Employment, social affairs and.
Compensation Management. Compensation Employee compensation – refers to extrinsic and intangible rewards. – refers to all forms of pay or rewards going.
Glasgow, 17 May 2012 Mike Coles Developments in the validation of learning in the EU.
Regional Policy How are evaluations used in the EU? How to make them more usable? Stockholm, 8 October 2015 Kai Stryczynski, DG Regional and Urban Policy.
Project Management Learning Program 23 Mar – 3 Aprl 2009, Mekong Institute, Khon Kaen, Thailand Managing for Development Results Results-Oriented Monitoring.
GOALS OF USE OF RESEARCH Opportunities and choices in SRH Safer passages to adulthood Strengthened health systems Improved health and development.
Question-led mixed methods research synthesis Centre launch 21 June 2005 David Gough and Sandy Oliver Institute of Education, University of London.
ROMANIA MINISTRY OF EDUCATION, RESEARCH AND INNOVATION National Centre for Development of Vocational Education and Training Implementation Unit of Phare.
Project selection for sustainable energy projects Determining the most important factors.
Birte Snilstveit International Initiative for Impact Evaluation Farmer field schools: a systematic review Hugh Waddington, Birte Snilstveit,
Kathy Corbiere Service Delivery and Performance Commission
Philip Davies What Are Systematic Reviews, And Why Do We Need Them? Philip Davies International Initiative for Impact Evaluation [3ie]
Creating Innovation through International collaboration Melanie Relton & Helen Kidd, British Council 7 April 2013, Qatar.
Fundamentals of Governance: Parliament and Government Understanding and Demonstrating Assessment Criteria Facilitator: Tony Cash.
John N. Lavis, MD, PhD Professor and Canada Research Chair in Knowledge Transfer and Exchange McMaster University Program in Policy Decision-Making McMaster.
Collaboration Between Researchers and State Policymakers: Models for Achieving Evidence-Informed Policy Andrew Coburn, Ph.D Muskie School of Public Service.
How can stakeholders benefit from the two national initiatives? Tami McCrone, NFER 3 rd June 2015 Presentation for LSRN workshop: Developing Clear Messages.
Audit Oversight in an Emerging Economy Bernard Peter Agulhas Chief Executive Officer Independent Regulatory Board for Auditors.
Project Management in Marketing Deirdre Makepeace Level Verifier – Professional Diploma Assignment brief June and September 2014.
Using Data for Decision Making. Learning Objectives By the end of the session, participants will be able to: 1. Raise awareness of the importance of using.
STUDY VISIT ON 'EVIDENCE-BASED POLICY MAKING IN EDUCATION AND TRAINING‘, SWEDEN MAY 2012.
Youth in Focus. Young people’s voices “ money issues are a key thing for me” “the right kind of support is really important to me” “ forming relationships.
Strategic Communications Training of Trainers X State MDA 1.
SRHR Policy Salima 30 th June 2011 SRHR Policy Salima 30 th June 2011 Foundation for Children Rights.
Cochrane Agenda and Priority Setting Methods Group (CAPSMG)
European policy perspectives on social experimentation
Presentation transcript:

Philip Davies Using Evidence for Policy and Practice Philip Davies International Initiative for Impact Evaluation [3ie] BCURE Evidence-Informed Decision-Making Capacity Building Workshop 1st and 2nd June 2015 Pretoria, South Africa

Philip Davies Instrumental Use Involves acting on research results in specific, direct ways. Conceptual Use Involves using research results for general enlightenment; results influence actions, but in less specific, more indirect ways than in instrumental use Symbolic Use Involves using research results to legitimate and sustain pre-determined positions. How Evidence is Used in Policy Making

Philip Davies Instrumental, conceptual and symbolic uses of evidence are not mutually exclusive, but can operate in different ways at different stages in the policy cycle and under different political contexts. How Evidence is Used in Policy Making

Philip Davies Tested four variants of a means-tested conditional cash transfer paid to 16–18-year olds for staying in full- time education. Two levels of payment (£30 and £40) to either the young person or a primary carer (usually the mother); Combined with different levels of a retention bonus (£50 and £80) and an achievement bonus (£50 and £140). Male and female young people, and in both urban and rural areas. Comparison groups using propensity score matching The Educational Maintenance Allowance Instrumental Use of Evidence

Philip Davies Case Study - The Employment Retention and Advancement (ERA) Demonstration The Policy Issue: How to retain, and advance, low-pay/no-pay workers in the labour market Evidence Gathering: Systematic review of the global evidence on labour market and social welfare participation Secondary analysis of existing primary data (surveys and admin) Stakeholder analysis and involvement Empirical testing in six different labour market areas using experimental design (randomised controlled trial)

Philip Davies Case Study - The Employment Retention and Advancement (ERA) Demonstration Tested the likely impact and cost-effectiveness of:  A post-employment adviser service,  Cash rewards for staying in work and for completing training  In-work training support Random assignment of over 16,000 people from six regions of Britain: ERA programme or a business-as- usual control group. Seven years to complete – a strategic development evaluation But milestone data in real time from Year 1

Philip Davies [Extreme] Summary of Outcomes The ERA Demonstration Project ERA evaluation had a differential effect on different types of low-pay/no-pay workers [Anticipate heterogeneity of impacts] Initial impacts in Year1 were not sustained Initial small impacts in Year 1 had larger impacts in Years 3 and beyond The timing of impact measurement/monitoring is important Requires regular built-in M&E

Philip Davies Barriers to the Use of Evidence Policymakers’ lack of familiarity with the research process Researchers’ lack of familiarity with the policy process Trust (lack of trust) of policymakers in researchers (vice versa) Timeliness and availability of evidence Physical access to evidence Cognitive access to evidence ( i.e. lack of understanding) Lack of clarity in the presentation of evidence

Philip Davies Interactions between researchers and policymakers increases the prospects for research use by policymakers (Lavis et al, 2005). Early and ongoing involvement of relevant decision makers increase research utilisation (Lomas, 2000) Identify and use interpersonal networks and interactions Identify willing and able knowledge brokers Separate strategic from operational demands for evidence Get policy makers to own the evidence – not just the policy Overcoming Barriers to the Use of Evidence

Philip Davies Establish what research says and does not say Establish the policy messages and policy implications Use a 1:3:25 format Be clear - plain English summary Be persistent and opportunistic Improving Communication of Evidence

Philip Davies The ‘One’ in the 1:3:25 Format A one page of main messages The lessons decision makers can take from the research Not a summary of findings Implications of findings No details of methodology

Philip Davies These are the key findings of the study The classic Executive Summary Condensed to serve the needs of the busy decision maker Focus on how the study will be useful for policy Some brief mention of methodology Some implications of policy and practice The ‘Three’ in the 1:3:25 Format

Philip Davies The ‘25’ in the 1:3:25 Format This should include: Context/Background Approach (Methodology in Appendices, not text) Results Implications Knowledge gaps References Additional resources Appendices

Philip Davies Group Exercise Identify networks of policy makers and evidence providers in your work environment who might help to overcome barriers to using evidence Consider what actions might need to be taken in your work environment to get evidence into policy/practice Consider how the capacity to find, appraise and use evidence might be developed in your work environment

Philip Davies Thank you Philip Davies (0) Visit