Www.3ieimpact.org Philip Davies Making Evidence Accessible and Relevant for Policy and Practice Philip Davies International Initiative for Impact Evaluation.

Slides:



Advertisements
Similar presentations
Evidence-Based Policy at the Cabinet Office
Advertisements

Using Communications for Development 19 May 2006.
Scoring Goals or Changing the Game: What Impacts Should We Measure? Jonathan Lomas Canadian Health Services Research Foundation Presentation to ESRC Symposium:
School Improvement Through Capacity Building The PLC Process.
February Dakar, Senegal
Evidence-Based Decision Making: The Contribution of Systematic Reviews in Synthesizing Evidence.
Knowing if the RBF mechanism is working Incorporating Rigorous Impact Evaluation into your HRBF program Sebastian Martinez World Bank.
Measuring policy influence: like measuring thin air? John Young:
Government Social Research Unit Philip Davies PhD Government Social Research Unit HM Treasury London SW1A 2HQ What Can Social.
Philip Davies The Challenges Ahead for Impact Evaluation Studies in Poland Using Mixed Methods of Evaluation for Policy Making Purposes.
Good Evaluation Planning – and why this matters Presentation by Elliot Stern to Evaluation Network Meeting January 16 th 2015.
Philip Davies International Initiative for Impact Evaluation [3ie] Getting Evidence Into Policy 3ie-LIDC Seminar Series 'What Works In.
Project Monitoring Evaluation and Assessment
TRENDS IN DEVELOPMENT OF NATIONAL MONITORING AND EVALUATION SYSTEMS FOR AIDS RESPONSE Kevin Kelly Inaugural SAMEA Conference March 2007, Johannesburg.
Undertaking Systematic Literature Reviews By Dr. Luke Pittaway Institute for Entrepreneurship and Enterprise Development.
Critical Appraisal Dr Samira Alsenany Dr SA 2012 Dr Samira alsenany.
Using Evidence in Your Work* From Evidence to Action A CIHR Funded Project *Based on a presentation for the National RAI Forum: "Making the Most of It“:
Module 1: Key concepts in data demand & use
Return On Investment Integrated Monitoring and Evaluation Framework.
Dissemination pathways Science and policy
Philip Davies Using Evidence for Policy and Practice Philip Davies International Initiative for Impact Evaluation [3ie] BCURE Evidence-Informed.
RCOG International Office Consultancy Skills and Tools Angela Brown, Technical Assistance Manager, RCOG International Office.
Philip Davies Identifying the Problem Philip Davies International Initiative for Impact Evaluation [3ie] BCURE Evidence-Informed Decision-Making.
Bond.org.uk The Bond Effectiveness Programme: developing a sector wide framework for assessing and demonstrating effectiveness July 2011.
Monitoring & Evaluation in World Bank Agriculture and Rural Development Operations SASAR Monitoring & Evaluation Workshop New Delhi; June 20, 2006.
IAEA International Atomic Energy Agency The IAEA Safety Culture Assessment Methodology.
Training and Learning Needs Analysis (TLNA) a tool to promote effective workplace learning & development Helen Mason, Project Worker, Unionlearn Representing.
Lessons from RAPID’s work on research-policy links John Young.
Philip Davies How is a Policy Supposed to Work? – Theory of Change Analysis Philip Davies International Initiative for Impact Evaluation.
Evaluating the system-wide effects of HIV scale-up: methodological gaps, challenges and recommendations David Hotchkiss Health Systems 20/20/Tulane University.
Performance Measurement and Analysis for Health Organizations
South East Asia - Optimising Reproductive & Child Health Outcomes in Developing Countries SEA-ORCHID Project Centre for Perinatal Health Services Research,
Perioperative fasting guideline Getting it into practice Getting started.
KT-EQUAL/ CARDI Workshop: ‘Lost in Translation’ 23 June 2011 Communicating research results to policy makers: A practitioner’s perspective.
Medical Audit.
Evaluation methods and tools (Focus on delivery mechanism) Jela Tvrdonova, 2014.
HSRU is funded by the Chief Scientist Office of the Scottish Government Health Directorates. The author accepts full responsibility for this talk. Health.
Assessments. Assessment in the Project Cycle DESIGN IMPLEMENTATION MONITORING EVALUATION ASSESSMENT.
Components of a national drug prevention system Ms. UNODC.
QUT Library CRICOS No.00213J Division of Technology Information and Learning Support LATN Quality Assurance Benchmarking Project Presentation to CAUL April.
+ Conceptualizing Influence and Impact in Development Research Katie Wright.
Rapid Evidence Assessments and Evidence Gap Maps
Monitoring and Evaluation of GeSCI’s Activities GeSCI Team Meeting 5-6 Dec 2007.
Introduction to the UJ- BCURE programme UJ-BCURE Funded by.
PPA 502 – Program Evaluation Lecture 2c – Process Evaluation.
UKPopNet Workshop 1 Undertaking a Systematic Review Andrew S. Pullin Centre for Evidence-Based Conservation University of Birmingham, UK.
Workshop on VHL and HEN, Sao Paulo, April 2006 Workshop on VHL and HEN Sao Paulo, April 2006 Anca Dumitrescu, M.D. WHO Regional Office for.
1 Government Social Research Unit Randomised Controlled Trials Conference University of York, September 2006 Why Governments Need.
Philip Davies What Is Evidence, and How Can It Improve Decision Making? Philip Davies International Initiative for Impact Evaluation.
Africa Program for Impact Evaluation on HIV/AIDS (AIM-AIDS) Cape Town, March 2009 Workshop of the Africa Program for Impact Evaluation of HIV/AIDS.
ERC 1 Energy Research Centre University of Cape Town Alison Hughes.
The P Process Strategic Design
Question-led mixed methods research synthesis Centre launch 21 June 2005 David Gough and Sandy Oliver Institute of Education, University of London.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Kathy Corbiere Service Delivery and Performance Commission
Philip Davies What Are Systematic Reviews, And Why Do We Need Them? Philip Davies International Initiative for Impact Evaluation [3ie]
Influencing Policy through Research: Introduction to Principles and Tools Arnaldo Pellini:
John N. Lavis, MD, PhD Professor and Canada Research Chair in Knowledge Transfer and Exchange McMaster University Program in Policy Decision-Making McMaster.
What is convincing evidence? Naved Chowdhury & Enrique Mendizabal Objective of the session: –To arrive at a definition of CONVINCING evidence –what makes.
Collaboration Between Researchers and State Policymakers: Models for Achieving Evidence-Informed Policy Andrew Coburn, Ph.D Muskie School of Public Service.
ESRC Research Methods Festival st July 2008 Exploring service user participation in the systematic review process Sarah Carr, Research Analyst,
Analysis and Critical Thinking in Assessment 1. What is the problem? Gathering information Using information to inform decisions/ judgment Synthesising.
From Data Poor, Information Poor to Data Rich, Information Rich Decision- Making: Design and Implementation of the Rocky View Schools Student Information.
Developing your research question Fiona Alderdice and Mike Clarke.
AGRO PARKS “The Policy Cycle” Alex Page Baku November 2014.
Improving Lives through Impact Evaluation Dr. Jyotsna (Jo) Puri Head of Evaluation Deputy Executive Director, 3ie Presentation at the IEO Inauguration.
By Dr. Talat AnwarAdvisor Centre for Policy Studies, CIIT, Islamabad Centre for Policy Studies, CIIT, Islamabad
Why Government Education Initiatives Work - or Don’t
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 3. Session 9. Periodic data collection methods.
Presentation transcript:

Philip Davies Making Evidence Accessible and Relevant for Policy and Practice Philip Davies International Initiative for Impact Evaluation [3ie] Africa Evidence Network Regional Meeting Johannesburg, South Africa, 3 rd June 2015

Philip Davies Helping people make better decisions and achieve better outcomes, by using the best available evidence from research and other sources Knowing what are effective interventions (“what works?”) In achieving which outcomes? For which groups of people? Under what conditions? Over what time span? At what costs?, plus Integrating research with decision makers’ knowledge, skills, experience, expertise and judgement What is Evidence-Based Policy

Philip Davies Evidence Experience & Expertise Judgement Resources Values, Beliefs and Ideology Habits & Bureaucratic Culture Lobbyists & Pressure Groups Pragmatics & Contingencies Factors Other Than Evidence

Philip Davies Understanding the Problem (Conceptualisation) Developing Solutions (Policy Development) Putting Solutions Into Effect (Implementation) Monitoring and Evaluation (M&E) The ‘Classic’ Policy Cycle The ‘ROAMEF’ Policy Cycle Evidence is required across the entire policy cycle

Philip Davies The Kind Of Evidence Decision-Makers Look For Identifying the nature, size and dynamics of the problem Specifying the desired objectives Identifying viable policy options Identifying how the policy is supposed to work Identifying the likely and achieved outcomes/impacts Identifying the social distribution of outcomes/impacts Understanding people’s attitudes, experiences, behaviour Valuing the impacts (cost-benefit/cost-effectiveness) Identifying effective implementation and delivery

Philip Davies Types of Evidence for Policy Making

Philip Davies Theory of Change/Logic Model/Programme Theory How is a policy/programme supposed to work? What activities, mechanisms, people, outputs have to be in place? And in what sequence – what is the causal chain? What resources are required – and are available? What data are required – and are available? Is the policy/programme feasible/achievable?

Philip Davies Constituent Features of a Theory of Change Assumptions?

Philip Davies Constituent Features of a Theory of Change Data Required Surveys, statistics, demographic data Qualitative data Costs/benefits data Systematic review data Documentary analysis Surveys, statistics, demographic data Qualitative data Costs/benefits data Systematic review data Documentary analysis Performance data Historical data Diversity data Qualitative data Effectiveness data Performance data Historical data Diversity data Qualitative data Effectiveness data Stakeholder data Qualitative data Public opinion data Effectiveness data Stakeholder data Qualitative data Public opinion data Effectiveness data Performance data Effectiveness data Stakeholder data Qualitative data Costs/benefits data Performance data Effectiveness data Stakeholder data Qualitative data Costs/benefits data Administrative data Performance data Costs/benefits data Administrative data Performance data Costs/benefits data Administrative data Performance data Qualitative data Administrative data Performance data Qualitative data Counterfactual data Administrative data Survey data, statistics Cost/benefit data Counterfactual data Administrative data Survey data, statistics Cost/benefit data

Philip Davies Types of Systematic Review Statistical Meta-Analyses Narrative Systematic Reviews Qualitative Systematic Reviews Rapid Evidence Assessments Evidence Maps and Gap Maps

Philip Davies  Single studies can:  Misrepresent the balance of research evidence  Illuminate only one part of a policy issue  Be sample-specific, time-specific, context- specific  Often be of poor quality Why Do We Need Systematic Reviews?  Consequently, give a biased view of the overall evidence

Philip Davies Systematic searching for studies Systematic critical appraisal of identified studies – separating the wheat from the chaff Systematic and transparent inclusion/exclusion of studies for final review Systematic and transparent extraction of data Systematic statistical testing and analysis Systematic reporting of findings What Makes a Review Systematic?

Philip Davies Statistical Meta-Analytical Reviews Source: David B. Wilson, 2006, A systematic review of drug court effects on recidivism

Philip Davies Synthesise qualitative and ethnographic evidence In-depth interviews, focus groups, observational studies, documentary analysis, case studies Seek common themes, concepts and principles across different studies Detailed attention to context/contextual specificity And stakeholders’ views Do not seek generalisations Qualitative Systematic Reviews Photo © Albert Gonzalez Farran - UNAMID

Philip Davies Scaled down systematic reviews of existing evidence Timed to meet the needs of policy makers/practitioners (1-3 months) Strategically using the ‘three arms’ of systematic searching, but less exhaustively Rapid Evidence Assessments – What Are They? Critical appraisal of identified studies is included Summary of findings, with caveats and qualifications Photo © Panos East Africa

Philip Davies Rapid Evidence Assessments – How Scaled Down?

Philip Davies 3ie Evidence Gap Maps Maps of the existing evidence base on a policy issue, topic or sector such as maternal health, HIV/AIDS, agriculture, extreme poverty Structured around a framework of interventions and outcomes (intermediate and final) A ways of identifying where there is evidence, and where there is not An indication of the quality of this evidence Links to user-friendly summaries in the 3ie database of systematic reviews.

Philip Davies Rapid Evidence Assessments - Limitations Evidence Gap Maps

Philip Davies

Philip Davies

Philip Davies Some Key Sources of Sythesised Evidence 3ie Impact Evaluations Database ( 3ie Systematic Reviews Database ( ( Best Evidence Encyclopedia ( Cochrane Collaboration ( Campbell Collaboration ( Collaboration for Environmental Evaluation ( National Institute for Health and Clinical Excellence (http: NHS Evidence ( National Guidelines Clearinghouse (USA) ( Prospero: International Prospective Register of Systematic Reviews ( Social Care Institute for Excellence ( Social Programs That Work (

Philip Davies UK Policymakers’ Views of Evidence Focus on the ‘end product’, rather than how the information was either collected or analysed Use of ‘anecdotal’ evidence (“tells a story”) Drawing on such things as ‘real life stories’, ‘fingers in the wind’, ‘local’ and ‘bottom-up’ evidence But: “If we try and move anywhere without having the scientific basis to do so we get fleeced in the House” And: DfID Evidence into Action Team And: BCURE Programme + DPME (South Africa)

Philip Davies Sharks Where Do UK Civil Servants Go For Evidence? Plankton Academic/Evaluation Research?

Philip Davies UK Policymakers’ Views of Research Evidence Too Long Verbose Too Detailed Too Dense Impenetrable Too Much Jargon Too Methodological Untimely Irrelevant for policy Source: Campbell, S., et al; 2007, Analysis for Policy

Philip Davies Establish what research says and does not say Establish the policy messages and policy implications Use a 1:3:25 format Very little mention of methodology Be clear - plain English summary Be persistent and opportunistic Improving Communication of Evidence

Philip Davies Thank you Philip Davies (0) Visit