Impact Evaluations and Development Draft NONIE Guidance on Impact Evaluation Cairo Conference: Perspectives on Impact Evaluation Tuesday, March 31, 2009.

Slides:



Advertisements
Similar presentations
Results-Based Management (RBM) and the UNDAF Results Matrix.
Advertisements

When are Impact Evaluations (IE) Appropriate and Feasible? Michele Tarsilla, Ph.D. InterAction IE Workshop May 13, 2013.
The Social Scientific Method An Introduction to Social Science Research Methodology.
Introduction to Monitoring and Evaluation
Mywish K. Maredia Michigan State University
Good Evaluation Planning – and why this matters Presentation by Elliot Stern to Evaluation Network Meeting January 16 th 2015.
Understanding the Research Process
Project Monitoring Evaluation and Assessment
How to evaluate ultimate impact of value chain interventions? Mixed methods design for attributing indirect interventions to farmers’ income. The case.
Results-Based Management: Logical Framework Approach
PPA 502 – Program Evaluation
Seminar on selected evaluation methodology issues 评估方法论研讨会 Independent Office of Evaluation of IFAD Beijing, 16 July 2014 国际农业发展基金独立评估办公室 2014 年 7 月 16.
TOOLS OF POSITIVE ANALYSIS
Types of Evaluation.
Really Using (Useful) Theories of Change IPDET 2013 John Mayne, Advisor on Public Sector Performance
Formulating the research design
Monitoring Evaluation Impact Assessment Objectives Be able to n explain basic monitoring and evaluation theory in relation to accountability n Identify.
Proposal Writing for Competitive Grant Systems
Lesson 5 – Logical Framework Approach (LFA)
Program Evaluation & Research In Service-Learning Service-Learning Mini-Institute Lynn E. Pelco, Ph.D. Division of Community Engagement.
Evaluating the system-wide effects of HIV scale-up: methodological gaps, challenges and recommendations David Hotchkiss Health Systems 20/20/Tulane University.
Four key tasks in impact assessment of complex interventions Background to a proposed research and development project 26 September 2008 Bioversity, Rome,
Performance Measures AmeriCorps Project Director Training Saratoga, NY October 7 th – 9 th, 2013.
Types of evaluation examine different aspects of performance Resources (Inputs) ActivitiesOutputs Short-Term Outcomes Intermediate Outcomes (through customers)
Evaluation methods and tools (Focus on delivery mechanism) Jela Tvrdonova, 2014.
1 DEMONSTRATING VALUE Presentation to the Pathfinder Working Group 15 November 2001.
PERSPECTIVES ON IMPACT EVALUATION Approaches to Assessing Development Effectiveness CAIRO, March - April 2009 Ignacio Pardo Universidad de la República,
Semester 2: Lecture 9 Analyzing Qualitative Data: Evaluation Research Prepared by: Dr. Lloyd Waller ©
EVALUATION APPROACHES Heather Aquilina 24 March 2015.
The LOGICAL FRAMEWORK Scoping the Essential Elements of a Project Dr. Suchat Katima Mekong Institute.
Methods: Pointers for good practice Ensure that the method used is adequately described Use a multi-method approach and cross-check where possible - triangulation.
GSSR Research Methodology and Methods of Social Inquiry socialinquiry.wordpress.com January 17, 2012 I. Mixed Methods Research II. Evaluation Research.
The Results-Based System Awoke Kassa ENTRO M&E Officer ENTRO M&E Officer SDCO Capacity Building Workshop IV October 2008 Cairo Nile Basin Initiative.
Impact Evaluation in Education Introduction to Monitoring and Evaluation Andrew Jenkins 23/03/14.
PPA 502 – Program Evaluation Lecture 2c – Process Evaluation.
Module 5 - Questions and Criteria for Evaluations.
Africa Impact Evaluation Program on AIDS (AIM-AIDS) Cape Town, South Africa March 8 – 13, Causal Inference Nandini Krishnan Africa Impact Evaluation.
Good Hygiene Practices along the coffee chain Training – Evaluating and reinforcing training Module 5.3.
Potential and Pitfalls of Experimental Impact Evaluation: Reflections on the design and implementation of an experimental Payments for Environmental Services.
Approach to GEF IW SCS Impact Evaluation Aaron Zazueta Reference Group Meeting Bangkok, Thailand September 27, 2010.
Screen 1 of 22 Food Security Policies – Formulation and Implementation Policy Monitoring and Evaluation LEARNING OBJECTIVES Define the purpose of a monitoring.
IPDET 2015 WHAT IS EVALUATION? Robert Picciotto King’s College, London "There are no facts, only interpretations“ Friedrich Nietzsche 1.
This research has received funding from the European Community’s Seventh Framework Programme (FP7/ ) under grant agreement No The LiveDiverse.
SUB-MODULE. 3 RESULTS CHAIN RESULTS BASED LOGICAL FRAMEWORK TRAINING Quality Assurance and Results Department (ORQR.2)
Draft NONIE guidance March 2009 Discussant comments Patricia Rogers.
Advanced Engineering Projects Management Dr. Nabil I El Sawalhi Associate Professor of Construction Management 1AEPM 4.
Current practices in impact evaluation Howard White Independent Evaluation Group World Bank.
Plan © Plan Post-Intervention Studies Presentation delivered to the conference on Perspectives on Impact Evaluation, Cairo 1 April 2009 By Junaid Habib.
Evaluation design and implementation Puja Myles
Logical Framework Slide 1 Mekong Institute & UNESCO Regional Office-Bangkok 23 February – 6 March 2009; Khon Kaen, Thailand Prepared by the Education Policy.
The implementation programme for the 2008 SNA and supporting statistics UNSD-Regional Commissions Coordination Meeting on Integrated Economic Statistics.
ACTED AME Appraisal, Monitoring and Evaluation. Summary 1/ ACTED AME department 2/ AME Responsibilities 3/ AME throughout project cycle 4/ Involvement.
Key Components of a successful Proposal OAS / IACML Workshop on Technical Assistance. San José, Costa Rica May 8, 2007 By: José Luis Alvarez R. Consultant.
Prof. (FH) Dr. Alexandra Caspari Rigorous Impact Evaluation What It Is About and How It Can Be.
What is Impact Evaluation … and How Do We Use It? Deon Filmer Development Research Group, The World Bank Evidence-Based Decision-Making in Education Workshop.
Interpreting Communication Research by Frey, Botan, Friedman, & Kreps PREFACE.
World Bank Group Impact Evaluations: Relevance and Effectiveness Jeff Tanner Task Team Leader for Impact Evaluations Independent Evaluation Group Based.
(I)WRM indicators A GWP PERSPECTIVE Water Country Briefs Project Diagnostic Workshop, Geneva, December 2010 Mike Muller : GWP-TEC.
Bringing Diversity into Impact Evaluation: Towards a Broadened View of Design and Methods for Impact Evaluation Sanjeev Sridharan.
ICT4D: Evaluation Dr David Hollow DSV, University of Stockholm.
Monitoring and evaluation 16 July 2009 Michael Samson UNICEF/ IDS Course on Social Protection.
Interreg Programmes Preliminary Conclusions May 2016.
Evaluation What is evaluation?
Evaluation Value Chain Interventions LEI approach in practice Giel Ton Agricultural Economics Research Institute - LEI Wageningen UR 3 November 2011 The.
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 3. Session 9. Periodic data collection methods.
Strategic Planning for Learning Organizations
Evaluation of Nutrition-Sensitive Programs*
© Foundations of Success
Evaluating agricultural value chain programs: How we mix our methods
Climate Change Impacts on Agriculture and Water
Presentation transcript:

Impact Evaluations and Development Draft NONIE Guidance on Impact Evaluation Cairo Conference: Perspectives on Impact Evaluation Tuesday, March 31, 2009 Frans Leeuw Maastricht University & WODC Jos Vaessen Maastricht University & University of Antwerp

Outline 1.Introduction 2.Methodological and conceptual issues for impact evaluation 3.Managing impact evaluations

Introduction Drafting the NONIE Guidance NONIE uses the OECD-DAC definition of impacts: Three basic premises: –No single method is best for addressing the variety of questions and aspects that might be part of IE –However, ‘logic of comparative advantages’ –Particular methods or perspectives complement each other in providing a more complete ‘picture’ of impact “The positive and negative, primary and secondary long-term effects produced by a development intervention, directly or indirectly, intended or unintended. These effects can be economic, sociocultural, institutional, environmental, technological or of other types” (OECD- DAC, 2002: 24).

Six methodological and conceptual issues 1.Identify the type and scope of the intervention 2.Agree on the objectives of the intervention that are valued 3.Articulate the theories linking interventions to results 4.Address the attribution problem 5.Build on existing knowledge relevant to the impact of interventions 6.Use a mixed methods approach: the logic of comparative advantages

1. Identify the type and scope of the intervention Impact of what vs. impact on what Impact of what: –Continuum of interventions –‘Holistic’ vs. deconstruction Impact on what: –Complexity of processes of change –Levels of impact: institutional vs. beneficiary level

1. Identify the type and scope of the intervention

2. Agree on the objectives of the intervention that are valued What to evaluate should be a balance between what stakeholders find important and the empirical reality of (processes of) change Intended vs. unintended effects Short-term vs. long-term effects Sustainability of effects Translate objectives into measurable indicators, but at the same time do not lose track of aspects that are difficult to measure

3. Articulate the theories linking interventions to results Interventions are theories: opening up the ‘black box’ Theories are partly ‘hidden’ and require reconstruction Theory-based IE  continuum of options ranging from: telling the causal story to benchmark for formal testing of causal assumptions

4. Address the attribution problem Attribution problem: to what extent can results of interest be attributed to an intervention? Importance of counterfactual analysis value target variable time ‘ before ’‘ after ’ a b c

4. Address the attribution problem Experimental, quasi-experimental and regression-based techniques have a comparative advantage in addressing the issue of attribution: –Counterfactual analysis –Systematic treatment of threats to validity of claims is possible (and should be done!) Limitations in applicability

5. Build on existing knowledge relevant to the impact of interventions Most interventions are not ‘new’  rely on similar mechanisms of change Example of types of mechanisms: –Situational mechanisms –Action-formation mechanisms –Transformational mechanisms Systematic review and synthesis approaches are useful tools for learning about the existing evidence on interventions

6. Use a mixed methods approach: the logic of comparative advantages Particular methods have comparative advantages in addressing specific aspects of impact Conceptual framework by Campbell, Cook, Shadish: internal validity: Is there a causal relationship between intervention and effects? external validity: Can we generalize findings to other settings? construct validity: Do the variables that we are measuring adequately represent the phenomena we are interested in?

6. Use a mixed methods approach: the logic of comparative advantages  Example of how the logic works: impact of incentives on LU change and farmer livelihoods E.g. randomized experiment can test effectiveness of different incentives on LU change and/or socio-economic effects of these changes (internal validity) E.g. survey data and case studies can tell us how incentives have different effects on particular types of farm households (strengthens internal validity and increases external validity of findings) E.g. semi-structured interviews and focus group conversations can tell us more about the nature of effects in terms of production, consumption, poverty, etc. (construct validity)

Managing impact evaluations Determine if an IE is feasible and worth the cost Start early – getting the data Front-end planning is important

THANK YOU