Mywish K. Maredia Michigan State University

Slides:



Advertisements
Similar presentations
1 EU Strategy for the Baltic Sea Region Evaluation: Setting Outcome Indicators and Targets Seminar: 15 March 2011, La Hulpe Veronica Gaffey Acting Director.
Advertisements

1 CASE STUDY RESEARCH An Introduction. 2 WHY CASE STUDY RESEARCH? The case study method is amongst the most flexible of research designs, and is particularly.
A Guide to Education Research in the Era of NCLB Brian Jacob University of Michigan December 5, 2007.
Philip Davies The Challenges Ahead for Impact Evaluation Studies in Poland Using Mixed Methods of Evaluation for Policy Making Purposes.
Good Evaluation Planning – and why this matters Presentation by Elliot Stern to Evaluation Network Meeting January 16 th 2015.
Project Monitoring Evaluation and Assessment
The Lumina Center Grantseeking Workshop Series Presents Outcomes & Evaluations April 20, 2006.
Specifying a Purpose, Research Questions or Hypothesis
TOOLS OF POSITIVE ANALYSIS
Evidence: What It Is And Where To Find It Building Evidence of Effectiveness Copyright © 2014 by JBS International, Inc. Developed by JBS International.
Measuring Success & Impact: Challenges & Opportunities for STEM Early Learning Programs Tiffany R. Lee University of Colorado Boulder University of Washington.
Types of Evaluation.
Formulating the research design
PAI786: Urban Policy Class 2: Evaluating Social Programs.
How to Develop the Right Research Questions for Program Evaluation
Selecting an Appropriate Design for the Evaluation
Monitoring Evaluation Impact Assessment Objectives Be able to n explain basic monitoring and evaluation theory in relation to accountability n Identify.
CONCEPT PAPER RESULT BASED PLANNING. RESULT-ORIENTED PLANNING Overall Objective/ Goal Specific Objective/ Purposes Expected Result/ Output Activities.
Research and Evaluation Center Jeffrey A. Butts John Jay College of Criminal Justice City University of New York August 7, 2012 How Researchers Generate.
Selecting a Research Design. Research Design Refers to the outline, plan, or strategy specifying the procedure to be used in answering research questions.
Moving from Development to Efficacy & Intervention Fidelity Topics National Center for Special Education Research Grantee Meeting: June 28, 2010.
Evaluation – the why’s, the what’s and the how’s 2014 Dr Basma Ellahi (PI) Cat Crum (PhD Student)
Sociology 3322a. “…the systematic assessment of the operation and/or outcomes of a program or policy, compared to a set of explicit or implicit standards.
Evaluation methods and tools (Focus on delivery mechanism) Jela Tvrdonova, 2014.
RESEARCH IN MATH EDUCATION-3
Project design & Planning The Logical Framework Approach An Over View Icelandic International Development Agency (ICEIDA) Iceland United Nations University.
Semester 2: Lecture 9 Analyzing Qualitative Data: Evaluation Research Prepared by: Dr. Lloyd Waller ©
Overview of Evaluation Designs. Learning objectives By the end of this presentation, you will be able to: Explain evaluation design Describe the differences.
QUICK OVERVIEW - REDUCING REOFFENDING EVALUATION PACK INTRODUCTION TO A 4 STEP EVALUATION APPROACH.
Setting the Stage: Workshop Framing and Crosscutting Issues Simon Hearn, ODI Evaluation Methods for Large-Scale, Complex, Multi- National Global Health.
EVALUATION APPROACHES Heather Aquilina 24 March 2015.
Evaluation – Principles and methods Matt Barnard Head of Evaluation NSPCC.
Monitoring & Evaluation: The concepts and meaning Day 9 Session 1.
Impact Evaluation in Education Introduction to Monitoring and Evaluation Andrew Jenkins 23/03/14.
PPA 502 – Program Evaluation Lecture 2c – Process Evaluation.
Quantitative SOTL Research Methods Krista Trinder, College of Medicine Brad Wuetherick, GMCTE October 28, 2010.
CAUSAL INFERENCE Presented by: Dan Dowhower Alysia Cohen H 615 Friday, October 4, 2013.
 Welcome, introductions  Conceptualizing the evaluation problem  stakeholder interests  Divergent and convergent processes  Developing evaluation.
Evaluation Designs Adrienne DiTommaso, MPA, CNCS Office of Research and Evaluation.
Developing a Framework In Support of a Community of Practice in ABI Jason Newberry, Research Director Tanya Darisi, Senior Researcher
Evaluation design and implementation Puja Myles
An Expanded Model of Evidence-based Practice in Special Education Randy Keyworth Jack States Ronnie Detrich Wing Institute.
Designing New Programs Design & Chronological Perspectives (Presentation of Berk & Rossi’s Thinking About Program Evaluation, Sage Press, 1990)
Randomized Assignment Difference-in-Differences
Prof. (FH) Dr. Alexandra Caspari Rigorous Impact Evaluation What It Is About and How It Can Be.
Quality Evaluations in Education Interventions 1 March 2016 Dr Fatima Adam Zenex Foundation.
Do European Social Fund labour market interventions work? Counterfactual evidence from the Czech Republic. Vladimir Kváča, Czech Ministry of Labour and.
Monitoring and evaluation 16 July 2009 Michael Samson UNICEF/ IDS Course on Social Protection.
Evaluation What is evaluation?
1 An introduction to Impact Evaluation (IE) for HIV/AIDS Programs March 12, 2009 Cape Town Léandre Bassolé ACTafrica, The World Bank.
Chapter 6 Selecting a Design. Research Design The overall approach to the study that details all the major components describing how the research will.
Introduction to Impact Evaluation The Motivation Emmanuel Skoufias The World Bank PRMPR PREM Learning Week: April 21-22, 2008.
Issues in Evaluating Educational Research
Technical Assistance on Evaluating SDGs: Leave No One Behind
Chapter 11: Quasi-Experimental and Single Case Experimental Designs
Evaluation: For Whom and for What?
Technical Assistance on Evaluating SDGs: Leave No One Behind
Food and Agriculture Organization of the United Nations
Right-sized Evaluation
Fundamentals of Monitoring and Evaluation
CASE STUDY RESEARCH An Introduction.
Program Evaluation Essentials-- Part 2
Monitoring and Evaluation of Postharvest Training Projects
III. Practical Considerations in preparing a CIE
Evaluating Impacts: An Overview of Quantitative Methods
WHAT is evaluation and WHY is it important?
Positive analysis in public finance
OGB Partner Advocacy Workshop 18th & 19th March 2010
Monitoring and Evaluating FGM/C abandonment programs
Presentation transcript:

Designing Impact Evaluations: What are the Appropriate Research Questions and Methods? Mywish K. Maredia Michigan State University Workshop for Managers of Impact Evaluation May 13, 2013 InterAction, 1400 16th St. NW, Suite 210 Washington, DC

IMPACT

Your Role in Explaining the ‘Miracle’ As Impact Evaluation Managers, your role is to ensure that a plan is in place to bridge the knowledge gap between how project outputs result in impacts; and Doing so based on evidence generated using credible methodology

Focus of this presentation Discuss the development of appropriate impact evaluation questions What types of impact evaluation designs are most appropriate in different contexts and given the evaluation questions? Purpose: To share some preliminary thoughts; present a framework; and To facilitate discussion and exchange of ideas / experience

Clarifying the term: Impact evaluation What it is: It is concerned with establishing a causal link between realized impacts (the effect) and an intervention (the ‘cause’) which could be a program, activity, policy change, etc. The goal of the analysis is to ‘rule out’ other possibilities / explanations for the observed effects

What do we mean by evaluation design? Every evaluation is essentially a research or a discovery project/activity If your results are to be reliable, you have to give the evaluation a structure that will tell you what you want to know That structure – the arrangement of discovery- is the evaluation’s design The design depends on what kinds of questions your evaluation is meant to answer

Development of Impact Evaluation Questions Characteristics of ‘appropriate’ IE questions: They should be narrow/specific Focus on small number of questions (~5) Focused on ‘summative’ evaluation of a project/intervention They should reflect the input of program staff and sponsors

Examples of common impact evaluation (research) questions Overall impact (effectiveness) Did it work? Did the intervention produce the intended impacts in the short, medium and long term? For whom, in what ways and in what circumstances did the intervention work? What unintended impacts (positive and negative) did the intervention produce? Source: Rogers (2012) Introduction to Impact Evaluation. Impact Evaluation Notes No.1

Examples of common impact evaluation (research) questions (cont’d) Nature of impacts and their distribution Are impacts likely to be sustainable? Did these impacts reach all intended beneficiaries? Influence of other factors on the impacts How did the intervention work in conjunction with other interventions, programs or services to achieve outcomes? What helped or hindered the intervention to achieve these impacts?

Examples of common impact evaluation (research) questions (cont’d) How it works How did the intervention contribute to intended impacts? What were the particular features of the intervention that made a difference? To what extent are differences in impact explained by variations in implementation? Matching intended impacts to needs To what extent did the impacts match the needs of the intended beneficiaries?

Common impact evaluation designs (focused on causal analysis) Methods for examining the factual. For e.g.: Comparative case studies Beneficiary/expert attribution Methods for creating counterfactual Experimental designs or RCTs (based on the principle of random assignment) Pipeline comparisons Other methods/approaches (using statistical techniques to form credible comparison groups). For e.g.,: Propensity score matching (PSM) Instrumental variables (IV) Regression discontinuity (RD) Difference in difference (DD)

IE Methods In theory, there are multiplicity of methods and approaches that can be used to assess impacts Each have problems and limitations There is no ‘one size fits all’ method/approach

Choosing an appropriate impact evaluation design Depends on… 1. The nature of the research questions Research Questions Appropriate Methods What is the effectiveness of a program Observational and correlational methods Whether observed effects can reasonably be attributed to the intervention and not to other sources Experimental and quasi-experimental methods What is the net impact of the program Cost effectiveness; cost-benefit analysis with qualitative methods to summarize the full range of impacts

Choosing an appropriate impact evaluation design (cont’d) Depends on… 2. The nature of your program Nature of your program Methods to consider Will you roll out your program over time? Pipeline design What is the unit of intervention--individuals, groups, communities? Those that give enough statistical power based on the number of ‘units of observations’ Is program assigned to participants or do they self-select? RCT, quasi-experimental There is no credible reason for other influencing factors (e.g., water pump) Before/after comparison

Choosing an appropriate impact evaluation design (cont’d) Depends on… 3. what participants / stakeholders will consent to? 4. What are your resources and time constraints?

Practical Considerations in Designing Impact Evaluation Establishing the program theory (the logic behind a program and the causal chain from inputs to outcome to impact) Understanding the program setting How participants are selected (to mitigate selection bias) Decision tree on which method is applicable and should be explored

Practical Considerations in Designing Impact Evaluation (Cont’d) Sample size Power calculation – Does the setting allow for enough numbers of units of intervention and units of observation for a robust design? Tradeoff between power and cost Time frame Is there enough time to observe the impact? Flexibility Strive for rigor, but be flexible…

Thank you