Presentation is loading. Please wait.

Presentation is loading. Please wait.

Mywish K. Maredia Michigan State University

Similar presentations


Presentation on theme: "Mywish K. Maredia Michigan State University"— Presentation transcript:

1 Designing Impact Evaluations: What are the Appropriate Research Questions and Methods?
Mywish K. Maredia Michigan State University Workshop for Managers of Impact Evaluation May 13, 2013 InterAction, th St. NW, Suite 210 Washington, DC

2 IMPACT

3 Your Role in Explaining the ‘Miracle’
As Impact Evaluation Managers, your role is to ensure that a plan is in place to bridge the knowledge gap between how project outputs result in impacts; and Doing so based on evidence generated using credible methodology

4 Focus of this presentation
Discuss the development of appropriate impact evaluation questions What types of impact evaluation designs are most appropriate in different contexts and given the evaluation questions? Purpose: To share some preliminary thoughts; present a framework; and To facilitate discussion and exchange of ideas / experience

5 Clarifying the term: Impact evaluation
What it is: It is concerned with establishing a causal link between realized impacts (the effect) and an intervention (the ‘cause’) which could be a program, activity, policy change, etc. The goal of the analysis is to ‘rule out’ other possibilities / explanations for the observed effects

6 What do we mean by evaluation design?
Every evaluation is essentially a research or a discovery project/activity If your results are to be reliable, you have to give the evaluation a structure that will tell you what you want to know That structure – the arrangement of discovery- is the evaluation’s design The design depends on what kinds of questions your evaluation is meant to answer

7 Development of Impact Evaluation Questions
Characteristics of ‘appropriate’ IE questions: They should be narrow/specific Focus on small number of questions (~5) Focused on ‘summative’ evaluation of a project/intervention They should reflect the input of program staff and sponsors

8 Examples of common impact evaluation (research) questions
Overall impact (effectiveness) Did it work? Did the intervention produce the intended impacts in the short, medium and long term? For whom, in what ways and in what circumstances did the intervention work? What unintended impacts (positive and negative) did the intervention produce? Source: Rogers (2012) Introduction to Impact Evaluation. Impact Evaluation Notes No.1

9 Examples of common impact evaluation (research) questions (cont’d)
Nature of impacts and their distribution Are impacts likely to be sustainable? Did these impacts reach all intended beneficiaries? Influence of other factors on the impacts How did the intervention work in conjunction with other interventions, programs or services to achieve outcomes? What helped or hindered the intervention to achieve these impacts?

10 Examples of common impact evaluation (research) questions (cont’d)
How it works How did the intervention contribute to intended impacts? What were the particular features of the intervention that made a difference? To what extent are differences in impact explained by variations in implementation? Matching intended impacts to needs To what extent did the impacts match the needs of the intended beneficiaries?

11 Common impact evaluation designs (focused on causal analysis)
Methods for examining the factual. For e.g.: Comparative case studies Beneficiary/expert attribution Methods for creating counterfactual Experimental designs or RCTs (based on the principle of random assignment) Pipeline comparisons Other methods/approaches (using statistical techniques to form credible comparison groups). For e.g.,: Propensity score matching (PSM) Instrumental variables (IV) Regression discontinuity (RD) Difference in difference (DD)

12 IE Methods In theory, there are multiplicity of methods and approaches that can be used to assess impacts Each have problems and limitations There is no ‘one size fits all’ method/approach

13 Choosing an appropriate impact evaluation design
Depends on… 1. The nature of the research questions Research Questions Appropriate Methods What is the effectiveness of a program Observational and correlational methods Whether observed effects can reasonably be attributed to the intervention and not to other sources Experimental and quasi-experimental methods What is the net impact of the program Cost effectiveness; cost-benefit analysis with qualitative methods to summarize the full range of impacts

14 Choosing an appropriate impact evaluation design (cont’d)
Depends on… 2. The nature of your program Nature of your program Methods to consider Will you roll out your program over time? Pipeline design What is the unit of intervention--individuals, groups, communities? Those that give enough statistical power based on the number of ‘units of observations’ Is program assigned to participants or do they self-select? RCT, quasi-experimental There is no credible reason for other influencing factors (e.g., water pump) Before/after comparison

15 Choosing an appropriate impact evaluation design (cont’d)
Depends on… 3. what participants / stakeholders will consent to? 4. What are your resources and time constraints?

16 Practical Considerations in Designing Impact Evaluation
Establishing the program theory (the logic behind a program and the causal chain from inputs to outcome to impact) Understanding the program setting How participants are selected (to mitigate selection bias) Decision tree on which method is applicable and should be explored

17 Practical Considerations in Designing Impact Evaluation (Cont’d)
Sample size Power calculation – Does the setting allow for enough numbers of units of intervention and units of observation for a robust design? Tradeoff between power and cost Time frame Is there enough time to observe the impact? Flexibility Strive for rigor, but be flexible…

18 Thank you


Download ppt "Mywish K. Maredia Michigan State University"

Similar presentations


Ads by Google