Presentation is loading. Please wait.

Presentation is loading. Please wait.

What is Impact Evaluation … and How Do We Use It? Deon Filmer Development Research Group, The World Bank Evidence-Based Decision-Making in Education Workshop.

Similar presentations


Presentation on theme: "What is Impact Evaluation … and How Do We Use It? Deon Filmer Development Research Group, The World Bank Evidence-Based Decision-Making in Education Workshop."— Presentation transcript:

1 What is Impact Evaluation … and How Do We Use It? Deon Filmer Development Research Group, The World Bank Evidence-Based Decision-Making in Education Workshop Africa Program for Education Impact Evaluation (APEIE) Accra, Ghana May 10-14 2010 1

2 Some examples Should my government distribute free textbooks to students to promote learning? Should we distribute scholarships to poor children to promote attendance? Should teachers be rewarded for learning improvements of their students? Should management decisions be devolved to the school level?  Impact evaluation is a way to start answering those questions using rigorous evidence 2

3 What is Impact Evaluation? Example You would like to distribute textbooks to students as a way of improving learning outcomes Your intuition tells you that textbooks should matter. But what is that intuition based on? – Own experience – “Common sense” – Observing children in schools – Comparisons between children in schools with textbooks and in those without Impact evaluation, in this situation, would aim at providing – rigorous evidence, – based on actual experience, – of what the actual impact of providing textbooks is. 3

4 What is Impact Evaluation? How would impact evaluation achieve this aim? – By establishing the causal impact of textbooks on learning outcomes – This is the ultimate goal of impact evaluation. This workshop will be about – What makes for a good estimate – How to estimate that impact – How to interpret the estimate 4

5 Why do we use impact evaluation? Understand if policies work – Might an intervention work (“proof of concept”)? – Can an intervention be done on a large scale? – What are alternative interventions to achieve a particular goal, and how do they compare? 5

6 Why do we use impact evaluation? Understand the net benefits of the program, and cost-effectiveness of alternatives – Requires good cost and benefit data Understand the distribution of gains and losses  Budget constraints force selectivity  Bad policies and programs are wasteful and can be hurtful 6

7 Why do we use impact evaluation? Demonstrate to politicians, population, donors that a program is effective  This can be key to sustainability  Informs beliefs and expectations 7

8 Putting impact evaluation in context Monitoring Evaluation Impact evaluation Regular collection and reporting of information to track whether actual results are being achieved as planned Analytical efforts to answer specific questions about performance of a program/activities. Analytical efforts to relate cause and effect. Key part is establishing “ what would have happened in the absence of the intervention ” 8

9 Monitoring, Evaluation, and Impact Evaluation Periodically collect data on the indicators and compare actual results with targets To identify bottle-necks and red flags (time-lags, fund flows) Point to what should be further investigated Monitoring Regular collection and reporting of information to track whether actual results are being achieved as planned 9

10 Monitoring, Evaluation, and Impact Evaluation Analyzes why intended results were or were not achieved Explores targeting effectiveness Explores unintended results Provides lessons learned and recommendations for improvement Evaluation Analytical efforts to answer specific questions about performance of a program/activities. 10

11 Monitoring, Evaluation, and Impact Evaluation What is effect of program on outcomes? How much better off are beneficiaries because of the intervention? How would outcomes change under alternative program designs? Does the program impact people differently (e.g. females, poor, minorities) Is the program cost-effective? Impact evaluation Analytical efforts to relate cause and effect. Key part is establishing “ what would have happened in the absence of the intervention ” 11

12 The central problem in Impact Evaluation Analysis: The counterfactual In order to establish the impact of the program, we need to know what would have happened in the absence of the program Not in general, but specifically for the people who actually received the program 12

13 The central problem in Impact Evaluation Analysis: The counterfactual What is the effect of a scholarship on school enrollment? We want to observe the units of treatment in two states What’s wrong with this picture? Elizabeth on 1 July 2010 with scholarship Elizabeth on 1 July 2010 without scholarship 13

14 The central problem in Impact Evaluation Analysis: The counterfactual This is impossible! – we never observe the same individual with and without program at same point in time The counterfactual is never actually observed It needs to be estimated Impact Evaluation Analysis is all about alternative approaches to estimating the counterfactual 14

15 Why is the counterfactual important? The next session will discuss the counterfactual in more detail Here, just one illustration 15

16 Illustration of the importance of the counterfactual Question: What is the best estimate of the impact of the program on enrollment? Time Enrollment AfterBefore A B 20082010 Program Impact? 16

17 Illustration of the importance of the counterfactual Question: What is the best estimate of the impact of the program on enrollment? Enrollment AfterBefore A B C? 20082010 D? Time Program Impact? 17

18 What is impact evaluation? Impact is … the difference between outcomes with the program and without it Impact evaluation involves … estimating the counterfactual so that changes in outcomes can be attributed to the program 18

19 What is involved in implementing an impact evaluation? Determine why an IE is called for Understand program and its results chain Determine what to measure Determine methodology Carry out – Data collection (baseline, follow up) – Program implementation – Analysis and reporting Adjust policy 19

20 Determine why the evaluation called for Specific intervention – Cash transfer to specific students – Specific teacher training program with a particular curriculum – School grant program with a particular structure Alternative interventions/complementary interventions – Teacher training versus school grants as a way to improve outcomes – Information and school grants as a way to boost school accountability and performance Entire program/cluster of activities Reform program  What is the audience (policymakers, technocrats, public at large)? 20

21 Understand the program using the results chain Teachers Text books Grants (What the program does) Teachers’ training School councils established Conditional Cash transfers Salary incentives (Goods & Services) Increased enrollment Lower teachers’ absenteeism Outcomes Higher primary education completion rates Higher student learning achievements Long-term Results Lower unemployment Poverty reduction Better income distribution InputsActivities Outputs This results chain provides guidance on what to measure 21

22 Determine what to measure Based on the results chain Choose indicators for the evaluation – Carefully defined indicators – Can be measured in a precise way – Are expected to be affected by the program… – … within the timeframe of the evaluation What are the important sub-populations – E.g. age, gender, urban/rural, SES… 22

23 Determine indicators using the results chain Teachers Text books Grants (What the program does) Teachers’ training School councils established Conditional Cash transfers Salary incentives (Goods & Services) Increased enrollment Lower teachers’ absenteeism Outcomes Higher primary education completion rates Higher student learning achievements Long-term Results Lower unemployment Poverty reduction Better income distribution InputsActivities Outputs Indicators related to the implementation of a program 23

24 Determine indicators using the results chain Teachers Text books Grants (What the program does) Teachers’ training School councils established Conditional Cash transfers Salary incentives (Goods & Services) Increased enrollment Lower teachers’ absenteeism Outcomes Higher primary education completion rates Higher student learning achievements Long-term Results Lower unemployment Poverty reduction Better income distribution InputsActivities Outputs Indicators related to the results of a program 24

25 Determine the methodology We’ll be talking a lot more about this – Experimental methods – Quasi-experimental methods Some principles: – Prefer method that complements program best – Prefer method that does not alter program design or implementation substantively – Prefer method that does not deny anyone benefits 25

26 The IE “cycle” Baseline data collection Program implementation Follow-up data collection Data analysis and reporting Program adjust- ements Design phase 26

27 The goal of impact evaluation To improve policies For example, to find out how to turn this teacher… 27

28 The goal of impact evaluation To improve policies …into this teacher 28

29 Thank you! 29


Download ppt "What is Impact Evaluation … and How Do We Use It? Deon Filmer Development Research Group, The World Bank Evidence-Based Decision-Making in Education Workshop."

Similar presentations


Ads by Google