Randomization This presentation draws on previous presentations by Muna Meky, Arianna Legovini, Jed Friedman, David Evans and Sebastian Martinez.

Slides:



Advertisements
Similar presentations
AFRICA IMPACT EVALUATION INITIATIVE, AFTRL Africa Program for Education Impact Evaluation Muna Meky Impact Evaluation Cluster, AFTRL Slides by Paul J.
Advertisements

The World Bank Human Development Network Spanish Impact Evaluation Fund.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Knowing if the RBF mechanism is working Incorporating Rigorous Impact Evaluation into your HRBF program Sebastian Martinez World Bank.
Designing Clinical Research Studies An overview S.F. O’Brien.
Impact Evaluation Click to edit Master title style Click to edit Master subtitle style Impact Evaluation World Bank InstituteHuman Development Network.
What is a sample? Epidemiology matters: a new introduction to methodological foundations Chapter 4.
S S (5.1) RTI, JAIPUR1 STATISTICAL SAMPLING Presented By RTI, JAIPUR.
Impact Evaluation Toolbox Gautam Rao University of California, Berkeley * ** Presentation credit: Temina Madon.
Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, 2009 AIM-CDD Using Randomized Evaluations.
Matching Methods. Matching: Overview  The ideal comparison group is selected such that matches the treatment group using either a comprehensive baseline.
AADAPT Workshop Latin America Brasilia, November 16-20, 2009 Non-Experimental Methods Florence Kondylis.
Measuring Impact: Experiments
AADAPT Workshop South Asia Goa, December 17-21, 2009 Nandini Krishnan 1.
ECON ECON Health Economic Policy Lab Kem P. Krueger, Pharm.D., Ph.D. Anne Alexander, M.S., Ph.D. University of Wyoming.
Types of validity we will study for the Next Exam... internal validity -- causal interpretability external validity -- generalizability statistical conclusion.
Experimental Design All experiments have independent variables, dependent variables, and experimental units. Independent variable. An independent.
Why Use Randomized Evaluation? Presentation by Shawn Cole, Harvard Business School and J-PAL Presenter: Felipe Barrera-Osorio, World Bank 1 APEIE Workshop.
Africa Impact Evaluation Program on AIDS (AIM-AIDS) Cape Town, South Africa March 8 – 13, Causal Inference Nandini Krishnan Africa Impact Evaluation.
Impact Evaluation Designs for Male Circumcision Sandi McCoy University of California, Berkeley Male Circumcision Evaluation Workshop and Operations Meeting.
Experiments and Causal Inference ● We had brief discussion of the role of randomized experiments in estimating causal effects earlier on. Today we take.
5-4-1 Unit 4: Sampling approaches After completing this unit you should be able to: Outline the purpose of sampling Understand key theoretical.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
AFRICA IMPACT EVALUATION INITIATIVE, AFTRL Africa Program for Education Impact Evaluation David Evans Impact Evaluation Cluster, AFTRL Slides by Paul J.
Nigeria Impact Evaluation Community of Practice Abuja, Nigeria, April 2, 2014 Measuring Program Impacts Through Randomization David Evans (World Bank)
Why Use Randomized Evaluation? Isabel Beltran, World Bank.
Applying impact evaluation tools A hypothetical fertilizer project.
Impact Evaluation for Evidence-Based Policy Making
Development Impact Evaluation in Finance and Private Sector 1.
Africa Impact Evaluation Program on AIDS (AIM-AIDS) Cape Town, South Africa March 8 – 13, Steps in Implementing an Impact Evaluation Nandini Krishnan.
Africa Program for Education Impact Evaluation Dakar, Senegal December 15-19, 2008 Experimental Methods Muna Meky Economist Africa Impact Evaluation Initiative.
AADAPT Workshop Latin America Brasilia, November 16-20, 2009 Laura Chioda.
Randomized Assignment Difference-in-Differences
Impact Evaluation Using Impact Evaluation for Results Based Policy Making Arianna Legovini Impact Evaluation Cluster, AFTRL Slides by Paul J. Gertler &
Bilal Siddiqi Istanbul, May 12, 2015 Measuring Impact: Non-Experimental Methods.
Social Experimentation & Randomized Evaluations Hélène Giacobino Director J-PAL Europe DG EMPLOI, Brussells,Nov 2011 World Bank Bratislawa December 2011.
Global Workshop on Development Impact Evaluation in Finance and Private Sector Rio de Janeiro, June 6-10, 2011 Using Randomized Evaluations to Improve.
Africa Impact Evaluation Program on AIDS (AIM-AIDS) Cape Town, South Africa March 8 – 13, Randomization.
Impact Evaluation for Evidence-Based Policy Making Arianna Legovini Lead Specialist Africa Impact Evaluation Initiative.
Impact Evaluation Methods Randomization and Causal Inference Slides by Paul J. Gertler & Sebastian Martinez.
Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, Causal Inference Nandini.
Impact Evaluation Impact Evaluation for Evidence-Based Policy Making Arianna Legovini Lead, Africa Impact Evaluation Initiative AFTRL.
Impact Evaluation Methods Regression Discontinuity Design and Difference in Differences Slides by Paul J. Gertler & Sebastian Martinez.
Why Use Randomized Evaluation?
Using Randomized Evaluations to Improve Policy
Measuring Results and Impact Evaluation: From Promises into Evidence
Quasi Experimental Methods I
Quasi Experimental Methods I
An introduction to Impact Evaluation
Impact Evaluation Methods
Explanation of slide: Logos, to show while the audience arrive.
Chapter 13- Experiments and Observational Studies
Making the Most out of Discontinuities
Using Randomized Evaluations to Improve Policy
Implementation Issues Program roll-out
Development Impact Evaluation in Finance and Private Sector
Impact Evaluation Methods
Impact Evaluation Toolbox
Implementation Challenges
Randomization This presentation draws on previous presentations by Muna Meky, Arianna Legovini, Jed Friedman, David Evans and Sebastian Martinez.
Impact Evaluation Methods: Difference in difference & Matching
Sampling and Power Slides by Jishnu Das.
Impact Evaluation Designs for Male Circumcision
Explanation of slide: Logos, to show while the audience arrive.
Applying Impact Evaluation Tools: Hypothetical Fertilizer Project
Module 3: Impact Evaluation for TTLs
Steps in Implementing an Impact Evaluation
TTL: Nyambura Githagui
Steps in Implementing an Impact Evaluation
Presentation transcript:

Randomization This presentation draws on previous presentations by Muna Meky, Arianna Legovini, Jed Friedman, David Evans and Sebastian Martinez

Objective To estimate the causal effect of intervention on the treated We need A good counterfactual (control group)

Illustration Program impact Effect of exogenous factors

Motivation Statistical analysis: Typically involves inferring the causal relationship between X and Y from observational data Many challenges & complex statistics We never know if we’re measuring the true impact Impact Evaluation: Retrospectively: same challenges as statistical analysis Prospectively: we generate the data ourselves through the program’s design  evaluation design makes things much easier!

Gold standard: Experimental design Experimental design = Randomized evaluation Only method that ensures balance in unobserved (and observed) characteristics Only difference is treatment Equal chance of assignment into treatment and control for everyone With large sample, all characteristics average out Precise and unbiased impact estimates

Options for randomization Random assignment with and without treatment (partial coverage) Lottery on the whole population Lottery on the eligible population Lottery after assignment to most deserving populations Random phase in (full coverage, delayed entry) Variation in treatment (full coverage, alternative treatments) Across treatments that we think ex ante are equally likely to work 6

Example: Impact of condom distribution Random assignment Treatment group receives condoms Control group receives nothing Random phase-in Treatment group receives condoms today Control group receives condoms in the next period Variation in treatment Control group receives condoms and information What are some examples could be used in the context of an educaiton ie Analogy is how a new drug is tested. Big difference between medical experiments and social experiments: In medical experiments can do double-blind. What does that mean : flip a coin 7

Operational opportunities for randomization Resource constrain full coverage Random assignment is fair and transparent Limited implementation capacity Phase-in gives all the same chance to go first No evidence on which alternative is best Random assignment to alternatives with equal ex ante chance of success 8

Operational opportunities for randomization, cont. Pilot a new program Good opportunity to test before scaling up Operational changes to ongoing programs Good opportunity to test changes before scaling them up 9

At what level should you randomize? It depends on the unit of intervention Individual, household Group School Community or village Health unit/district or hospital District/province

Unit of randomization Individual or household randomization is lowest cost option Randomizing at higher levels requires much bigger samples: within-group correlation Political challenges to unequal treatment within a community But look for creative solutions Some programs can only be implemented at a higher level

Eléments de base d’une Evaluation Expérimentale Random assignment Treatment group Participants  Drop-outs Control group Evaluation sample Potential participants Sex workers Truck drivers Target population High risk population Women, particularly poor Based on Orr (1999) 12

External and Internal validity External validity The sample is representative of the total population The results in the sample represent the results in the population We can apply the lessons to the whole population Internal validity The sample is representative of a segment of the population The results in the sample are representative for that segment of the population We cannot apply the lessons to the whole population Does not inform scale-up without assumptions Example: Condom intervention on sex workers versus condom interventions on the whole population

Samples National Population External validity National Population Randomization Samples National Population

Samples of Population Stratum Internal validity Population Stratification Randomization Population stratum Samples of Population Stratum

Example: Condom Distribution, internal validity Sample of Truck Drivers Random assignment 16

Example: Condom distribution in communities near truckers’ rest stops Basic sequence of tasks for the evaluation Baseline survey in rest stop areas Random assignment of rest stops Intervention: condom distribution to women in communities neighboring rest stops assigned to treatment Follow-up survey What are some examples could be used in the context of an educaiton ie Analogy is how a new drug is tested. Big difference between medical experiments and social experiments: In medical experiments can do double-blind. What does that mean : flip a coin 17

Efficacy & Effectiveness Proof of concept Smaller scale Pilot in ideal conditions Effectiveness At scale Prevailing implementation arrangements Higher or lower impact? Higher or lower costs? Impacts: lower e.g. deworming, malaria etc things that need critical mass to be eradicated | higher: ideal conditions costs: ecomies of scale vs inefficient scale up (from ngo to inefficient gov systems for example 18

Advantages of experiments Clear and precise causal impact Relative to other methods Much easier to analyze Cheaper (smaller sample sizes) Easier to convey More convincing to policymakers Methodologically uncontroversial

When is it really not possible? The treatment has already been assigned and announced and no possibility for expansion of treatment The program is over (retrospective) Universal eligibility and universal access Example: free access to ARV

Thank You