Africa Impact Evaluation Program on AIDS (AIM-AIDS) Cape Town, South Africa March 8 – 13, 2009 1 Steps in Implementing an Impact Evaluation Nandini Krishnan.

Slides:



Advertisements
Similar presentations
PAYING FOR PERFORMANCE In PUBLIC HEALTH: Opportunities and Obstacles Glen P. Mays, Ph.D., M.P.H. Department of Health Policy and Administration UAMS College.
Advertisements

The World Bank Human Development Network Spanish Impact Evaluation Fund.
Integration: Intersection for Reproductive Health and HIV Programs: the Kenyan Experience Family Health International Sponsored Satellite Session World.
What could go wrong? Deon Filmer Development Research Group, The World Bank Evidence-Based Decision-Making in Education Workshop Africa Program for Education.
Operationalizing IE: Case Study example: Textbooks and Teacher Training in Sierra Leone APEIE Workshop, May
Conditional Cash Transfers for Improving Utilization of Health Services Health Systems Innovation Workshop Abuja, January 25 th -29 th, 2010.
Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, 2009 AIM-CDD Using Randomized Evaluations.
1 Randomization in Practice. Unit of randomization Randomizing at the individual level Randomizing at the group level –School –Community / village –Health.
RESEARCH A systematic quest for undiscovered truth A way of thinking
AADAPT Workshop Latin America Brasilia, November 16-20, 2009 Nandini Krishnan Africa Impact Evaluation Initiative World Bank April 14, 2009.
Experiments and Observational Studies. Observational Studies In an observational study, researchers don’t assign choices; they simply observe them. look.
Copyright © 2007 Pearson Education, Inc. Publishing as Pearson Addison-Wesley Chapter 13 Experiments and Observational Studies.
Unit 8: Uses and Dissemination of HIV Sentinel Surveillance Data #3-8-1.
Measuring Impact: Experiments
AADAPT Workshop South Asia Goa, December 17-21, 2009 Nandini Krishnan 1.
SEDA IMPACT EVALUATION WESTERN CAPE (SOUTH AFRICA) Varsha Harinath (the dti) Francisco Campos (World Bank) Finance and Private Sector Development IE Workshop.
Evidencing Outcomes Ruth Mann / George Box Commissioning Strategies Group, NOMS February 2014 UNCLASSIFIED.
The Rising Prevalence of NCDs: Implications for Health Financing and Policy Charles Holmes, MD, MPH Office of the U.S. Global AIDS Coordinator Department.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Global Partnership on Disability and Development What is the GPDD? Presentation to JICA Group Training Course HIV/AIDS Section Judith Heumann, Lead Consultant,
Private involvement in education: Measuring Impacts Felipe Barrera-Osorio HDN Education Public-Private Partnerships in Education, Washington, DC, March.
Africa Impact Evaluation Program on AIDS (AIM-AIDS) Cape Town, South Africa March 8 – 13, Causal Inference Nandini Krishnan Africa Impact Evaluation.
Impact Evaluation Designs for Male Circumcision Sandi McCoy University of California, Berkeley Male Circumcision Evaluation Workshop and Operations Meeting.
Rigorous Quasi-Experimental Evaluations: Design Considerations Sung-Woo Cho, Ph.D. June 11, 2015 Success from the Start: Round 4 Convening US Department.
21/4/2008 Evaluation of control measures 1. 21/4/2008 Evaluation of control measures 2 Family and Community Medicine Department.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Nigeria Impact Evaluation Community of Practice Abuja, Nigeria, April 2, 2014 Measuring Program Impacts Through Randomization David Evans (World Bank)
EXPERIMENTAL EPIDEMIOLOGY
AIDS Turning the Tide Together Promoting HIV Prevention Behaviors Through Mass Media and Community Engagement in Malawi Presenter(s): Glory Mkandawire.
Applying impact evaluation tools A hypothetical fertilizer project.
Non-experimental methods Markus Goldstein The World Bank DECRG & AFTPM.
Effect of community-wide isoniazid preventive therapy on tuberculosis among South African gold miners “Thibelo TB” Aurum Health Research LSHTM JHU Gold.
OPPORTUNITIES AND CHALLENGES OF CONDUCTING RESEARCH IN LARGE SCALE PROGRAMS Presented by: Deanna Olney and Jef Leroy, IFPRI.
Measuring Impact 1 Non-experimental methods 2 Experiments
Development Impact Evaluation in Finance and Private Sector 1.
Access to data for local authority public health AGW Public Health Network Training Event: Public Health Data, Information and Intelligence 11 th November.
Africa Program for Education Impact Evaluation Dakar, Senegal December 15-19, 2008 Experimental Methods Muna Meky Economist Africa Impact Evaluation Initiative.
Mitigating HIV/AIDS Impact on Education Systems: A Multi-Sectoral Approach Peter Badcock-Walters MTT 2004 Winter School Sica’s Conference Centre, Durban.
Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, 2009 Steps in Implementing an Impact.
AADAPT Workshop South Asia Goa, India, December 17-21, 2009 Arianna Legovini Manager, Development Impact Evaluation Initiative The World Bank.
Randomized Assignment Difference-in-Differences
De Beers Response to HIV/AIDS 19 th June 2006 World Bank Group- CommDev Workshop.
The Bank’s Regional HIV/AIDS Strategies An Overview.
What IE is not and is not Male Circumcision Impact Evaluation Meeting Johannesburg, South Africa January 18-23, 2010 Nancy Padian UC Berkeley.
What is Impact Evaluation … and How Do We Use It? Deon Filmer Development Research Group, The World Bank Evidence-Based Decision-Making in Education Workshop.
Africa Impact Evaluation Program on AIDS (AIM-AIDS) Cape Town, South Africa March 8 – 13, Randomization.
Identify bottlenecks Facilitators’ Workshop on District Health Performance Improvement Lilongwe, 25 th – 27 th March 2015.
Impact Evaluation for Evidence-Based Policy Making Arianna Legovini Lead Specialist Africa Impact Evaluation Initiative.
IMPLEMENTATION AND PROCESS EVALUATION PBAF 526. Today: Recap last week Next week: Bring in picture with program theory and evaluation questions Partners?
1 Health Needs Assessment Workshop Sue Cavanagh Keith Chadwick.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, Causal Inference Nandini.
1 An introduction to Impact Evaluation (IE) for HIV/AIDS Programs March 12, 2009 Cape Town Léandre Bassolé ACTafrica, The World Bank.
Impact Evaluation Methods Regression Discontinuity Design and Difference in Differences Slides by Paul J. Gertler & Sebastian Martinez.
How to show your social value – reporting outcomes & impact
Measuring Results and Impact Evaluation: From Promises into Evidence
Impact Evaluation Terms Of Reference
Explanation of slide: Logos, to show while the audience arrive.
Supporting an omnichannel strategy Measuring performance
Development Impact Evaluation in Finance and Private Sector
Implementation Challenges
Randomization This presentation draws on previous presentations by Muna Meky, Arianna Legovini, Jed Friedman, David Evans and Sebastian Martinez.
Impact Evaluation Methods: Difference in difference & Matching
Randomization This presentation draws on previous presentations by Muna Meky, Arianna Legovini, Jed Friedman, David Evans and Sebastian Martinez.
Impact Evaluation Designs for Male Circumcision
Presentation for Second Meeting of the Global TB/HIV Working Group
Steps in Implementing an Impact Evaluation
South Africa: From ProTest to Nationwide Implementation
Steps in Implementing an Impact Evaluation
Nancy Padian UC Berkeley
Presentation transcript:

Africa Impact Evaluation Program on AIDS (AIM-AIDS) Cape Town, South Africa March 8 – 13, Steps in Implementing an Impact Evaluation Nandini Krishnan Africa Impact Evaluation Initiative World Bank March 11, 2009

2 Step 1: Identify priorities Know your epidemic Examine sector plan Poverty Reduction Health Long-term Strategy Identify highest priorities for learning in HIV/AIDS HIV testing: VCT services Prevention: Education for HIV/AIDS prevention, Media campaign, PMTCT Treatment and Care: ART Mitigation: OVC interventions

3 Step 1: Identify priorities Priority interventions: –Unknown benefits –Costly intervention –New intervention –National or regional policy thrust: resources focused on.. Priority outcomes of interest –Intermediate –Final Joint determination of intervention/s and outcomes Feasible evaluation question

4 Step 2: Understand roll-out of intervention and opportunities for impact evaluation How will the program be rolled out? Different interventions? –Piloted in a random sample of communities / schools / regions? –Rolled out nationwide? –Rolled out in communities/schools/regions satisfying a certain criteria? –Rolled out to a targeted high-risk population/areas?  Understand TARGETING and PROGRAM PARTICIPATION Each roll-out strategy yields distinct opportunities for impact evaluation

5 3. Appropriate design Keep in mind The needs of the intervention – Target population/ High- risk areas The evaluation: Take advantage of opportunities for random assignment or phase-out Example: 1,000 schools in high-risk/priority areas to receive education for HIV/AIDS interventions Randomly assign 300 to Year 1, 400 each to Years 2-3 Identify 500 neediest (using clearly defined criteria) and assign to years 1 and 2

6 3. More on design Determine scale: Large scale or small pilot? Universal scale: difficult to evaluate, usually not feasible At large scale –May be representative –More costly to implement –Better information about regional or national effectiveness Small pilot (e.g., in two districts) –Easier to implement –Not as informative

7 4. Assignment to treatment and control Is random assignment feasible? At what level? The unit of intervention often simplest Example: Education for HIV/AIDS prevention -Can randomly assign at school, clinic, or community level Trade-off: higher level means bigger sample Alternative assignment strategies? If the intervention must be targeted, use clearly defined criteria, and if feasible, phase out randomly within eligible population / schools / clinics / communities

8 5. Collect baseline data Baseline data not strictly necessary if assignment is randomized: Randomization implies treatment and control are similar But baseline data Allows us to verify that treatment and control appear balanced, allows Diff-in-Diff with Random assignment Regression Discontinuity Design and other quasi- experimental methods Informs project design and implementation: 1.Who was targeted? Did the program mostly benefit patients who were poor or at high risk at baseline? 2.How well were they targeted? Allows analysis of targeting efficiency 3.Sometimes, unexpected but important information

9 5. Baseline questionnaires 1.Include areas essential to impact evaluation Ultimate outcomes we care most about: prevalence rates Intermediate outcomes we expect to change first: behavior? Characteristics that might affect outcomes 2.Take advantage of opportunity to collect essential sector data Kenya: IE on VCT uptake – census to get information of activities at a representative VCT centre 3.Who collects it? Bureau of Statistics: Integrate with existing data Ministry concerned: Ministry of Health Private agency: Sometimes easier quality monitoring

10 6. Check for balance / pre-treatment characteristics Do treatment and control groups look similar at baseline? If not, all is not lost! Even in absence of perfect balance, can use baseline data to adjust analysis—Diff in Diff PovertyFemale-headed households Number of children in household Formal sector job Treatment 70%64%3.120% Control 68%66%2.918%

11 7. Roll out intervention Monitor to roll-out to ensure evaluation is not compromised What if the benefits are accidentally rolled out to everyone, all at once? Example: A radio-based information campaign is rolled out to all districts Evaluation is compromised: Needed to monitor! What if all the control group receive some other benefit? Example: NGO targets control communities to receive VCT Changes evaluation: Add information campaign in treatment communities

12 7. Gather info on roll-out In reality, who receives which benefits when? –Could affect the impacts measured: variation in exposure to treatment Does the intervention involve something other than initially planned? –Example: Learn that those giving resources to VCT clinics also gave detailed guidance on clinic management –Program impact now includes the guidance

13 8. Follow-up data Collect follow-up data for both the treatment and control groups Appropriate intervals –Consider how long it should take for outcomes to change –Sub-sample at six months? Intermediate changes –One year Provide initial outcomes Adjust program if needed –Two years: Changes in longer term outcomes? –After end of program: Do effects endure? What happens once the Education for HIV/AIDS prevention program has ended?

14 9. Estimate program impacts Randomization: Simply compare average outcomes for treatment and comparison Other methods: Make statistical assumptions to estimate impact of program Combination of methods

Are the effects big enough to matter? Are the effects statistically significant? –Basic statistical test tells whether differences are due to the program or to noisy data Are they policy significant? –If the anti-HIV media campaign costs a million dollars and has positive effect but it’s tiny, may not be worthwhile

Disseminate! If no one knows about it, it won’t make a difference to policy! Make sure the information gets into the right policy discussions Ownership to government, capacity building Forums –Workshop –Report –Policy brief

Iterate Re-examine sector priorities: Identify next learning opportunity Or suppose the effects aren’t as large as you hoped –Test variations Different training for teachers and clinic officers to implement an Education for HIV prevention program) –Test other interventions to affect same outcomes Media campaign for HIV/AIDS awareness versus Education for HIV/AIDS program)

18 Thank You