Africa Program for Impact Evaluation on HIV/AIDS (AIM-AIDS) Cape Town, March 2009 Arianna Legovini Africa Impact Evaluation Initiative (AIM) and Development.

Slides:



Advertisements
Similar presentations
MDG based national development strategies and plans in Africa: the role of the Integrated Package of Services Presentation by BDP/BRSP at RBA Workshop.
Advertisements

ClimDev-Africa Program & African Climate Policy Center (ACPC)
THE DEVELOPMENT BANK OF SOUTHERN AFRICA
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Second Cross-country Workshop of the Africa Programs for Education and AIDS Impact Evaluation Dakar, December 2008 Africa Impact Evaluation Initiative.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Explanation of slide: Logos, to show while the audience arrive.
The concepts/mechanisms/tools for developing a Joint Programme: Critical issues and UNDG Joint Programme Guidance and formats.
Knowing if the RBF mechanism is working Incorporating Rigorous Impact Evaluation into your HRBF program Sebastian Martinez World Bank.
PEPFAR’s Approach to Maximize Efficiency, Effectiveness and Impact
The French Youth Experimentation Fund (Fonds d’Expérimentation pour la Jeunesse – FEJ) Mathieu Valdenaire (DJEPVA - FEJ) International Workshop “Evidence-based.
Comprehensive M&E Systems
Making Impact Evaluations Happen World Bank Operational Experience 6 th European Conference on Evaluation of Cohesion Policy 30 November 2009 Warsaw Joost.
Welcome to The Expert Community Forum 19 November 2007.
Lessons Learned for Strong Project Delivery & Reporting Sheelagh O’Reilly, Kristin Olsen IODPARC Independent Assessors for the Scottish Government IDF.
\ EASST Impact Evaluation Summit June 18, 2014 | Kigali, Rwanda EASST: History of East Africa Social Science Translation Collaborative Prof. Edward K.
Essential Service # 7:. Why learn about the 10 Essential Services?  Improve quality and performance.  Achieve better outcomes – improved health, less.
Innovations for Poverty Action Evaluating the Impact of Agricultural Development Programs Africa Rising 23 October 2012.
Regional Centers for Results Based Management and Evaluation Capacity Development: Regional Centers for Results Based Management and Evaluation Capacity.
LEARNING FROM EXPERIENCE Good practices from 45 years of school feeding Marc Regnault de la Mothe Policy, Planning and Strategy Division World Food Programme.
The Global Fund- structure, function and evolution February 18, 2008.
Use of impact evaluation for operational research PREM Week 2007 Arianna Legovini, Africa Impact Evaluation Initiative (AFTRL)
Adaptation knowledge needs and response under the UNFCCC process Adaptation Knowledge Day V Session 1: Knowledge Gaps Bonn, Germany 09 June 2014 Rojina.
PREM Week April 2009 PREM Week April 2009 Arianna Legovini Head, Development Impact Evaluation Initiative (DIME) World Bank Impact Evaluation for Real.
Commissioning Self Analysis and Planning Exercise activity sheets.
Sub-Regional Workshop for GEF Focal Points in West and Central Africa Accra, Ghana, 9-11 July 2009 Tracking National Portfolios and Assessing Results.
MAINSTREAMING MONITORING AND EVALUATION IN EDUCATION Can education be effectively managed without an M & E system in place?
Monitoring and Evaluation of GeSCI’s Activities GeSCI Team Meeting 5-6 Dec 2007.
Africa Impact Evaluation Program on AIDS (AIM-AIDS) Cape Town, South Africa March 8 – 13, Causal Inference Nandini Krishnan Africa Impact Evaluation.
Impact Evaluation for Real Time Decision Making Arianna Legovini Head, Development Impact Evaluation Initiative (DIME) World Bank.
The Next Stage for Results in Africa. Context 2005 Paris Declaration on Aid Effectiveness 2006 Mutual Learning Events Uganda & Burkina Faso 2007 Hanoi.
Proposed Priority Actions By NSF Goals (before group work) By Rose Nalwadda 1 st February 2006.
Alain Thomas Overview workshop Background to the Principles Definitions The National Principles for Public Engagement What.
Africa Program for Impact Evaluation on HIV/AIDS (AIM-AIDS) Cape Town, March 2009 Workshop of the Africa Program for Impact Evaluation of HIV/AIDS.
Impact Evaluation for Evidence-Based Policy Making
AADAPT Workshop for Impact Evaluation in Agriculture and Rural Development Goa, India 2009 With generous support from Gender Action Plan.
ReSAKSS-ECA: Progress Report Godfrey Bahiigwa ReSAKSS Coordinator Eastern and Central Africa (COMESA) May 31, 2007.
Africa RISING M&E Expert Meeting Addis Ababa, 5-7 September 2012.
Development Impact Evaluation in Finance and Private Sector 1.
Africa Impact Evaluation Program on AIDS (AIM-AIDS) Cape Town, South Africa March 8 – 13, Steps in Implementing an Impact Evaluation Nandini Krishnan.
Children Affected by AIDS: Update Save the Children Highlights Presented to IATT Steering Committee Washington, DC September 2009.
A short introduction to the Strengthened Approach to supporting PFM reforms.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
AADAPT Workshop South Asia Goa, India, December 17-21, 2009 Arianna Legovini Manager, Development Impact Evaluation Initiative The World Bank.
Regional Centers for Evaluation Capacity Development Regional Centers for Evaluation Capacity Development A Multilateral Initiative June 2009.
Tracking HIV/AIDS resources in-country: Institutionalization through capacity building and regional networks T. Dmytraczenko and S. De, Abt Associates.
The Bank’s Regional HIV/AIDS Strategies An Overview.
What IE is not and is not Male Circumcision Impact Evaluation Meeting Johannesburg, South Africa January 18-23, 2010 Nancy Padian UC Berkeley.
What is Impact Evaluation … and How Do We Use It? Deon Filmer Development Research Group, The World Bank Evidence-Based Decision-Making in Education Workshop.
Impact Evaluation for Evidence-Based Policy Making Arianna Legovini Lead Specialist Africa Impact Evaluation Initiative.
World Bank Group Impact Evaluations: Relevance and Effectiveness Jeff Tanner Task Team Leader for Impact Evaluations Independent Evaluation Group Based.
NFM: Modular Template Measurement Framework: Modules, Interventions and Indicators LFA M&E Training February
URBACT IMPLEMENTATION NETWORKS. URBACT in a nutshell  European Territorial Cooperation programme (ETC) co- financed by ERDF  All 28 Member States as.
Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, Causal Inference Nandini.
1 An introduction to Impact Evaluation (IE) for HIV/AIDS Programs March 12, 2009 Cape Town Léandre Bassolé ACTafrica, The World Bank.
1 Developing country capacity for impact evaluation PREM Week 2007 Arianna Legovini, Africa Impact Evaluation Initiative (AFTRL) Africa Impact Evaluation.
The CQUIN Learning Network: Partnering to Advance Differentiated Care
Operational Aspects of Impact Evaluation
Impact Evaluation for Real Time Decision Making
Measuring Results and Impact Evaluation: From Promises into Evidence
a New Focus for External Validity
Seminario Avaliacão de Impacto em Agricultura e Desinvolvimento Rural
AFRICA CENTERS OF EXCELLENCE FOR DEVELOPMENT IMPACT (ACEs for Impact)
Institutionalizing the Use of Impact Evaluation
Workshop of the Africa Program for Impact Evaluation of HIV/AIDS
East Central and Southern Africa Health Community (ECSA HC)
Development Impact Evaluation in Finance and Private Sector
Impact Evaluation for Real Time Decision Making
Steps in Implementing an Impact Evaluation
Nancy Padian UC Berkeley
Presentation transcript:

Africa Program for Impact Evaluation on HIV/AIDS (AIM-AIDS) Cape Town, March 2009 Arianna Legovini Africa Impact Evaluation Initiative (AIM) and Development Impact Evaluation Initiative (DIME) World Bank Impact Evaluation for Real Time Decision Making

QuestionNon-aroused (%) Aroused (%) Can you imagine …being attracted to a 12 year old? 2346 …having sex with a 60 year old woman?723 Is just kissing frustrating?4169 A condom decreases sexual pleasure6678 Would you always use a condom if you did not know the sexual history of the partner? 8869 Would you use a condom even if you were afraid that the woman might change her mind while you went to get it? 8660

Impact of family illness on self reported sex worker behavior

 Do we understand why people do the things they do?  Prevention in HIV/AIDS is predicated on knowing the right answer  Do people do what they know is right?  Do we need to rethink how prevention work if knowledge alone is not enough?

 Contingent transfers  Scholarships to stay in school in Kenya  Cash transfers conditional on staying HIV negative in Tanzania  Health insurance for sex workers?  Precommitment strategies ▪ Legislating condoms in hotel rooms (Eritrea) ▪ Abstinence? ▪ Carrying condoms?

 The word impact is often misused as a synonym for higher-level outcome  Impact originally means “effect of something onto something else”  Here impact is the portion of the observed change in an outcome caused by the intervention of interest

What is Impact Evaluation? Counterfactual analysis to single out the causal effect of an intervention on an outcome  Compare same individual with & without “something” at the same point in time  Estimate counterfactual: find a control or comparison group Counterfactual Criteria  Treated & counterfactual groups have identical initial average characteristics  Only reason for the difference in outcomes is due to the intervention

What is monitoring? Trend analysis  Change over time  Compare results before and after on the set of individuals with “something” Y AfterBefore B’ A B t0t0 t1t1 A Treatment Change Impact

Monitoring and Impact Evaluation  monitoring to track implementation efficiency (input- output) INPUTSOUTCOMESOUTPUTS MONITOR EFFICIENCY EVALUATE EFFECTIVENESS $$$ BEHAVIOR  impact evaluation to measure effectiveness (output-outcome)

 M&E: monitoring & process evaluation Descriptiveanalysis Causalanalysis What was the effect of the program on outcomes? How would outcomes change under alternative program designs? Is the program cost-effective? Is program being implemented efficiently? Is program targeting the right population? Are outcomes moving in the right direction?  Impact Evaluation:

 Are conditional cash transfers being delivered as planned?  Does peer-to-peer increase awareness?  What are the trends in HIV prevalence?  Does HIV testing affect prevention behavior?  M&E  IE  M&E  IE

Nutrition & Early Child Development in Uganda  Strong impact evaluation results  children in treatment scored half a standard deviation better than children in the control  Failed project  Project ran into financial difficulties  Parliament negative reaction  Intervention stopped  Recently, Presidency asked to take a second look at the evaluation: saving the baby? Separate performance from quality of intervention: babies & bath water

 Improve quality of programs  Separate institutional performance from quality of intervention  Test alternatives and inform design in real time  Increase program effectiveness  Answer the “so what” questions  Build government institutions for evidence-based policy-making  Plan for implementation of options not solutions  Find out what alternatives work best  Adopt better way of doing business and taking decisions

PM/Presidency: Communicate to constituencies Treasury/ Finance: Allocate budget Line ministries: Deliver programs and negotiate budget Cost- effectiveness of different programs Effects of government program BUDGET SERVICE DELIVERY CAMPAIGN PROMISES Accountability Cost-effectiveness of alternatives and effect of sector programs

 From retrospective, external, independent evaluation  Top down  Determine whether program worked or not  To prospective, internal, and operationally driven impact evaluation /externally validated  Set program learning agenda bottom up  Consider plausible implementation alternatives  Test scientifically and adopt best  Just-in-time advice to improve effectiveness of program over time

 Bottom up requires capacity development for IE in implementing agencies  Some formal training  Mainly application and learning by doing by being part of the evaluation team  Objective  use impact evaluation as an internal and routine management tool  secure policy feedback

 Question design-choices of program  Institutional arrangements, Delivery mechanisms, Packages, Pricing/incentive schemes  Use random trials to test alternatives  Focus on short term outcomes  take up rates, use, adoption  Follow up data collection and analysis  months after exposure  Measure impact of alternative treatments on short term outcomes and identify “best”  Change program to adopt best alternative  Start over

 How much does the program deliver?  Is it cost-effective?  Use most rigorous method of evaluation possible  Focus on higher level outcomes  educational achievement, health status, income  Measure impact of operation on stated objectives and a metric of common outcomes  One, two, three year horizon  Compare with results from other programs  Inform budget process and allocations

From:  Program is a set of activities designed to deliver expected results  Program will either deliver or not To:  Program is menu of alternatives with a learning strategy to find out which work best  Change programs overtime to deliver more results Shifting Program Paradigm

 This is a technical assistance product to change the way decisions are taken  It is about building a relationship  Adds results-based decision tools to complement existing sector skills  The relationship delivers not one but a series of analytical products  Must provide useful (actionable) information at each step of the impact evaluation

AIM Africa Impact Evaluation Initiative  Empower clients to learn and adopt technologies what work  Build knowledge and work with operations to scale up success

Working with 86 agencies in 28 countries 65 experimental 21 non-experimental

 Creation of learning teams within the national agencies  Develop pool of local researchers  Multi-Country Workshops learn & apply / thematic model  Pilot Aug East Africa Seminar, Mombasa, Kenya  Ethiopia 2006, South Africa 2006  Malaria 2007, Education 2007  HIV & Malaria 2008, Education 2008  HIV 2009, Agriculture 2009, Community Driven Development 2009  In Country Workshops  South-to-South collaboration and virtual network of practitioners and researchers  North-to-South partnerships  Harvard, MIT, Berkeley, UCL, LSHTS, IFPRI

 Develop team  Counterpart, project and research working together throughout design and implementation  Facilitate design & implementation of evaluations  Moderate process of critical thinking about government program  Identify policy questions, evaluation design, timeline and budget, and prepare concept notes and funding proposals  In-country stakeholders consultations, registration of trials and clearance with national authorities  Place field coordinator for day-to-day implementation support  Implementation modalities, guidance for data collection, management and analysis

 Coordinating unit  Technical Advisory Group  Develop and harmonize methods, instruments and best practice approaches  Clearing function for design and data collection protocols  Ongoing monitoring  Intervention in case of quality failures  Summarize lessons learned in materials that are accessible and relevant (AIM website, papers, policy notes)

 AIM-CDDCommunity-Driven Development  8 countries, implementation stage  APEIEAfrica Program for Education Impact Evaluation  12 countries, implementation stage  MIEPMalaria Impact Evaluation Program  7 countries (AFR/SAR), implementation stage  AIM-AIDSHIV/AIDS Impact Evaluation Program  8 countries, preparatory stage  AIM-ECDImpact Evaluation of Early Childhood Development  4 countries, preparatory stage  AIM-Water Impact Evaluation of Water Supply  8 countries (AFR/LAC), preparatory stage  AADAPT Agricultural Adaptation  5 countries in preparation, 10 countries discussion stage

 Secure coordinated policy learning agenda  address knowledge gaps  Improve comparability and generalizability of findings  harmonization of measurement  Cost-effectiveness through pooling of resources  Technical advisory groups provides  governments with access to the best available expertise in a thematic field  a strong mechanism for quality assurance  Facilitate the implementation of effective multi-country capacity development strategies  South-south exchange and knowledge sharing

 Objectives  Build rigorous country-level evidence  Build technical and institutional capacity  Focus on national priorities  Co-leadership  ActAfrica  Africa Impact Evaluation Initiative/Development Impact Evaluation Initiative

 Focus on prevention  Do our prevention strategies work?  Are some approaches better than others?  What innovations should we test to inform the next generation of projects?

AIM-AIDS Research Teams AIM-AIDS Research Teams MAP Teams MAP Teams Government IE Teams Government IE Teams Coordinating Unit IE Leads Sector Leads Program Coordinator Team Support Coordinating Unit IE Leads Sector Leads Program Coordinator Team Support Technical Advisory Group Researchers & Specialists Technical Advisory Group Researchers & Specialists Working Groups Impact evaluation Sampling & instruments Epidemiology Prevention & Behavioral change Cost-Effectiveness Lead researcher & Field Coordinator Lead researcher & Field Coordinator Testing and treatment

Benin DRC Mauritania Information campaign Eritrea Peer-to-peer communication Kenya VCT services Ivory Coast Malawi Testing Tanzania Burkina Faso Conditional Cash Transfers South Africa Kenya Treatment

Thank You