3ie  Context: the results agenda  The call for evidence  Rigorous impact evaluation  The international response  Reactions.

Slides:



Advertisements
Similar presentations
MDGs Needs Assessment in Kenya Process, Experiences and Challenges George Anyango Ministry of Planning & National Development.
Advertisements

The Africa Action Plan An IEG Evaluation CSO Forum April 15, 2011.
Building Statistical Capacity To Monitor Development Progress World Bank Development Data Group.
PEPFAR’s Approach to Maximize Efficiency, Effectiveness and Impact
1 Assessment of Cambodia’s Statistics Capacity Prepared by Zia A. Abbasi IMF Multi-sector Statistics Advisor, Cambodia for the International Conference.
Evaluation and performance assessment - experience from DFID Colin Kirk Head, Evaluation Department, DFID.
Delivering on Commitments to Gender Equality and Women’s Rights Key issues for HLF4 on aid effectiveness, Busan November 2011 Delivering on Commitments.
Commonwealth Local Government Forum Freeport, Bahamas, May 13, 2009 Tim Kehoe Local Government and Aid Effectiveness.
TRENDS IN DEVELOPMENT OF NATIONAL MONITORING AND EVALUATION SYSTEMS FOR AIDS RESPONSE Kevin Kelly Inaugural SAMEA Conference March 2007, Johannesburg.
The Results Agenda Improving The Way We Do Business E. Gail Richardson OPCS – Results Secretariat April 29, 2009.
Latest Trends in Evaluation: Interviews with Industry Leaders Don Snodgrass and Zan Northrip October 2, 2008 DAI.
Global Poverty Action Fund Community Partnership Window Funding Seminar January 2014 Global Poverty Action Fund Community Partnership Window Funding Seminar.
MEDIA AND HUMAN RIGHTS WORKSHOP FEBRUARY 23, 2012 SIOBHÁN MCINERNEY-LANKFORD WORLD BANK NORDIC TRUST FUND Human Rights and Development: An Introduction.
Ella.practicalaction.org. Learn,Share,Network. ella.practicalaction.org.
OPTIONS AND REQUIREMENTS FOR ENGAGEMENT OF CIVIL SOCIETY IN GEF PROJECTS AND PROGRAMMES presented by Faizal Parish Regional/Central Focal Point GEF NGO.
Development Marketplace Theresa Bradley November 19, 2007 Land Administration and Policy Thematic Group.
Public Sector Reform: What Works and Why? An IEG Evaluation of World Bank Support – September 2008.
Importance of Health Information Systems Information explosion during 1990s  It is estimated that in the next 50 years, the amount of knowledge currently.
United Nations Statistics Division
UNFCCC Meeting on experiences with performance indicators for monitoring and evaluation of capacity building in developing countries Rio De Janeiro, Brazil.
Strengthening the quality of research for policy engagement in the African context – achievements and aspirations Tebogo B. Seleka Botswana Institute for.
THE REPUBLIC OF UGANDA Giving National direction through Evaluation: Uganda’s evaluation of its Poverty Eradication Action Plan ( ) Albert Byamugisha,
Capacity Building for Better Agricultural Statistics Misha Belkindas and Graham Eele Development Data Group, World Bank.
Education and Culture Main initiatives and events 2013 Multilingualism.
Page 0 Agency Approaches to Managing for Development Results Why Results? What Results? Key Challenges, lessons learnt Core principles and draft action.
High-level forum on strategic planning for statistics: Bishkek, May 2006 Why statistics? Why strategic planning? Presentation by PARIS21 Secretariat.
Evidence and Information for Policy Health Metrics Network Strengthening Country-Level and Global Tracking of Health Outcomes.
MANAGING FOR DEVELOPMENT RESULTS INFRASTRUCTURE, PRIVATE SECTOR DEVELOPMENT AND GOVERNANCE: WHAT CAN WE MEASURE? Rapporteur’s Report Thursday, 5 February,
Shaida Badiee, Director Development Data Group The World Bank International Forum on Monitoring Economic Development Beijing, China Sept 28, 2011.
Gender and Development Effectiveness. Entry points for Tanzania? DPG Main, 8 May 2012 Anna Collins-Falk, Representative, UN Women on behalf of DPG Gender.
Institutional Development for Improved Water Quality | November 2010 Operation and Maintenance for Safe Drinking Water – Institutional development to achieve.
Statistics and cooperation: Rome, 24 November 2005 Statistics to Inform Development Policy: the Role of PARIS21 Presentation by Antoine Simonpietri, PARIS21.
Report on the Evaluation Function Evaluation Office.
Caribbean Community Secretariat 2nd meeting of the Advisory Group on Statistics San Ignacio – Belize 25 June 2008 Introduction and Objectives of NSDS day.
1 Analysing the contributions of fellowships to industrial development November 2010 Johannes Dobinger, UNIDO Evaluation Group.
8 TH -11 TH NOVEMBER, 2010 UN Complex, Nairobi, Kenya MEETING OUTCOMES David Smith, Manager PEI Africa.
What is HMN? Global partnership founded on the premise that better health information means better decisions and better health Partners reflect wide.
African Centre for Statistics United Nations Economic Commission for Africa Data for managing development Identifying gaps and setting priorities African.
The Global Fund and Southern Africa A review of the structures and grants in southern Africa.
Regional humanitarian networks ALNAP Biannual Meeting Madrid, 3 rd June.
PUBLIC POLICIES FOR CAREER DEVELOPMENT DESIGN OF CAREER INFORMATION AND GUIDANCE SYSTEMS IN MIDDLE INCOME AND TRANSITION ECONOMIIES.
Howard White 3ie.  3ie created to enhance development effectiveness through evidence-based policy- making  Founding group of agencies, Foundations and.
Welcome to the Workshop Towards shared principles for reporting health impacts of development aid EC, EAGHA, The Lancet.
UNITAR SEMINAR – February 22, 2012 Paul Balogun- Consultant
Experiences, Trends, and Challenges in Management of Development Assistance Operators for Effective Monitoring & Evaluation of Results Linda Morra Imas.
Improving Outcome Measurement Scottish Annual Statistics Users Conference 29/10/09 Dr Mark McAteer Director of Governance & Performance Management.
AfDB-IFAD Joint Evaluation of Agriculture and Rural Development in Africa Towards purposeful partnerships in African agriculture African Green Revolution.
River Basin Management in Southern Africa Barbara Schreiner.
A short introduction to the Strengthened Approach to supporting PFM reforms.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Current practices in impact evaluation Howard White Independent Evaluation Group World Bank.
Sustainability Planning Framework and Process Cheri Hayes Consultant to Nebraska Lifespan Respite Statewide Sustainability Workshop June 23-24, 2015 ©
Experience of Health Policy Networks in East Asia & Pacific region Lessons and possibilities Dr. Ravi P. Rannan-Eliya World Bank Regional Health Financing.
2009 An Overview of programmes. Overview Current country programmes – Ghana, Rwanda Transitioned country programmes – Namibia Potential new country programmes.
A Strategy for Implementation of the Updated System of National Accounts 1993 Prepared for discussion by UNSD and IMF Joint Eurostat-UNSD Conference on.
Global Partnership for Enhanced Social Accountability (GPESA) December 19, 2011 World Bank.
Dissemination in Canada CICAD Guidelines for School-based Prevention of Substance Abuse VII Meeting of the Expert Group on Demand Reduction September 13,
The Bank’s Regional HIV/AIDS Strategies An Overview.
An Update on On-Going TG Activities Thursday, March 8, :30-2:00PM PRESENTED BY: LILI LIU & KAI KAISER.
World Bank Group Impact Evaluations: Relevance and Effectiveness Jeff Tanner Task Team Leader for Impact Evaluations Independent Evaluation Group Based.
Exploring Capacity and Accountability Gaps Joan Kagwanja, Chief Land Policy Initiative World Bank Conference on Land and Poverty March 2016.
Developing a national governance framework for health promotion in Scottish hospitals Lorna Smith Senior Health Improvement Programme Officer NHS Health.
OWN, SCALE-UP & SUSTAIN The 16 th International Conference on AIDS & STIs in Africa 4 to 8 December 2011, Addis Ababa
The PEFA Framework – a tool for monitoring government performance ICGFM – New Developments in Governmental Financial Management Miami, May 19-22, 2008.
Development Account: 6th Tranche Strengthening the capacity of National Statistical Offices (NSOs) in the Caribbean Small Island Developing States to fulfill.
Assessing the evidence base for how STIP enhances development programming Annette N. Brown Director, Research and Evaluation Strategic Initiative, FHI.
International Initiative for Impact Evaluation (3ie)
Enhancing employers’ involvement in Social Protection policy debates
Country-led Development Evaluation The Donor Role in Supporting Partner Ownership and Capacity Mr. Hans Lundgren March 2009.
A year of progress on global and country coordination on PHC
Presentation transcript:

3ie  Context: the results agenda  The call for evidence  Rigorous impact evaluation  The international response  Reactions

3ie  Methodological developments  Results agenda › US: Government Results and Performance Act, GRPA – USAID adopted six ‘strategic development goals’ e.g. broad-based economic growth, defining outcome indicators for each e.g. per capita growth › UK: Public Service Agreements and Service Delivery Agreements (very similar to Report Cards and Development Indicators in South Africa)  MDGs › Focus on outcomes › Raises questions about attribution

3ie Goals were ‘so broad, and progress affected by many factors other than USAID programmes, that the indicators cannot realistically serve as measures of the agency’s specific efforts ’ In 2000 USAID abandoned the goal outcome measures as a basis for monitoring its performance

3ie  Timescale, exacerbated by data lags  Decentralized programs not aligned with higher-level objectives  Data quality  Impact? Development agencies may not know exactly what impact their efforts are having given the wide range of other agencies and external political, economic and social factors involved.

3ie  Use existing data systems, but with a view to data quality and timeliness  Invest in improving existing data systems rather than making new (often parallel) ones  Know the advantages and limitations of different sorts of data (administrative versus survey data)  Avoid indicator proliferation  Don’t be all M and no E

3ie Outcome monitoring is not a valid basis for rigorous performance measurement … hence the need for impact evaluation

3ie  International agency response › World Bank: DIME and IEG › Inter-American Development Bank*  At country level response strongest in Latin America › Progressa › In Mexico and Colombia social programs legally require impact evaluations to secure funding Note: * See IDS Bulletin March 2008 for discussion

3ie After decades in which development agencies have disbursed billions of dollars for social programs, and developing country governments and nongovernmental organizations (NGOs) have spent hundreds of billions more, it is deeply disappointing to recognize that we know relatively little about the net impact of most of these social programs.

3ie  UNICEF review (1995) found 15% had ‘impact assessments’ but ‘many evaluations were unable to properly assess impact because of methodological shortcomings’  ILO review of 127 studies of 258 community health programs found only 2 which could give robust conclusion on impact on access  Recent (2008) NORAD review found many reports draw impact conclusions on basis of weak evidence (or none at all)

3ie Billions of dollars are being spent with no idea if they are the best intervention to achieve the desired outcome – or even if the help achieve it at all. Rigorous impact evaluation – and only rigorous impact evaluation – will provide the evidence needed for optimal resource allocation

3ie  When will we ever learn? › An approach which can attribute change in outcomes to the intervention › Requires a well-defined control group › Strongest design is developed at project design stage › Where feasible, a randomized approach should be considered  Poverty action lab › Active promotion of randomized control trials (RCTs) › Evaluations in conjunction with NGOs especially in India and Kenya

3ie Joint AfrEA, NONIE, 3ie impact evaluation conference Cairo May 2009 NONIE3ie HistoryOrigins in DAC meeting November 2006 Evaluation gap working group PurposeDemand generation Guidance Coordination Promote quality impact studies, including financing Advocacy MembershipEvaluation networksGovernments and NGOs IE practitioners are associates EngagementOPSC representedSign up, membership and studies

3ie  Websites › World Bank (PREM, IEG and NONIE) › Poverty Action Lab › 3ie –  Training courses › PAL › IPDET › Africa Impact Evaluation Initiative  Opportunities to conduct IEs › DIME › 3ie › Own resources

3ie  Each year two themes and open window  Themes being determined – see 3ie website for details and to submit proposed questions  Southern-driven, issues-based with southern-led evaluation teams  Under each of these three windows › ‘quick wins’ › 6-8 large studies › 6 baseline studies › 6 synthetic reviews

3ie  Only promoting RCTs – not true  RCTs seen as gold standard – not exactly true, issues-led not methods-led, but where feasible an RCT should certainly be considered (best available method)  A positivist approach – true, but evidence-based policy making is an inherently positivist approach  Another northern initiative – partly true, but 3ie offers great potential for (1) Southern ownership of evaluation agenda, (2) harmonization around quality standards

3ie The current policy framework in South Africa implicitly demands quality impact evaluations, and resources are available to help meet the supply

3ie  The move to performance-based management leads to questions being asked about attribution  Impact evaluation is the approach to be adopted to address attribution  Impact evaluation has to adopt certain technical standards if it is to provide proper evidence  The new international initiatives exist to support efforts to expand programs of impact evaluation

3ie VISIT