Update on the Multilateral Effectiveness Initiative James Melanson Director of Evaluation CIDA DAC Network on Development Evaluation June 2013.

Slides:



Advertisements
Similar presentations
ENTITIES FOR A UN SYSTEM EVALUATION FRAMEWORK 17th MEETING OF SENIOR FELLOWSHIP OFFICERS OF THE UNITED NATIONS SYSTEM AND HOST COUNTRY AGENCIES BY DAVIDE.
Advertisements

Guidance Note on Joint Programming
Policies and Procedures for Civil Society Participation in GEF Programme and Projects presented by GEF NGO Network ECW.
Comparative Study of MOPAN and EvalNet Approaches to Assessing Multilateral Organizations’ Development Effectiveness James Melanson Director, Development.
Progress Toward Impact Overall Performance Study of the GEF Aaron Zazueta GEF Evaluation Office Hanoi, March 10, 2010.
Presentation by Ms. Siona Koti DAD Community of Practice, Yerevan, Armenia June 2011 DAD Solomon Islands: Channelling Donor Resources to the Country’s.
Vietnam Country Programme Evaluation Presentation to the Evaluation Committee during their country visit to Viet Nam, 22 May 2013.
Evaluation. Practical Evaluation Michael Quinn Patton.
DAC Network on Development Evaluation Illuminating development challenges and results.
1 “Adaptation to the consequences of Climate Change: Progress achieved and capacity building needed” Budapest, November 19-20, 2007 Strategic Environmental.
MeTA Jordan Executive Summary Baseline data is an important source for policy makers to diagnose the pharmaceutical and health sector situation in order.
Introduction A GENERAL OVERVIEW OF THE WCD FINDINGS, RECOMMENDATIONS & APPLICATION Alex Muhweezi & Chihenyo Mvoyi IUCN Uganda Country Office.
UNDP Support to UN Cooperation in Moldova Annual Programme Review UNDP Moldova 18 December, 2003.
Cross-cutting areas of Capacity Building and Adaptation UNDP Workshop for NIS Environmental Focal Points June 2004.
IPC Global Strategic Programme ( ) IPC Global Partners: IPC REGIONAL Strategic Programme IPC Regional Steering Committee Meeting – March.
Evaluation methods and tools (Focus on delivery mechanism) Jela Tvrdonova, 2014.
1 RBM Background Development aid is often provided on a point to point basis with no consistency with countries priorities. Development efforts are often.
1 Experiences of Using Performance Information in Budget Process 27 th Annual Meeting of Senior Budget Officials Sydney, June 5 th 2006 Teresa Curristine.
Accountability in Health Promotion: Sharing Lessons Learned Management and Program Services Directorate Population and Public Health Branch Health Canada.
1. IASC Operational Guidance on Coordinated Assessments (session 05) Information in Disasters Workshop Tanoa Plaza Hotel, Suva, Fiji June
Toolkit for Mainstreaming HIV and AIDS in the Education Sector Guidelines for Development Cooperation Agencies.
Formative Evaluation of UNGEI Findings and Selected Recommendations Presentation to UNGEI GAC 14 February 2012.
Global Task Team: Improving AIDS Coordination Among Multilateral Institutions and International Donors Briefing for Theme Group on HIV/AIDS 1 November.
EVALUATION IN THE GEF Familiarization Seminar 2012 Aaron Zazueta Chief Evaluation Officer.
IAOD Evaluation Section, the Development Agenda (DA) and Development Oriented Activities Julia Flores Marfetan, Senior Evaluator.
Tracking of GEF Portfolio: Monitoring and Evaluation of Results Sub-regional Workshop for GEF Focal Points Aaron Zazueta March 2010 Hanoi, Vietnam.
UNFPA-UNICEF Joint Programme on Female Genital Mutilation/Cutting: Accelerating Change Management Response and Key Actions.
Report on the Evaluation Function Evaluation Office.
Commissioning Self Analysis and Planning Exercise activity sheets.
1 Analysing the contributions of fellowships to industrial development November 2010 Johannes Dobinger, UNIDO Evaluation Group.
8 TH -11 TH NOVEMBER, 2010 UN Complex, Nairobi, Kenya MEETING OUTCOMES David Smith, Manager PEI Africa.
UNDAF M&E Systems Purpose Can explain the importance of functioning M&E system for the UNDAF Can support formulation and implementation of UNDAF M&E plans.
0 United Nations Capital Development Fund Summary of Strategic Partnership Between UNDP and UNCDF Moving Closer Together in the Context of the UNDP Strategic.
MAINSTREAMING MONITORING AND EVALUATION IN EDUCATION Can education be effectively managed without an M & E system in place?
Aaron Zazueta Chief Evaluation Officer 2013 EVALUATION IN THE GEF.
The China Biodiversity Partnership And Framework for Action (CBPF) A Programmatic Approach for Biodiversity Conservation.
Mozambique The United Nations At Work THE UN REFORM In Mozambique DaO Evaluability Study, Report and Recommendations May 2009.
Presented by CIDA on behalf of the Task Team on Multilateral Effectiveness.
The United Nations Global Initiative to Fight Human Trafficking (UN.GIFT) at the Informal Joint Briefing to Member States, Vienna, 2 March 2012.
1 Corporate-level Evaluation on Gender Equality and Women’s Empowerment IFAD’s Office of Evaluation Informal Seminar Executive Board – 101st Session 13.
European Commission Joint Evaluation Unit common to EuropeAid, Relex and Development Methodology for Evaluation of Budget support operations at Country.
Independent Evaluation Group World Bank November 11, 2010 Evaluation of Bank Support for Gender and Development.
UNITAR SEMINAR – February 22, 2012 Paul Balogun- Consultant
Policies and Procedures for Civil Society Participation in GEF Programme and Projects presented by GEF NGO Network ECW.
The University of Kentucky Program Review Process for Administrative Units April 18 & 20, 2006 JoLynn Noe, Assistant Director Office of Assessment
INTEGRATED ASSESSMENT AND PLANNING FOR SUSTAINABLE DEVELOPMENT 1 Click to edit Master title style 1 Evaluation and Review of Experience from UNEP Projects.
Tracking national portfolios and assessing results Sub-regional Workshop for GEF Focal Points in West and Central Africa June 2008, Douala, Cameroon.
AfDB-IFAD Joint Evaluation of Agriculture and Rural Development in Africa Towards purposeful partnerships in African agriculture African Green Revolution.
WHO EURO In Country Coordination and Strengthening National Interagency Coordinating Committees.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Kathy Corbiere Service Delivery and Performance Commission
MULTILATERAL ORGANISATION PERFORMANCE ASSESSMENT NETWORK The MOPAN Common Approach 2009 Presentation to DAC Network on Development Evaluation 15 June 2009.
FAO Turkey Partnership Programme (FTPP) FAO Final Evaluation of the FTPP Summary for FTPP Programming Meeting, 14 December
1 Role of Evaluation Societies in Nurturing the M&E Systems Daniel Svoboda Czech Evaluation Society IDEAS Global Assembly October 2015.
EVALUATION OF THE SEE SARMa Project. Content Project management structure Internal evaluation External evaluation Evaluation report.
More Timely, Credible and Cost Effective Performance Information on Multilateral Partners Presented by: Goberdhan Singh Director of the Evaluation Division.
Evaluating Engagement Judging the outcome above the noise of squeaky wheels Heather Shaw, Department of Sustainability & Environment Jessica Dart, Clear.
SWA Progress Review Initial Framing Ken Caplan & Leda Stott 12 November 2013 SWA Partnership Meeting 2013.
[Presentation location] [Presentation date] (Confirm ABT logo) Building Bridges and Bonds (B3): An introduction.
Raising standards improving lives The revised Learning and Skills Common Inspection Framework: AELP 2011.
Development Account: 6th Tranche Strengthening the capacity of National Statistical Offices (NSOs) in the Caribbean Small Island Developing States to fulfill.
Module 8 Guidelines for evaluating the SDGs through an equity focused and gender responsive lens: Overview Technical Assistance on Evaluating SDGs: Leave.
Social Protection Global Technical Team Retreat,
UNICEF Plan for Global Evaluations
Road map and outline of the revised evaluation policy of UNICEF
THE independent evaluation office of Undp Independence, credibility and use IPDET, 30 June 2014 Indran A. Naidoo Director.
Statistics Governance and Quality Assurance: the Experience of FAO
Helene Skikos DG Education and Culture
WHAT is evaluation and WHY is it important?
Proposed Approach to Strengthening Information on Development Effectiveness of Multilateral Organizations Presented by: Goberdhan Singh for the Task Team.
Presentation transcript:

Update on the Multilateral Effectiveness Initiative James Melanson Director of Evaluation CIDA DAC Network on Development Evaluation June 2013

The Challenge Gap in information on development effectiveness Great need for performance information of MOs –Current climate for evidence-based decision making regarding resource allocation Variable coverage, quality and reliability of reporting among MOs does not provide clear picture of their performance Infrequent, lengthy and high cost of independent joint evaluation of MOs Existing efforts focused primarily on organizational assessment of MOs that do not directly address development effectiveness 2

The Response New methodology 2009 – DAC EVALNET Task Team established to develop new methodology that: –Generates a body of credible information on a common set of criteria that would provide a picture of the development effectiveness of MOs –Builds on evidence (evaluation reports) which is already available –Uses methods that are modest in time and cost requirements and with limited burden on MOs 2010 – Methodology developed and pilot tested (ADB & WHO) 2011 – DAC EVALNET endorses as an acceptable methodology 3

Assessing Development Effectiveness Common criteria Methodology focuses on a description of development effectiveness that maps onto the DAC evaluation criteria –Achievement of development objectives and expected results –Crosscutting themes (environmental sustainability and gender equality) –Sustainability of results/benefits –Relevance of interventions –Efficiency –Use of evaluation and monitoring to to improve effectiveness 4

Scenarios & Options 5 Scenario A MO reporting on DE is adequate Scenario B MO reporting on DE is not adequate but evaluation function is Scenario C MO effectiveness reporting and available evaluations inadequate for reporting on DE Option 1 Rely on MO reporting systems Option 2 Conduct a systematic synthesis of information from available evaluations Option 3 Implement actions aimed at strengthening MO evaluation system and DE reporting Apply the meta-synthesis of evaluation results methodology Preliminary Review Establish Universe, Screen Reports

Methodology Preliminary Review Establish universe of evaluation reports prepared by the MO over a three- to four-year time-frame Select sample that provides reasonable coverage of MO programming (geographic, thematic, objectives, sector, technical focus) Screen reports from the sample for quality using DAC and UNEG’s accepted quality standards Decide on Scenario A, B or C depending on screening results –In order to validate if MO falls under Scenario A, it might still be assessed under Scenario B 6

Methodology Meta Synthesis (Scenario B) Review, analyze and classify evaluation findings for each criterion (operational guidelines) Identify contextual factors contributing to or inhibiting effectiveness for each criterion Prepare report that summarizes findings and context, and establishes conclusions and recommendations 7

Experience to Date CIDA led, jointly with the Netherlands, the WFP and UNDP reviews  Successful use of the methodology – WFP and UNDP fall under scenario B (reporting on DE not adequate but evaluation function is)  Reviews provided good understanding of the organizations’ development effectiveness  Constructive conversations took place at the boards CIDA prepared reports based on the pilot tests of the ADB and WHO reviews  Successful use of the methodology  Reviews provided good understanding of the organizations’ development effectiveness, however, given the low number of evaluations available from WHO, no generalization at the organization level was possible 8

Experience to Date CIDA led the AfDB review  Report in process of finalization and will be published on CIDA and EVALNET websites Netherlands led the UNICEF review  Report presented to the board in May

The Findings Organization is effective in achieving most of its objectives and expected results Programs are highly relevant to the needs of the target groups and developing country governments Sustainability and efficiency represent areas for improvement Challenges exist with gender equality and environmental sustainability Good use of evaluation, but inadequate performance frameworks and weak monitoring 10 Scenario B

The Findings Organization is effective in achieving most of its objectives and expected results, and in supporting gender equality and environmental sustainability Programs are highly relevant to the context in which they work Improving the sustainability of benefits remains a challenge Efficiency is an area for improvement Organization faces issues in strengthening decentralized systems for evaluation, monitoring and results-based management Evaluation Office produces high quality evaluations 11 Scenario B

The Findings Insufficient evidence available to make generalizable conclusions The limited number of evaluation reports provide some insights into the effectiveness of those programs –Programs appear to be relevant to stakeholder needs and national priorities and effective in achieving most of its development objectives and expected results –Programs appear to be sustainable, but there are challenges in sustaining the capacity of its partners –Evaluations have not regularly addressed effectiveness in supporting gender equality or environmental sustainability –Systems for evaluation and monitoring to be unsatisfactory 12 Scenario B

The Findings Most programs achieve their objectives and expected results Programs are relevant to stakeholder needs and national priorities Improving the sustainability of benefits remains a challenge Efficiency represents an area for improvement Programs contribute to gender equality and environmental sustainability, but improvements are needed with the latter Evaluation is effective and well used, but challenges are highlighted in monitoring and results-based management 13 Scenario A

The Findings Programs are largely effective and highly relevant to the needs of target groups A stronger focus on gender equality and environmental sustainability is needed Continuity and sustainability of program benefits remain a challenge Efficiency appears to be a challenge Effective use of evaluation through increased preparation of management responses, but challenges remain with integration and dissemination of evidence Challenges remain with respect to monitoring and results-based management 14 Scenario B

MO Engagement Donors engage with multilateral organizations during the whole review process –Launch of the review –Establishment of the evaluation universe –Confirmation of evaluation sample –Preliminary findings –Draft report –Final report Donor-neutral version prepared for each of the reviews and published on the OECD EVALNET website  For use by any partner donor 15

Utility of this approach? For CIDA –Allowed demonstration of accountability for results in multilateral investments –Improved the ability to substantiate positions in engagement at the Boards For Multilateral Organizations –Optimizes resources and reduces transaction costs of reviews by utilizing published and publicly available evaluation reports that encapsulate evidence-based progress being made and lessons learned at country, regional and global levels (UNDP) –Report interesting and valuable, appropriately nuanced and reasonably reflecting the evaluation reports and findings (WFP) –Appreciate use of meta-synthesis methodology but need to ensure source of information is comprehensive and current (ADB) –Best practice and calls for its further emulation (UNDP and WFP) –Constructively discussed at the Boards and provided impetus in areas of needed improvement (anecdotes from Board participants) 16

Complementarity with A study of complementarity with MOPAN was completed during the pilot test phase (2010) –Findings included: The two approaches focus on different aspects of multilateral effectiveness and rely on different information sources Results are complementary rather than duplicative Together, they can provide a more complete picture of multilateral organization's overall performance However, does this still hold true with the new “results component” of the MOPAN assessment? –Both undertake document review –Some similarities in criteria (relevance and achievement of outcomes) –Differences in methodological approach 17

Criteria 18 Extent of progress towards organization-wide outcomes Extent of contribution to country- level goals, priorities and MDGs Relevance of objectives and programme of work to stakeholders Are objectives and expected results at the national and local level in developing countries achieved? Are interventions relevant to needs of target groups and its members? Are benefits and results achieved sustainable? Is programming delivered in a cost- efficient manner? Does programming support crosscutting themes (gender equality and environmental sustainability)? Is evaluation and monitoring used to improve development effectiveness?

Approach 19 Data sources Strategic plans, performance reports, mid-term reviews, evaluations, reviews Country strategies, work plans, reports, evaluations, reviews MDG reports, national development strategies and plans Key informants from HQ, donor countries and direct partners Methods Document review of all available documentation Survey of a sample of key informants Data sources Evaluation reports Other documents, including annual reports (e.g., development effectiveness, strategic plans), COMPAS entries, DAC/UNEG Peer Reviews Methods Preliminary review Meta-evaluation to establish quality and coverage of evaluation performance information Meta-synthesis of a sample of evaluation reports to assess development effectiveness and identify conclusions and recommendations

Possible Next Steps Independent evaluation of MOPAN now in progress Comparison of MOPAN “results component” and EVALNET development effectiveness review in the context of the 2013 IFAD assessment: –Type/level of information obtain from both approaches –Level of effort and resources required –Ability to address information gap on development effectiveness Explore how EVALNET can collaborate with MOPAN? –Ensure complementarity –Possibly integrate best components of each methodology 20