Presentation is loading. Please wait.

Presentation is loading. Please wait.

Update on the Multilateral Effectiveness Initiative James Melanson Director of Evaluation CIDA DAC Network on Development Evaluation June 2013.

Similar presentations


Presentation on theme: "Update on the Multilateral Effectiveness Initiative James Melanson Director of Evaluation CIDA DAC Network on Development Evaluation June 2013."— Presentation transcript:

1 Update on the Multilateral Effectiveness Initiative James Melanson Director of Evaluation CIDA DAC Network on Development Evaluation June 2013

2 The Challenge Gap in information on development effectiveness Great need for performance information of MOs –Current climate for evidence-based decision making regarding resource allocation Variable coverage, quality and reliability of reporting among MOs does not provide clear picture of their performance Infrequent, lengthy and high cost of independent joint evaluation of MOs Existing efforts focused primarily on organizational assessment of MOs that do not directly address development effectiveness 2

3 The Response New methodology 2009 – DAC EVALNET Task Team established to develop new methodology that: –Generates a body of credible information on a common set of criteria that would provide a picture of the development effectiveness of MOs –Builds on evidence (evaluation reports) which is already available –Uses methods that are modest in time and cost requirements and with limited burden on MOs 2010 – Methodology developed and pilot tested (ADB & WHO) 2011 – DAC EVALNET endorses as an acceptable methodology 3

4 Assessing Development Effectiveness Common criteria Methodology focuses on a description of development effectiveness that maps onto the DAC evaluation criteria –Achievement of development objectives and expected results –Crosscutting themes (environmental sustainability and gender equality) –Sustainability of results/benefits –Relevance of interventions –Efficiency –Use of evaluation and monitoring to to improve effectiveness 4

5 Scenarios & Options 5 Scenario A MO reporting on DE is adequate Scenario B MO reporting on DE is not adequate but evaluation function is Scenario C MO effectiveness reporting and available evaluations inadequate for reporting on DE Option 1 Rely on MO reporting systems Option 2 Conduct a systematic synthesis of information from available evaluations Option 3 Implement actions aimed at strengthening MO evaluation system and DE reporting Apply the meta-synthesis of evaluation results methodology Preliminary Review Establish Universe, Screen Reports

6 Methodology Preliminary Review Establish universe of evaluation reports prepared by the MO over a three- to four-year time-frame Select sample that provides reasonable coverage of MO programming (geographic, thematic, objectives, sector, technical focus) Screen reports from the sample for quality using DAC and UNEG’s accepted quality standards Decide on Scenario A, B or C depending on screening results –In order to validate if MO falls under Scenario A, it might still be assessed under Scenario B 6

7 Methodology Meta Synthesis (Scenario B) Review, analyze and classify evaluation findings for each criterion (operational guidelines) Identify contextual factors contributing to or inhibiting effectiveness for each criterion Prepare report that summarizes findings and context, and establishes conclusions and recommendations 7

8 Experience to Date CIDA led, jointly with the Netherlands, the WFP and UNDP reviews  Successful use of the methodology – WFP and UNDP fall under scenario B (reporting on DE not adequate but evaluation function is)  Reviews provided good understanding of the organizations’ development effectiveness  Constructive conversations took place at the boards CIDA prepared reports based on the pilot tests of the ADB and WHO reviews  Successful use of the methodology  Reviews provided good understanding of the organizations’ development effectiveness, however, given the low number of evaluations available from WHO, no generalization at the organization level was possible 8

9 Experience to Date CIDA led the AfDB review  Report in process of finalization and will be published on CIDA and EVALNET websites Netherlands led the UNICEF review  Report presented to the board in May 2013 9

10 The Findings Organization is effective in achieving most of its objectives and expected results Programs are highly relevant to the needs of the target groups and developing country governments Sustainability and efficiency represent areas for improvement Challenges exist with gender equality and environmental sustainability Good use of evaluation, but inadequate performance frameworks and weak monitoring 10 Scenario B

11 The Findings Organization is effective in achieving most of its objectives and expected results, and in supporting gender equality and environmental sustainability Programs are highly relevant to the context in which they work Improving the sustainability of benefits remains a challenge Efficiency is an area for improvement Organization faces issues in strengthening decentralized systems for evaluation, monitoring and results-based management Evaluation Office produces high quality evaluations 11 Scenario B

12 The Findings Insufficient evidence available to make generalizable conclusions The limited number of evaluation reports provide some insights into the effectiveness of those programs –Programs appear to be relevant to stakeholder needs and national priorities and effective in achieving most of its development objectives and expected results –Programs appear to be sustainable, but there are challenges in sustaining the capacity of its partners –Evaluations have not regularly addressed effectiveness in supporting gender equality or environmental sustainability –Systems for evaluation and monitoring to be unsatisfactory 12 Scenario B

13 The Findings Most programs achieve their objectives and expected results Programs are relevant to stakeholder needs and national priorities Improving the sustainability of benefits remains a challenge Efficiency represents an area for improvement Programs contribute to gender equality and environmental sustainability, but improvements are needed with the latter Evaluation is effective and well used, but challenges are highlighted in monitoring and results-based management 13 Scenario A

14 The Findings Programs are largely effective and highly relevant to the needs of target groups A stronger focus on gender equality and environmental sustainability is needed Continuity and sustainability of program benefits remain a challenge Efficiency appears to be a challenge Effective use of evaluation through increased preparation of management responses, but challenges remain with integration and dissemination of evidence Challenges remain with respect to monitoring and results-based management 14 Scenario B

15 MO Engagement Donors engage with multilateral organizations during the whole review process –Launch of the review –Establishment of the evaluation universe –Confirmation of evaluation sample –Preliminary findings –Draft report –Final report Donor-neutral version prepared for each of the reviews and published on the OECD EVALNET website  For use by any partner donor 15

16 Utility of this approach? For CIDA –Allowed demonstration of accountability for results in multilateral investments –Improved the ability to substantiate positions in engagement at the Boards For Multilateral Organizations –Optimizes resources and reduces transaction costs of reviews by utilizing published and publicly available evaluation reports that encapsulate evidence-based progress being made and lessons learned at country, regional and global levels (UNDP) –Report interesting and valuable, appropriately nuanced and reasonably reflecting the evaluation reports and findings (WFP) –Appreciate use of meta-synthesis methodology but need to ensure source of information is comprehensive and current (ADB) –Best practice and calls for its further emulation (UNDP and WFP) –Constructively discussed at the Boards and provided impetus in areas of needed improvement (anecdotes from Board participants) 16

17 Complementarity with A study of complementarity with MOPAN was completed during the pilot test phase (2010) –Findings included: The two approaches focus on different aspects of multilateral effectiveness and rely on different information sources Results are complementary rather than duplicative Together, they can provide a more complete picture of multilateral organization's overall performance However, does this still hold true with the new “results component” of the MOPAN assessment? –Both undertake document review –Some similarities in criteria (relevance and achievement of outcomes) –Differences in methodological approach 17

18 Criteria 18 Extent of progress towards organization-wide outcomes Extent of contribution to country- level goals, priorities and MDGs Relevance of objectives and programme of work to stakeholders Are objectives and expected results at the national and local level in developing countries achieved? Are interventions relevant to needs of target groups and its members? Are benefits and results achieved sustainable? Is programming delivered in a cost- efficient manner? Does programming support crosscutting themes (gender equality and environmental sustainability)? Is evaluation and monitoring used to improve development effectiveness?

19 Approach 19 Data sources Strategic plans, performance reports, mid-term reviews, evaluations, reviews Country strategies, work plans, reports, evaluations, reviews MDG reports, national development strategies and plans Key informants from HQ, donor countries and direct partners Methods Document review of all available documentation Survey of a sample of key informants Data sources Evaluation reports Other documents, including annual reports (e.g., development effectiveness, strategic plans), COMPAS entries, DAC/UNEG Peer Reviews Methods Preliminary review Meta-evaluation to establish quality and coverage of evaluation performance information Meta-synthesis of a sample of evaluation reports to assess development effectiveness and identify conclusions and recommendations

20 Possible Next Steps Independent evaluation of MOPAN now in progress Comparison of MOPAN “results component” and EVALNET development effectiveness review in the context of the 2013 IFAD assessment: –Type/level of information obtain from both approaches –Level of effort and resources required –Ability to address information gap on development effectiveness Explore how EVALNET can collaborate with MOPAN? –Ensure complementarity –Possibly integrate best components of each methodology 20


Download ppt "Update on the Multilateral Effectiveness Initiative James Melanson Director of Evaluation CIDA DAC Network on Development Evaluation June 2013."

Similar presentations


Ads by Google