Comparative Study of MOPAN and EvalNet Approaches to Assessing Multilateral Organizations’ Development Effectiveness James Melanson Director, Development.

Slides:



Advertisements
Similar presentations
Why would you want to do a CPEIR and how might you benefit?
Advertisements

Evaluation at NRCan: Information for Program Managers Strategic Evaluation Division Science & Policy Integration July 2012.
1 A Framework for Common Action around shared goals Presentation by Harald Lossack, GTZ on Behalf of OECD/DAC ENVIRONET PEP Meeting 15 June 2006.
Summary of Report to IATI Steering Committee, Paris 9 February 2011 Richard Manning.
1 of 17 Information Strategy The Features of an Information Strategy © FAO 2005 IMARK Investing in Information for Development Information Strategy The.
Delivering as One UN Albania October 2009 – Kigali.
Enhanced Integrated Framework for Trade-Related Technical Assistance to Least Developed Countries (EIF) IF Programme Implementation Unit (PIU) World Trade.
GEOSS Data Sharing Principles. GEOSS 10-Year Implementation Plan 5.4 Data Sharing The societal benefits of Earth observations cannot be achieved without.
Policies and Procedures for Civil Society Participation in GEF Programme and Projects presented by GEF NGO Network ECW.
SAI Performance Measurement Framework
1 An Assessment of the First Three Years Consultative Group meeting Brussels 8 October 2002 Development Planning Unit.
Results Reporting by Donor Agencies (DAC/WP EFF – Cluster MfDR) Presented by Adrian Maître, SDC, and Daniel Low-Beer, GFATM EU Expert Group on Results,
Project Monitoring Evaluation and Assessment
Update on the Multilateral Effectiveness Initiative James Melanson Director of Evaluation CIDA DAC Network on Development Evaluation June 2013.
Session V: Programme Roles and Responsibilities
EEN [Canada] Forum Shelley Borys Director, Evaluation September 30, 2010 Developing Evaluation Capacity.
By Saurabh Sardesai October 2014.
Changes of the American Community Survey State Date Center and Census Information Center Steering Committees Tasha Boone October 22, 2014.
Orientation to the Accreditation Internal Evaluation (Self-Study) Flex Activity March 1, 2012 Lassen Community College.
Evaluation. Practical Evaluation Michael Quinn Patton.
Evaluation Cooperation Group Presentation to the DAC Network on Development Evaluation 9/10 November 2004 Role and functioning of the Evaluation Cooperation.
System-wide Action Plan for implementation of the CEB Policy on gender equality and the empowerment of women: briefing UN Women Coordination Division.
Internal Auditing and Outsourcing
Project Overview, Objectives, Components and Targeted Outcomes
Government’s Expenditure Review Initiative Progress Lunchtime seminar of Irish Evaluation Network 10 March 2005 Conor McGinn, Department of.
Formative Evaluation of UNGEI Findings and Selected Recommendations Presentation to UNGEI GAC 14 February 2012.
IAOD Evaluation Seminar “Demystifying Evaluation in WIPO- Best Practices from Initial Evaluations” Geneva November, Evaluation Section Internal.
Strengthening the Production and Use of Statistics in the OIC Strengthening the Production and Use of Statistics in the OIC Mohamed-El-Heyba Lemrabott.
Essential SNA Project being developed from 2011 to 2013.
Global Task Team: Improving AIDS Coordination Among Multilateral Institutions and International Donors Briefing for Theme Group on HIV/AIDS 1 November.
IAOD Evaluation Section, the Development Agenda (DA) and Development Oriented Activities Julia Flores Marfetan, Senior Evaluator.
Tracking of GEF Portfolio: Monitoring and Evaluation of Results Sub-regional Workshop for GEF Focal Points Aaron Zazueta March 2010 Hanoi, Vietnam.
A Review of the Standing Committee of Caribbean Statisticians (SCCS) as a Mechanism for Statistical Development and Harmonisation The Second Meeting of.
1 Women Entrepreneurs in Rural Tourism Evaluation Indicators Bristol, November 2010 RG EVANS ASSOCIATES November 2010.
How to use the VSS to design a National Strategy for the Development of Statistics (NSDS) 1.
ACCREDITATION Goals: Goals: - Certify to the public and to educational organizations that the school is recognized as an effective institution of learning.
1 1 Workshop on Improving Statistics on SME's and Entrepreneurship, Paris, September 2003 Draft Conclusions and Recommendations.
© Commonwealth of Australia 2003 The Quality Assessment Framework.
Presented by CIDA on behalf of the Task Team on Multilateral Effectiveness.
European Commission Joint Evaluation Unit common to EuropeAid, Relex and Development Methodology for Evaluation of Budget support operations at Country.
Policies and Procedures for Civil Society Participation in GEF Programme and Projects presented by GEF NGO Network ECW.
Developing Consensus Principles and Standards for Evaluating Global and Regional Partnership Programs (GRPPs) Progress Report to the Fifth Meeting of the.
Pilar Barrera Operations officer Civil Society/Capacity Development Expanded Constituency Workshop Sarajevo, September 6, 2013 Cross-Cutting Capacity Development.
Tracking national portfolios and assessing results Sub-regional Workshop for GEF Focal Points in West and Central Africa June 2008, Douala, Cameroon.
United Nations Economic Commission for Europe Statistical Division The UNECE Gender Database and Website UNECE Statistical Division.
A short introduction to the Strengthened Approach to supporting PFM reforms.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Piloting Integrated Processes and Approaches To Facilitate National Reporting to Rio Conventions – a GEF/UNEP project CBD SBSTTA 14 May 2010, Nairobi,
Evaluation Seminar Czech Republic CSF and OP Managing Authorities Session 3: Mid-Term Evaluation.
Kathy Corbiere Service Delivery and Performance Commission
MULTILATERAL ORGANISATION PERFORMANCE ASSESSMENT NETWORK The MOPAN Common Approach 2009 Presentation to DAC Network on Development Evaluation 15 June 2009.
Quality Assessment of MFA’s evaluations Rita Tesselaar Policy and operations Evaluation Department Netherlands Ministry of Foreign Affairs.
Copyright © 2012 Pearson Canada Inc. 00 Chapter 11 Alliances as Vehicles.
United Nations Economic Commission for Europe Statistical Division The UNECE Gender Statistics Database and Website UNECE Statistical Division.
EVALUATION OF THE SEE SARMa Project. Content Project management structure Internal evaluation External evaluation Evaluation report.
Global Partnership for Enhanced Social Accountability (GPESA) December 19, 2011 World Bank.
More Timely, Credible and Cost Effective Performance Information on Multilateral Partners Presented by: Goberdhan Singh Director of the Evaluation Division.
REVIEW OF JAR EXPERIENCES MAIN FINDINGS IHP+ Nairobi Meeting December
Practical Experiences - Evaluation of Program 1 Geneva January 29, 2016.
Page 1 Portfolio Committee on Water and Environmental Affairs 14 July 2009.
Tools for Mainstreaming Disaster Risk Reduction: Guidance Notes for Development Organisations Charlotte Benson and John Twigg Presented by Margaret Arnold.
SWA Progress Review Initial Framing Ken Caplan & Leda Stott 12 November 2013 SWA Partnership Meeting 2013.
Options for harmonizing national reporting to biodiversity-related agreements Peter Herkenrath UNEP World Conservation Monitoring Centre.
Rooting evaluation independence in the context of multilateral development organizations Oscar A. Garcia Director, Independent Office of Evaluation of.
June 24, 2016 UNAIDS Partnerships- Strategies, structures and social relationships Sally Smith Partnership Adviser.
Evaluation in the SDG era: lessons, challenges and opportunities for UNEG EPE, 26 April 2016, Geneva.
Social Protection Global Technical Team Retreat,
9/16/2018 The ACT Government’s commitment to Performance and Accountability – the role of Evaluation Presentation to the Canberra Evaluation Forum Thursday,
Statistics Governance and Quality Assurance: the Experience of FAO
Proposed Approach to Strengthening Information on Development Effectiveness of Multilateral Organizations Presented by: Goberdhan Singh for the Task Team.
Presentation transcript:

Comparative Study of MOPAN and EvalNet Approaches to Assessing Multilateral Organizations’ Development Effectiveness James Melanson Director, Development Evaluation Division Foreign Affairs, Trade and Development Canada DAC Network on Development Evaluation February 2014

Introduction The DAC EvalNet endorsed approach to reviewing development effectiveness of MOs was originally designed to complement MOPAN’s organizational assessment The recent MOPAN Evaluation found duplication, in particular between the more recently developed “results component” of MOPAN and the EvalNet Approach The Evaluation recommended the two be merged Members from EvalNet and MOPAN felt further analysis needed of objectives, methods, content and costs of the two approaches – for an informed decision Canada offered to lead a comparative study

Study Framework Objectives Provide a comparative assessment of both approaches Primary focus on the development effectiveness (results) components Seek to identify insights to resolve duplication Support potential decision-making by donor countries on the future of the two Key Questions What are the stated goals of the approaches? What needs are they trying to address? What are the differences between the approaches? Where do they overlap? How well are the approaches meeting their stated goals? If the two approaches were to be integrated, how might that be done?

Findings What are the stated goals of the approaches Findings What are the stated goals of the approaches? What needs are they trying to address? Both approaches Were developed to address the information needs of donors regarding the effectiveness of MOs The emphasis of the two approaches correspond to their different initial focus on organizational effectiveness (MOPAN) and development effectiveness (EvalNet) Seek to produce credible information to meet domestic accountability requirements and to encourage improvement by multilateral organizations

Findings What are the differences? Where do they overlap? Selection of Multilateral Organizations Both select multilateral organizations to be assessed based on the same criteria Methodology MOPAN uses surveys, document review and interviews EvalNet primarily uses primarily document review Sampling MOPAN performs sampling at several different levels (country programs, survey recipients, interviewees, and documents) EvalNet selects a representative sample of the MO’s own evaluations reports

Findings (Cont’d) What are the differences? Where do they overlap? Data analysis MOPAN uses statistical analysis for surveys and content analysis for document review; various ratings scales used EvalNet uses content analysis of evaluation reports and a four-point rating scale Engagement MO is engaged throughout the assessment under MOPAN MO is engaged at beginning and end of assessment under EvalNet Publication All MOPAN and EvalNet reports are published online

Findings (Cont’d) What are the differences? Where do they overlap? Similar Effectiveness indicators Strong overlap between the MOPAN “results component” indicators and EvalNet indicators Distinct MOPAN organisational performance indicators MOPAN indicators related to corporate strategy and mandate, harmonizing procedures, availability of documents Distinct EvalNet relevance and sustainability indicators EvalNet Indicators related to positive benefits to target groups (achievement of objectives) and sustainability

Findings (Cont’d) What are the differences? Where do they overlap? Partial match between MOPAN and EvalNet indicators Corporate focus on results, resource allocation decisions, contributing to policy dialogue and disseminating lessons learned Crosscutting themes of environmental sustainability and gender equality and efficiency Costs and level of effort MOPAN Approach: C$350,000 and 320 days for entire process EvalNet Approach: C$125,000 and 100 days Burden on MO higher with MOPAN Approach

Findings How well are the approaches meeting their stated goals? MOPAN Approach Strong approach to assessing organizational effectiveness Use of multiple lines of evidence Good engagement with MOs Approach more expensive and takes longer High level of effort on the part of MOs and other stakeholders EvalNet Approach Successful in producing independent, credible, evidence-based information Provides a picture of MOs’ development effectiveness Indicators aligned with DAC Evaluation Standards Requires fewer financial resources and less time to complete Minimal level of effort on the part of MOs

Findings If the two were to be integrated, how might that be done? EvalNet Approach could substitute for MOPAN’s “results component” Achievement of results Relevance Sustainability Efficiency EvalNet Approach’s performance management indicators could be integrated either into MOPAN Approach’s knowledge management or operational management sections EvalNet Approach’s crosscutting themes indicators could be integrated into MOPAN Approach’s strategic management section

Conclusions EvalNet approach of using the MO’s own evaluation evidence, pre- screening for quality and coverage, and synthesizing against DAC evaluation criteria (i.e. relevance, effectiveness, efficiency, and sustainability) , would be a viable equivalent to the current MOPAN “results component” Assessment of MO achievement of its own stated strategic objectives, which is required by the current MOPAN Approach, is not addressed by the current EvalNet Approach Evidence on gender equality and environmental sustainability gathered in the current EvalNet approach would strengthen organizational effectiveness assessment under MOPAN

Update and MOPAN 3.0 Study was presented at the December 2013 MOPAN meeting MOPAN members agreed that: MOPAN should draw lessons from the EvalNet approach and consider elements that could be replicated or used for MOPAN 3.0 A MOPAN 3.0 Task Team has been created to develop and cost options for the assessment of multilateral organisations from 2015 onwards Representatives of the MOPAN Technical Working Group, the MOPAN Strategic Working Group, the Secretariat, and a consultants Costed options will be presented to MOPAN Steering Committee to consider in March 2014