Assessing Humanitarian Performance: Where are we now? 24 th Biannual Meeting Berlin, 3 rd December 2008.

Slides:



Advertisements
Similar presentations
HELPING THE NATION SPEND WISELY Performance audit and evaluation: common ground with Internal Audit ? The UK National Audit Office experience Jeremy Lonsdale.
Advertisements

REPORTING ON STATISTICAL DEVELOPMENT Reporting on partners activities in country: CRESS or C-PRESS June 4, 2010 PARIS21 Seminar - OECD Conference Center.
Comparative Study of MOPAN and EvalNet Approaches to Assessing Multilateral Organizations’ Development Effectiveness James Melanson Director, Development.
Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
UNSW Strategic Educational Development Grants
Screen 1 of 43 Reporting Food Security Information Reporting Formats for Food Security Report Types Learning Objectives At the end of this lesson you will.
Progress Toward Impact Overall Performance Study of the GEF Aaron Zazueta GEF Evaluation Office Hanoi, March 10, 2010.
Project Monitoring Evaluation and Assessment
Update on the Multilateral Effectiveness Initiative James Melanson Director of Evaluation CIDA DAC Network on Development Evaluation June 2013.
The IASC Transformative Agenda. Floods inFloods inPakistan 2010 Earthquakes In HaitiIn Haiti UN Photo/Logan AbassiUN Photo/Evan Schneider.
Pillar 4a Information management
Nutrition Cluster - South Sudan Nutrition Cluster Performance Monitoring Review Workshop Findings 4 th April 2014 ARON HOTEL.
Evidence-based approaches to Humanitarian Aid The Irish Aid Experience.
Humanitarian Programme Cycle 2015 August
TEC Initial Findings v040 8-Aug-15 Initial findings from the TEC.
The Process of Conducting a Post Disaster Needs Assessment (PDNA) United Nations Development Programme Bureau for Crisis Prevention and Recovery Bangkok,
1Office for the Coordination of Humanitarian Affairs (OCHA) CAP (Consolidated Appeal Process) Section The Consolidated Appeal Process Rome, 9-10 May 2012.
Needs Analysis Session Scottish Community Development Centre November 2007.
MeTA Jordan Executive Summary Baseline data is an important source for policy makers to diagnose the pharmaceutical and health sector situation in order.
1 Mutual Accountability & Donor Performance Assessment The Case of Mozambique Pedro Couto, Vice-Minister of Finance Hanoi, February 2007 Pedro Couto, Vice-Minister.
11 Saleh Saeed, CEO Disasters Emergency Committee The Disasters Emergency Committee: A Strategic Alliance for Effective Fundraising During Disasters.
1 RBM Background Development aid is often provided on a point to point basis with no consistency with countries priorities. Development efforts are often.
Needs Assessment Overview Assessment and Classification of Emergencies (ACE) Project IASC Weekly Meeting 28 January 2009.
1. IASC Operational Guidance on Coordinated Assessments (session 05) Information in Disasters Workshop Tanoa Plaza Hotel, Suva, Fiji June
COMING OF AGE OF JOINT EVALUATIONS? ALNAP Presentation at OECD-DAC 21 st January 2008.
OPS CAP ONLINE PROJECT DATABASE. ONLINE PROJECT SYSTEM (OPS) The OPS allows CAP partners to edit, manage, submit and revise their projects online, as.
How to Use National Governance Data for UNDAF, CCA and other development frameworks Workshop on Measuring and Assessing Democratic Governance November,
Addressing Humanitarian Public Health Challenges through collaborative research and innovation - A case study Jess Camburn, ELRHA Director 17 th September.
Needs Assessment Task Force The story of MIR(N?)A IASC Needs Assessment Task Force Geneva, 18 March 2011.
Development agency support for impact evaluation Harry Jones, ODI Impact Evaluation Conference 2009.
Evaluation of sector programmes and budget support operations in the context of EU development cooperation 1 st M&E Network Forum 07 to 08 November 2011.
Global Development in Humanitarian Action Information Management Updates 1 4 th Pacific Humanitarian Team Annual Meeting October Holiday Inn, Suva,
Methods: Pointers for good practice Ensure that the method used is adequately described Use a multi-method approach and cross-check where possible - triangulation.
Current Status. Norwegian commissioned study (2009)  At the global level, preparedness and disaster risk reduction (DRR) saves lives and is cost effective,
FUNCTION 5: MONITORING M5 – S1. 1.Situation Monitoring 2.Humanitarian Response Monitoring 3.Coordination Performance Monitoring Types of Monitoring.
GUIDELINES ON DATA ISSUES IN HUMANITARIAN CRISIS SITUATIONS THE ACUTE PHASE OF EMERGENCIES LAMLENN SAMSON HRB/UNFPA, NEW YORK SUVA, SEPTEMBER 2011 Suva,
Nutrition Cluster Initiative on Assessment in Emergencies including Infant Feeding in Emergencies Bruce Cogill, Ph.D. Global Cluster Coordinator IFE Meeting.
Presented by CIDA on behalf of the Task Team on Multilateral Effectiveness.
Regional humanitarian networks ALNAP Biannual Meeting Madrid, 3 rd June.
DETERMINE Working document # 4 'Economic arguments for addressing social determinants of health inequalities' December 2009 Owen Metcalfe & Teresa Lavin.
TEC Initial Findings v Nov-15 Initial findings from the TEC.
IASC Needs Assessment Task Force IASC Needs Assessment Task Force NATF Training on Coordinated Assessments Revinge, Sweden October 28, 2010.
Humanitarian aid evaluation at Medecins sans Frontieres
Monitoring the Paris Declaration in 2011 Preliminary Findings Working Party on Aid Effectiveness Paris, 5-8 July 2011.
Tsunami Evaluation Coalition: Progress to Date. Background OCHA, WHO, ALNAP Secretariat + DANIDA, USAID Level 1 Purpose: To promote a sector wide approach.
Tracking national portfolios and assessing results Sub-regional Workshop for GEF Focal Points in West and Central Africa June 2008, Douala, Cameroon.
Using results frameworks to shift the focus of evaluation to a strategic level Emerging research on the principles underpinning results frameworks Kate.
1 MAMI (Management of Acute Malnutrition in Infants) Funded by UNICEF-led IASC Nutrition Cluster A retrospective review of the current field management.
Kathy Corbiere Service Delivery and Performance Commission
Flash Appeals NATF/ACAPS Training Revinge When is a Flash appeal issued? Part of the Consolidated Appeals Process, the FA is considered as the humanitarian.
The role of impact assessment in promoting Policy Coherence for Development Meeting of the national focal points for Policy Coherence for Development 1.
FAO Turkey Partnership Programme (FTPP) FAO Final Evaluation of the FTPP Summary for FTPP Programming Meeting, 14 December
27/04/2017 Strengthening of the Monitoring and Evaluation system for FTPP/FTTP in FAO /SEC December 2015 FTPP/FTFP Workshop, Bishkek, Kyrgyzstan.
EVALUATION OF THE SEE SARMa Project. Content Project management structure Internal evaluation External evaluation Evaluation report.
High level seminar on the implementation of the System of National Accounts 2008 in the GCC countries Muscat, Oman, 27 May 2010 United Nations Statistics.
IASC Task Team on Accountability to Affected Populations and Protection from sexual Exploitation and Abuse (AAP/PSEA) What should happen with the TT after.
More Timely, Credible and Cost Effective Performance Information on Multilateral Partners Presented by: Goberdhan Singh Director of the Evaluation Division.
Humanitarian Performance Project ALNAP Biannual Meeting Madrid, 5th June.
UNECE meeting on National Accounts Geneva, Switzerland April 2010 Herman Smith UNSD Response to the economic and financial crisis Agenda item 10.
Bangladesh Joint Country Assistance Evaluation: Assessing Total ODA at the Country Level Presentation to OECD DAC November 2006 Bruce Murray Director General.
Evaluation Practice Exchange Seminar 13 th March 2015 Anne-Claire Luzot Senior Evaluation Officer, WFP Office of Evaluation Scene setting: humanitarian.
Monitoring and Evaluation Systems for NARS organizations in Papua New Guinea Day 4. Session 10. Evaluation.
Assessments ASSESSMENTS. Assessments The Rationale and Purpose for Assessments.
Title of presentation Copyright IDS and MeTA 2010
Europe’s Environment Assessment of Assessments EE-AoA 2011
Monitoring and Evaluation of Peacekeeping and Peacebuilding
Workshop on tracking of nutrition-relevant budget allocations Bangkok, April Conclusions and recommendations.
RRP6 Development Process
Evaluation of the marketing standards framework for fishery and aquaculture products Presentation to the Market Advisory Council 23 May 2018 Brussels.
24 January 2018 Juba, Republic of South Sudan
Presentation transcript:

Assessing Humanitarian Performance: Where are we now? 24 th Biannual Meeting Berlin, 3 rd December 2008

24th ALNAP Biannual Meeting, December Various strands of ALNAP work are working towards assessing system- wide performance  Various components of RHA - evaluation synthesis, meta-evaluation (especially on joint evaluations) and themed chapters  Facilitation of TEC and discussions on recommendations  HPP - data mapping and exploratory analysis of how to assess system-wide performance

24th ALNAP Biannual Meeting, December What have we learned from the RHA?  Evaluation synthesis useful but, on its own, not able to assess performance  Component parts of RHA are good but the final product is probably less than the sum of its parts  Need to strengthen methodology and produce a more coherent whole

24th ALNAP Biannual Meeting, December What have we learned from the TEC?  joint evaluations better than single-agency evaluations in providing system-wide snapshot  system-wide joint evaluations provide one off picture only  utilisation and take up of recommendations very difficult to achieve in practice

24th ALNAP Biannual Meeting, December What have we learned from HPP?  Lots of data collected, but of different types, from different sources with different uses.  Many methodological and conceptual difficulties  Majority data gathered in needs assessment phase  Very little effort given to seeking the views of affected populations/ recipients of aid

24th ALNAP Biannual Meeting, December What have we learned from the Madrid biannual? a)Be realistic as to what can be achieved now, use existing evidence to assess performance, with a special emphasis on ‘impact’ b)Explore the use of beneficiary surveys in assessment of impact and performance c)Develop a ‘pilot’ to test these ideas d)Continue mapping and do not lose sight of developing a more precise way of assessing performance

24th ALNAP Biannual Meeting, December What are we going to do? The three-track approach:  Track one - fast track ‘State of the System’ pilot  Track two - medium track Learn more about use of beneficiary surveys and impact assessment and feed this into future ‘State of the System’ reports  Track three - slow track Continue mapping and work on developing key performance indicators

24th ALNAP Biannual Meeting, December Track One: State of the System Report What is it for?  Overall goal is to assess overall humanitarian performance against agreed criteria  Pilot will provide a ‘base-line’ to track future performance

24th ALNAP Biannual Meeting, December What problems will we face?  Analysing a system that is not strictly a system (i.e., not systematic)  Lack of data relating to outcomes and indicators

24th ALNAP Biannual Meeting, December How will we address the problems? a) Break down the system into different units of analysis. Disaggregate the data by looking at: state of response in individual crisis state of response in particular sectors (clusters) state of response in particular categories – natural disasters, wars, high profile crisis, neglected crises state of response in relation to types of actors UN, NGOs, donors, governments etc

24th ALNAP Biannual Meeting, December b) For each unit of analysis, performance will be analysed in relation to OECD-DAC criteria: Relevance/Appropriateness Connectedness Coherence Coverage Efficiency Effectiveness Impact

24th ALNAP Biannual Meeting, December c) Need to identify indicators to apply to OECD- DAC criteria. For example: was coverage adequate: are resources adequate global funding against needs (CAP, beneficiary surveys) funding across sectors and emergencies staffing coverage in key areas and so on..

24th ALNAP Biannual Meeting, December What do we want the report to tell us?  emerging themes  trends – how has sector y or response in emergency x changed over time  innovations and changes  performance indicators if/when they exist  perceptions of informed stakeholders about effectiveness, impact etc

24th ALNAP Biannual Meeting, December What methods shall we use? Building on the RHA  key informant interviews (NGO’s, donors, government) aiming for mix of HQ and field  financial data analysis – (OECD-DAC and FTS)  mapping of global footprint and across current emergencies  key informant survey – polling opinion about performance to provide base line  evaluation synthesis  literature review

24th ALNAP Biannual Meeting, December What now?  establish peer review advisory panel  undertake preliminary interviews/ consultations or input on scope and objectives of pilot  design detailed methodology and research plan in inception report to be peer reviewed