Tsunami Evaluation Coalition 8-Aug-15 The Tsunami Evaluation Coalition: What Worked and What Did Not? European Evaluation Society 2006.

Slides:



Advertisements
Similar presentations
Ministry of Labor and Social Policy SOCIAL POLICY COUNCILS Dragica Vlaović-VasiljevićSophia, 2-6th July 2007 Dragica Vlaović-VasiljevićSophia, 2-6th July.
Advertisements

Tips and Resources IASC Cluster/Sector Leadership Training
Guidance Note on Joint Programming
Delivering as One UN Albania October 2009 – Kigali.
SGA1 – The evolving role of UNAIDS in a changing financial environment UNAIDS has adapted to a new funding environment and developed strong and positive.
Workshop B NGOs and the Cluster Roll-Out Strengths and Suggestions for the Future.
Making the UN “fit for purpose” for the new sustainable development agenda John Hendra December 2014.
Preliminary Reflections on the IASC RTE of Darfur prepared for ALNAP, December 2004.
Internal Evaluation Evaluation of the UNISDR Secretariat Asia Pacific 2009 Contract Commencement: Monday 2 February 2009 Contract Completion: Friday 6.
Overview of the Global Fund: Guiding Principles Grant Cycle / Processes & Role of Public Private Partnerships Johannesburg, South Africa Tatjana Peterson,
Presentation by Ms. Siona Koti DAD Community of Practice, Yerevan, Armenia June 2011 DAD Solomon Islands: Channelling Donor Resources to the Country’s.
Capacity Development for Cooperation Effectiveness in Latin America and the Caribbean OAS Subregional Workshop for Cooperation Effectiveness: Caribbean.
Commonwealth Local Government Forum Freeport, Bahamas, May 13, 2009 Tim Kehoe Local Government and Aid Effectiveness.
CSO’s on the Road to Busan: Key Messages and Proposals.
SSATP ANNUAL MEETING 2010 Kampala, UGANDA OCTOBER 18-21, 2010 TRANSPORT STRATEGY UPDATING PROCESS IN BURKINA FASO Sub-Saharan Africa Transport Policy Program.
The Current Debate on the Post-2015 Development Agenda Assistant Secretary-General Thomas Gass December 2014.
TEC Initial Findings v040 8-Aug-15 Initial findings from the TEC.
Presentation title goes here this is dummy text Joint Office : -What have we learned?- “Delivering as One” through Joint Programming and Joint Programmes.
1 UNity in diversity One UN Steering Committee Meeting 9 November 2007 One UN-Delivering as One in Rwanda.
System-wide Action Plan for implementation of the CEB Policy on gender equality and the empowerment of women: briefing UN Women Coordination Division.
Challenges of Global Alcohol Policy Developments FIVS Public Policy Conference 7-9 April 2014 Brussels, Belgium.
Assessing Humanitarian Performance: Where are we now? 24 th Biannual Meeting Berlin, 3 rd December 2008.
2 Improved donor cooperation and collaboration with partner countries for pro-poor growth in rural areas by Christoph Kohlmeyer BMZ,Co-Chair of the Global.
Identify the institutions which have a stake in the
BUILDING A COMMUNITY OF PRACTICE. Question 1: What is the mission of our network? To share knowledge and experiences. To extract lessons to improve dialogue.
Experience of Wolaita Cluster Consortia Joint Resilience Building
1 Consultative Meeting on “Promoting more effective partnership between INGOs and other CSOs” building on Oxfam’s “Future Roles of INGO in Cambodia”, 24.
1. IASC Operational Guidance on Coordinated Assessments (session 05) Information in Disasters Workshop Tanoa Plaza Hotel, Suva, Fiji June
Coordination and Net Working on DRR Rapid Emergency Assessment and Coordination Team (REACT) Bishkek November, 2009.
Workshop on Humanitarian Reform and Coordination by NHRPII for Members of SHOC Mogadishu, 17 June 2012 How do we coordinate?
Unit 3 - Part 2 Working with Mission Partners. UN Pre-Deployment Training (PDT) Standards Core PDT Materials 1 st Ed Why is this important for me?
Towards establishing a SUN Civil Society Alliance (CSA) in the Philippines.
The IASC Humanitarian Cluster Approach Angelika Planitz UNDP BCPR Developing Surge Capacity for Early Recovery March 2006.
Case development for a fund supporting CSOs active in national health policies Presentation to the IHP+ Steering Committee.
Roma Education Fund Presentation by Rumyan Russinov Deputy Director.
Sphere India: Genesis and Milestones Sphere India: Present Status Sphere India Management Structure Sphere India: Future Plans Sphere India.
1 Joint humanitarian impact evaluation: options paper Tony Beck 25 th ALNAP Meeting, London, 18 th November 2009 Commissioned by the Evaluation and Studies.
12-Oct-15 The Tsunami Evaluation Coalition. 12-Oct-15 The Tsunami Evaluation Coalition (TEC) January 2005: Several agencies discussed with ALNAP how to.
Page 0 United Nations Development Programme PLATFORM TO POOL DONOR FUNDS & STEER COORDINATION PLATFORM TO POOL DONOR FUNDS & STEER COORDINATION Lessons.
Human Services Integration Building More Effective Responses to Peoples’ Needs.
February 21, JAS Consultation between the Government of Tanzania and Development Partners February 21, 2006 Courtyard Hotel, Dar es Salaam.
Tsunami Evaluation Coalition (TEC): An Introduction.
European Commission Joint Evaluation Unit common to EuropeAid, Relex and Development Methodology for Evaluation of Budget support operations at Country.
Regional humanitarian networks ALNAP Biannual Meeting Madrid, 3 rd June.
Training Resource Manual on Integrated Assessment Session UNEP-UNCTAD CBTF Process of an Integrated Assessment Session 2.
TEC Initial Findings v Nov-15 Initial findings from the TEC.
Trade Union Training on employment policies with a focus on youth employment 11 July, 2007 Turin, Italy.
Building a Stronger, More Predictable Humanitarian Response System reform HUMANITARIAN.
Sida’s Support to Civil Society Presentation by Team Civil Society to Development Practioners’ Network in Prague 13 May 2009.
Aid Transparency: Better Data, Better Aid Simon Parrish, Development Initiatives & IATI Yerevan, 4 October 2009.
Tsunami Evaluation Coalition: Progress to Date. Background OCHA, WHO, ALNAP Secretariat + DANIDA, USAID Level 1 Purpose: To promote a sector wide approach.
Vision: A strong and capable civil society, cooperating and responsive to Cambodia’s development challenges Host of the 2nd Global Assembly for CSO Development.
Vito Cistulli - FAO -1 Damascus, 2 July 2008 FAO Assistance to Member Countries and the Changing Aid Environment.
WA Task Report Prepared by Rick Lawford May 29, 2008.
Joint UN Teams and Programmes on AIDS Lessons from a UNDP/UNAIDS e-Discussion.
FOLLOW UP TO THE 9 th ROUND TABLE Riccardo del Castello Communication for Development Officer FAO.
Innovative and effective approaches to climate change: Experiences from the Global Climate Change Alliance Brussels 12 th -14 th September 2012 Thematic.
A Selection of TEC Findings and Recommendations Relevant to GHD Principles February 2006 Montreux Niels Dabelstein The Tsunami Evaluation Coalition.
FAO Turkey Partnership Programme (FTPP) FAO Final Evaluation of the FTPP Summary for FTPP Programming Meeting, 14 December
Session 8 Improving technical assistance in the health sector: current issues and opportunities 1.
Global Partnership for Enhanced Social Accountability (GPESA) December 19, 2011 World Bank.
Bangladesh Joint Country Assistance Evaluation: Assessing Total ODA at the Country Level Presentation to OECD DAC November 2006 Bruce Murray Director General.
Tools for Mainstreaming Disaster Risk Reduction: Guidance Notes for Development Organisations Charlotte Benson and John Twigg Presented by Margaret Arnold.
Strategic Objective 3 Pilot Independent System Wide Evaluation (ISWE) Progress, Emerging Lessons and Next Steps.
International Affairs Division
Global Shelter Cluster Strategy Evaluation
Second SDG Partnerships Webinar:
The Functioning of NGO DRR Networks
Module 5 SDG follow-up and review mechanisms
Why Humanitarian Reform?
Presentation transcript:

Tsunami Evaluation Coalition 8-Aug-15 The Tsunami Evaluation Coalition: What Worked and What Did Not? European Evaluation Society 2006

Tsunami Evaluation Coalition 8-Aug-15 Background of Tsunami Evaluation Coalition The TEC is a new sector wide learning and accountability initiative constituted in February 2005 It is made up of about 40 UN agencies, donors, NGOs, a non- for-profit and the Red Cross/Crescent Movement. Participating agencies have worked within a framework that encourages sector-wide information sharing, lesson learning, accountability and transparency. Focus on cross-cutting themes (coordination, needs assessment, local capacities, donor response, LRRD) and sector-wide performance rather than on individual agency performance

Tsunami Evaluation Coalition 8-Aug-15 TEC Timeline February 2005 Geneva Meeting April 2005 First TEC teleconference June 2005 ALNAP Meeting in the Hague July – August - planning phase for all evaluations September – November – field visits October 05 – Copenhagen Meeting: Comm/Diss strategy November- May 2006 Report Writing December 8: ALNAP meeting/TEC meeting Brussels: Presentation of early findings and early findings report December 25 – publication of early findings report February 2006 – Teamleader validation meeting London February – June 06 – Production of the Synthesis Report July 06 – Launch of Synthesis Report during ECOSOC July 06 – April 07 – TEC Follow up

Tsunami Evaluation Coalition 8-Aug-15 Getting started … (1) This was a voluntary initiative started by a few actors who felt the time was right for a major inter-agency initiative The first meeting in 02/05 did not immediately provide clarity about roles and responsibilities, nor the actual nature of the various studies Many actors stayed on the fence …. Much time was initially spent on gaining mutual confidence and building relationships Key initial actors busy with other things and TEC workload was significant for all key actors There needed to be dedicated time and resources at the beginning of the process: jump started by ALNAP - the f/t researcher played pivotal role to keep the TEC going during the early days

Tsunami Evaluation Coalition 8-Aug-15 Getting started …..(2) Key tipping points were: the appointment of the researcher, the appointment of the coordinator and the ALNAP Biannual Meeting in the Hague in June 2005 ALNAP meeting in particular brought the necessary buy-in and funding Funding, however, came in slow and had adverse impact on timeliness of the TEC Some agencies had to wait for full funding before the evaluation process took off – delayed start-up of TEC missions as they were to be undertaken simultaneously TOR preparation not coordinated - duplication “Fishing in the same pond”

Tsunami Evaluation Coalition 8-Aug-15 Fundraising Getting commitments from some major donors brought in others and gave wide buy-in Down-side: multiple donors with short time-frames lead to short contracts for consultants, shortened field visits, increased admin costs Raising funds for the core of the TEC and between studies should have been better coordinated Fundraising for all five studies and the TEC Secretariat was extremely time-consuming and cumbersome – this should have been part of the appeal or a special trust fund established Yet, excellent results BUT can this be replicated?

Tsunami Evaluation Coalition 8-Aug-15 Implementation Modalities Set-up with a core management group and a broader working group worked well Strong commitment by CMG and sub-groups – with very harmonious way of working together 3/5 studies had similar set-ups Good mix between face-to-face meetings and teleconferences Good use of technology – telecon, shared documents, mapping, the resource CD Backing of ALNAP, a network with a natural fit to the TEC and an interest in joint evaluations – was critical Complex arrangement

Tsunami Evaluation Coalition 8-Aug-15 Flows: Management Coordination Evaluation Reports Core Management Group for the Tsunami Evaluation Coalition and the six thematic evaluations Synthesis Report Written by the Synthesis Primary Author with contributions from the EAC and the RDC. TEC Online Forum (includes the Evaluation Map) Longer term Studies (from ’06) ALNAP Secretariat Hosts the TEC and manages the writing of the Synthesis Report. TEC staff include: Evaluation Advisor & Coordinator (EAC), Researcher & Deputy Coordinator (RDC), and TEC Administrator Theme: Coordination led by OCHA Theme: Needs Assessment Led by WHO, SDC & FAO Theme: Impact on Local & National Capacities Led by UNDP by DMI Theme: LRRD Led by Sida Theme: Impact Assessment led by IFRC with the Global Consortium Individual Agency Evaluations (TEC Members) Theme: International Community’s Funding Response led by Danida Key Messages Report written by the EAC

Tsunami Evaluation Coalition 8-Aug-15 Working through the mandate Mandate was assumed by the TEC but had no broader clientele, including those not working in evaluation units of the respective TEC members No real involvement of regional and local actors Five cross-cutting themes in principle a good idea but resulted in overlap, uncertainties between the teams and a confused and overloaded recipient community Did not consider alternative and possibly more cost effective approaches, e.g. one team per country Missed out on “impact” although an attempt was made to cover this through an IFRC-planned initiative that took almost a year to materialize

Tsunami Evaluation Coalition 8-Aug-15 Some TEC shortcomings Not all teams worked well together Some critical expertise was missing Not enough time spent in the field Weak on hard data Little information on Impact Lack of local ownership/buy-in Reports of varying quality – much work needed to bring some of them to acceptable levels Country reports in some cases not very strong – underestimated time needed to do them well Many cooks …teamleaders not fully on board Did not reduce evaluation overload

Tsunami Evaluation Coalition 8-Aug-15 Some TEC Achievements First major system-wide humanitarian evaluation since Rwanda TEC approach can work and lessons from setting up the TEC will make the next time easier Timing of TEC products was well planned and critical (initial findings report for 12/25 and the synthesis report for ECOSOC) TEC is beginning to influence humanitarian reform debate Clinton Initiative is moving on critical TEC issues in relation on NGOs Much more follow up ahead but will need dedicated attention and a sustained effort at various levels

Tsunami Evaluation Coalition 8-Aug-15 What should we do differently next time? Include system-wide mechanism as part of the appeal Get early in-country stakeholder buy-in Establish a local support/reference group(s) Organize regular in-country discussion/follow-up meetings (through a focal point organization) Promote the early establishment of performance indicators and M&E systems Develop an evaluation framework with agreed-to performance benchmarks Reduce complexities (funding, multi-team etc) Identify good practice, not just what did not work