The ALNAP Meta-evaluation Tony Beck Presentation for the IDEAS Conference, Delhi, 14 th April 2005.

Slides:



Advertisements
Similar presentations
ENTITIES FOR A UN SYSTEM EVALUATION FRAMEWORK 17th MEETING OF SENIOR FELLOWSHIP OFFICERS OF THE UNITED NATIONS SYSTEM AND HOST COUNTRY AGENCIES BY DAVIDE.
Advertisements

Phytosanitary Capacity Evaluation (PCE) Tool
The WASH Cluster - Global & Country Level Partnership & Challenges The WASH Cluster.
Policies and Procedures for Civil Society Participation in GEF Programme and Projects presented by GEF NGO Network ECW.
Business: Graded Unit 2 Kenneth Allen
Communities First Mike Durke. Key Lessons 2002: Early days 2003: Deputy Minister Review 2006: Interim Evaluation 2008: ‘Communities Next’ 2009: Wales.
Progress Toward Impact Overall Performance Study of the GEF Aaron Zazueta GEF Evaluation Office Hanoi, March 10, 2010.
CEET Conference 2011 Funding VET for Social Inclusion Competitive tendering and contestable funding in VET: approaches to supporting access and equity.
Meeting with IESBA CPAB Update Glenn Fagan and Kam Grewal April 7, 2014.
UNITED NATIONS NORMS AND STANDARDS FOR EVALUATION UNITED NATIONS EVALUATION GROUP (UNEG) Maya Bachner WIPO IDEAS 1st BIENNIAL CONFERENCE, NEW DELHI, APRIL.
Supporting & promoting Equality & Diversity through REF Dianne Berry, Chair REF E&D Advisory Panel Ellen Pugh, Senior Policy Officer ECU.
The ALNAP Meta-evaluation Tony Beck, Peter Wiles and John Lakeman.
Nutrition Cluster - South Sudan Nutrition Cluster Performance Monitoring Review Workshop Findings 4 th April 2014 ARON HOTEL.
Common recommendations and next steps for improving local delivery of climate finance Bangkok, October 31, 2012.
GFSC Food Security & Livelihoods in Urban Settings Working Group Lessons Learned in the Urban Response – the Philippines Typhoon Yolanda L3 Emergency.
Accountability in the humanitarian system Global Cluster Leads Donor Meeting April 21 st 2009.
Assessing Humanitarian Performance: Where are we now? 24 th Biannual Meeting Berlin, 3 rd December 2008.
Pacific TA Meeting: Quality Practices in Early Intervention and Preschool Programs Overview to Trends and Issues in Quality Services Jane Nell Luster,
Module 3 Why measure corruption? Assessment anxiety? vast diversity of approaches that serve different purposes UNCAC reporting mechanism asks countries.
Add presentation title to master slide | 1 New inspection methodology from September 2009 NATSPEC conference Lorna Fitzjohn May 2009.
ECB Project Accountability Activities Overview Andrea Stewart ECB Communications Manager
Hazards Risk Management Course Revision Project Update George Haddow June 2012.
Paper Presented at the Standing Conference for African National and University Libraries in East, Central and Southern Africa (SCANUL – ECS), 1st and 2nd.
How to Write a Critical Review of Research Articles
EDU 385 Education Assessment in the Classroom
1 Incorporating New Concepts into Programme Planning: the Experience of the Horizontal Principles in the NDP Friday 24 th September 2004 IEN Conference.
COMING OF AGE OF JOINT EVALUATIONS? ALNAP Presentation at OECD-DAC 21 st January 2008.
IAOD Evaluation Section, the Development Agenda (DA) and Development Oriented Activities Julia Flores Marfetan, Senior Evaluator.
Development agency support for impact evaluation Harry Jones, ODI Impact Evaluation Conference 2009.
Assessments. Assessment in the Project Cycle DESIGN IMPLEMENTATION MONITORING EVALUATION ASSESSMENT.
MKUKUTA/PER Consultations 2007 DPG session MKUKUTA/PEFAR 22 May.
8 TH -11 TH NOVEMBER, 2010 UN Complex, Nairobi, Kenya MEETING OUTCOMES David Smith, Manager PEI Africa.
Rhona Sharpe, Head of OCSLD Liz Turner, Head of APQO 11 th April 2013 CHAIRING VALIDATION PANELS.
Being The Best We Can Self-evaluation workshop Key results for Victoria’s public library services.
Add presentation title to master slide | 1 New inspection methodology from September 2009 NATSPEC conference Lorna Fitzjohn/Kath Townsley September 2009.
Division Of Early Warning And Assessment MODULE 5: PEER REVIEW.
Corporate Social Responsibility LECTURE 25: Corporate Social Responsibility MGT
Presented by CIDA on behalf of the Task Team on Multilateral Effectiveness.
Workshop Conclusions and Recommendations   To establish a responsibility system for EIA management and enhance the accountability of government officials.
ACCOUNTABILITY AND MONITORING IN HEALTH INITIATIVE PUBLIC HEALTH PROGRAM Capacity Building Consultation 5 – 7 June, 2012 Istanbul, Turkey.
Institute for International Programs An international evaluation consortium Institute for International Programs An international evaluation consortium.
Policies and Procedures for Civil Society Participation in GEF Programme and Projects presented by GEF NGO Network ECW.
Tsunami Evaluation Coalition: Progress to Date. Background OCHA, WHO, ALNAP Secretariat + DANIDA, USAID Level 1 Purpose: To promote a sector wide approach.
The Local Offer, Overview and links to the Voluntary Sector Annabel MacGregor Post 16 and Local Offer Commissioning Officer.
Presentation for ESNFI Cluster Partners Presentation for ESNFI Cluster Partners Special Meeting on 1 st April 10AM UNHCR Kabul Office Cluster Coordination.
Nutrition, AAP and the XCIs A project led by HelpAge International, the Global Nutrition Cluster and UNICEF Barb Wigley.
Revised AQTF Standards for Registered Training Organisations Strengthening our commitment to quality - COAG February August 2006.
PRESENTATION BY THE GHANA TEAM By Eunice Dapaah Senior Education Specialist World Bank- Ghana Office.
Legacy Report of Select Committee on Finance By: Zolani Rento Date: 09 July 2014.
More Timely, Credible and Cost Effective Performance Information on Multilateral Partners Presented by: Goberdhan Singh Director of the Evaluation Division.
1Your reference The Menu of Indicators and the Core Set from the South African Point of View Moses Mnyaka 13/08/2009.
Joint Evaluation of the Paris Declaration, Phase 2Core Team IRG Meeting 30 Nov 2009 Key conclusions & follow-up actions DRAFT Core Evaluation Team.
Presentation to the Ad-hoc Joint Sub-Committee on Parliamentary Oversight and Accountability Wednesday 20 March 2002 PUBLIC SERVICE MONITORING AND EVALUATION.
Action Points. SO1 Norms and Standards sub-group Immediate next steps: prepare preamble / copy edit and fact checking (EG) Develop and implement a dissemination.
THE REPORT ON THE REINSPECTION OF SOUTH TYNESIDE LEA BY OFSTED January 2003.
Seite 1 GTZ Independent Evaluation in the Thematic Priority Area “Decentralization” (2008) Overview of Results and Recommendations Implications.
EIAScreening6(Gajaseni, 2007)1 II. Scoping. EIAScreening6(Gajaseni, 2007)2 Scoping Definition: is a process of interaction between the interested public,
Humanitarian Principles in Evaluation: how can we move forward? Margie Buchanan-Smith on behalf of Tony Beck (team leader), Belen Diaz and Lara Ressler.
Coordination Performance Survey Validation workshop May 2016.
ALNAP Biannual Meeting 9-10 June 2005
Monitoring and Evaluating Rural Advisory Services
Approaches to Partnership
EIA approval process, Management plan and Monitoring
LDCF/SCCF Annual Evaluation Report 2015
How can field leadership make a difference?
Accountability in the humanitarian system
Quality in Evaluation: the international development experience
24 January 2018 Juba, Republic of South Sudan
Tool for Assessing Statistical Capacity (TASC)
ILO’s approach to youth employment
Presentation transcript:

The ALNAP Meta-evaluation Tony Beck Presentation for the IDEAS Conference, Delhi, 14 th April 2005

Outline 1)Background 2)The ALNAP Quality Proforma 3)Agency visits 4)Findings from the agency visits 5)Finding from the Quality Proforma

What is the ALNAP and its meta- evaluation?  An overview of evaluation of humanitarian action quality  Identification of strengths and weaknesses  Recommendations for improvement across the sector and in individual agencies

Process Review of evaluation reports against a set of standards Visits to and interaction with agency evaluation offices Focus: : Accountability : Accountability and: good practice, dialogue, interaction

The ALNAP Quality Proforma ALNAP’s meta- evaluation tool Draws on good practice in EHA and evaluation in general Revised and peer reviewed in 2004

The ALNAP Quality Proforma Made up of seven sections: 1.Terms of reference 2.Methods, practice and constraints 3.Contextual analysis 4.Analysis of intervention 5.Assessing the report 6.Overall comments

The ALNAP Quality Proforma 4 point rating scale A = good B = satisfactory C = unsatisfactory D = poor Guidance notes for meta- evaluators. Eg: Consideration given to confidentiality and dignity? Guidance: The evaluation report should detail how the overall approach and methods will protect confidentiality and promote respect for stakeholders’ dignity and self-worth.

The ALNAP Proforma Coverage : 197 evaluations Process 2 meta-evaluators Reconciliation of rating Analysis by section

Mainstreaming of the Quality Proforma By ECHO to revise tor (lesson learning, protection, identification of users, prioritisation, time frame and users of recommendations etc) DEC Southern Africa evaluation (rated 7 agency report) Groupe URD (for planning of evaluations)

Agencies included in dialogue: CAFOD, Danida, ECHO, ICRC, OCHA, OFDA, Oxfam, SC-UK, SIDA, UNHCR, and WHO

Purpose of agency dialogue Agency response to initial two years of use of Quality Proforma To discuss Quality Proforma rating and agency strengths and weaknesses To discuss processes leading to good evaluation practice To discuss goof practice

Findings from dialogue with evaluation managers Areas affecting evaluation quality are not currently captured by the QP, eg Evaluation quality depends on subtle negotiations within agencies Evaluation funds in most cases are not being allocated for follow-up Follow-up to recommendations is complex More agencies are using tracking matrices

Findings from dialogue with evaluation managers: the EHA market Main constraint to improved evaluation quality is agencies accessing available evaluators with appropriate skills Does the EHA market need further regulation?

Findings from the Proforma Area of enquiry Rating% attaining rating 2004 % attaining rating TOR – Good practice in approach and method Good or Satisfactory Unsatisfactory or Poor

Findings from the Proforma Area of enquiry Rating% attaining rating 2004 % attaining rating TOR – Intended users and uses Good or Satisfactory Unsatisfactory or Poor

Findings from the Proforma Area of enquiry Rating% attaining rating 2004 % attaining rating Consultation with primary stakeholders Good or Satisfactory Unsatisfactory or Poor

Findings from the Proforma Area of enquiry Rating% attaining rating 2004 % attaining rating Use of the DAC criteria Good or Satisfactory Unsatisfactory or Poor

Findings from the Proforma Area of enquiry Rating% attaining rating 2004 % attaining rating HR and management Good or Satisfactory Unsatisfactory or Poor

Findings from the Proforma Area of enquiry Rating% attaining rating 2004 % attaining rating CoordinationGood or Satisfactory Unsatisfactory or Poor

Findings from the Proforma Area of enquiry Rating% attaining rating 2004 % attaining rating Quality of evaluation of protection issues Good or Satisfactory Unsatisfactory or Poor

Findings from the Proforma Improvement in most areas noted above of between 10 and 30 per cent Too early to disaggregate or suggest why this improvement has taken place Still a number of areas of generic weakness

Conclusions Process: Meta-evaluations need to include interaction with those being meta-evaluated Agency visits have been important is discussing constraints to improved evaluation quality Meta-evaluations need to maintain an appropriate balance between accountability functions and the need to improve evaluation quality through lesson learning

Conclusions: findings EHA demonstrates some areas of strength, and improvement over four years, eg use of most of the DAC criteria, analysis of HR Many evaluative areas need to be strengthened, eg gender, identification of use and users, participation of primary stakeholders, transparency of methodologies used