1. Key points 1. Planning Monitoring and Evaluation in the Spanish Cooperation 2. Manual for the Management of Evaluations of the Spanish Cooperation.

Slides:



Advertisements
Similar presentations
ENTITIES FOR A UN SYSTEM EVALUATION FRAMEWORK 17th MEETING OF SENIOR FELLOWSHIP OFFICERS OF THE UNITED NATIONS SYSTEM AND HOST COUNTRY AGENCIES BY DAVIDE.
Advertisements

EuropeAid PARTICIPATORY SESSION 2: Managing contract/Managing project… Question 1 : What do you think are the expectations and concerns of the EC task.
EuropeAid ENGAGING STRATEGICALLY WITH NON-STATE ACTORS IN NEW AID MODALITIES SESSION 1 Why this Focus on Non-State Actors in Budget Support and SPSPs?
Evaluating public RTD interventions: A performance audit perspective from the EU European Court of Auditors American Evaluation Society, Portland, 3 November.
Good Evaluation Planning – and why this matters Presentation by Elliot Stern to Evaluation Network Meeting January 16 th 2015.
Enhancing Data Quality of Distributive Trade Statistics Workshop for African countries on the Implementation of International Recommendations for Distributive.
Lesson 2: Project: Evaluation, Monitoring, Auditing Macerata, 22 nd October Alessandro Valenza, Director, t33 srl.
Commonwealth Local Government Forum Freeport, Bahamas, May 13, 2009 Tim Kehoe Local Government and Aid Effectiveness.
The quality framework of European statistics by the ESCB Quality Conference Vienna, 3 June 2014 Aurel Schubert 1) European Central Bank 1) This presentation.
Results-Based Management: Logical Framework Approach
INTEGRATING BENEFICIARY FEEDBACK INTO EVALUATION- A STRUCTURED APPROACH Presentation to UKES Conference May 2015 Theme: Theory and practice of inclusion.
Chapter 10 Human Resource Management and Performance: a Review and Research Agenda David E. Guest.
PPA 502 – Program Evaluation
Business research methods: data sources
Writing an Effective Assessment Plan
How to write a publishable qualitative article
WRITING A RESEARCH PROPOSAL
Performance Measurement. Integration of Information for Process Improvement and Innovation Palmira López-Fresno President. Quality Service Committee Spanish.
The use and convergence of quality assurance frameworks for international and supranational organisations compiling statistics The European Conference.
Monitoring Evaluation Impact Assessment Objectives Be able to n explain basic monitoring and evaluation theory in relation to accountability n Identify.
Culture Programme - Selection procedure Katharina Riediger Infoday Praha 10/06/2010.
Sociology 3322a. “…the systematic assessment of the operation and/or outcomes of a program or policy, compared to a set of explicit or implicit standards.
Evaluation methods and tools (Focus on delivery mechanism) Jela Tvrdonova, 2014.
1 Hsin Chu, August 2012 Regulatory Impact Assessment Charles-Henri Montin, Senior Regulatory Expert, Ministry of economy and finance, Paris
1 RBM Background Development aid is often provided on a point to point basis with no consistency with countries priorities. Development efforts are often.
Project design & Planning The Logical Framework Approach An Over View Icelandic International Development Agency (ICEIDA) Iceland United Nations University.
Lesson 2: Project: Evaluation, Monitoring, Auditing Macerata, 22 nd October Alessandro Valenza, Director, t33 srl.
Course on Data Analysis and Interpretation P Presented by B. Unmar Sponsored by GGSU PART 2 Date: 5 July
IAOD Evaluation Section, the Development Agenda (DA) and Development Oriented Activities Julia Flores Marfetan, Senior Evaluator.
Harmonisation, Decentralisation and Local Governance.
ROLE OF INFORMATION IN MANAGING EDUCATION Ensuring appropriate and relevant information is available when needed.
Stakeholder consultations Kyiv May 13, Why stakeholder consultations? To help improve project design and implementation To inform people about changes.
PPA 502 – Program Evaluation Lecture 2c – Process Evaluation.
IPA Funds Monitoring and Evaluation December Bölgesel Rekabet Edebilirlik Operasyonel Programı’nın Uygulanması için Kurumsal Kapasitenin Oluşturulmasına.
Project Management Learning Program 7-18 May 2012, Mekong Institute, Khon Kaen, Thailand Writing Project Report Multi-Purpose Reporting.
European Commission Joint Evaluation Unit common to EuropeAid, Relex and Development Methodology for Evaluation of Budget support operations at Country.
Screen 1 of 22 Food Security Policies – Formulation and Implementation Policy Monitoring and Evaluation LEARNING OBJECTIVES Define the purpose of a monitoring.
The Major Steps of a Public Health Evaluation 1. Engage Stakeholders 2. Describe the program 3. Focus on the evaluation design 4. Gather credible evidence.
Copyright © Allyn & Bacon 2008 Intelligent Consumer Chapter 14 This multimedia product and its contents are protected under copyright law. The following.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 24, 2009.
© 2002, CARE USA. All rights reserved. Applying Quality Standards in Impact Evaluation: Case of CARE Program Quality Framework and Evaluation Policy Ahmed.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Title Sub-Title Open Writing it up! The content of the report/essay/article.
Kathy Corbiere Service Delivery and Performance Commission
Quality Assessment of MFA’s evaluations Rita Tesselaar Policy and operations Evaluation Department Netherlands Ministry of Foreign Affairs.
Monitoring Afghanistan, 2015 Food Security and Agriculture Working Group – 9 December 2015.
EVALUATION OF THE SEE SARMa Project. Content Project management structure Internal evaluation External evaluation Evaluation report.
ACTED AME Appraisal, Monitoring and Evaluation. Summary 1/ ACTED AME department 2/ AME Responsibilities 3/ AME throughout project cycle 4/ Involvement.
Key Components of a successful Proposal OAS / IACML Workshop on Technical Assistance. San José, Costa Rica May 8, 2007 By: José Luis Alvarez R. Consultant.
Module 7- Evaluation: Quality and Standards. 17/02/20162 Overview of the Module How the evaluation will be done Questions and criteria Methods and techniques.
Evaluating Engagement Judging the outcome above the noise of squeaky wheels Heather Shaw, Department of Sustainability & Environment Jessica Dart, Clear.
Better regulation in the Commission Jonathon Stoodley Head of Unit C.1 Evaluation, Regulatory Fitness and Performance Secretariat General of the European.
Development of Gender Sensitive M&E: Tools and Strategies.
Organizations of all types and sizes face a range of risks that can affect the achievement of their objectives. Organization's activities Strategic initiatives.
United Nations Statistics Division Developing a short-term statistics implementation programme Expert Group Meeting on Short-Term Economic Statistics in.
Evaluation What is evaluation?
CENTRAL EUROPE PROGRAMME WORKSHOP A: Preparing an application – focus on activities and partnership Project development seminar Prague, 1-2 February.
How to write a publishable qualitative article
Evaluation: For Whom and for What?
Herman Smith United Nations Statistics Division
GUIDELINES Evaluation of National Rural Networks
Module 1: Introducing Development Evaluation
Continuous Improvement through Accreditation AdvancED ESA Accreditation MAISA Conference January 27, 2016.
Chapter 6 Project Management and Project Cycle Management.
Measuring Data Quality and Compilation of Metadata
Business Retention and Expansion
Assessment of Quality in Statistics GLOBAL ASSESSMENTS, PEER REVIEWS AND SECTOR REVIEWS IN THE ENLARGEMENT AND ENP COUNTRIES Mirela Kadic, Project Manager.
WHAT is evaluation and WHY is it important?
EVALUATIONS in the EU External Aid
European Statistics Code of Practice
Presentation transcript:

1

Key points 1. Planning Monitoring and Evaluation in the Spanish Cooperation 2. Manual for the Management of Evaluations of the Spanish Cooperation 3. Value and use of evaluation quality standards 4. Standards in practice: Challenges and Balances 2

1. Planning, Monitoring and Evaluation in the Spanish Cooperation 1. History: Spanish Stakeholders (Ministries, Departaments, Municipalities…) II Master Plan for Spanish Cooperation ( ) Manual for Management of Evaluations of SC Evaluation of the IIMPSC (05-08) (EES Lisbon Oct 2008) 25 Evaluations ( ) of Spanish Cooperation 2. What did we learn? o Evaluability (Planning design and monitoring) o Evaluation culture (institutionalisation) 3. Future challenges: III Master Plan for Spanish Cooperation ( ) Implementing Paris Declaration and Accra commitments Putting in practice and Implementing Standards Learning to learn (Knowledge management and Evaluation system) 3

Specific Objective Offering the stakeholders of the Spanish Cooperation a tool to manage evaluations: Instructive Flexible Useful Systematic 4 General Objetives  Strengthen the evaluation culture of the Spanish Cooperation.  Strengthen the implemention evaluations: quality, systematic, participative and planning oriented. 2. Manual for the Management of Evaluations of the Spanish Cooperation

5 Manual for the Management of Evaluations of the Spanish Cooperation Learning to do things better

Principles, Criteria and Standards for assessing the quality of SC evaluation reports From Philosophy (Principles) to Practice (Standards) They are based on these ones used by: the European Commission and include DAC evaluation standards and the criteria being used by the Evaluation Division of DG POLDE 6 3. Value and use of evaluation quality standards

PRINCIPLES FOR EVALUATION OF DEVELOPMENT ASSISSTANCE DAC (1992)SPANISH COOPERATION (MANUAL) IMPARTIALITY AND INDEPENDENCE CREDIBILITY USELFULNESS PARTICIPATION OF DONORS AND RECIPIENTS DONOR-COOPERATION EVALUATION PROGRAMMING, DESING AND IMPLEMENTATION DISSEMINATION AND FEEDBACK OTHER REVIEWS REINFORCING WITH GUIDANCES (1998) PARTICIPATION LEARNING AND INCORPORATION OF LESSONS LEARNED USEFULNESS TRANSPARENCY THIS IMPLIES: A comprehensive result-centred approach. A pluralistic and participatory approach. An analytical, learning-centred and conclusive approach. A strategy based on the use of results. A comprehensive result-centred approach. A pluralistic and participatory approach. An analytical, learning-centred and conclusive approach. A strategy based on the use of results. 7

CRITERIA FOR EVALUATION OF DEVELOPMENT ASSISSTANCE DACSPANISH COOPERATION (MANUAL) Relevance Effectiveness Effciciency Impact Sustaninability DAC criteria Other criteria CONSISTENCY OWNERSHIP ALIGNMENT HARMONISATION PARTICIPATION COVERAGE 8

STANDARDS FOR EVALUATION OF DEVELOPMENT ASSISSTANCE DACSPANISH COOPERATION (MANUAL) 1.Rationale, purpose and objectives 2.Scope 3.Context 4.Methodology 5.Information Sources 6.Independency 7.Ethics 8.Quality assurance 9.Relevance of the evaluation results 10.Completeness 1: Compliance with requirements – responds to the evaluation questions 2: Context analysis 3: Justification for the methodology used 4: Reliability of the data 5: Robustness of the analysis 6: Credibility of findings 7: Validity of conclusions 8: Usefulness of the recommendations 9: Clarity of the report 9

SPANISH STANDARDS Std 1: Compliance with requirements – responds to the evaluation questions This assesses adherence to the terms of reference. In other words, a good report is one which adequately fulfils the requirements laid down in the terms of reference and satisfactorily answers the evaluation questions. The report is expected to provide a good general overview of how the announced objectives were met and to clarify the logic of the intervention. 10 Std 2: Context analysis  This assesses whether the report focuses on the intervention as a whole including its temporal, geographical and regulatory dimensions and whether it analyses the context surrounding the intervention, i.e. the institutional, political economic and social situation and both the foreseen and unforeseen interactions with other related policies and their consequences.

Std 3: Justification for the methodology used This assesses whether the tools and methodology used are clearly explained and whether they have indeed been applied throughout the process. The methodological choices made must meet the requirements laid down in the ToR. The limits inherent to the evaluation method should also be clearly defined and arguments should be presented as to why certain options were chosen over others. 11 Std 4: Reliability of the data  This does not judge the intrinsic validity of the available data but rather the way in which the evaluation team obtained and used that data. The evaluation team is expected to identify the sources of quantitative and qualitative data and explain and justify the reliability of the data. To this end, it must clearly explain the collection tools used which, in turn, must be adapted to the information sought.

Std 5: Robustness of the analysis A robust analysis of the quantitative and qualitative data should be done by closely following the recognised and appropriate steps depending on the type of data analysed. The cause-effect relationships between the intervention and its consequences must be clearly explained and there must be consistency and a logical sequence between evidence and assessment; assessment and conclusions; and between conclusions and recommendations. We would recommend that the steps of the analysis be explained and its validity limits specified. 12 Std 6: Credibility of findings  The findings made in the analysis are expected to be reliable and balanced. The findings should suitably reflect the reality drawn by the data and documented test elements, on the one hand, and the reality of the intervention as perceived by actors and beneficiaries, on the other. The effects of the evaluated intervention should be isolated from external factors and from contextual restrictions.

Std 7: Validity of conclusions This does not judge the intrinsic value of the conclusions but rather the way in which they were reached. In accordance with this criterion the conclusions must be rooted in the analysis, must be supported by facts and analyses easily identifiable in the rest of the report and must avoid bias or personal feelings. They should also indicate the limits and context of the validity of the conclusions. 13 Std 8: Usefulness of the recommendations  Recommendations should be formulated in a clear and concise manner, should derive from the conclusions and be based on balanced and unbiased analyses. They should also be detailed enough so as to be specifically applicable by the different actors responsible for the evaluated intervention.

Std 9: Clarity of the report A clear report is one that is easy to read and follows a logical structure. A brief summary should be an accurate reflection of the report. Annexes should be provided focusing on specialised concepts and technical demonstrations clearly referenced throughout the text. The report should be brief, concise and easily readable and the structure of the report should be readily recognisable. The report should clearly describe the intervention evaluated, its context and the evaluation findings and the information furnished should be readily understandable. 14

Quality standards (SPAIN) / Quality standards (DAC) 1: Compliance with requirements – responds to the evaluation questions 2: Context analysis 3: Justification for the methodology used 4: Reliability of the data 5: Robustness of the analysis 6: Credibility of findings 7: Validity of conclusions 8: Usefulness of the recommendations 9: Clarity of the report Formulation of evaluation findings (9. Relevance of the evaluation results) 10.1 Evaluation questions answered by conclusions (10. Completeness) 3. Context 4.1 Explanation of the methodology used (4. Evaluation methodology) 5. Information sources 4.2 Assessment of results (4. Evaluation methodology) 10.2 Clarity of analysis (10. Completeness) 10.3 Distinction between conclusions, recommendations and lessons learned 9.3 Recommendations and lessons learned (9. Relevance of the evaluation results) 10.5 Clarity and representativeness of the summary (10. Completeness)

Quality standards (DAC) / Quality standards (Spanish) 16 1.Rationale, purpose and objectives 2.Scope 3.Context 4.Methodology 5.Information Sources 6.Independency 7.Ethics 8.Quality assurance 9.Relevance of the evaluation results 10.Completeness 1. ToRs: Phase I 2. ToR: Phase I 3. Context (s 2) 4. Methodology (s3) 5. Reliability of data (s4) 6. No directly (principles, criteria / ToR) 7. No directly (principles, criteria / ToR) 8. No directly (principles, criteria / ToR) 9. Usefulness of recommendations (s8 and s1) 10. Compliance with requirements (s1) and validity of analysis, findings and conclusions (s5, s6 and s7)

4. Standards in practice: Challenges and Balances 1. Rationale: Instituional motivation for learning / accountability 2. Scope: Ambition vs Realism (concetration / priorisation) 3. Context: specific context that affects the evaluation 4. Methodology: limitations (budget? evaluator´s skills?), perception / evidence basis, ¿Impact and effciency? 5. Information Sources: existence of consistent data for evidences, privacity vs trazability 6. Independency: evaluation culture and institutionalisation 7. Ethics: evaluation culture and institutionalisation 8. Quality assurance: experience for flexible vs rigid approaches 9. Relevance of the evaluation results: Utility and use ¡¡ 10. Completeness: focus vs ambition 17

DISPONIBLE cooperación internacional- evaluación Thank you 18