Evaluation methods and tools (Focus on delivery mechanism) Jela Tvrdonova, 2014.

Slides:



Advertisements
Similar presentations
Reasons for Monitoring and Evaluation at the Project Level
Advertisements

1 Evaluating Communication Plans Cvetina Yocheva Evaluation Unit DG REGIO 02/12/2009.
ROM reviews Saskia Van Crugten
1 EU Strategy for the Baltic Sea Region Evaluation: Setting Outcome Indicators and Targets Seminar: 15 March 2011, La Hulpe Veronica Gaffey Acting Director.
Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
Knowing if the RBF mechanism is working Incorporating Rigorous Impact Evaluation into your HRBF program Sebastian Martinez World Bank.
Lucila Beato UNMIL/HRPS
Evaluating public RTD interventions: A performance audit perspective from the EU European Court of Auditors American Evaluation Society, Portland, 3 November.
Good Evaluation Planning – and why this matters Presentation by Elliot Stern to Evaluation Network Meeting January 16 th 2015.
Project Monitoring Evaluation and Assessment
Lesson 2: Project: Evaluation, Monitoring, Auditing Macerata, 22 nd October Alessandro Valenza, Director, t33 srl.
1 Jela Tvrdonova,  Strategic approach to rural development  Common approach to evaluation: legal background and CMEF  Monitoring and evaluation.
Reviewing and Critiquing Research
Return On Investment Integrated Monitoring and Evaluation Framework.
Benchmarking as a management tool for continuous improvement in public services u Presentation to Ministry of Culture of the Russian Federation u Peter.
MONITORING & QUALITY CONTROL ERASMUS MUNDUS II PROJECT CENTAURI MOBILITY KAZAKHSTAN, KYRGYZSTAN, TAJIKISTAN, UZBEKISTAN.
Seminar on selected evaluation methodology issues 评估方法论研讨会 Independent Office of Evaluation of IFAD Beijing, 16 July 2014 国际农业发展基金独立评估办公室 2014 年 7 月 16.
Pestalozzi Children‘s Foundation emPower 2012 Monitoring & Evaluation Lecturers: Beatrice Schulter.
Monitoring Evaluation Impact Assessment Objectives Be able to n explain basic monitoring and evaluation theory in relation to accountability n Identify.
CONCEPT PAPER RESULT BASED PLANNING. RESULT-ORIENTED PLANNING Overall Objective/ Goal Specific Objective/ Purposes Expected Result/ Output Activities.
1 Module 4: Designing Performance Indicators for Environmental Compliance and Enforcement Programs.
TEMPUS IV- THIRD CALL FOR PROPOSALS Recommendation on how to make a good proposal TEMPUS INFORMATION DAYS Podgorica, MONTENEGRO 18 th December 2009.
1 Framework Programme 7 Guide for Applicants
Lesson 3: Monitoring and Indicator Macerata, 23 nd October Alessandro Valenza, Director, t33 srl.
Work Programme for the specific programme for research, technological development and demonstration "Integrating and strengthening the European Research.
Project “Ex-ante evaluation of programming documents and strengthening evaluation capacity for EU funds post-accession” (EUROPAID/130401/D/SER/HR) Project.
1 RBM Background Development aid is often provided on a point to point basis with no consistency with countries priorities. Development efforts are often.
18 March th meeting of the Evaluation Expert Committie 1 Thematic Working Group „Ex post Evaluation Guidelines” State of play Jela Tvrdonova.
Seminar on Mid Term Evaluation in Objective 1 and 2 Regions Lessons from the Mid Term Evaluation of Merseyside Objective One.
Project design & Planning The Logical Framework Approach An Over View Icelandic International Development Agency (ICEIDA) Iceland United Nations University.
Lesson 2: Project: Evaluation, Monitoring, Auditing Macerata, 22 nd October Alessandro Valenza, Director, t33 srl.
Semester 2: Lecture 9 Analyzing Qualitative Data: Evaluation Research Prepared by: Dr. Lloyd Waller ©
INTERNATIONAL SOCIETY FOR TECHNOLOGY IN EDUCATION working together to improve education with technology Using Evidence for Educational Technology Success.
INTERNATIONAL LABOUR ORGANIZATION Conditions of Work and Employment Programme (TRAVAIL) 2012 Module 13: Assessing Maternity Protection in practice Maternity.
IAOD Evaluation Section, the Development Agenda (DA) and Development Oriented Activities Julia Flores Marfetan, Senior Evaluator.
Setting the Stage: Workshop Framing and Crosscutting Issues Simon Hearn, ODI Evaluation Methods for Large-Scale, Complex, Multi- National Global Health.
EVALUATION APPROACHES Heather Aquilina 24 March 2015.
JCint - JobCreator International Network and Web Services n. LLP-LDV-TOI-09-IT-0502 This project has been funded with support from the European Commission.
Methods: Pointers for good practice Ensure that the method used is adequately described Use a multi-method approach and cross-check where possible - triangulation.
Workshop II Monitoring and Evaluation INTERACT ENPI Annual Conference December 2009 | Rome.
Monitoring & Evaluation: The concepts and meaning Day 9 Session 1.
MAINSTREAMING MONITORING AND EVALUATION IN EDUCATION Can education be effectively managed without an M & E system in place?
IPA Funds Monitoring and Evaluation December Bölgesel Rekabet Edebilirlik Operasyonel Programı’nın Uygulanması için Kurumsal Kapasitenin Oluşturulmasına.
4/5 June 2009Challenges of the CMEF & Ongoing Evaluation 1 Common monitoring and evaluation framework Jela Tvrdonova, 2010.
Project Management Learning Program 7-18 May 2012, Mekong Institute, Khon Kaen, Thailand Writing Project Report Multi-Purpose Reporting.
Screen 1 of 22 Food Security Policies – Formulation and Implementation Policy Monitoring and Evaluation LEARNING OBJECTIVES Define the purpose of a monitoring.
111 CINDI PMER Workshop March, 2007 By Simon Opolot © 2007 Office of the Premier, KwaZulu-Natal Province, Private Bag X9037 Telecom House, 2nd Floor,
Project Management Learning Program 23 Mar – 3 Aprl 2009, Mekong Institute, Khon Kaen, Thailand Managing for Development Results Results-Oriented Monitoring.
Quality Assessment of MFA’s evaluations Rita Tesselaar Policy and operations Evaluation Department Netherlands Ministry of Foreign Affairs.
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
"The challenge for Territorial Cohesion 2014 – 2020: delivering results for EU citizens" Veronica Gaffey Acting Director EUROPEAN COMMISSION, DG for Regional.
Monitoring Afghanistan, 2015 Food Security and Agriculture Working Group – 9 December 2015.
EVALUATION OF THE SEE SARMa Project. Content Project management structure Internal evaluation External evaluation Evaluation report.
1 Performance Auditing ICAS & IRAS Officers NAAA 21 Jan 2016.
4/5 June 2009Challenges of the CMEF & Ongoing Evaluation Common monitoring and evaluation framework for evaluation of rural development programs.
Croatia: Result orientation within the process of preparation of programming documents V4+ Croatia and Slovenia Expert Level Conference Budapest,
Evaluation What is evaluation?
Monitoring and Evaluation Systems for NARS organizations in Papua New Guinea Day 4. Session 10. Evaluation.
Common monitoring and evaluation framework for evaluation of rural development program Jela Tvrdonova, 2016.
GUIDELINES Evaluation of National Rural Networks
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 2. Session 6. Developing indicators.
Common Monitoring and Evaluation System for Rural Development
WHAT is evaluation and WHY is it important?
The Estonian experience with ex-ante evaluation – set-up and progress
EVALUATIONS in the EU External Aid
Guidelines on the Mid-term Evaluation
Monitoring and evaluation
Integrating Gender into Rural Development M&E in Projects and Programs
How is an M & E framework derived from the logframe?
Times they are changing: Evaluating outreach in a new era
Presentation transcript:

Evaluation methods and tools (Focus on delivery mechanism) Jela Tvrdonova, 2014

Content The evaluation process Setting up evaluation system Direct and indirect programme effects and their separation Evaluation design Evaluation methods Securing data Answering evaluation questions Summing up – key issues to be addressed

The evaluation process Ongoing Periodical Ex-ante Mid- term Ex-post

Setting up the evaluation systems Administrative tasks and institutional set up (steering group, monitoring committee, evaluation managers etc.) Terms of reference (for independent evaluator) Preparation of evaluation

Seting up the evaluation systems Phases of the evaluation – evaluation tasks Structuring (overlap with preparation) Observing Analysing Judging

Structuring Review intervention logic for the different measures to be evaluated, Review other topics to be evaluated (e.g. delivery mechanism) Set up the evaluation framework and design

Review intervention logic Review objectives, inputs, measures, expected outputs, results and impacts Define of key terms Assess: ◦ Relevance ◦ Coherence ◦ Effectiveness ◦ Efficiency ◦ Intended unintended factors

RDP Intervention logic Context, its description SWOT and needs assessment Relevance Efficiency Overall objectives EU/MS Programme level Specific objectives EU/MS Axis level Operational objectives EU/MS Measure level Impacts Results Outputs Effectiveness Measures, projects and their management and implementation Inputs Coherence EU policy objectives Complementarity Coherence Source: EENRD 2014

Review other topics Identify evaluation need Define key terms Establish benchmarks if possible

Set up evaluation framework Define programme specific evaluation questions, judgment criteria and indicators Link intervention logic with evaluation questions and indicators Remember intended and unintended factors of IL Identify direct and indirect programme effects Consider contextual factors Chose evaluation design and methods to answer evaluation questions Screen data and information sources and ensure their availability Decide on the collection of additional data and information to fill data gaps

Observing Create the tools needed for the quantitative and qualitative analysis: interview guides, questionnaires, queries for extractions from databases, requests for maps, guidelines for case studies, focus groups and any other data collection instrument that the contractor deems appropriate Collect data and qualitative information needed for answering each evaluation question: databases, studies, people to be interviewed, appropriate case study areas etc. Description of the process of programme implementation, composition of programmes, priorities and target levels, budget

Analysing Analysing all information available in view of assessing the effects and impacts of measures, focus areas and programme in relation to the programme's objectives and target levels. In order to assess progress made, the link to the baselines, provided in the context of ex-ante evaluations, has to be established. Impacts will be identified as net-contributions to achievement of programme's objectives. In this respect evaluators have to: Establish appropriate typologies of measures and/or beneficiaries in view of reducing the complexity for dealing with the empirical analysis. Process and synthesise available data and information, and - where necessary – handle data gaps by modelling or other extrapolations. Apply a measurement against the counterfactual as well as target levels.

Judging Answer all evaluation questions (common and programme specific questions) Assess the impact, effectiveness and efficiency of the programme Assess measures with respect to their balance within the programme Judge on the degree to which the programme contributes to achieving the objectives set out in the national and Community strategy Identify the factors which contributed to the success or failure of the programme Draft conclusions and recommendations based on the findings Identify possible adjustments necessary for improvement of rural policy interventions

Evaluation phases and key activities Setting intervention logic per measure, focus area and program, setting up evaluation framework and desgn Development of tools, Collecting data – primary, secondary,monito ring Analysing via using various methods – naive, advanced, qualitative quantitative Developing judgments, answering evaluation questions structuring observing analysing judging

Evaluation methods – qualitative Qualitative approaches are useful during the three stages of an impact evaluation: When designing an impact evaluation, focus groups and interviews with key informants to develop hypotheses In the intermediate stage, before quantitative impact evaluation as the quick insights into what is happening in the program. In the analysis stage, evaluators can apply qualitative methods to provide context and explanations for the quantitative results - triangulation The applicability of qualitative methodologies to construct valid counterfactuals is considered as rather limited, however possible

Qualitative methods Interviews Focus groups Surveys Case studies Field observations Literature reviews Other qualitative approaches

Criteria for selection of evaluation methods Credibility Rigour Reliability Robustness Validity Transparency Practicability Also: Ability to explain causality Ability to eliminate a possible selection bias Ability to isolate the effect of the programme from other factors Taking into account potential indirect effects

Answering evaluation questions Evidence based answers Related to the contextual environment – netting out Sound methodology and data  Drafting conclusions and recommendations