Monitoring Afghanistan, 2015 Food Security and Agriculture Working Group – 9 December 2015.

Slides:



Advertisements
Similar presentations
Evaluation at NRCan: Information for Program Managers Strategic Evaluation Division Science & Policy Integration July 2012.
Advertisements

Project Cycle Management
Introduction to Monitoring and Evaluation
Framework and Toolkit for UN Coherence, Effectiveness and Relevance at Country Level: Step 2 – Prioritize and set outcomes.
Towards a model M&E system for AIDS programs Kampala April
1 Department of State Program Evaluation Policy Overview Spring 2013.
SYSTEM OF EVALUATION AND MANAGEMENT CONTROL RESULTS-BASED BUDGETING THE CHILEAN EXPERIENCE Heidi Berner H Head of Management Control Division Budget Office,
Prepared by BSP/PMR Results-Based Programming, Management and Monitoring Presentation to Geneva Group - Paris Hans d’Orville Director, Bureau of Strategic.
Quality Management within the Clinical Research Process
Training for Participants Date, Location, Venue 1 Welcome! Welcome to PEM 2 Module!
Decision Making Tools for Strategic Planning 2014 Nonprofit Capacity Conference Margo Bailey, PhD April 21, 2014 Clarify your strategic plan hierarchy.
HUMANITARIAN RESPONSE MONITORING. HOW TO USE THIS PRESENTATION This presentation contains a complete overview of all aspects of Response Monitoring Presenting.
Action Implementation and Evaluation Planning Whist the intervention plan describes how the population nutrition problem for a particular target group.
MONITORING PROJECTS: QUALITY AND RESULTS. DAY ONE ASSESSMENT DAY TWO DESIGN DAY THREE MONITORING MORNING Intro. Training Intro. Assessment Intro. DesignIntro.
M&E Plan Overview M&E Capacity Strengthening Workshop Addis Ababa
HOW TO WRITE A GOOD TERMS OF REFERENCE FOR FOR EVALUATION Programme Management Interest Group 19 October 2010 Pinky Mashigo.
Performance Management Training Governor’s Office of Planning and Budget 1.
Reporting and Using Evaluation Results Presented on 6/18/15.
Implementation & Evaluation Regional Seminar ‘04 School Development Planning Initiative “An initiative for schools by schools”
AICT5 – eProject Project Planning for ICT. Process Centre receives Scenario Group Work Scenario on website in October Assessment Window Individual Work.
ISTEP: Technology Field Research in Developing Communities Instructor: M. Bernardine Dias CAs: Sarah Belousov and Ermine Teves Spring 2009.
Commission on Teacher Credentialing Inspire, Educate, and Protect the Students of California Commission on Teacher Credentialing 1 Accreditation Overview.
Do it pro bono. Strategic Scorecard Service Grant The Strategy Management Practice is presented by Wells Fargo. The design of the Strategic Scorecard Service.
Evaluation in the GEF and Training Module on Terminal Evaluations
Evaluation methods and tools (Focus on delivery mechanism) Jela Tvrdonova, 2014.
NIST Special Publication Revision 1
1 RBM Background Development aid is often provided on a point to point basis with no consistency with countries priorities. Development efforts are often.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 10, 2011.
Multicluster Rapid Needs Assessment 10-Step Process 1.Initiation and Engagement 2.Design and Planning 3.Establishment of Coordination 4.Methods determination.
Project design & Planning The Logical Framework Approach An Over View Icelandic International Development Agency (ICEIDA) Iceland United Nations University.
IAOD Evaluation Section, the Development Agenda (DA) and Development Oriented Activities Julia Flores Marfetan, Senior Evaluator.
Assessments. Assessment in the Project Cycle DESIGN IMPLEMENTATION MONITORING EVALUATION ASSESSMENT.
UNDAF M&E Systems Purpose Can explain the importance of functioning M&E system for the UNDAF Can support formulation and implementation of UNDAF M&E plans.
Evaluation Plan New Jobs “How to Get New Jobs? Innovative Guidance and Counselling 2 nd Meeting Liverpool | 3 – 4 February L Research Institute Roula.
D1.HRD.CL9.06 D1.HHR.CL8.07 D2.TRD.CL8.09 Slide 1.
Belgrade, Donor Coordination Meeting Belgrade, 30 th of November 2011.
Presented by CIDA on behalf of the Task Team on Multilateral Effectiveness.
Project Management Learning Program 7-18 May 2012, Mekong Institute, Khon Kaen, Thailand Writing Project Report Multi-Purpose Reporting.
Senior Evaluation Officer GEF Independent Evaluation Office Minsk, Belarus September 2015 Evaluation in the GEF and Training Module on Terminal Evaluations.
Screen 1 of 22 Food Security Policies – Formulation and Implementation Policy Monitoring and Evaluation LEARNING OBJECTIVES Define the purpose of a monitoring.
ACP S&T Programme - Stakeholder conference October Implemented by the ACP Secretariat Funded by the European Union EDULINK - ACP Science and.
Framework and Toolkit for UN Coherence, Effectiveness and Relevance at Country Level: Step 7 – Monitor & Evaluate.
Tracking national portfolios and assessing results Sub-regional Workshop for GEF Focal Points in West and Central Africa June 2008, Douala, Cameroon.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 24, 2009.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Provincial M&E Forum 18 August 2011 The Presidency Department of Performance Monitoring and Evaluation Draft National Evaluation Policy Framework.
1 Results-based Monitoring, Training Workshop, Windhoek, Results-based Monitoring Purpose and tasks Steps 1 to 5 of establishing a RbM.
CORE REFORMS – NOVEMBER 4 TH, 2009 Development Partners Comments.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 9, 2012.
Driving towards Impact through Development Goals Washington, DC 04/13/2011.
EVALUATION OF THE SEE SARMa Project. Content Project management structure Internal evaluation External evaluation Evaluation report.
ACTED AME Appraisal, Monitoring and Evaluation. Summary 1/ ACTED AME department 2/ AME Responsibilities 3/ AME throughout project cycle 4/ Involvement.
Key Components of a successful Proposal OAS / IACML Workshop on Technical Assistance. San José, Costa Rica May 8, 2007 By: José Luis Alvarez R. Consultant.
Performance Measurement 101. Performance Measurement Performance Measurement is: –The ongoing monitoring and reporting of program accomplishments and.
HUMANITARIAN RESPONSE MONITORING. HOW TO USE THIS PRESENTATION This presentation contains a complete overview of all aspects of Response Monitoring Presenting.
Session 6: Data Flow, Data Management, and Data Quality.
Session 2: Developing a Comprehensive M&E Work Plan.
AGRO PARKS “The Policy Cycle” Alex Page Baku November 2014.
Folie 1 Sarajevo, October 2009 Stefan Friedrichs Managing Partner Public One // Governance Consulting Project Management in the Public Sector Monitoring.
Welcome. Contents: 1.Organization’s Policies & Procedure 2.Internal Controls 3.Manager’s Financial Role 4.Procurement Process 5.Monthly Financial Report.
W. Schiessl, AGRI E.II.4 Programme management and institutions involved in monitoring and evaluation.
Monitoring and Evaluation for UNDP/GEF projects MONITORING AND EVALUATION OF UNDP/GEF PROJECTS Inception Workshop, Baikal Lake Watershed Project,
Reporting, Monitoring and Evaluation Giovanni Rum, Chao Xing GEO Secretariat GEO Work Programme Symposium Geneva, 2-4 May 2016.
Project monitoring and evaluation
Monitoring and Evaluation
Predetermined Objectives – 2013/14
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 2. Session 6. Developing indicators.
Monitoring and Evaluation using the
Evaluation in the GEF and Training Module on Terminal Evaluations
AICT5 – eProject Project Planning for ICT
Presentation transcript:

Monitoring Afghanistan, 2015 Food Security and Agriculture Working Group – 9 December 2015

Overview  Rationale  Monitoring  Project Monitoring Cycle 2

Monitoring Structure  Performance Management commitment to results oriented and evidence based Performance Management under the transformative agenda.  Process Monitoring ensuring transparency in an evidence based approach to programming and internal transparency, articulation of which can be a valued strength in a competitive situation (i.e. proposal stage).  Beneficiary Contact Monitoring ensures accountability to affected populations and enables to refine project design.  Project Monitoring Cycle guides a standardized approach to monitoring. The cycle starts with the development of a logical framework and monitoring plan.  Temptation: Focus on project monitoring instead of beneficiaries.

Project Monitoring Cycle

Step 1- Prepare/revise performance monitoring narratives and logical frameworks  In Step 1, design of the project is summarised in the logframe which aligns the project with one or more of the strategic objectives found in WFP’s Strategic Plan. Baselines and performance targets are set for each indicator through consultation with host government and other relevant local partners. The logframe is part of the project document and is accompanied by a narrative section that gives an overview of how the project will be monitored. The development of the project logframe is an important first step in the monitoring cycle in that it establishes intended results and accompanying indicators.

Step 2 - Prepare monitoring plan and draft budget  In Step 2, to ensure that indicators in the project logframe are systematically used, a monitoring plan is developed. Outcome and output indicators, process indicators, and cross-cutting indicators are included. To ensure that adequate resources are allocated for monitoring, the monitoring plan is accompanied by a budget that specifies approved monitoring costs (staff, travel, vehicles, supplies, administrative expenses, IT services, capacity building expenses, etc.).  It is recommended that these costs be clearly indicated in the overall project budget at the time of project design and approval otherwise it is less likely that resources for monitoring will be available when needed.  Guidelines for monitoring plan and link to implementation plan  Temptation is to overcommit. Manage expectations is key.

Step 2 - Prepare monitoring plan  Sampling is key to preparing reliable monitoring results and can be as basic or sophisticated as the project allows.  Minimum Monitoring Standards, Standard Operating Procedures etc. would be consulted in the process as appropriate.  Take into account the gender-split, ensuring available female monitors, as most key indicators rely on a balanced gender equity in the responses. For instance, Coping Strategy Index is calculated on an average, but if only 4% female headed households have been represented in the sampling due to lack of female monitors, the results will be biased.

Example: monitoring plan

Step 3 – Develop methodology, tools and plan  In Step 3 the indicators presented in the monitoring plan are operationalized, that is, the tools (checklists, questionnaires and databases) for monitoring and data analysis are piloted and finalized. Guidelines and templates provided by various technical departments are adapted, and database design aligned with corporate requirements and minimum monitoring standards. Tasks, such as monitoring specific sites, are delegated within monthly work plans of individual staff or partners. If data is to be stored in a database, this database must be created prior to start of data collection. If electronic devices are to be used, these must be procured, programmed, and tested.

Example: Monitoring & Tools Checklists

Step 4 – Collect primary data and collate secondary data  Step 4 involves the ongoing collection and analysis of data as outlined in the Monitoring Plan. Primary data is collected and secondary data synthesized with the frequency described in the plan.  During this stage, training of staff (including local Government staff) who will be involved in data collection is essential. Once gathered, baseline values are captured and recorded.  Ongoing quality control is required to assure that data collected is valid and reliable.

Step 5 – Capture, compile and validate data  In Step 5 data is compiled, validated and electronically stored. Where paper questionnaires are used, data needs to be entered into a database. Similarly, if electronic devices are used, data has to be uploaded. This requires following up with data collectors to ensure submission and entry of all data sets.  Data validation involves data cleaning, checking completeness, quality and comparability of data. In cases where data collection is outsourced, the contractual document should clarify the format in which the data will be received and whether or not any preliminary analysis will be required.

Step 6 – Analyse data and prepare reports and other products  In Step 6, performance is analysed and reports prepared. Once data is compiled and validated, it is analysed to discern trends, achievements and challenges. Explanation of performance is revealed and suggestions made for improvement.  The development of information products follows to communicate findings (e.g. executive summaries for donor briefs, technical reports, operational reports, action sheets for programme managers, SPRs, etc.). These products are circulated as per the monitoring plan for learning, information sharing, decision-making, and accountability.

Step 7 – Make use of findings, take action, and document lessons  Step 7 ensures that remedial action, based on monitoring findings, is taken. During regular formal and informal review meetings (at both field and national levels), monitoring reports help validate and agree on actions required.  These actions are subsequently reflected in revised logic models and work plans (back to Step 1 of the monitoring cycle), and in adjusted monitoring plans and partnership agreements (back to Step 2).

Step 8 – Conduct mid or end-term evaluation or reviews  Step 8 occurs periodically, not necessarily annually. Findings of evaluations and reviews are used to inform the design and re-design of projects.  Evaluations serve to measure the results and impacts of interventions and generate learning that informs programme direction.  In addition to evaluations, project level reviews can be carried out to assess progress of implementation and inform decisions.

Resources  

Questions?