Download presentation
Presentation is loading. Please wait.
Published byClaribel Chambers Modified over 9 years ago
1
Monitoring and Evaluation (M&E) of Projects by the Ministry of Public Works, Work Services Group
2
Ministry of Public Works, Guyana 5 th International Engineering Conference January 2015 Presenters: Ms. Lloyda Rollins Ms. Jennifer Rahim
4
Monitoring and Evaluation in Brief Monitoring is the routine, daily assessment of ongoing activities and progress on projects of all types Evaluation is the periodic assessment of overall achievements of projects Monitoring looks at what is being done, whereas evaluation examines what has been achieved or what impact has been made
5
M&E can help develop the confidence of organizations in making decisions in the following areas: resource allocation and uses; programme (and project) direction; and meeting the needs of intended recipients It is by means of M&E that organizations can determine the impact of its programmes (and projects), through a comprehensive analysis of the intended and unintended outcomes
6
It also provides information about the performance of a government, of individual ministries and agencies, and of managers and their staff as well as it provides information on the performance of donors that support the work of government.
7
Reasons for Monitoring & Evaluating Projects With the growing number of large projects in Guyana, the cost for execution and the continuous delays of projects, donor agencies such as the Inter-American Development Bank and the Caribbean Development Bank has sanction the need for better control and management of the project because of the large investment they make. This has caused agencies such as the Work Services Group to increase its focus on improving the efficiency of projects and its expenditures through monitoring and control.
9
What is Monitoring & Evaluation Monitoring is not policing or imposing but rather it is the continuous collection of data and information on specified indicators to assess the implementation of a project in relation to activity schedules and expenditure of allocated funds, and progress and achievements in relation to its intended outcomes.
10
Monitoring involves day-to-day follow-up of project activities during implementation to measure progress and identify deviations -requires routine follow-up to ensure activities are proceeding as planned and are on schedule -needs continuous assessment of activities and results - answers the question, “what are we doing?
11
Monitoring activities provide answers to the following questions: Is the programme (or project) achieving its goal and objectives? Is the programme (or project) being implemented as intended? What factors are facilitating/hindering success? What are the unintended outcomes? What are the lessons learned up to this point? Are stakeholders’ priorities being addressed?
12
Monitoring involves: Reviewing progress towards the achievement of prescribed programme (or project) objectives Setting up systems to collect data for each indicator and for each objective Documenting the contextual issues which impact on programme (or project) implementation Using real-time information to manage a programme (or project)
13
What is Evaluation Evaluation is the periodic assessment of the design implementation, outcome, and impact of a programme (or project). It should assess the relevance and achievement of the intended outcome, and implementation performance in terms of effectiveness and efficiency, and the nature, distribution, and sustainability of impact
14
Evaluation is a systematic way of learning from experience to improve current activities and promote better planning for future action is designed specifically with the intention to attribute changes to the project itself answers the question, “what have we achieved and what impact have we had?” Evaluations promote a culture of learning, which is focused on service improvement through evidence- based practices
15
Evaluations promote replication of successful interventions (using evidence-based practices) Evaluations determine the impact of programmes (and projects) by reporting on the intended, as well as unintended outcomes
16
Why Should We Monitor & Evaluate Projects Why When Monitor Evaluate Review Progress on set targets, indicators, objectives Identify gaps in planning and implementation Make day-to-day decisions Provide information for evaluation Judge, and value Asses Major decision Provide information for planning During implementation Continuous Before or after Periodic
17
How Should We Monitor and Evaluate Projects In order to Monitor and Evaluate, you must have performance indicators. Indicators are realistic, specific, observable and measurable characteristic that can be used to show changes or progress a programme or project while achieving a specific outcome.
18
Indicators provide information/data that answer M&E questions Indicators provide clues, signs and markers that inform how close projects (and programmes) may be to their intended paths Indicators are used to assess inputs, outputs, outcomes and impacts
19
Input indicators include financial, materials, equipment, human and technical resources required for the project Output indicators provide information/data on project activities that are completed Outcome indicators provide information/data on improvements expected from the project activities Impact indicators provide information/data on the longer-term, holistic improvements expected from the project (and programme) activities
20
Requirements for Monitoring & Evaluating Projects LevelDescriptionTime-frame Impacts (Goal) Measurable changes over time as a result of the projects Related to long- term outcomes Outcomes (Objectives) Changes in behaviours or skills as a result of the implemented project. Outcomes lead to impacts Usually mid- to long-term Outputs (Deliverables) Activities or services that the project is providing. Outputs lead to outcomes Milestone dates within the project duration InputsResources that are put into the project (e.g. person-months of consulting time, cost of materials, equipment, etc). Lead to achievement of outputs. Throughout the project duration
21
Logical Framework Model for Monitoring and Evaluation Resources (Inputs) Activities Outputs Outcomes Impact
22
What we aim to change Impact What wish to achieve Outcome What we produce or deliver OutputsActivities Inputs (Resources ) What we do What we use to do work
23
Monitoring and Evaluation Data In order for monitoring and evaluation to be effective there must be data to measure the desired indicators. Data required can be obtained from existing sources or may require new sources via project-related M&E activities Existing data sources may not always be accessible; confidentiality of data may be an issue; data may be imprecise, incomplete or of poor quality. Existing data sources are generally inexpensive (free) and show historic trends
24
New data sources may require expensive, time consuming methods (e.g. surveys), but should provide the precise data required. All data collected must be accurate and of the best quality so as to negate any doubts of the intended results. There must be confidence in the data collected. It must be monitored at every step of the data collection, analysis and reporting process. It must inspired
25
Data should reflect stable and consistent data collection processes Reliable Feasible to access, given the available resources Routinely collected (when possible) Easy to collect Data collected should be relevant to the purposes for which it is to be used Relevant Offer confidence in the quality of information gathered (believable and reliable) Verifiable Data should be collected as quickly as possible after the activity and must be available within a short period of time Timely
27
How Can Data Be Collected Quantitative Methods Surveys Exit interviews Record abstraction Checklists Observations Qualitative Methods Key informant interviews Focus group discussions
28
Quantitative Methods Surveys – data on a group of individuals is collected; snapshot at a defined point in time; measure community satisfaction, travel time surveys, etc Exit interviews – refers to conducting interviews with key beneficiaries following completion of activities/services Record abstraction – collection of data from existing sources (e.g. Traffic police accident data) Checklists – list of activities that should be performed during project implementation, including milestone dates
29
Observations – watching and recording behavioural patterns and any changes as a result of project activities Qualitative Methods Informant interviews – interviews with selected, knowledgeable individuals about specific aspects of the project (can be used to complement quantitative data collected) Focus group discussions – interviews of a small group of persons to gain in-depth understanding of attitudes, perceptions, situations, etc
30
Application of Monitoring & Evaluation of Projects by the WSG WSG/MPW is developing and using M&E systems as part of the CDB-financed Fourth Road Project West Coast Demerara and the IADB- financed East Bank Demerara Four Lane Project, West Bank Demerara, Canal Polder 1 & 2, East Bank Berbice, Sheriff Street Mandela and Grove to Timehri.
31
4 th Road Project includes: WCDR road improvements (31 km) from Vreed-en- Hoop to Hydronie (4 yrs) Road Safety Education Programme in Schools (2 yrs) Road Safety Community and Driver Education Programme (1 yr) Road Safety Public Relations Programme (2 yrs) Goal (Impact – long-term) To strengthen road safety awareness in the curriculum and increase awareness of children and young people attending schools through the development of a school RSE programme.
32
Objectives (Outcomes – long-term institutionalisation aimed at sustainability) To raise awareness of road safety and RSE To establish RSE in the curriculum To develop RSE capacity of teachers in the target schools through RSE curriculum seminars To improve the content and delivery of the existing GNRSC school road safety patrol programme and the Traffic Police RSE initiatives
33
Deliverables (Outputs – short-term, during project 24mths) Completed project scoping study (review of existing RS courses, id of needed improvements, etc) Report on how RSE can be integrated into existing curriculum Sample lesson plans and training materials for 4 types of schools; Delivery of classroom and on/off-road practical training for students in target schools
34
Assessment reports on recipients’ knowledge, awareness, attitudes and perception of road safety before and after the project, including any changes in behaviour Reports on the use of RSE principles and techniques by teachers and students A national RSE guidance document and best practices guide Report on teacher training recommendations to ensure sustainability of RSE in schools
35
Project Logical Framework (LogFrame) Show Project Goal; Objectives (Outcomes); Deliverables (Outputs/Expected Results) in four Columns: Narrative Summary Measures of Achievement (Performance Indicators) Means (Sources) of Verification Critical Assumptions (Risks) for Achieving Expected Results
36
Narrative Summary Performance Indicators (Measures of Achievement) Means (Sources) of Verification Critical Assumptions Goal or Impact To improve safe road use by students Expected Results (Outputs or Deliverables) Teachers trained in RSE New road safety syllabus agreed for teacher training qualifications Road safety included in selected schools curriculum MOE supports including road safety in Region 3 schools curriculum
37
Building Capacity Introductory and advance training in M&E for staff through workshops and seminar.
38
Conclusion There are incentives for implementing monitoring and evaluation systems into any organisation. Such incentives can be achieved through the use of “carrots, sticks, and sermons” (Mackay, 2012).
39
An example of a carrot is the delivery of greater autonomy to managers who demonstrate (through reliable M&E information) better performance of their programs, projects, or institutions. An example of a stick is to set challenging (but realistic) performance goals to be met by each ministry and program manager. An example of a sermon is a high-level declaration of support for M&E from an influential actor in government, such as the president or an important minister.
40
CarrotStickSermon Conduct “How are we doing” team meetings Highlight good/bad results (using M&E) High-level statements of endorsement Awards or prize for managing results Set performance Awareness-raising seminars Staff incentives, e.g. recruitment, promotion Require performance “exception reporting” Pilot rapid evaluations, impact evaluations Output or outcome – based performance triggers Include information on results when appraising managers Highlight examples of useful, influential M&E Source: Keith Mackay, How to Build M&E Systems to Support Better Government, World Bank 2007
41
With the successful implementation of M&E, WSG will have better control of Scope, Cost and Time on each project.
42
The End Thank You No Question Please
Similar presentations
© 2024 SlidePlayer.com Inc.
All rights reserved.