Download presentation
1
FPD M&E Workshop 5 – 6 November 2014
2
Introduction to M&E Components of MERP 5 – 6 November 2014
Sunet Jordaan
3
Workshop Expectations?
What do you expect to learn in the next two days? Specific needs? Key M&E skills you want to acquire?
4
Overall Learning Goal To build participants’ skills in monitoring and evaluation Other goals?
5
Learning Outcomes Describe the basic components of M&E
Describe the role and value of MERP (Monitoring, Evaluation and Reporting Plan) Apply key elements of MERP Theory of change framework and Results Logic Framework Indicators and Indicator Development Data management: data quality and data flow Shareholder analysis Data use and dissemination plan
6
Overview of Presentation
Overarching definition of M&E (what, why, how) Theory of Change Framework and Logical Framework Indicators Data Management (quality and flow) Stakeholder Analysis Data use and dissemination plan
7
Name M&E Activities Name one M&E activity linked to this project that is already being implemented ART Tier.Net Registers Patients on treatment Viral load, CD4 count Waiting times Follow-up with defaulting patients Health Education monitoring Side effects monitoring Compliance
8
Overview of Learning Outcomes
Describe the basic components of M&E Describe the role and value of MERP (Monitoring, Evaluation and Reporting Plan) Apply key elements of MERP Theory of Change Framework and Logic Framework Indicators and Indicator Development Data management: data quality and data flow Shareholder analysis Data use and dissemination plan
9
What is Monitoring and Evaluation?
Routine, on-going assessment of activities to provide managers, decision makers and other stakeholders with regular feedback on progress in implementation, results achieved and early indicators of problems that need to be corrected Evaluation: Time-bound, periodic assessment that seeks to answer specific questions to guide decisions
10
Monitoring versus Evaluation
Monitoring: What are we doing? Tracking inputs and outputs to assess whether the programme is implemented according to plan Evaluation: What have we achieved? Assessment of impact of the programme on behaviour or outcome
11
Purpose of M&E To provide the data needed to guide the planning, coordination and implementation of response; To assess the effectiveness of the programme; and To identify areas for programme improvement.
12
Purpose of M&E Have to measure results to tell success from failure
Learn from mistakes Demonstrate results to the donor
13
Purpose of M&E Monitoring Evaluation What is it?
Monitoring Evaluation What is it? Ongoing collection and analysis of data on progress towards results, changes in the context, strategies, and implementation Reviewing what has happened and why, and determining relevance, efficiency, effectiveness and impact Why do it? Inform day-to-day decision making, adjust project design, and inform on planning Accountability and reporting Strengthen future planning Provide evidence of success Deepen understanding of what works Who does it? Programme staff/partners/participants External consultant/staff/participants When to plan At design stage Core decisions at design stage and refined along the way When to implement Continuously Mid-term (formative) Completion (summative) After completion (impact)
14
M&E versus other activities
Main Aim Activity Project management, planning and justification Test hypotheses; develop new knowledge Disease control and prevention Control and proof Monitoring & Evaluation Research Surveillance Audit
15
M&E in Programme Management
Programme Improvement Data Sharing Reporting/ Accountability
16
M&E Schema Decision Making Based on Information
Using M&E in Programmes Methodology Basic Concepts and Principles of M&E
17
Data Management
18
Data Uses Document if a service is happening, or not
Just because it’s not recorded, doesn’t mean it’s not happening…but have to record to know Can provide an indication as to the quantity and quality of service Can highlight breaks in continuity of service Can identify ‘hot spots’ of poor performance Can suggest where a problem might be Need to have investigations to confirm Can motivate for one action over another Can highlight what we’re doing right and if there is improvement
19
Discussion questions Are you involved with data management?
At your organisation? What do you want to different at your organisation after this course? Important to do it right!
20
Overview of Learning Outcomes
Describe the basic components of M&E Describe the role and value of MERP (Monitoring, Evaluation and Reporting Plan) Apply key elements of MERP Theory of change framework and Logic Framework Indicators and Indicator Development Data management: data quality and data flow Shareholder analysis Data use and dissemination plan
21
MERP? M: onitoring E: valuation R: eporting P: lan
22
Purpose of MERP To provide a comprehensive Monitoring and Evaluation Plan for tracking performance and evaluation interventions To lay the basis for the design of a monitoring and evaluation system that would provide relevant, accurate and timeous information for informed decisions making To describe a system which links strategic information from various systems to decisions to improve a programme
23
Purpose of MERP The system that links strategic information to decisions that will improve programmes Ensures accountability and a measure for success
24
Function of MERP State how the programme is going to measure what it has achieved Encourage transparency and responsibility Guide implementation Preserve institutional memory Living document: adjust when needed
25
Steps Step 1: Understand your project
Step 2: Theory of change framework Step 3: Results Logic Framework Step 4: Data Management
26
Step 1: Understand your project
Understand obligations Research the context Consult with programme team Understand your budget, operations, infrastructure and human resource capacity
27
Step 1: Understand your project
Fit to your programme and need Fit capacity, stakeholder requirements and data needs Understand obligations (contract, programme etc.) Be flexible Simple!
28
Step 1: Understand your project
Research the context: Consult experts What was done before? Benchmarks and indicators?
29
Step 1: Understand your project
Consult with programme team Define the problem and the desired change Create a project management plan
30
Step 1: Understand your project
Understand your budget, operations, infrastructure and human resource capacity
31
Elements for MERP Brief project description Purpose of M&E plan
Brief history of M&E plan development Evaluation framework Indicator system Information system (data sources) Impact evaluation design Dissemination and utilisation plan Possible adjustments to M&E plan Example of a MERP
32
Overview of Learning Outcomes
Describe the basic components of M&E Describe the role and value of MERP (Monitoring, Evaluation and Reporting Plan) Apply key elements of MERP Theory of change framework and Logic Framework Indicators and Indicator Development Data management: data quality and data flow Shareholder analysis Data use and dissemination plan
33
Step 2 and Step 3 Theory of Change Framework Logical Framework
34
Theory of Change Framework
35
Logical Framework
36
Theory of Change and Results Logical Framework
Show how an organisations functions: Theory and assumptions of a programme Creates a road map: Links outcomes (short and long-term) with programme activities and processes and the theoretical assumptions/principles of the programme
37
Theory of Change and Logical Framework
This show where programmes fits into the wider context Shows relationships Guide identification of indicators Guide impact analysis
38
Theory of Change
39
Theory of Change examples
CBCT Programme Reduce Loss-to-Follow-Up patients UJ BCURE: Using research evidence for policy making
40
Step 2: Theory of Change It locates a programme or project within a wider analysis of how change comes about. It draws on external learning about development. It articulates our understanding of change - but also challenges us to explore it further. It acknowledges the complexity of change: the wider systems and actors that influence it It is often presented in diagrammatic form with an accompanying narrative summary
41
Logical Frameworks Foundation for M&E Frameworks
Outlines the hierarchy and relationship between project inputs, outputs, outcomes and impact
42
Logical Frameworks Activities Output Outcome Impact
43
Logical Frameworks: E.g.
Activities: Provide quality CHW training Output: CHW’s provide better health services to population Outcome: More people accessing VCT services Impact: Improved health outcomes at national level
44
Theory of Change Framework
45
Logical Framework
48
Logical frameworks Examples CBCT Logical Framework
SIDA Results Logical Framework SIDA Log Frame for Planning
49
Group Work Develop a Logical Framework for the Diabetes Project
Diabetes Awareness Project in Daveyton Activities: Diabetes awareness talk Testing for diabetes Novo Nordisk Diabetes Bus
50
Results Broad term used to refer to the effects of a programme
Most ambitious outcomes planned: this will be what you will be held accountable for
51
Overview of Learning Outcomes
Describe the basic components of M&E Describe the role and value of MERP (Monitoring, Evaluation and Reporting Plan) Apply key elements of MERP Theory of change framework and Logic Framework Indicators and Indicator Development Data management: data quality and data flow Shareholder analysis Data use and dissemination plan
52
Indicators What is an indicator?
53
Indicators If you can’t measure it, you can’t manage it Definition:
Measurable and verifiable proof of an action or result A quantitative or qualitative variable (something that changes) that provides a simple and reliable measurement of one aspect of performance, achievement or change in a programme or project An Indicator should be directly related to the programme or project objective to be measured with no overlap with that of other indicators
54
Indicators Measure change: directly or indirectly
Measure trends over time Measure progress towards defined targets and/or desired outcomes Provide information about a broad range if conditions through a single measure
55
Indicators Reduce a large amount of data to its simplest form
Help direct resources to areas where the needs are greatest and optimal health care system strengthening Provides evidence for achievement (or lack of) of results and activities (comparisons)
56
Indicators Indictors follow the hierarchy of results in the logical framework: Process indicators: Measure the completion of activities Impact indicators: Measure achievement of change Outcomes
57
Difference between ‘results areas’ and ‘indicators’
Individual statements of that statement/idea are measurable and verifiable Uses worlds like: number, percentage, rate… M&E speak for what you intend to measure in order to document that the desired change has been achieved Broad statement/ overarching idea about what you would like to change Uses words like: increased, decreased, improved etc.. Laymen’s speak for what you want to change; justifies your reason for intervention
58
Indicator Structure
59
Tools for defining indicators
Review existing indicators and resources from own country, international bodies (WHO, UNAIDS, MGD, PEPFAR) Align (as much as possible) to existing indicators (also align with existing data management structures) Indicators reference protocols: Grouped by programme area and/or goal Unique identifier Definition (inclusion/exclusion criteria) Disaggregation components Data quality concerns Reporting frequency
60
SMART and RAVESS Indicators
R: eliable A: ppropriate V: alid E: asy S: ensitive S: pecific S: pecific M: easurbale A: chievable R: ealistic T: imebound
61
Developing SMART Indicators
How to measure your progress
62
Indicator Selection Rules
They should be the optimum set that meet the management needs at a reasonable cost Limit the number of indicators used to track each objective or results to a few (2-3) Select only those that represent the most basic and important dimensions of your objectives Select indicators that you can measure!
63
Indicator Components Name of indicator Description/definition
Unit of measurement Data source (primary/secondary) Baseline/target values by year Frequency of data collection Responsibility Reporting plan and frequency
64
Priority Indicators
65
3 Questions when designing indicators:
Can I, and will I, use the information collected by this indicator? Does the value outweigh the effort of data collection? Will knowing this information (and using it) improve the quality of my programme?
66
Overview of Learning Outcomes
Describe the basic components of M&E Describe the role and value of MERP (Monitoring, Evaluation and Reporting Plan) Apply key elements of MERP Theory if change framework and Results Logic Framework Indicators and Indicator Development Data management: data quality and data flow Shareholder analysis Data use and dissemination plan
67
Reality to Action Real World Data Information Action
Collection and Coding Processing, interpretation, presentation Politics, commitment
68
Data Management Refers to the process of moving data from collection to collation to analysis to reporting Data management can be on a paper-based method or electronically Comprise of three focus areas: Data Quality Data Sources Data Flow Good data management processes mean that data is translated into information according to time and format required for use and generation of knowledge
69
What is Data Quality (DQ)?
Refers to the worth/accuracy of information collected How well do the data reflect ‘true performance’? Is a direct result if data management (DM) (poor DM → poor DQ) Measured by 5 components: Validity Reliability Timeliness Precision/accuracy Integrity
70
Common Problems with DQ
Incomplete or missing data Getting data collected, collated or analysed quickly enough (e.g. census statistics) Getting data in a timely manner Getting honest information
71
Technical Factors Problems with DQ Standard Indicators
Data Collection Forms Appropriate IT Data Presentation Trained People
72
System and Environmental Factors
Problems with DQ System and Environmental Factors Resources Structures of the Health System Roles and Responsibilities Organisational Culture
73
Sense of Responsibility
Problems with DQ Behavioural Factors Motivation Attitudes and Values Confidence Sense of Responsibility
74
Factors Affecting Data Quality
75
5 Components of Criterion-Based Evaluation of Data Quality
Validity Reliability Timeliness Precision/Accuracy Integrity
76
Measure of Validity Good Validity: Risks to validity:
Measure what you intended to measure Risks to validity: Understand the definition of the indicator in the context of the project (e.g. appropriate training and support defining exactly what we want to measure) Recognise the data that must be included/ excluded from a data set (e.g. inclusion and exclusion criteria) Recognise where we measure the data (e.g. which tools collect the data, who is responsible etc..) Clicks?
77
Measure of Reliability
Good reliability: Consistently collect data of the same quality over time Trust in the data Reliability = pre-requisite for validity Risks: if we fail to identify: The system doesn’t work: gaps in data collection, unclear roles, not reporting what we need to Collection instruments allow for variations over time and place: unclear understanding of who to use registers/forms New people (inadequate mentoring and training)
78
Measure of Timeliness Good timeliness: we collect, collate and report data which still have the desired relevance at time of reporting Data arrive in time to be used for evidence-based planning and decision-making Frequency: frequent enough to inform programme management decisions (e.g. monthly statistics) Currency: data are reported as soon as possible after collection (this month’s statistics reflect last month’s activities)
79
Measure of Precision/Accuracy
Good precision/accuracy: We work to make our data as free from bias (accuracy) and error (precision) as possible, and give an indication of the risk magnitude (how big is it) and direction of error (under- or over report) Measure of Precision/Accuracy: pre-requisite for validity Source errors/biases: Instruments used for collection, collation, manipulation and storage produce error or bias (corrupted spread sheet, formula not accurate anymore) Data results in bias based on time, place, person or under/over reporting errors (e.g. person responsible for register not there) Transcription methods (e.g. data capturer entered 29 instead of 92) Are you using precise information?
80
Measure of Integrity Data integrity: Poor integrity occurs from:
Truthfulness of the data (good or bad) Poor integrity occurs from: Human error: ooops! (e.g. did not count one page of register) Actual human interference: Don’t report that, it looks bad (e.g. too little data on TB symptom screening, so use and approximate instead) Manipulation of data: if we over-report, then it looks as if we are doing our work Technology failure: viruses, computer crashes etc. Integrity can be compromised during data collection, cleaning, handling and storage due to lack of proper controls
81
So, what is data quality? Good data quality has:
Validity: measures what you want to measure Reliable: has consistent and uniform data management methods over time Timely: is consistent with deadlines Precision/Accuracy: has minimal error/bias Integrity: is free of human error or manipulation
82
Managing Data Quality Drive data use at all levels Less is more
Critically assess your indicators for risks to data quality Pilot/test your M&E systems: never just assume that it will work Collaborate with the users of data Build checks and balances into your data management process to identify potential DQ issues quickly
83
Data Management Tools Look at existing tool (DoH, WHO, UNAIDS, partners) Align with existing data management systems (e.g. DHIS, tier.net, DoH registers) Only implement new tools if exiting tools can not meet the new data requirements
84
Information Systems Must have information systems for collecting data based on indicators Collect, process, analyse and report based on that Various tools
85
Data Sources Files, registers, tick sheets, notification sheets
Surveys (telephonic, self-administered, face-to-face, satisfaction) Questionnaires Interviews Focus groups Lab results Databases Stats SA, HSRC surveys DHIS
86
Evaluation Design Experimental vs. observational Quasi experimental
Based on implementation plan
87
Data Flow ‘Of all things that flow, data is not one of them’
88
Data Flow Data maps show the flow of indicators through data gathering forms and report formats and how they are connected Data maps ensure a process die collecting data for the indicators listed in the project proposal Depending on the scale and complexity of the project, there may be several data flow maps
89
Data Flow by Person
90
Data Flow: Data Quality Steps
91
Data Flow between partners
Important: How the data should be communicated between partners? What is the situation in your organisations? Discuss
92
Data Quality Management Plan
Criterion-based assessment of indicators + Data Flow: who does what, risks to DQ per data flow step, ensure data are being used = Tool development (definitions, quality checks, feedback format) Training interventions General monitoring feedback Data feedback format (general performance against targets and data quality graphs) Data use forums and demand for quality data
93
Framework for Enhancing Data Quality
94
Data Quality Management Plan
Criterion-based assessment of indicators + Data Flow: who does what, risks to DQ per data flow step, ensure data are being used = Tool development (definitions, quality checks, feedback format) Training interventions General monitoring feedback Data feedback format (general performance against targets and data quality graphs) Data use forums and demand for quality data
95
Building sustainable M&E Systems
Alignment of tools based on users/ beneficiaries data needs Systems to detect data quality issues Timely and relevant feedback structures on performance (based on data) and DQ concerns Training of tool users on data use, data quality and data management A sustainable M&E system
96
Overview of Learning Outcomes
Describe the basic components of M&E Describe the role and value of MERP (Monitoring, Evaluation and Reporting Plan) Apply key elements of MERP Theory of change framework and Logic Framework Indicators and Indicator Development Data management: data quality and data flow Shareholder analysis Data use and dissemination plan
97
Stakeholder Analysis & Data Use and Dissemination Plan
98
Stakeholder Management
A stakeholder: Is anyone who has reason to have an interest in your project; Can have an interest and/or participate in the project either in a direct or indirect manner; and May act independently; or represent groups
99
Stakeholder Management
Stakeholders include for example: The Project Manager - You; The Customer – the person/organisation paying for the project ; The Sponsor - generally the person who assigned the Project Manager the responsibility for the project, often an individual that participates in the senior or executive management level of the organisation; The Manager in your organisation who is required to provide approvals for specific actions/tasks); The Project Team Members; and All external role players e.g. regulatory authorities, government, trade unions, traditional leaders, NGO’s FBO’s, etc. Tip: Manage your stakeholders: Inform, inform and inform
100
Stakeholder Management
From: Accessed 25 September 2010.
101
Stakeholder Analysis Who has a vested interest in your project?
What are their data requirements? Why do they want that data? What should the data format look like?
102
Stakeholder Analysis
103
Data Use and Dissemination Plan
Reports (examples) Evaluation assessments Logical Frameworks Add to database Presentations Research? Publish Credit to co-partners Discuss the above aspects in the group
104
Tips: Stakeholder Analysis & Data Use and Dissemination Plan
Defines focus of data collection: data that can and will be used (in other words, data needs) Streamline and standardise as much as possible: minimise parallel and redundancies as much as possible Consult stakeholders and review draft analysis and data use and dissemination plan
105
Case Study: Stakeholder Analysis and Data Use and Dissemination Plan
As a group identify a stakeholder Complete the stakeholder analysis for that stakeholder (look at the big picture) 15 minutes for discussion, 10 minutes for presentation
Similar presentations
© 2025 SlidePlayer.com Inc.
All rights reserved.