Presentation is loading. Please wait.

Presentation is loading. Please wait.

11 Designing and Building a Results-Based Monitoring and Evaluation System: A Tool for Public Sector Management.

Similar presentations


Presentation on theme: "11 Designing and Building a Results-Based Monitoring and Evaluation System: A Tool for Public Sector Management."— Presentation transcript:

1 11 Designing and Building a Results-Based Monitoring and Evaluation System: A Tool for Public Sector Management

2 22 The Power of Measuring Results If you do not measure results, you can not tell success from failure If you can not see success, you can not reward it If you can not reward success, you are probably rewarding failure If you can not see success, you can not learn from it If you can not recognize failure, you can not correct it If you can demonstrate results, you can win public support Adapted from Osborne & Gaebler, 1992

3 33 Ten Steps to Designing, Building and Sustaining a Results-Based Monitoring and Evaluation System Conducting a Readiness Assessment Agreeing on Outcomes to Monitor and Evaluate Selecting Key Indicators to Monitor Outcomes Baseline Data on Indicators Where Are We Today? Planning for Improvement Selecting Results Targets Monitoring for Results The Role of Evaluations Reporting Your Findings Using Your Findings Sustaining the M&E System Within Your Organization

4 44 Introduction to Results-Based Monitoring and Evaluation Results-based monitoring and evaluation measures how well governments are performing Results-based monitoring and evaluation is a management tool! Results-based monitoring and evaluation emphasizes assessing how outcomes are being achieved over time What Are We Talking About?

5 55 Remember Monitoring and evaluation are two separate, but interrelated strategies to collect data and report the findings on how well (or not) the public sector is performing During this workshop, we will be discussing: – Monitoring as a tool – Evaluation as a tool – How the two interrelate to support good public management – The ten steps to build a results-based monitoring and evaluation system to measure government performance

6 66 Reasons to Do Results-Based M&E Provides crucial information about public sector performance Provides a view over time on the status of a project, program, or policy Promotes credibility and public confidence by reporting on the results of programs Helps formulate and justify budget requests Identifies potentially promising programs or practices

7 77 Reasons to Do Results-Based M&E (cont.) Focuses attention on achieving outcomes important to the organization and its stakeholders Provides timely, frequent information to staff Helps establish key goals and objectives Permits managers to identify and take action to correct weaknesses Supports a development agenda that is shifting towards greater accountability for aid lending

8 88 Definition Results-Based Monitoring (what we will call monitoring) is a continuous process of collecting and analyzing information to compare how well a project, program or policy is performing against expected results

9 99 Major Activities Where Results Monitoring Is Needed Setting goals and objectives Reporting to Parliament and other stakeholders Managing projects, programs and policies Reporting to donors Allocating resources

10 10 A New Emphasis on Both Implementation and Results-Based Monitoring Traditional monitoring focuses on implementation monitoring – This involves tracking inputs ($$, resources, strategies), activities (what actually took place) and outputs (the products or services produced) – This approach focuses on monitoring how well a project, program or policy is being implemented – Often used to assess compliance with workplans and budget

11 11 Results-based monitoring involves the regular collection of information on how effectively government (or any organization) is performing Results-based monitoring demonstrates whether a project, program, or policy is achieving its stated goals A New Emphasis on Both Implementation and Results-Based Monitoring

12 12 Results Based Monitoring Requires Attention to Causal Logic ---or The Theory of Change What is the logic of the overall project, program or policy design? How do each of the components of the program help to establish an If-Then relation Is there a theory behind the change expected or seen? In other words does the change follow the logic proposed? Does this theory or logic hold during implementation?

13 13 Results-Based Monitoring Outcomes Intermediate effects of outputs on clients Outputs Products and services produced Activities Tasks personnel undertake to transform inputs to outputs Inputs Financial, human, and material resources Goal (Impacts) Long-term, widespread improvement in society Implementation Results

14 14 Results-Based Monitoring: Adult Literacy Outcomes Increased literacy skill; more employment opportunities Outputs Number of adults completing literacy courses Activities Literacy training coursesInputs Facilities, trainers, materials Goal (Impacts) Higher income levels; increase access to higher skill jobs

15 15 Definition Results-Based Evaluation An assessment of a planned, ongoing, or completed intervention to determine its relevance, efficiency, effectiveness, impact and sustainability. The intent is to incorporate lessons learned into the decision-making process.

16 16 Evaluation Addresses Why Questions–What caused the changes we are monitoring How Questions–What was the sequence or processes that led to successful (or not) outcomes Compliance/ Accountability Questions Process/ Implementation Questions –Did the promised activities actually take place and as they were planned? Was the implementation process followed as anticipated, and with what consequences

17 17 Designing Good Evaluations Getting the questions right is critical Answering the questions is critical Supporting public sector decision-making with credible and useful information is critical

18 18 Designing Good Evaluations Better to have an approximate answer to the right question, than an exact answer to the wrong question. Paraphrased from statistician John W. Tukey

19 19 Designing Good Evaluations Better to be approximately correct than precisely wrong. Paraphrased from Bertrand Russell

20 20 Some Examples of Evaluation Privatizing Water Systems Resettlement Policy Evaluations Comparing model approaches to privatizing public water supplies Comparing strategies used for resettlement of rural villages to new areas Program Evaluations Assessing fiscal management of government systems Assessing the degree to which resettled village farmers maintain previous livelihood Project Evaluations Assessing the improvement in water fee collection rates in 2 provinces Assessing the farming practices of resettled farmers in one province

21 21 Some Examples of Evaluation Privatizing Water Systems Resettlement Policy Evaluations Comparing model approaches to privatizing public water supplies Comparing strategies used for resettlement of rural villages to new areas Program Evaluations Assessing fiscal management of government systems Assessing the degree to which resettled village farmers maintain previous livelihood Project Evaluations Assessing the improvement in water fee collection rates in 2 provinces Assessing the farming practices of resettled farmers in one province

22 22 Complementary Roles of Results-Based Monitoring and Evaluation MonitoringEvaluation Clarifies program objectivesAnalyzes why intended results were or were not achieved Links activities and their resources to objectives Assesses specific causal contributions of activities to results Translates objectives into performance indicators and set targets Examines implementation process Routinely collects data on these indicators, compares actual results with targets Explores unintended results Reports progress to managers and alerts them to problems Provides lessons, highlights significant accomplishment or program potential, and offers recommendations for improvement

23 23 Developing A Results Plan Once a set of outcomes are identified, it is time to develop a plan to assess how the organization will begin to achieve these outcomes In the traditional approach to developing a plan, the first thing a manager usually did was to identify activities and assign responsibilities But the shortcoming in this approach is that completing all the activities does not mean the same as reaching the outcome goal

24 24 Key Types of Monitoring OutputActivityInputOutcome Impact Results Monitoring Implementation Monitoring (Means and Strategies) Implementation Results

25 25 Translating Outcomes to Action Note: Activities are crucial! They are the actions you take to manage and implement your programs, use your resources, and deliver the services of government But the sum of these activities may or may not mean you have achieved your outcomes Question is: How will you know when you have been successful?

26 26 Implementation Monitoring Links to Results Monitoring Outcome Target 2 Means and Strategies (Multi-Year and Annual Work Plans) Target 1 Target 3 Means and Strategies (Multi-Year and Annual Work Plans)

27 27 Partner 1Partner 3 Partner 2 Partner 1Partner 3 Partner 2 Partner 1Partner 3 Partner 2 Achieving Results Through Partnership Goal Outcome Target 2Target 1 Means & Strategy


Download ppt "11 Designing and Building a Results-Based Monitoring and Evaluation System: A Tool for Public Sector Management."

Similar presentations


Ads by Google