Explanation of slide: Logos, to show while the audience arrive.

Slides:



Advertisements
Similar presentations
Module 2 – Monitoring and Evaluation Definitions.
Advertisements

Poverty Monitoring - Good Practice in Selecting Indicators Presentation by Dirk U. Hahn, Consultant at the Workshop on Prioritization, Operationalization.
Monitoring and Evaluation: the Foundations for Results
The World Bank Human Development Network Spanish Impact Evaluation Fund.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
February Dakar, Senegal
Overview M&E Capacity Strengthening Workshop, Maputo 19 and 20 September 2011.
Donald T. Simeon Caribbean Health Research Council
Knowing if the RBF mechanism is working Incorporating Rigorous Impact Evaluation into your HRBF program Sebastian Martinez World Bank.
Innovations in the Delivery and Scaling-up of Nutrition Interventions Marie T. Ruel, Director, Food Consumption and Nutrition Division IFPRI.
Conditional Cash Transfer penalties Vs. no penalties.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Lesson 2: Project: Evaluation, Monitoring, Auditing Macerata, 22 nd October Alessandro Valenza, Director, t33 srl.
Ray C. Rist The World Bank Washington, D.C.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Results-Based Management: Logical Framework Approach
Role of Result-based Monitoring in Result-based Management (RBM) Setting goals and objectives Reporting to Parliament RBM of projects, programs and.
1 Designing a Monitoring and Evaluation System for a Rural Travel and Transport Project Michael Bamberger Gender and Development Group The World Bank RTTP.
Conditional Cash Transfers for Improving Utilization of Health Services Health Systems Innovation Workshop Abuja, January 25 th -29 th, 2010.
Poverty Reduction Information Systems (PRIS)
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Dr. G. Johnson, Program Evaluation and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson.
Results-Based Management
Effect of Timed and Targeted Counseling in Changing IYCF Practices in Ethiopia Mesfin Beyero MD MPH 1, Kathryn Reider MS 2, Yared Mekonnen PhD 3 1 World.
Performance Measurement: A Brief Introduction For Kunuwanimano Child and Family Services.
1 RBM Background Development aid is often provided on a point to point basis with no consistency with countries priorities. Development efforts are often.
1 What are Monitoring and Evaluation? How do we think about M&E in the context of the LAM Project?
May 8, 2012 MWP-K Learning Event Monitoring, evaluation, and learning (MEL) framework for the Millennium Water Program, Kenya.
Lesson 2: Project: Evaluation, Monitoring, Auditing Macerata, 22 nd October Alessandro Valenza, Director, t33 srl.
KEYWORDS REFRESHMENT. Activities: in the context of the Logframe Matrix, these are the actions (tasks) that have to be taken to produce results Analysis.
Lesson 3: Monitoring and Indicator Macerata, 2o th November Alessandro Valenza, Director, t33 srl.
The National Development Plan, Iraq 6 July 2010 “Developing Objectives & Indicators for Strategic Planning” Khaled Ehsan and Helen Olafsdottir UNDP Iraq.
IAOD Evaluation Section, the Development Agenda (DA) and Development Oriented Activities Julia Flores Marfetan, Senior Evaluator.
Pilot Project IEC Materials for Child Friendly City Endah Sri Rejeki INDONESIA.
Why Evaluate? Evaluating the Impact of Projects and Programs, Beijing, China April Shahid Khandker World Bank Institute.
Slide 1 Monitoring & Evaluation of e-Government Projects, Performance Metrics for CM People Process Technology Outputs Outcomes Impact.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Ministry of Social Services and Community Development DEPARTMENT OF SOCIAL SERVICES SOCIAL SAFETY NET REFORM September, 2013.
Impact Evaluation in Education Introduction to Monitoring and Evaluation Andrew Jenkins 23/03/14.
1 Second Regional Workshop on gender and Poverty Reduction Strategies, September 2003, Siem Reap Gender responsive costing and budgeting Nalini Burn.
Krakow, Poland September 24-26, 2014 Results-Based Financing Approaches.
21/4/2008 Evaluation of control measures 1. 21/4/2008 Evaluation of control measures 2 Family and Community Medicine Department.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
AmeriCorps Grantee Training Evaluation and Research September 11, 2014.
Workgroup: Delivering and Accounting for Development Results
Institute for International Programs An international evaluation consortium Institute for International Programs An international evaluation consortium.
SUB-MODULE. 3 RESULTS CHAIN RESULTS BASED LOGICAL FRAMEWORK TRAINING Quality Assurance and Results Department (ORQR.2)
Africa RISING M&E Expert Meeting Addis Ababa, 5-7 September 2012.
PREPARING FOR MONITORING AND EVALUATION 27 – 31 May 2013 Bangkok Bangkok Office Asia and Pacific Regional Bureau for Education United Nations Educational,
Impact Evaluation of Urban Upgrading Programs Judy Baker, FEU November 19, 2007.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Florence M. Turyashemererwa Lecturer- Makerere University
A Presentation on the Report of the Monitoring and Evaluation Exercise conducted between 1st January - 30th June, 2011 Presented By Jil Mamza Monitoring.
What is Impact Evaluation … and How Do We Use It? Deon Filmer Development Research Group, The World Bank Evidence-Based Decision-Making in Education Workshop.
Introduction to Monitoring and Evaluation. Learning Objectives By the end of the session, participants will be able to: Define program components Define.
SUMMARY Macerata, 8 th April Andrea Gramillano, t33 srl.
Implementation Science: Finding Common Ground and Perspectives Laura Reichenbach, Evidence Project, Population Council International Conference on Family.
Session 7: Planning for Evaluation. Session Overview Key definitions:  monitoring  evaluation Process monitoring and process evaluation Outcome monitoring.
MONITORING & EVALUATION The Foundations for Results This material constitutes supporting material for the "Impact Evaluation in Practice" book. This additional.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Evaluation What is evaluation?
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
Monitoring and Evaluation Systems for NARS organizations in Papua New Guinea Day 4. Session 10. Evaluation.
Project Management and Monitoring & Evaluation
Strategic Planning for Learning Organizations
Explanation of slide: Logos, to show while the audience arrive.
Sampling for Impact Evaluation -theory and application-
Title Team Members.
Presentation transcript:

Explanation of slide: Logos, to show while the audience arrive.

MONITORING & EVALUATION The Foundations for Results This material constitutes supporting material for the "Impact Evaluation in Practice" book. This additional material is made freely but please acknowledge its use as follows: Gertler, P. J.; Martinez, S., Premand, P., Rawlings, L. B. and Christel M. J. Vermeersch, 2010, Impact Evaluation in Practice: Ancillary Material, The World Bank, Washington DC (www.worldbank.org/ieinpractice). The content of this presentation reflects the views of the authors and not necessarily those of the World Bank.

Objectives of this session Global Focus on Results 1 Monitoring vs. Evaluation 2 Using a Results Chain 3 Results in Projects 4 Moving Forward 5 Selecting smart indicators. Collecting data. Making results useful.

Objectives of this session Global Focus on Results 1 Monitoring vs. Evaluation 2 Using a Results Chain 3 Results in Projects 4 Moving Forward 5 Selecting smart indicators. Collecting data. Making results useful.

Results: Based Management is a global trend What is new about results? Managers are judged by their programs’ performance, not their control of inputs: A shift in focus from inputs to outcomes. Establishing links between monitoring and evaluation, policy formulation, and budgets Critical to effective public sector management

Objectives of this session Global Focus on Results 1 Monitoring vs. Evaluation 2 Using a Results Chain 3 Results in Projects 4 Moving Forward 5 Selecting smart indicators. Collecting data. Making results useful.

Monitoring vs. Evaluation Frequency Regular, Continuous Periodic Coverage All programs Selected program, aspects Data Universal Sample based Depth of Information Tracks implementation, looks at WHAT Tailored, often to performance and impact/ WHY Cost Cost spread out Can be high Utility Continuous program improvement, management Major program decisions Monitoring A continuous process of collecting and analyzing information -- To compare how well a project, program or policy is performing against expected results -- To inform implementation and program management Evaluation A systematic, objective assessment of an on-going or completed project, program, or policy, its design, implementation and/or results --To determine the relevance and fulfillment of objectives, development efficiency, effectiveness, impact and sustainability. --To generate lessons learned to inform the decisionmaking process.

Monitoring “ A continuous process of collecting and analyzing information, to compare how well a project, program or policy is performing against expected results, and to inform implementation and program management.

Evaluation “ A systematic, objective assessment of an on-going or completed project, program, or policy, its design, implementation and/or results, to determine the relevance and fulfillment of objectives, development efficiency, effectiveness, impact and sustainability, and to generate lessons learned to inform the decision making process, tailored to key questions.

Impact Evaluation “ An assessment of the causal effect of a project , program or policy on beneficiaries. Uses a counterfactual… to estimate what the state of the beneficiaries would have been in the absence of the program (the control or comparison group), compared to the observed state of beneficiaries (the treatment group), and to determine intermediate or final outcomes attributable to the intervention .

When to use Impact Evaluation? Evaluate impact when project is: Innovative Replicable/scalable Strategically relevant for reducing poverty Evaluation will fill knowledge gap Substantial policy impact Use evaluation within a program to test alternatives and improve programs

Objectives of this session Global Focus on Results 1 Monitoring vs. Evaluation 2 Using a Results Chain 3 Results in Projects 4 Moving Forward 5 Selecting smart indicators. Collecting data. Making results useful.

Using a Results Chain A Results Chain answers 3 questions: What are the intended results of the program? How will we achieve the intended results? How will we know we have achieved the intended results?

The Results Chain in a Typical Program INPUTS ACTIVITIES OUTPUTS OUTCOMES LONGER-TERM OUTCOMES HIGHER ORDER GOALS Financial, human, and other resources mobilized to support activities. Actions taken or work performed to convert inputs into specific outputs. Project deliverables within the control of implementing agency SUPPLY SIDE. Use of outputs by beneficiaries and stakeholders outside the control of implementing agency DEMAND SIDE. Changes in outcomes that have multiple drivers. Training plan completed Cash transfer delivered Road constructed School built New practices adopted Use of the road School attendance up Health service use up Budget Staffing Training Studies Construction Poverty reduced Income inequality reduced Labor productivity increased How many of you are familiar with this? Use it regularly? Feel completely confident about the definitions of the different terms? A results chain answers 3 questions What are the intended results of the program? How will we achieve the intended results? How will we know we have achieved the intended results? Implementation Results Results-based management

Example 1: Results Chain Activities Outputs Outcomes Longer-term Outcomes Education Teacher training Textbooks developed Teachers trained in new methods Textbooks delivered New methods used Increased completion rates Increased test scores Increased labor productivity Health Doctors hired Birth attendants trained New doctors practicing Attendants applying methods Increased use of health clinics for deliveries Improved maternal mortality Social Protection and Labor CCTs delivered Targeting system MIS CCTs delivered to target households in accordance with conditions Increased food consumption Increased child health visits Decreased poverty Lower child mortality

Example 2: Results Chain Identify the sequence of inputs, activities, outputs and outcomes: Information is available for parents about the importance of breast feeding. 1 2 Children in community healthier. 3 Fewer children are having diarrheal diseases. 4 Mothers breast feeding rather than using formula. New funds available to implement a health project to reduce child malnutrition rates. 5 Design information campaigns on the importance of breast feeding 6

Example 2: Results Chain Identify the sequence of inputs, activities, outputs and outcomes: New funds available to implement a health project to reduce child malnutrition rates. 5 Input Design information campaigns on the importance of breast feeding 6 Activity Information is available for parents about the importance of breast feeding. 1 Output 4 Mothers breast feeding rather than using formula. Outcome 3 Fewer children are having diarrheal diseases. Outcome 2 Children in community healthier. Outcome

Objectives of this session Global Focus on Results 1 Monitoring vs. Evaluation 2 Using a Results Chain 3 Results in Projects 4 Moving Forward 5 Selecting smart indicators. Collecting data. Making results useful.

Implementing the Results Chain Jamaica PATH CCT Program Example of how a well-structured program level M&E helped shape program design and inform policy decisions Program of Advancement Through Health and Education (PATH) Conditional cash transfer (CCT) program aimed at linking social assistance with human capital accumulation Primarily child grants to poor children <19 conditional on school, health care usage

Jamaica PATH CCT Program Level Time Frame Activities: Monitoring program execution On-going basis Assessing program implementation Regular basis Outputs: Assessing program effectiveness Annual (linked to periodic household survey) Outcomes: Evaluating Impact of program on outcomes Baseline and follow-up NEW

Jamaica’s PATH M&E System Level Instruments Key Indicators Activities: Monitoring program execution Management Info Systems (MIS) Beneficiaries Compliance Payments Assessing program implementation Implementation evaluation Beneficiary and stakeholders understanding of program requirements and satisfaction Internal audits Process evaluation Spot checks Adherence to regulations Outputs: Assessing program effectiveness Special targeting assessment Annual household survey Coverage Targeting Adequacy of benefits Outcomes: Evaluating Impact of program on outcomes Impact evaluation School attendance Use of preventive health services

Use of PATH M&E Results Key Indicator Instruments Activities: Management Info System (MIS) Results Some lag in payments Good compliance with conditions Slower take up rate of program Use Adjustments to payment system Intensified outreach Implementation evaluations Application process seen as burdensome Stakeholders not clear on program rules Strong demand for jobs/ training Social workers used as focal points to access a variety of social services “Steps to Work”, new program created with focus on employment, labor markets skills development Internal audits Process evaluation Spot checks Problems with payment system Weak system for verifying eligibility of new beneficiaries Delays in appeals processing Revamping of MIS Revised operations manual New check printing machine for timely payments Intensified training of social workers NEW Problem with payment system – (i) duplicate checks; (ii) poor oversight of uncollected checks; (iii) inadequate engagement/control at post office (site of check delivery)

Use of PATH M&E Results Key Indicator Instruments Outputs: Outcomes: Special targeting assessment Annual household survey Results PATH better at reaching the poor than other Jamaican safety net programs Not as good as other internationally Use Improved the beneficiary identification system Expanded training for social workers to help verify eligibility More frequent recertification Outcomes: Impact evaluation Education: School attendance improved slightly (by about half a day in a 20 day period). No impact on enrollment Health: 30% increase in use of preventive health services Focused main education objective on school completion Introduced differentiated benefit levels to provide incentives for completion (gender, age) Introduced a bonus for completing high school NEW Special targeting assessment - used Annual hh survey – JSLC (Jamaica Survey of Living Conditions) used to assess coverage of program, benefit incidence and adequacy of benefits as % of consumption. Targeting – close to 50% of PATH beneficiaries from poorest quintile New improved the beneficiary identification system (new scoring formula, better data collection);

Lessons Learned A well articulated approach to M&E is critical to good program management and to informing policy Impact evaluations are powerful for informing key program and policy decisions Good monitoring systems Allow for results-based planning and management Facilitate project preparation, supervision and reform

Lessons Learned What does it take to get there? Clients willing to learn, take risks, experiment, and collaborate (“from threats to tools”) Strong support of M&E by senior government champions and demand for transparency by civil society Donor and government desire to focus on M&E processes and goals Cross-sectoral collaboration in the government (especially Ministry of Finance) & donors

Objectives of this session Global Focus on Results 1 Monitoring vs. Evaluation 2 Using a Results Chain 3 Results in Projects 4 Moving Forward 5 Selecting smart indicators. Collecting data. Making results useful.

SMART: Identifying good indicators pecific M easurable A ttributable R ealistic T argeted

S pecific Measure as closely as possible what you want to know. Outcome: Children treated for malaria Indicators: Increased utilization of clinics Increased use of malaria drugs Which indicator is more specific? M easurable Be clear about how it will be measured –specific. Indicators: % of health centers without stocks of drugs x, y & z for more than a week at a time % of health centers with availability of drugs Which indicator is more measurable? Source: Kathouri and Kusek, 2006

A ttributable R ealistic Logically and closely linked to a program’s efforts. Indicators: Life expectancy % of children fully immunized at 1 year Which indicator is attributable? R ealistic Data obtainable at reasonable cost, frequency and accuracy. Indicators: HIV prevalence among 15-24 year-old pregnant women HIV prevalence among the total population Which indicator is more realistic? Source: Kathouri and Kusek, 2006

T argeted Specific to the program’s target group. Indicators: % increase in employment % increase in employment of graduates of technical training center X, in the first year after completion of training Which indicator is targeted? Source: Kathouri and Kusek, 2006

Develop a Data Collection Plan Identify what specific data are needed Identify how the data will be collected Identify who will be responsible for collecting and reporting the data Identify when the data will be collected and reported, including how frequently NEW Identify costs and sources of financing

Quick Tips on Making performance monitoring really useful… Provide frequent, timely information to program staff. Set targets for each performance indicator. Provide sub-group data. Disaggregate data by customer and service characteristics. Do regular, basic, analysis of the data, especially comparisons. NEW

Require explanations for unexpected findings. Report findings in a user-friendly way. Hold “How Are We Doing?” sessions after each performance report. Use “Red-Yellow-Green Lights” to identify programs/projects needing attention. Link outcome information to program costs. NEW Source: Harry Hatry, Urban Institute

Which Hospital Would You Choose? MERCY HOSPITAL APOLLO HOSPITAL 2,100 SURGERY PATIENTS 63 DEATHS 3% DEATH RATE 800 SURGERY PATIENTS 16 DEATHS 2% DEATH RATE

Which Hospital Would You Choose? MERCY HOSPITAL APOLLO HOSPITAL 2,100 SURGERY PATIENTS 63 DEATHS 3% DEATH RATE 800 SURGERY PATIENTS 16 DEATHS 2% DEATH RATE BUT BUT 1% DEATH RATE 6 DEATHS 600 IN GOOD CONDITION 1.3% DEATH RATE 8 DEATHS 600 IN GOOD CONDITION 3.8% DEATH RATE 57 DEATHS 1,500 IN POOR CONDITION 4% DEATH RATE 8 DEATHS 200 IN POOR CONDITION

Conclusions Monitoring and evaluation are separate, complementary functions, but both are key to results-based management Good M&E is crucial not only to effective project management but can be a driver for reform Have a good M&E plan before you roll out your project and use it to inform the journey! Design the timing and content of M&E results to further evidence-based dialogue Good monitoring is essential to good impact evaluation