Presentation on theme: "Explanation of slide: Logos, to show while the audience arrive."— Presentation transcript:
1Explanation of slide:Logos, to show while the audience arrive.
2MONITORING & EVALUATION The Foundations for Results This material constitutes supporting material for the "Impact Evaluation in Practice" book. This additional material is made freely but please acknowledge its use as follows: Gertler, P. J.; Martinez, S., Premand, P., Rawlings, L. B. and Christel M. J. Vermeersch, 2010, Impact Evaluation in Practice: Ancillary Material, The World Bank, Washington DC (www.worldbank.org/ieinpractice). The content of this presentation reflects the views of the authors and not necessarily those of the World Bank.
3Objectives of this session Global Focus on Results1Monitoring vs. Evaluation2Using a Results Chain3Results in Projects4Moving Forward5Selecting smart indicators.Collecting data.Making results useful.
4Objectives of this session Global Focus on Results1Monitoring vs. Evaluation2Using a Results Chain3Results in Projects4Moving Forward5Selecting smart indicators.Collecting data.Making results useful.
5Results: Based Management is a global trend What is new about results?Managers are judged by their programs’ performance, not their control of inputs:A shift in focus from inputs to outcomes.Establishing links between monitoring and evaluation, policy formulation, and budgetsCritical to effective public sector management
6Objectives of this session Global Focus on Results1Monitoring vs. Evaluation2Using a Results Chain3Results in Projects4Moving Forward5Selecting smart indicators.Collecting data.Making results useful.
7Monitoring vs. Evaluation FrequencyRegular, ContinuousPeriodicCoverageAll programsSelected program, aspectsDataUniversalSample basedDepth of InformationTracks implementation,looks at WHATTailored, often to performance and impact/ WHYCostCost spread outCan be highUtilityContinuous program improvement, managementMajor program decisionsMonitoringA continuous process of collecting and analyzing information-- To compare how well a project, program or policy is performing against expected results-- To inform implementation and program managementEvaluationA systematic, objective assessment of an on-going or completed project, program, or policy, its design, implementation and/or results--To determine the relevance and fulfillment of objectives, development efficiency, effectiveness, impact and sustainability.--To generate lessons learned to inform the decisionmaking process.
8Monitoring“A continuous process of collecting and analyzing information,to compare how well a project, program or policy is performing against expected results, andto inform implementation and program management.
9Evaluation“A systematic, objective assessment of an on-going or completed project, program, or policy, its design, implementation and/or results,to determine the relevance and fulfillment of objectives, development efficiency, effectiveness, impact and sustainability, andto generate lessons learned to inform the decision making process,tailored to key questions.
10Impact Evaluation“An assessment of the causal effect of a project , program or policy on beneficiaries. Uses a counterfactual…to estimate what the state of the beneficiaries would have been in the absence of the program (the control or comparison group), compared to the observed state of beneficiaries (the treatment group), andto determine intermediate or final outcomes attributable to the intervention .
11When to use Impact Evaluation? Evaluate impact when project is:InnovativeReplicable/scalableStrategically relevant for reducing povertyEvaluation will fill knowledge gapSubstantial policy impactUse evaluation within a program to test alternatives and improve programs
12Objectives of this session Global Focus on Results1Monitoring vs. Evaluation2Using a Results Chain3Results in Projects4Moving Forward5Selecting smart indicators.Collecting data.Making results useful.
13Using a Results Chain A Results Chain answers 3 questions: What are the intended results of the program?How will we achieve the intended results?How will we know we have achieved the intended results?
14The Results Chain in a Typical Program INPUTSACTIVITIESOUTPUTSOUTCOMESLONGER-TERM OUTCOMESHIGHER ORDER GOALSFinancial, human, and other resources mobilized to support activities.Actions taken or work performed to convert inputs into specific outputs.Project deliverables within the control of implementing agencySUPPLY SIDE.Use of outputs by beneficiaries and stakeholders outside the control of implementing agencyDEMAND SIDE.Changes in outcomes that have multiple drivers.Training plan completedCash transfer deliveredRoad constructedSchool builtNew practices adoptedUse of the roadSchool attendance upHealth service use upBudgetStaffingTrainingStudiesConstructionPoverty reducedIncome inequality reducedLabor productivity increasedHow many of you are familiar with this? Use it regularly? Feel completely confident about the definitions of the different terms?A results chain answers 3 questionsWhat are the intended results of the program?How will we achieve the intended results?How will we know we have achieved the intended results?ImplementationResultsResults-based management
15Example 1: Results Chain ActivitiesOutputsOutcomesLonger-term OutcomesEducationTeacher trainingTextbooks developedTeachers trained in new methodsTextbooks deliveredNew methods usedIncreased completion ratesIncreased test scoresIncreased labor productivityHealthDoctors hiredBirth attendants trainedNew doctors practicingAttendants applying methodsIncreased use of health clinics for deliveriesImproved maternal mortalitySocial Protection and LaborCCTs deliveredTargeting systemMISCCTs delivered to target households in accordance with conditionsIncreased food consumptionIncreased child health visitsDecreased povertyLower child mortality
16Example 2: Results Chain Identify the sequence of inputs, activities, outputs and outcomes:Information is available for parents about the importance of breast feeding.12Children in community healthier.3Fewer children are having diarrheal diseases.4Mothers breast feeding rather than using formula.New funds available to implement a health project to reduce child malnutrition rates.5Design information campaigns on the importance of breast feeding6
17Example 2: Results Chain Identify the sequence of inputs, activities, outputs and outcomes:New funds available to implement a health project to reduce child malnutrition rates.5InputDesign information campaigns on the importance of breast feeding6ActivityInformation is available for parents about the importance of breast feeding.1Output4Mothers breast feeding rather than using formula.Outcome3Fewer children are having diarrheal diseases.Outcome2Children in community healthier.Outcome
18Objectives of this session Global Focus on Results1Monitoring vs. Evaluation2Using a Results Chain3Results in Projects4Moving Forward5Selecting smart indicators.Collecting data.Making results useful.
19Implementing the Results Chain Jamaica PATH CCT ProgramExample of how a well-structured program level M&E helped shape program design and inform policy decisionsProgram of Advancement Through Health and Education (PATH)Conditional cash transfer (CCT) program aimed at linking social assistance with human capital accumulationPrimarily child grants to poor children <19 conditional on school, health care usage
20Jamaica PATH CCT Program LevelTime FrameActivities:Monitoring program executionOn-going basisAssessing program implementationRegular basisOutputs:Assessing program effectivenessAnnual(linked to periodichousehold survey)Outcomes:Evaluating Impact of program on outcomesBaseline and follow-upNEW
21Jamaica’s PATH M&E System LevelInstrumentsKey IndicatorsActivities:Monitoring program executionManagement Info Systems (MIS)BeneficiariesCompliancePaymentsAssessing program implementationImplementation evaluationBeneficiary and stakeholders understanding of program requirements and satisfactionInternal auditsProcess evaluationSpot checksAdherence to regulationsOutputs:Assessing program effectivenessSpecial targeting assessmentAnnual household surveyCoverageTargetingAdequacy of benefitsOutcomes:Evaluating Impact of program on outcomesImpact evaluationSchool attendanceUse of preventive health services
22Use of PATH M&E Results Key Indicator Instruments Activities: Management Info System (MIS)ResultsSome lag in paymentsGood compliance with conditionsSlower take up rate of programUseAdjustments to payment systemIntensified outreachImplementation evaluationsApplication process seen as burdensomeStakeholders not clear on program rulesStrong demand for jobs/ trainingSocial workers used as focal points to access a variety of social services“Steps to Work”, new program created with focus on employment, labor markets skills developmentInternal auditsProcess evaluationSpot checksProblems with payment systemWeak system for verifying eligibility of new beneficiariesDelays in appeals processingRevamping of MISRevised operations manualNew check printing machine for timely paymentsIntensified training of social workersNEWProblem with payment system – (i) duplicate checks; (ii) poor oversight of uncollected checks; (iii) inadequate engagement/control at post office (site of check delivery)
23Use of PATH M&E Results Key Indicator Instruments Outputs: Outcomes: Special targeting assessmentAnnual household surveyResultsPATH better at reaching the poor than other Jamaican safety net programsNot as good as other internationallyUseImproved the beneficiary identification systemExpanded training for social workers to help verify eligibilityMore frequent recertificationOutcomes:Impact evaluationEducation: School attendance improved slightly (by about half a day in a 20 day period). No impact on enrollmentHealth: 30% increase in use of preventive health servicesFocused main education objective on school completionIntroduced differentiated benefit levels to provide incentives for completion (gender, age)Introduced a bonus for completing high schoolNEWSpecial targeting assessment -used Annual hh survey – JSLC (Jamaica Survey of Living Conditions) used to assess coverage of program, benefit incidence and adequacy of benefits as % of consumption.Targeting – close to 50% of PATH beneficiaries from poorest quintileNew improved the beneficiary identification system (new scoring formula, better data collection);
24Lessons LearnedA well articulated approach to M&E is critical to good program management and to informing policyImpact evaluations are powerful for informing key program and policy decisionsGood monitoring systemsAllow for results-based planning and managementFacilitate project preparation, supervision and reform
25Lessons Learned What does it take to get there? Clients willing to learn, take risks, experiment, and collaborate (“from threats to tools”)Strong support of M&E by senior government champions and demand for transparency by civil societyDonor and government desire to focus on M&E processes and goalsCross-sectoral collaboration in the government (especially Ministry of Finance) & donors
26Objectives of this session Global Focus on Results1Monitoring vs. Evaluation2Using a Results Chain3Results in Projects4Moving Forward5Selecting smart indicators.Collecting data.Making results useful.
27SMART: Identifying good indicators pecificMeasurableAttributableRealisticTargeted
28SpecificMeasure as closely as possible what you want to know.Outcome: Children treated for malariaIndicators:Increased utilization of clinicsIncreased use of malaria drugsWhich indicator is more specific?MeasurableBe clear about how it will be measured –specific.Indicators:% of health centers without stocks of drugs x, y & z for more than a week at a time% of health centers with availability of drugsWhich indicator is more measurable?Source: Kathouri and Kusek, 2006
29A ttributable R ealistic Logically and closely linked to a program’s efforts.Indicators:Life expectancy% of children fully immunized at 1 yearWhich indicator is attributable?RealisticData obtainable at reasonable cost, frequency and accuracy.Indicators:HIV prevalence among year-old pregnant womenHIV prevalence among the total populationWhich indicator is more realistic?Source: Kathouri and Kusek, 2006
30T argeted Specific to the program’s target group. Indicators: % increase in employment% increase in employment of graduates of technical training center X, in the first year after completion of trainingWhich indicator is targeted?Source: Kathouri and Kusek, 2006
31Develop a Data Collection Plan Identify what specific data are neededIdentify how the data will be collectedIdentify who will be responsible for collecting and reporting the dataIdentify when the data will be collected and reported, including how frequentlyNEWIdentify costs and sources of financing
32Quick Tips on Making performance monitoring really useful… Provide frequent, timely information to program staff.Set targets for each performance indicator.Provide sub-group data. Disaggregate data by customer and service characteristics.Do regular, basic, analysis of the data, especially comparisons.NEW
33Require explanations for unexpected findings. Report findings in a user-friendly way.Hold “How Are We Doing?” sessions after each performance report.Use “Red-Yellow-Green Lights” to identify programs/projects needing attention.Link outcome information to program costs.NEWSource: Harry Hatry, Urban Institute
34Which Hospital Would You Choose? MERCY HOSPITALAPOLLO HOSPITAL2,100SURGERY PATIENTS63DEATHS3%DEATH RATE800SURGERY PATIENTS16DEATHS2%DEATH RATE
35Which Hospital Would You Choose? MERCY HOSPITALAPOLLO HOSPITAL2,100SURGERY PATIENTS63DEATHS3%DEATH RATE800SURGERY PATIENTS16DEATHS2%DEATH RATEBUTBUT1%DEATH RATE6DEATHS600IN GOOD CONDITION1.3%DEATH RATE8DEATHS600IN GOOD CONDITION3.8%DEATH RATE57DEATHS1,500IN POOR CONDITION4%DEATH RATE8DEATHS200IN POOR CONDITION
36ConclusionsMonitoring and evaluation are separate, complementary functions, but both are key to results-based managementGood M&E is crucial not only to effective project management but can be a driver for reformHave a good M&E plan before you roll out your project and use it to inform the journey!Design the timing and content of M&E results to further evidence-based dialogueGood monitoring is essential to good impact evaluation