Presentation is loading. Please wait.

Presentation is loading. Please wait.

Regional Centers for Learning on Evaluation and Results.

Similar presentations


Presentation on theme: "Regional Centers for Learning on Evaluation and Results."— Presentation transcript:

1 Regional Centers for Learning on Evaluation and Results

2 Program A five-year global program launched in January 2010, supported by bilateral and multilateral donors, the World Bank Rationale Increasing demand for performance measurement and accountability Objective Strengthen competitively selected academic institutions to provide capacity development services in monitoring and evaluation (M&E) and results-based management (RBM) to government and civil society Overview

3 The CLEAR Programme is supported by Support

4

5 Mismatch when supply outstrips demand MONITORING masquerades as EVALUATION (Picciotto, 2009) Observation

6 What evidence is there for African governments developing a stronger demand for evidence? Six case studies from Benin, Senegal, Ghana, Uganda, Kenya and South Africa The question…

7 RESULTS ORIENTATION Planning, budgeting, M&E designed to support valued changes in peoples lives (esp. poor) DEMAND Created when decision makers want to use evidence (endogenous or exogenous?) Definitions…

8 – Need to harmonise multiple M&E systems – Planning budgeting and M&E not aligned in their results orientation – eg. no comprehensive performance based budgeting – Challenges resulting from merging of Donor and Country led demands Institutional design

9 – Oldest and best resourced – System continues to mature but MONITORING still dominates – Issues with capacity, data quality and timeliness Monitoring Systems

10 – Tendency to focus on developing operational supply without meeting demand – Recommendation to institutionalise knowledge and undertake high quality approaches in key areas Monitoring can become a bottomless pit!

11 – Demand for performance monitoring by Cabinet - Uganda and SA – Monitoring feeding into budget - Kenya and Ghana – Results orientation must be questioned – MONITORING cannot explain WHY change has taken place Demand is improving

12 Risk of MONITORING crowding out EVALUATION Demand is improving but

13 – Formalising after early stages of development (Benin, SA and Uganda) – Less evidence of existence (Ghana, Kenya and Senegal) – Gov. agencies not involved in commissioning, or setting standards Evaluation Systems

14 – South Africa undertaking six forms of evaluation across Policy cycle – Benin and Uganda summative focused – Benin SA and Uganda setting standards – Happens outside government in Ghana (3% of spending on Eval. out of M&E) Evidence

15 – to elevate EVALUATION from behind the shadow of MONITORING – High quality nodes of supply – Focus on quality standards and local rather than international supply – Developing and regulating the market – Decision makers need to be sure of the quality of product Challenge

16 Evaluation may produce information that politicians and donors don’t want to hear! Risk

17 Monitoring is still dominant but systems are adapting to demand while responding to Donors Conclusion

18 Well positioned institutional champions are emerging Demand emerging from political need could easily be reversed Endogenous results based demand must permeate through M&E systems Flow is currently upwards and internal to executive not outwards to citizens Still activity and output rather that outcome focused Importance of commissioning Evaluations and using recommendations Conclusion

19 www.theclearinitiative.org


Download ppt "Regional Centers for Learning on Evaluation and Results."

Similar presentations


Ads by Google