Presentation is loading. Please wait.

Presentation is loading. Please wait.

Development agency support for impact evaluation Harry Jones, ODI Impact Evaluation Conference 2009.

Similar presentations


Presentation on theme: "Development agency support for impact evaluation Harry Jones, ODI Impact Evaluation Conference 2009."— Presentation transcript:

1 Development agency support for impact evaluation Harry Jones, ODI Impact Evaluation Conference 2009

2 Recent ODI work 1. Comparative study on evaluation policies & practices in development agencies 2. Improving impact evaluation production & use

3 Comparative study on evaluation policies and practices in development agencies Marta Foresti with C. Archer, T. O’Neil, R. Longhurst December 2007

4 Overview of study Scope and Objectives A descriptive comparative study of evaluation policies and practices in key agencies, to inform AFD reform process. key features of evaluation function (e.g. mandate, position, management, roles etc. ) main aspects of evaluation systems, processes and tools practices involved in commissioning, managing and supporting evaluation Activities Desk Case Studies: DANIDA, EU, OXFAM, IMF (Evaluation Units) Full Case Studies: DFID, SIDA, WB, AfDB, KFW (Evaluation Units) + Key Informants Interviews Outputs - Case Study Reports (AFD) - Final Comparative Report - Workshops: Mid Term (AFD Internal), Dissemination: AFD internal, ODI lunchtime meeting (Feb 08) and DAC network in March 08

5 Profiles of Evaluation Units - Overview Variability in budget and staffing Many evaluation policies being reviewed, updated or created Mandate not always clear in policies: lack of clarity across organisation No single/unified methodology

6 Independence vs Integration Most EUs sit outside management structure or operational dept. Report to minister/boards etc. Position of unit important, but also rules for budget allocation, appointment of staff, disclosure (WB, IMF) All recognise tension between independence and integration. ‘Being involved’ as important as ‘being detached’. Reliance on ‘usual consultants’: are they ‘really independent’ and ‘free’ to be critical?

7 Main responsibilities: tendering, contracts and managing evaluation processes, not doing evaluation. Different levels and intensity of consultation with other departments, more on implementation and dissemination, less at planning/decision phase Capacity and evaluation skills of EU staff a major constraint (DFID and others). Focus often on specific sectoral skills (e.g. economists at KFW) ‘New’ roles and responsibilities: KM and learning, communication, dissemination and capacity building Staff capacity, roles and responsibilities

8 Communication and dissemination Of increasing importance, beyond ‘dissemination of findings’ towards effective communication, reach and active engagement of client/stakeholders (big push at WB). Disclosure policies and transparencies, all reports on website Products: more than reports: synthesis, briefs, seminars, internet etc. Limited feedback and weak evidence on utilisation (AfDB)

9 Improving Impact Evaluation Production and Use Nicola Jones, Harry Jones, Liesbet Steer, and Ajoy Datta March 2009

10 Overview of study Scope and Objectives Commissioned by DfID to inform discussions on IE production and use, particularly within NONIE To determine how amenable various methods for IE are to different types of projects, programmes and policies; To assess the dynamics around commissioning, production and delivery of IEs; To analyse how IEs are disseminated and communicated; To assess use and influence of IEs; and To make recommendations to improve the production and use of IEs. Activities Scoping study Literature review Annotated database of IEs: Sector Case studies Synthesis Outputs  ODI Working paper  Opinion piece

11 Methodologies: Suitability and opportunities for IE Similarities between sectors Projects with simple impact pathways Methodological innovation Call for pluralism Differences across sectors Sector history of IE Gestation of impact Coverage and relevance

12 Demand and supply: commissioning, production and delivery Largely supply-driven Upward accountability Less for downward accountability, learning Some exceptions in social development: range of implementing agencies, Southern government demand

13 Communication, Use and Influence Communication varied but difficult at a national level; greater interest and initiatives at international level Use: Some direct use ‘Legitimation’ most common function Indirect and ‘enlightenment’ use

14 Emerging messages 1.Common aims but diversity of practices, diversity of sector experiences, and of evaluation questions 2.Need for plural approach to (I)E quality ; and balance between rigour and coverage for accountability 3.Improving agency learning from IEs difficult; but crucial to improve programmes 4.Disconnect between rhetoric on strategic importance of development evaluation, and practice in development agencies. An ‘institutional gap’: need to invest in institutional role of evaluation, at different levels. 5.How to strengthen Demand for development evaluation?

15 h.jones@odi.org.uk Other relevant ODI work: Development effectiveness: the role of qualitative research in impact evaluation – Martin Prowse (RPGG) Re-thinking the impact of humanitarian aid – Karen Proudlock and Ben Ramalingam (ALNAP) Thank you!


Download ppt "Development agency support for impact evaluation Harry Jones, ODI Impact Evaluation Conference 2009."

Similar presentations


Ads by Google