Presentation is loading. Please wait.

Presentation is loading. Please wait.

“Evaluation Follow-Up: Challenges and Lessons” AEA 2011 Conference Think-Tank Session Organizer: Scott Chaplowe Presenters: Osvaldo Feinstein Bidjan Nashat.

Similar presentations


Presentation on theme: "“Evaluation Follow-Up: Challenges and Lessons” AEA 2011 Conference Think-Tank Session Organizer: Scott Chaplowe Presenters: Osvaldo Feinstein Bidjan Nashat."— Presentation transcript:

1 “Evaluation Follow-Up: Challenges and Lessons” AEA 2011 Conference Think-Tank Session Organizer: Scott Chaplowe Presenters: Osvaldo Feinstein Bidjan Nashat Mike Hendricks

2 Evaluation Follow-up 2 Osvaldo Néstor Feinstein AEA 2011 Conference

3 Themes of this presentation on evaluation follow-up interest in going beyond reports comparative work that was carried out good practice guidelines issues to think about

4 Motivation benefit of evaluations depend on their use perception/evidence of too limited use risk of cost without benefit potential benefits of learning from evaluations “anxiety of influence” Increase the benefit/cost ratio of evaluations

5 Comparative work on evaluation follow-up done on behalf of the United Nations Evaluation Group (UNEG) 2008-9; updated for WBG 2010 literature review and interviews mainly to UN agencies; also some other type of organizations focus on procedures/systems/mechanisms for evaluation recommendations and management response follow-up

6 Main findings emphasis on formal procedures, similar to those used by auditors high transaction costs – low efficiency concerns with the quality of evaluation recommendations confusion of “adoption of recommendations” with “consistency or alignment between recommendations and actions taken by Management” confirmation that there was scope for improvement of evaluation follow-up systems

7 Good practice guidelines endorsed at the UNEG Annual Meeting 2010 Good practices in management response to evaluation Development of systems for tracking and reporting on the implementation of the evaluations' recommendations, and Mechanisms for facilitating learning and knowledge development from evaluations. http://www.unevaluation.org/GPG/followup

8 Some issues for thinking/discussion could the engagement of evaluators with managers, that are expected to implement evaluation recommendations, jeopardize evaluators’ independence? If evaluation recommendations were of really high quality, wouldn’t they be applied even without evaluation follow-up?

9 Balancing Accountability and Learning in Evaluation Follow-Up: Lessons Learned from Reforms at the World Bank Group Bidjan Nashat Strategy Officer, World Bank Independent Evaluation Group AEA Think Tank Evaluation Follow-Up: Challenges and Lessons November 4, 2011 9

10 IEG’s Mandate on follow-up IEG recommendations intend to “help improve the development effectiveness of the World Bank Group’s programs and activities, and their responsiveness to member countries’ needs and concerns.” IEG is also mandated to report "periodically to the Board on actions taken by WBG management in response to evaluation findings.” Source: IEG Mandate 10

11 The three stages of reforming IEG’s follow-up system 1.Upstream process: How do we come up with recommendations? 2.Follow-up process: How do we track if they are being implemented? 3.Analysis and Utilization: What does it mean for IEG, World Bank Management, and the Board? 11

12 1.Reforming the upstream process Quality control: What is a good recommendation? Context: What is the link between recommendations and findings? Engagement: What it would take to fix the problem? Responsibility: Who is doing what when and how? Governance: Who decides in the case of disagreements? 12

13 IEG’s follow-up process reform Reform: recommendation standards. IEG clearly indicates the link between its findings and how they lead to draft recommendations Reform: Engagement on findings and recommendations. IEG team meets with management counterparts at the working level to discuss findings and get input and suggestions on draft recommendations Reform: Actions and timelines. Within 90 days after the CODE meeting, management lays out more specific actions and timelines for each accepted recommendation Reform: Reporting to CODE. IEG reports on a quarterly basis to CODE on submitted action plans and timelines. IEG monitors and reports on adoption annually. Source: IEG Results and Performance Report 2011, ieg.worldbank.org/content/dam/ieg/rap2011/rap2011_vol1.pdf 13

14 2.Reforming the follow-up process Learning from others: How are other evaluation units following up? Simplify and unify ratings and procedures: How can we streamline annual tracking across the WBG? Move from manual process to automation: How can we automate the tracking process? 14

15 3.Reforming Analysis and Utilization Ask the right questions: What is the right question to ask before analyzing the data? Provide context: What does follow up data tell us about our theory of change? Make it public: How should we disclose follow up data? Utilization: How can we link follow-up data to our organizational impact? Close the loop: How do we make follow-up data useful for new evaluations and our work program? 15

16 Thank you! Contact: bnashat@worldbank.orgbnashat@worldbank.org 16

17 Broadening Our Thinking About Evaluation Follow-Up Michael Hendricks Presented at the American Evaluation Association November 4, 2011

18 What Do We Mean By “Follow-Up”? We talk about evaluation use and evaluation follow-up What exactly are we concerned about? Generally, our findings and recommendations Especially... “Have our recommendations been accepted and implemented?” But are we thinking broadly enough about this?

19 Four (or More) Different “Models” for Offering Recommendations Who are the Persons Involved with the Recommendations? What Type of Model for Offering Recommendations Is This? Electrical inspector to homeowners Advisor to “uninformed” Physician to patient Advisor to “personally aware” Management consultant to client Advisor to “differently expert” Assistant coach to head coach Advisor to “equally expert” Others??

20 Is It Reasonable to Expect the Same Type of Use from Each Model? Who are the Persons Involved with the Recommendations? What Type of Model for Offering Recommendations Is This? What Type of “Use” Is It Realistic to Expect? Electrical inspector to homeowners Advisor to “uninformed” ? Physician to patient Advisor to “personally aware” ? Management consultant to client Advisor to “differently expert” ? Assistant coach to head coach Advisor to “equally expert” ? Others???

21 Aren’t There Many Different Ways to “Use” Recommendations? Ignore them/pay no attention Consider them carefully when offered Discuss them later in depth (“learning events”) React to them formally Implement them in part or whole Monitor their implementation Evaluate their effects

22 Are We Asking These Important Questions? Which models best describe an evaluator offering recommendations to a program manager or to a governing body? How often is it advisor to uninformed? Advisor to equally expert? Something in between? Won’t different models lead us to expect different uses? And won’t that give us insights into how to measure -- and improve -- each type of use? Would it be useful to explore these ideas further?


Download ppt "“Evaluation Follow-Up: Challenges and Lessons” AEA 2011 Conference Think-Tank Session Organizer: Scott Chaplowe Presenters: Osvaldo Feinstein Bidjan Nashat."

Similar presentations


Ads by Google