Presentation is loading. Please wait.

Presentation is loading. Please wait.

Donor perspectives on planning and evaluation Janice Astbury The J.W. McConnell Family Foundation.

Similar presentations


Presentation on theme: "Donor perspectives on planning and evaluation Janice Astbury The J.W. McConnell Family Foundation."— Presentation transcript:

1 Donor perspectives on planning and evaluation Janice Astbury The J.W. McConnell Family Foundation

2 Current context Why evaluate? (and once you decide to evaluate, planning is implicit) Evolution in evaluation thinking What works Donor perspectives on planning and evaluation

3 Most of this Powerpoint was prepared by my colleague Sheherazade Hirji for a presentation we will be doing later this week Acknowledgement

4 “There are a number of interesting social value calculations in use throughout the sector. We also learned about their limitations in terms of data quality and comparability. We now appreciate that while each method generates actionable information for its own users, no one approach has yet emerged as the single best method. In fact, to some extent, it was the discipline and rigor of application that is the most important common ingredient among the methods. Each of the practitioners acknowledged the importance of their calculation model forcing them to make their assumptions explicit and transparent. It is only once the assumptions are laid bare, that a true debate about the merits of a program, strategy or grant relative to costs can fully be vetted and debated, even if not fully known with precision. “ Gates Foundation Cover Letter on the Report on Measuring and/or estimating Social Value Creation, December 2008 CURRENT CONTEXT

5 WHY EVALUATE AT ALL?

6 Accountability Assessing impact - what difference did you make? Learning: what works, what doesn’t Building capacity “The difference between what we do and what we are capable of doing would suffice to solve most of the world's problems.” Mahatma Gandhi Sharing/transferring knowledge

7 Distinction: 1.Monitoring = Accountability Were funds used as agreed? Did the grantee do what they said they would? Mid-grant adjustments 2.Evaluation = Impact What changed as a result of the funding? What was learned about the issue/intervention? So what? Transferring/disseminating knowledge EVALUATION 101

8 Inter-related levels in evaluation: The organization: vision, mission, mandate, capacity The program: impact of funding The issue: what’s different, needs to change

9 EVOLUTIONARY THINKING IN EVALUATION Assumptions Purpose is to prove It’s about the grantee Happens at the end Measures everything Looking for attribution Done by experts, pre-determined assumptions Rigorous methodology One size fits all Focus is accountability, measurement Internally focused Reality Focus is to improve Involves many stakeholders Starts as soon as program is conceived Select indicators to help critical decisions Satisfied with contribution and learning Participatory, evolving process Rigorous thinking Specific to organization/program age/stage Also looks for impact and learning Use learning and knowledge transfer to influence and inform decision-making, policy HIGH LEARNING POTENTIALLOW LEARNING POTENTIAL

10 EVALUATION AS A LEARNING TOOL Starts with organizational strategy, mission, goals Integrated into operational and planning cycle Anchor and amplify it in existing activities Allocate time and resources Encourage “evaluative thinking”

11 EVALUATION AS A LEARNING TOOL Mutual accountability Creates space for conversation Increases transparency, trust Meets broader public agenda by sharing what we learn Great social impact: what works/does not work Increases efficiency and effectiveness

12 WHAT WORKS Clarity in purpose and audiences Theory of change supported with evaluation framework that includes impact on individuals and communities Selecting a few indicators that help assess progress in each area Focus on contribution, rather than attribution Balance quantitative and qualitative No stories without numbers and no numbers without stories Share learning, celebrate successes Tell the story as it unfolds, periodically tie themes together

13 DEVELOPING AN EVALUATION PLAN Develop Logic model Evaluation work plan Identify information required: quantitative output Qualitative impact Increase awareness/knowledge Change attitudes Change behaviours Increase skill levels Improve individual status Improve community status How/by whom will these be measured? Approach – participatory, developmental, formative, summative? Resources required (human and financial

14 DEVELOPING AN EVALUATION PLAN Limit your evaluation plan to the actual population/community you will be serving and the scope of your activity. Avoid things outside your control (systemic barriers, regulations, for example) unless you intend to address these as part of the program and be accountable for changing them. Under-promise and over-deliver Complex grants need external evaluation help. Cost can range from 5%-15% of total project costs Provide staff/board the results and learn about what worked and could be improved in future programming. Use the results to report to your funders

15 RESOURCES Community Builder’s Approach to Theory of Change http://tamarackcommunity.ca/g3s4_7.html Measuring and/or estimating social value creation www.gatesfoundation.org/learning/Pages/december- 2008-measuring-estimating-social-value-creation- report-summary.aspx GrantCraft: Practical Wisdom for Grantmakers – Using a Theory of Change to Guide Planning and Evaluation www.grantcraft.org/index.cfm?fuseaction=Page.view Page&pageID=808


Download ppt "Donor perspectives on planning and evaluation Janice Astbury The J.W. McConnell Family Foundation."

Similar presentations


Ads by Google