Donor perspectives on planning and evaluation Janice Astbury The J.W. McConnell Family Foundation.

Slides:



Advertisements
Similar presentations
EMSP CEED Seminar April 2004 Balázs Sátor The Civil Society Development Foundation Hungary.
Advertisements

Research Strategies: Joining Deaf Educators Together Deaf Education Virtual Topical Seminars Donna M. Mertens Gallaudet University October 19, 2004.
Continuing and Expanding Action Research Learning Cedar Rapids Community Schools February, 2005 Dr. Susan Leddick.
Outputs, outcomes and impacts Using Theory of Change to plan, develop and evaluate your initiative.
Introduction to Monitoring and Evaluation
Measuring Impact – Making a Difference Carol Candler – NRF Graeme Oram – Five Lamps.
Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
Spark NH Council Member Survey October – November, 2012.
Presentation at The Conference for Family Literacy Louisville, Kentucky By Apter & O’Connor Associates April 2013 Evaluating Our Coalition: Are We Making.
April 6, 2011 DRAFT Educator Evaluation Project. Teacher Education and Licensure DRAFT The ultimate goal of all educator evaluation should be… TO IMPROVE.
1 Tools and mechanisms: 1. Participatory Planning Members of local communities contribute to plans for company activities potentially relating to business.
Evaluating Collaboration National Extension Family Life Specialists Conference April 28, 2005 Ellen Taylor-Powell, Ph.D. Evaluation Specialist University.
Ray C. Rist The World Bank Washington, D.C.
An Introduction to an Integrated P,M&E System developed by IDRC Kaia Ambrose October, 2005.
Value for Your Dollar: The Social Enterprise Impact Assessment Project.
Getting on the same page… Creating a common language to use today.
The Lumina Center Grantseeking Workshop Series Presents Outcomes & Evaluations April 20, 2006.
PPA 502 – Program Evaluation
1 Bulgaria Delegation. 2 Societary evaluation context Globalization Increasing complexity of governance Better governance Global competition Budget deficits.
Questions to Consider What are the components of a comprehensive instructional design plan? What premises underline the instructional design process?
Performance Management Upul Abeyrathne, Dept. of Economics, University of Ruhuna, Matara.
Accelerated Planning: not an oxymoron Rebecca Jones Dysart & Jones Associates
Sustaining Local Public Health and Built Environment Programs Fit Nation NYC November 3, 2011 Annaliese Calhoun.
Continuous Quality Improvement (CQI)
ActionAid Value for Money Pilot Update May Origins of the VFM Pilot -Measuring cost effectiveness approved in September 2010 as part of the new.
1 Qualitative Evaluation Terms Coding/categorization The process of condensing qualitative data through the identification of common themes. Data Matrix.
Monitoring Evaluation Impact Assessment Objectives Be able to n explain basic monitoring and evaluation theory in relation to accountability n Identify.
ISTEP: Technology Field Research in Developing Communities Instructor: M. Bernardine Dias CAs: Sarah Belousov and Ermine Teves Spring 2009.
1 Module 4: Designing Performance Indicators for Environmental Compliance and Enforcement Programs.
TEMPUS IV- THIRD CALL FOR PROPOSALS Recommendation on how to make a good proposal TEMPUS INFORMATION DAYS Podgorica, MONTENEGRO 18 th December 2009.
Investing in Change: Funding Collective Impact
Evaluation – the why’s, the what’s and the how’s 2014 Dr Basma Ellahi (PI) Cat Crum (PhD Student)
Session 0. Introduction: Why and key concepts Benedetta Magri, Bangkok, 2-5 September 2013.
Do it pro bono. Strategic Scorecard Service Grant The Strategy Management Practice is presented by Wells Fargo. The design of the Strategic Scorecard Service.
Strategic and operational plan. Planning it is a technical function that enables HSO to deal with present and anticipate the future. It involve deciding.
Outcome Based Evaluation for Digital Library Projects and Services
Semester 2: Lecture 9 Analyzing Qualitative Data: Evaluation Research Prepared by: Dr. Lloyd Waller ©
Impact evaluation: External and internal stakes Impact evaluation seminar - 2 to 6 December, Phnom Penh.
Developing and Writing Winning Individual, Corporate and Foundation Proposals Robin Heller, Director, Corporate and Foundation Philanthropy, BBBSA Robert.
EVALUATION APPROACHES Heather Aquilina 24 March 2015.
Boston Geneva San Francisco Seattle Beginning the Evaluation Journey with FSG KCIC Boot Camp March 24, 2010 Prepared for:
Making Sense of the Social World 4 th Edition Chapter 11, Evaluation Research.
Monitoring & Evaluation: The concepts and meaning Day 9 Session 1.
University of Palestine Dept. of Urban Planning Introduction to Planning ( EAGD 3304 ) M.A. Architect: Tayseer Mushtaha Mob.:
1 Designing Effective Programs: –Introduction to Program Design Steps –Organizational Strategic Planning –Approaches and Models –Evaluation, scheduling,
Monitoring and Evaluation of GeSCI’s Activities GeSCI Team Meeting 5-6 Dec 2007.
Measuring Conservation Impact Initiative World Parks Congress Durban, South Africa September 8-18, 2003 Foundations of Success (FOS) with Wildlife Conservation.
Third Sector Evaluation: Challenges and Opportunities Presentation to the Public Legal Education in Canada National Conference on “Making an Impact” 26.
Facilitating UFE step-by-step: a process guide for evaluators Joaquín Navas & Ricardo Ramírez December, 2009 Module 1: Steps 1-3 of UFE checklist.
Mapping the logic behind your programming Primary Prevention Institute
The Logic Model An Outcomes-Based Program Model. What is a Logic Model? “a systematic and visual way to present and share your understanding of the relationships.
Developing a Framework In Support of a Community of Practice in ABI Jason Newberry, Research Director Tanya Darisi, Senior Researcher
Using results frameworks to shift the focus of evaluation to a strategic level Emerging research on the principles underpinning results frameworks Kate.
Including School Stakeholders. There are many individuals and groups associated with schools and many of these people are likely to have valuable ideas.
Kathy Corbiere Service Delivery and Performance Commission
Welcome to Program Evaluation Overview of Evaluation Concepts Copyright 2006, The Johns Hopkins University and Jane Bertrand. All rights reserved. This.
Social value reporting: An integrated approach John Maddocks – CIPFA
International engagement: m & e meeting Monitoring & Evaluation: an introduction for practitioners Liz Allen.
Capacity Development Results Framework A strategic and results-oriented approach to learning for capacity development.
Creating an evaluation framework to optimise practice returns: Process & progress within a large community sector organisation Australasian Evaluation.
Company LOGO. Company LOGO PE, PMP, PgMP, PME, MCT, PRINCE2 Practitioner.
Building Systems for Today’s Dynamic Networked Environments A Methodology for Building Sustainable Enterprises in Dynamic Environments through knowledge.
Good Governance of Community- based and Community-managed Not-for-Profit Organisations.
Organizations of all types and sizes face a range of risks that can affect the achievement of their objectives. Organization's activities Strategic initiatives.
1 Outcomes: Outcomes: Libraries Change Lives — Libraries Change Lives — Oh yeah? Prove it. Oh yeah? Prove it. The Institute of Museum and Library Services.
Assessment/Evaluation Make evaluation a central part of planning – not an afterthought 1) Determine Needs 2) Determine Desired Outcomes 3) Determine Activities.
Strategic and operational plan
CATHCA National Conference 2018
SUSTAINABLE MICRO-FINANCE for WOMEN’S EMPOWERMENT
Wednesday 13 September UKCF Conference Cardiff
Presentation transcript:

Donor perspectives on planning and evaluation Janice Astbury The J.W. McConnell Family Foundation

Current context Why evaluate? (and once you decide to evaluate, planning is implicit) Evolution in evaluation thinking What works Donor perspectives on planning and evaluation

Most of this Powerpoint was prepared by my colleague Sheherazade Hirji for a presentation we will be doing later this week Acknowledgement

“There are a number of interesting social value calculations in use throughout the sector. We also learned about their limitations in terms of data quality and comparability. We now appreciate that while each method generates actionable information for its own users, no one approach has yet emerged as the single best method. In fact, to some extent, it was the discipline and rigor of application that is the most important common ingredient among the methods. Each of the practitioners acknowledged the importance of their calculation model forcing them to make their assumptions explicit and transparent. It is only once the assumptions are laid bare, that a true debate about the merits of a program, strategy or grant relative to costs can fully be vetted and debated, even if not fully known with precision. “ Gates Foundation Cover Letter on the Report on Measuring and/or estimating Social Value Creation, December 2008 CURRENT CONTEXT

WHY EVALUATE AT ALL?

Accountability Assessing impact - what difference did you make? Learning: what works, what doesn’t Building capacity “The difference between what we do and what we are capable of doing would suffice to solve most of the world's problems.” Mahatma Gandhi Sharing/transferring knowledge

Distinction: 1.Monitoring = Accountability Were funds used as agreed? Did the grantee do what they said they would? Mid-grant adjustments 2.Evaluation = Impact What changed as a result of the funding? What was learned about the issue/intervention? So what? Transferring/disseminating knowledge EVALUATION 101

Inter-related levels in evaluation: The organization: vision, mission, mandate, capacity The program: impact of funding The issue: what’s different, needs to change

EVOLUTIONARY THINKING IN EVALUATION Assumptions Purpose is to prove It’s about the grantee Happens at the end Measures everything Looking for attribution Done by experts, pre-determined assumptions Rigorous methodology One size fits all Focus is accountability, measurement Internally focused Reality Focus is to improve Involves many stakeholders Starts as soon as program is conceived Select indicators to help critical decisions Satisfied with contribution and learning Participatory, evolving process Rigorous thinking Specific to organization/program age/stage Also looks for impact and learning Use learning and knowledge transfer to influence and inform decision-making, policy HIGH LEARNING POTENTIALLOW LEARNING POTENTIAL

EVALUATION AS A LEARNING TOOL Starts with organizational strategy, mission, goals Integrated into operational and planning cycle Anchor and amplify it in existing activities Allocate time and resources Encourage “evaluative thinking”

EVALUATION AS A LEARNING TOOL Mutual accountability Creates space for conversation Increases transparency, trust Meets broader public agenda by sharing what we learn Great social impact: what works/does not work Increases efficiency and effectiveness

WHAT WORKS Clarity in purpose and audiences Theory of change supported with evaluation framework that includes impact on individuals and communities Selecting a few indicators that help assess progress in each area Focus on contribution, rather than attribution Balance quantitative and qualitative No stories without numbers and no numbers without stories Share learning, celebrate successes Tell the story as it unfolds, periodically tie themes together

DEVELOPING AN EVALUATION PLAN Develop Logic model Evaluation work plan Identify information required: quantitative output Qualitative impact Increase awareness/knowledge Change attitudes Change behaviours Increase skill levels Improve individual status Improve community status How/by whom will these be measured? Approach – participatory, developmental, formative, summative? Resources required (human and financial

DEVELOPING AN EVALUATION PLAN Limit your evaluation plan to the actual population/community you will be serving and the scope of your activity. Avoid things outside your control (systemic barriers, regulations, for example) unless you intend to address these as part of the program and be accountable for changing them. Under-promise and over-deliver Complex grants need external evaluation help. Cost can range from 5%-15% of total project costs Provide staff/board the results and learn about what worked and could be improved in future programming. Use the results to report to your funders

RESOURCES Community Builder’s Approach to Theory of Change Measuring and/or estimating social value creation measuring-estimating-social-value-creation- report-summary.aspx GrantCraft: Practical Wisdom for Grantmakers – Using a Theory of Change to Guide Planning and Evaluation Page&pageID=808