We think you have liked this presentation. If you wish to download it, please recommend it to your friends in any social system. Share buttons are a little bit lower. Thank you!
Presentation is loading. Please wait.
Published byLizette Rainford
Modified about 1 year ago
Boston Geneva San Francisco Seattle Trends in Foundation Evaluation Webinar for Assifero June 16, 2010
© FSG Social Impact Advisors 2 Goals for this Webinar Goals Share three new important trends in evaluation by foundations, especially shared measurement Highlight what these trends mean in practice through examples Discuss how Assifero members view and use evaluation at their own foundation
3 © FSG Social Impact Advisors Since Its Inception in 1999, FSG Has Combined Consulting, Thought Leadership and Advocacy to Facilitate Greater Social Impact Overview of FSG Sample PublicationsOverview Functional Expertise Strategy and Program Development Evaluation Organizational Alignment Strategy Implementation Topical Expertise Global Development Global and US Health Youth & Education Corporate Social Responsibility Environment Community-Based Philanthropy Client Spectrum Corporations: Foundations: Nonprofits: Government:
4 © FSG Social Impact Advisors Why Evaluation Matters “If you look at American foundations as a whole, we gave away something around $40 billion in 2008, which seems like a lot of money in aggregate, but when you compare it to the US government budget and the US GDP, it is really a pittance. If we claim to be a funder whose goal is to produce significant social change, we need to be very strategic. To do that, we need to learn and get better to have more impact.” Stephen Heintz, CEO, Rockefeller Brothers Fund
5 © FSG Social Impact Advisors StrategyEvaluation Increased Social Impact FSG’s Work Is Based on the Reinforcing Relationship between Strategy and Evaluation with the Goal of Increasing Social Impact When strategy informs what is evaluated, and evaluation guides the development and refinement of strategy, the potential for social impact increases Why Evaluation Matters
6 © FSG Social Impact Advisors Our Evaluation Work is Underpinned by Three Paradigm Shifts in Evaluation These paradigm shifts are linked and can occur simultaneously ApproachPurposeScope Retrospective Prospective Judge Learn Individual Shared Overview of Evaluation Trends
7 © FSG Social Impact Advisors The Prospective Approach Requires Dynamic, Actionable Evaluation Prospective ApproachRetrospective Approach The retrospective approach is focused on the past The goal is typically to prove precisely what the specific impact of a grant / project was through multi-year, academic studies This process is expensive and cumbersome for both the foundation and the grantee Worse, the results are static, limited, and often come too late to result in any course corrections The prospective approach seeks to use evaluation to enable better planning and implementation Evaluation takes on several forms and is used for: –Planning –Improving Implementation –Tracking Progress Methods include: –Establishing a baseline –Convening grantees –Real-time data gathering –Inclusion of publicly available data and studies Planning Implementation Progress Approach: From Retrospective to Prospective
8 © FSG Social Impact Advisors Example: Prospective Evaluation Over the next 5 years, Foundation XXX seeks to improve the organizational capacity of out-of-school-time organizations in its city A critical part of their approach is a dynamic evaluation plan, which will follow the initiative step- by-step and thereby inform the initiative partners and other cities Research to select partner organizations, determine performance metrics, and establish baseline data Meetings between Foundation XXX and the evaluator to discuss results Memos on progress, challenges, insights, etc. for all partners in the initiative Presentations on results and insights during meetings of all of Foundation XXX‘s grantees Publically available reports on the intermediate and final results of the initiative, to allow other cities to emulate it Presentations on insights and results of the initiative during national education conferences In order to deliver on this evaluation plan a broad range of evaluation approaches will be utilized Foundation XXX Approach: From Retrospective to Prospective
9 © FSG Social Impact Advisors Evaluation for Learning Means that Foundations and Grantees Together Answer Specific Learning Questions Evaluation for LearningEvaluation for Judging The focus of traditional evaluation is often on proving to the foundation that the grant or project was a “good investment” The grantee would like to position itself for future funding and is therefore keen on being judged as successful by the evaluation However, this way both the foundation and the grantee can miss the opportunity to improve their strategies based on the evaluation results Imagine instead that foundation and grantee together decide that the grant or project serves the purpose of answering specific learning questions Similar to a research experiment, the goal becomes to explore if and how specific approaches or interventions contribute to solving the challenges of that specific program area The purpose of evaluation moves from “What exactly did the grant or project result in?” to “Which practical lessons can we derive from this grant or project?” This also makes it easier to understand which data / information should be collected If the grant or project does not work it is not a failure, rather an opportunity for the foundation and the grantee to refine and improve their strategies Purpose: From Judging to Learning
10 © FSG Social Impact Advisors Example: Evaluation for Learning The Bill & Melinda Gates Foundation applies “Learning Agendas” to large grants and projects What is a Learning Agenda? When the program areas within the foundation develop five- year strategies, learning agendas are key components The learning agenda captures questions and information gaps that could strengthen the strategy because they prove or disprove hypotheses, or provide new insights and approaches for solving a social problem The point of the learning agenda is to learn something concrete from every grant or project, and thereby continually improve the foundation’s strategy What does this mean in practice? Every grantee / project partner is allocated specific questions / topics from the learning agenda and is tasked with answering these over the course of the partnership It is not about “yes / no” or “success / failure”, but about making sure that through project the foundation improves its knowledge about solutions in that programmatic area Learning Agenda Excerpt: Post-Secondary Education for Social Mobility How can we leverage generation-specific reliance on technology, such as peer-social networks? To what extent will financial instability among our target populations limit the initiative’s effectiveness? Will a focus on a single geographic area (a place- based strategy) increase prospects for success? What are the barriers and facilitators for taking a model developed in one place and replicating it in others? Through a learning agenda, every single grant or project can play a critical role in the overall strategy Purpose: From Judging to Learning
11 © FSG Social Impact Advisors Shared Evaluation Is About Coordination and Collaboration Shared EvaluationIndividual Evaluation Foundations use individual indicators and reporting formats Grantees and project partners have to conduct a separate evaluation for every one of their funders Neither foundations nor grantees can compare results or learn from one another‘s experiences The activities of both groups are uncoordinated and aimed at different goals Several foundations and grantees agree on shared metrics and evaluation methods in a given thematic program area Often, a joint online platform is used to report on results Such systems have several advantages: –Grantees no longer have to track separate metrics for each funder –Grantees and funders can directly compare results and learn from these –Coordinating metrics leads to coordinating strategies, which means that resources can be more optimally deployed Disparate foundation and grantee activities thereby become a system, which works in harmony in a targeted way to solve social projects Scope: From Individual to Shared
© FSG Social Impact Advisors 12 Shared Measurement Systems Example – Strive Scope: From Individual to Shared Background Strive is a large-scale partnership initiative in Greater Cincinnati featuring: –An evidence-based organizing framework to address education from cradle through to career –More than 300 participating organizations with aligned goals and strategies –A rich learning environment focused on continuous improvement –Strong infrastructure and functional support Participants in the Strive partnership include: –Hundreds of education-related nonprofits –The three local public school districts and one diocesan district in the region –Eight universities and community colleges –Four key local private and corporate funders
© FSG Social Impact Advisors 13 Ambito: da individuale a condiviso Shared Measurement Systems Example – Strive Common Definition of Success / Overarching Vision and Framework 10 Key IndicatorsCommon Report CardPriority Strategies
14 © FSG Social Impact Advisors FSG Resources on Evaluation Insights from 100 foundations on prospective evaluation Ideas and tools to engage your trustees on the topic of evaluation Trends and case studies on shared measurement and evaluation For follow-up questions, me anytime at Tools & Resources
15 © FSG Social Impact Advisors Discussion Questions Are you using aspects of prospective, learning-focused, shared evalution? What do you see as the advantages of these approaches? What do you see as the challenges? Discussion Questions
Module 2: National IEA process design and organization.
Boston | Geneva | San Francisco | Seattle | Washington FSG.ORG Why Shared Measurement Matters Srik Gopalakrishnan, FSG April 2013.
The Project Cycle Management Course presented by Simon Pluess World Alliance of YMCAs.
EWBs direction in more detail – June 1 st 2009 In two previous s, we identified why EWB thinks that stronger connections between Canada and Africa.
Theory of Change Workshop Bistandstorget February 2012 Contact INTRAC Training: Telephone: +44 (0) Website:
Today we will be orienting ourselves to some of the principles and qualities we look for and try to exhibit as Leaders in NA. Specifically, we will look.
Teachers as researchers and the development of teacher professionalism Dr Ken Chow.
Module 8: Monitoring, evaluation and learning – for increased impact and improvement of the IEA process.
Beyond Bureaucracy Beyond Bureaucracy! Developing High-Performing Pre- Referral Intervention Teams.
Developing and Using Institutional Plans. Christopher D. Lambert Associate Director of Commission Relations ACCSCT.
WHO Strategy on Research for Health Rob Terry – Project Manager, RPC.
Final Report – November 3, 2003 Organization of American States Management Study of the Operations of the General Secretariat Part I – Executive Summary.
Professional Learning Communities A Comprehensive Guide to PLC Start-Up to Sustainability Adapted from the work of Dr. Richard Dufour Mark Cerutti – Director.
Joint Management Response on the Joint Evaluation on Joint Programmes on Gender Equality in the UN System.
Head Start and Public Schools Strengthening Birth to PK-3 Partnerships Approaches to Linking PK-3 in Massachusetts: Activities to Support Continuity for.
Micaela Kirshy, MPH, LICSW Project Manager, Performance Management and Quality Improvement Demystifying Domain 9: Performance Management Strategies and.
Building a Recovery Focused Mental Health System: Reflections on Systems Change and Growth in Community Mental Health November 7, 2008 Canadian Innovations.
Response to Intervention: A Framework for Educational Reform What does this mean for gifted education? Response to Intervention: A Framework for Educational.
Training Session #4 Linking & Leveraging for Success.
Denison Culture Survey Results Debrief and Action Planning Prepared by: Insert Your Name Here Insert Unit Name and/or Logo here.
January 25, 2010 Metropolitan Business Planning Initiative Project Kick-Off Meeting CONFIDENTIAL – for Internal Team & Partner Discussion ONLY.
Recommendations from the Local Unit Criteria and Indicators Development (LUCID) Project Pamela Wright Forest Monitoring Team Leader USFS Inventory and.
FY2011 Grants and Programs Coordinated Family and Community Engagement Grant Educator and Provider Support Procurement Early Childhood Mental Health Grant.
Massachusetts State Advisory Council (SAC) on Early Childhood Education and Care Review of Grant, Work Plan Updates, and Year One Budget Considerations.
Selecting Outstanding Teachers for Level 4 Schools Spring 2010 Massachusetts Department of Elementary and Secondary Education.
11 Designing and Building a Results-Based Monitoring and Evaluation System: A Tool for Public Sector Management.
MOVING FORWARD with lesson study Derek Robinson MOVING FORWARD A new way of looking at professional development NCETM Workshop – February 25 th 2009.
Investigating Program Sustainability Andrew Powers, Research Associate Amy L. Powers, Principal Program Evaluation and Educational Research (PEER) Associates,
A Knowledge InnovationTM Project Proposal for (The Producing Village) June Egypt.
What is Trade Promotions? Created August 5, 2004.
© 2016 SlidePlayer.com Inc. All rights reserved.