OE GOLDMINE RESEARCH FINDINGS P ACKARD P ROGRAM O FFICERS D ISCUSSION G ROUP TCC Group July 13, 2011.

Slides:



Advertisements
Similar presentations
Conducting your own Data Life Cycle Audit
Advertisements

Training and Education
Community-Based Research Workshop Series CBR 206 Writing Effective Letters of Intent.
Understanding Student Learning Objectives (S.L.O.s)
Chapter 5 Transfer of Training
1 Nia Sutton Becta Total Cost of Ownership of ICT in schools.
Science Subject Leader Training
1 of 21 Information Strategy Developing an Information Strategy © FAO 2005 IMARK Investing in Information for Development Information Strategy Developing.
1 of 17 Information Strategy The Features of an Information Strategy © FAO 2005 IMARK Investing in Information for Development Information Strategy The.
No 1 IT Governance – how to get the right and secured IT services Bjorn Undall and Bengt E W Andersson The Swedish National Audit Office Oman
Overview of Performance Measurement. Learning Objectives By the end of the module, you will be able to: Describe what performance measurement is, and.
Alignment and Quality 2/8/2014 Alignment and Quality 1 What is it? What do I need to look for?
1 Alternative measures of well-being Joint work by ECO/ELSA/STD.
NCATS REDESIGN METHODOLOGY A Menu of Redesign Options Six Models for Course Redesign Five Principles of Successful Course Redesign Four Models for Assessing.
Designing and Building a Results-Based Monitoring and Evaluation System: A Tool for Public Sector Management.
HART RESEARCH P e t e r D A S O T E C I Raising The Bar
Lisa Brown and Charles Thomas LAWNET 2002 Taking the Mystery Out of Project Management.
TCE Board Presentation February, 2006 Evaluating the Initiative Oakland, CA - Seattle, WA.
Jeopardy Q 1 Q 6 Q 11 Q 16 Q 21 Q 2 Q 7 Q 12 Q 17 Q 22 Q 3 Q 8 Q 13
Jeopardy Q 1 Q 6 Q 11 Q 16 Q 21 Q 2 Q 7 Q 12 Q 17 Q 22 Q 3 Q 8 Q 13
BUILDING THE CAPACITY TO ACHIEVE HEALTH & LEARNING OUTCOMES
QA & QI And Accreditation.  A continuous process to review, critique, and implement measurable positive change in public health policies, programs or.
MSCG Training for Project Officers and Consultants: Project Officer and Consultant Roles in Supporting Successful Onsite Technical Assistance Visits.
Presenter: Beresford Riley, Government of
Management Plans: A Roadmap to Successful Implementation
Survey Responses Challenges and Opportunities Matt Richey St. Olaf College.
A Roadmap to Successful Implementation Management Plans.
2009 Strategic Planning playbook
Effective Test Planning: Scope, Estimates, and Schedule Presented By: Shaun Bradshaw
Effectively applying ISO9001:2000 clauses 6 and 7.
1 Division of Aging and Adult Services (DAAS) Knowledge Management and Transfer Project 7/30/12.
1 Quality Indicators for Device Demonstrations April 21, 2009 Lisa Kosh Diana Carl.
Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
1 Building Capacity to Advocate for Change May 24, 2007 GLA Capacity Building PLEASE ALSO JOIN US ON THE PHONE CALL: (Toll-free): +1 (866) Participant.
Stronger health systems. Greater health impact. Proximal, Distal and Everything In Between: Measuring Organizational Capacity Presenter: Stephanie Calves,
Evaluation Capacity Building Identifying and Addressing the Fields Needs.
Internews Network Ukraine Media Project (U-Media) Evidence-based Local Capacity Development in Ukraine October 2012.
Project Management CHAPTER SIXTEEN McGraw-Hill/Irwin Copyright © 2011 by the McGraw-Hill Companies, Inc. All rights reserved.
/1 Transparency Challenge Panel March / Welcome & Introductions Suzanne Wise Strategy Consultation Overview of responses and next steps.
Reform and Innovation in Higher Education
Planning for Progress Judith Lindenau, CAE, RCE
Benefits for All. Overview History What is Simplify? How Data Sharing Works RWJF Case Study Standards Setting Key Players Why Funders Should Participate.
1 Developing Tests for Departmental Assessment Deborah Moore, Assessment Specialist Institutional Research, Planning, & Effectiveness University of Kentucky.
Problems, Skills and Training Needs in Nonprofit Human Service Organizations Dr. Rick Hoefer University of Texas at Arlington School of Social Work.
What is Pay & Performance?
Promoting Regulatory Excellence Self Assessment & Physiotherapy: the Ontario Model Jan Robinson, Registrar & CEO, College of Physiotherapists of Ontario.
How to commence the IT Modernization Process?
Strategic Financial Management 9 February 2012
Settlement Program Logic Model
MANAGEMENT RICHARD L. DAFT.
Appraising and Managing Performance (c) 2007 by Prentice Hall7-1 Chapter 7.
1 Department of State Program Evaluation Policy Overview Spring 2013.
MANAGEMENT RICHARD L. DAFT.
Employment Ontario Program Updates EO Leadership Summit – May 13, 2013 Barb Simmons, MTCU.
M&E in the GEF Aaron Zazueta GEF Evaluation Office Expanded Constituency Workshop Dalat, Vietnam - April 2011.
CUPA-HR Strong – together!
The Rubric Reality Cobb Keys Classroom Teacher Evaluation System.
CUPA-HR Strong – together!
25 seconds left…...
RTI Implementer Webinar Series: Establishing a Screening Process
We will resume in: 25 Minutes.
Performance Measurement in the Conservation Community Status, progress, barriers, and next steps Elizabeth O‘Neill, Conservation Auditor, WWF International.
1 Phase III: Planning Action Developing Improvement Plans.
World’s Largest Educational Community
PHAB's Approach to Internal and External Evaluation Jessica Kronstadt | Director of Research and Evaluation | November 18, 2014 APHA 2014 Annual Meeting.
Notes for a presentation to the EEN (Canada) Forum Blair Dimock Director, Research, Evaluation and Knowledge Management October 1, 2010 Sharing Practical.
Chapter 9 Developing an Effective Knowledge Service
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Supporting Grantees through Capacity Building Initiatives
Presentation transcript:

OE GOLDMINE RESEARCH FINDINGS P ACKARD P ROGRAM O FFICERS D ISCUSSION G ROUP TCC Group July 13, 2011

2 Agenda  Introduction  Background on research methodology  Some interesting specific findings  Some final general questions for reflection

3 Background on Research Methodology  Since 1997, the Foundation has awarded 1392 “organizational effectiveness” grants to an array of nonprofit groups, most of which have annual operating budgets of $1-10 million and operate in the human services, environmental conservation, population, and arts fields.  Packard recently surveyed 274 of these grantees that finished OE projects between 2007and 2009 and analyzed the responses from 169 (a 62% response rate), which ranged in grant amount from $7,000 to $160,000 (90% were between $20-$50K), to ascertain the outcomes.

4 Background on Research Methodology  The post-grant survey included both multiple choice questions and open-ended questions.  This data includes grantees’ self-reported data, as well as, data from OE program’s internal documentation.  Research findings are shared with the public: foundation-oe.wikispaces.com/OE+Goldmine+Research+Project foundation-oe.wikispaces.com/OE+Goldmine+Research+Project

5 Agenda  Introduction  Background on research methodology  Some interesting specific findings  Some final general questions for reflection

6 Overview of Some Interesting Specific findings 1.Linking organizational capacity and programmatic effectiveness 2.Factors influencing project success 3.Greater impact associated with projects related to strategy, adaptability, and leadership 4.Keys to consulting success 5.Difficulties in working with consultants 6.The challenges of implementation

7 Finding #1: Linking Organizational Capacity and Programmatic Effectiveness  Grantees say they can clearly make the link between organizational capacity building and program service outputs or outcomes (quality/reach, strategy, visibility, resources, etc.) in a measurable way.  89% of 169 grantees said the OE grant had “some,” “significant,” or “transformational” and measurable impact on program services.

8 Finding #1: Linking Organizational Capacity and Programmatic Effectiveness

9  Do these results seem high or low to you? How do you understand the connection between stronger organizations leading to stronger programs, which lead to greater social impact? What are other ways to document these causal connections?  Given that grantees overwhelmingly indicate OE has a measurable impact on program services, how confident do you need to be that a specific capacity-building intervention led directly to program improvement and what would constitute valid evidence? Should OE ask grantees to report on program impact in Final Reports?

10 Finding #2: Factors Influencing Project Success  The research found that capacity-building project success depends more on organizational readiness and adequate resources to implement the OE project than on whether the organization has resources to implement follow-up after the OE project.

11 Finding #2: Factors Influencing Project Success This finding confused us. According to grantees’ final report data, the number one challenge for OE projects (23% report this challenge) is implementation after a consultant leaves. How do you interpret these findings? Do you think the OE Program should require grantees to demonstrate completion of a recent organizational assessment prior to every grant received in order to better determine organizational readiness?

12 Finding #3: Greater Impact Associated with Projects Related to Strategy, Adaptability, and Leadership than Those Related to Fundraising Capacity Development  Grantees that concentrated on improving fund development capacity reported inferior longer-term outcomes compared to those that focused on strategic planning, organizational learning, or leadership succession. They were not as likely to have met their grant objectives and described lower levels of sustainability of their grant results, as well as less impact on program services.

13 Finding #3: Greater Impact Associated with Projects Related to Strategy, Adaptability, and Leadership than Those Related to Fundraising Capacity Development  How can investments made in fundraising capacity have a greater impact? How can grants focused on fundraising go beyond just building technical skills and management abilities; holistically integrate it with organization-wide activities related to learning, adapting, developing leadership, and decision-making; and embed it in the organization’s culture and business model?

14 Finding #4: Keys to Consulting Success  The most important factors contributing to consultant success were “understanding of grantees’ unique needs” (34%) and “ongoing communications and trusting relationships” (29%), while “consulting skills” (23%) and “field knowledge” (15%) ranked lower. Does this finding surprise you? What do you think the most important success factors should be?

15 Finding #4: Keys to Consulting Success Satisfaction working with an external consultant for capacity building has more to do with the consultant’s 1) knowledge of the grantee’s field of work (e.g., conservation) and 2) experience with non-profitsthan 3) expertise in the specific area of capacity building (e.g., strategic planning) -- though all three are important for project success. How do you interpret these results? What, if any, is the role of the Foundation in helping grantees identify and strategically use consultants?

16 Finding #5: Difficulties Working with Consultants  The top three reported challenges working with consultants were: 1)consultant’s availability and accessibility; 2)failure to understand grantees’ unique culture and provide customized approaches; 3)failure to deliver high-quality products that meet grantees’ needs and stay on timeline and budget.

17 Finding #5: Difficulties Working with Consultants  In general, how can nonprofits, consultants, and funders minimize these challenges?  In particular, how can nonprofits become better consumers of consulting services, how can consultants avoid providing services that are too “cookie cutter,” and how can funders help enhance the quality of consulting services?

18 Finding #6: The Challenge of Implementation  The survey found that the biggest challenge that nonprofits face with capacity building is implementing the often first-rate strategies that are devised. Too much grantmaker support for nonprofit organizational development seems to be geared for “ready, set,” and not enough for “go.”

19 Finding #6: The Challenge of Implementation  How can nonprofits be supported after the “project” through implementation assistance such as ongoing action-oriented learning, peer exchange, coaching, real-time tools, and hands-on support to act on wise counsel and get the good work done?  How can nonprofits make more time to execute?  Can more be done up front with grantees to agree on post-project follow-through and implementation expectations?”

20 Agenda  Introduction  Background on research methodology  Some interesting specific findings  Some final general questions for reflection

21 1. Diagnosing Organizational Development Needs  At this point, Packard Foundation does informal organizational assessments based on program officers’ sense of grantees’ capacity-building needs as well responding to grantees’ own stated OE priorities.

22 1. Diagnosing Organizational Development Needs  How formal and in-depth do you think the upfront organizational assessment process should be to conduct a comprehensive diagnosis and prioritize needs and a critical path for organizational development?  What should be the respective roles of the nonprofit, consultant, and funder in identifying the type of capacity that should be strengthened?  How “responsive” or “directive” should a funder be (with respect to determining capacity building priorities, consultant selection, identifying group or individual project, etc.)?

23 2. Broad vs. Focused Capacity-Building Approach Focused approach:  Currently, the Packard Foundation organizes most of its OE work around a responsive, “focused” capacity- building approach – one capacity at a time, for one year, aimed at organization/network-wide scope and reach. Many OE grantees receive multiple grants over time to focus on different aspects of their capacity.  Packard not only gets coordinated input from the primary Program so that the “OE project” is successful, but makes sure the project is a priority for Program and that Program will have ongoing operations support, follow-up, and integration.

24 2. Broad vs. Focused Capacity-Building Approach Broad approach:  Some other funders use a broad approach that integrates the program and organizational development work more and follow a more holistic, in-depth, and ongoing approach that is less “project” oriented. What do you think are some pros and cons of these different approaches?

25 3. Supporting Organizational Effectiveness Work that is NOT Subsidized by a Particular Grant  Most nonprofit organizational effectiveness work is an inside job – staff and board do the work without the aid of consultants or funding geared toward the particular activity. How can funders, consultants, and others encourage this “inside” work? How can general operating support be employed better to encourage and support this work?

26 4. Next Steps with the Research  How useful do you think this set of research findings is for the field?  How could it be more beneficial?  What are some ways it can be disseminated to those who would find it helpful?  What are some areas where additional research and analysis might be helpful?

Engage with us: oe.wikispaces.com/OE+Goldmine+Research+Project