Sylvia Galen & Jon Thorsen AASP Regional Symposium January 21, 2015.

Slides:



Advertisements
Similar presentations
1 1 The Evolution of Evaluation What Gets Measured is What Counts.
Advertisements

Facilitator:Colleen Kelly December 14, 2012 Governance Lab Session 4: Evaluating Your ED/CEO.
Bryan Roach Chairman Crime Stoppers Australia. Strategic Planning The process for defining strategy (direction) and decision making For Crime Stoppers,
2025 Planning Contacts Meeting November 8, 2012 K-State 2025.
Steve Meier. What is Strategic Planning Determines Where an organization is going over the next year or more, How it's going to get there How it'll know.
Kathy Keeley Northland Foundation Strengthening Communities April 14, 2011 Writing Result Measures and Program Evaluation.
Copyright Marts & Lundy Cultivating a Culture of Philanthropy Kathleen Hanson Senior Consultant and Principal Leader – Schools Practice Group Editor, The.
Monitoring and evaluation of carers’ services and projects Dr Andrea Wigfield - Associate Professor of Social Policy Centre for International Research.
Amy Blakemore, MoPTA Technology Chair Andrea Battaglia, MoPTA PR Chair Super strategies to benefit any Unit/Council Fantastic Fundraising.
Decision Making Tools for Strategic Planning 2014 Nonprofit Capacity Conference Margo Bailey, PhD April 21, 2014 Clarify your strategic plan hierarchy.
Jennifer Bennett CVA, Senior Manager, Education &
By Saurabh Sardesai October 2014.
Title I Needs Assessment and Program Evaluation
Copyright © 2015 McGraw-Hill Education. All rights reserved
A Human Rights-Based Approach to Monitoring Session 6 (cont.)
UWM CIO Office A Collaborative Process for IT Training and Development Copyright UW-Milwaukee, This work is the intellectual property of the author.
Application of CRM (Customer Relationship Management) in Libraries.
Summer Seminar 2014: Best Practices and the Model Development Department Creating Your Development Plan, Goals and Structure.
Columbia-Greene Community College The following presentation is a chronology of the College strategic planning process, plan and committee progress The.
Info-Tech Research Group1 Improving Business Satisfaction Moving from Measurement to Action.
Performance Planning Presented by: Mike Dougherty AVP for Human Resources.
Title I Needs Assessment/ Program Evaluation Title I Technical Assistance & Networking Session October 5, 2010.
Molly Chamberlin, Ph.D. Indiana Youth Institute
Moves Management: A Primer. What is Moves Management? “If you don’t know where you are going, any road will take you there...” Moves Management- A system,
Shared Decision Making: Moving Forward Together
Measuring for Performance: The Balanced Scorecard
OSSE School Improvement Data Workshop Workshop #4 June 30, 2015 Office of the State Superintendent of Education.
PROGRAM PLANNING, IMPLEMENTATION & EVALUATION The Service Delivery Model Link Brenda Thompson Jamerson, Chair Services to Youth Facet May 8-12, 2013.
From Evidence to Action: Addressing Challenges to Knowledge Translation in RHAs The Need to Know Team Meeting May 30, 2005.
Approaches to Board Development Thomas P. Holland, Ph.D., Professor Institute for Nonprofit Organizations University of Georgia Athens, Ga
McLean & Company1 Improving Business Satisfaction Moving from Measurement to Action.
Cargill Associates Architects in Philanthropy. 1. Narrow focus on immediate needs 2. Unengaged constituency 3. Weak Case for Support 4. Untested goals.
Do it pro bono. Strategic Scorecard Service Grant The Strategy Management Practice is presented by Wells Fargo. The design of the Strategic Scorecard Service.
Measuring the effectiveness of your volunteer program Meals On Wheels Association of America 2011 Annual Conference August 30, 2011 Stacey McKeever, MPH,
MEET U.S. Performance Measurement Confidential – Do not Distribute NP STRATEGIES MEASURING PERFORMANCE IN THE NONPROFIT ORGANIZATION MEET U.S. In-Region.
From Recruitment to Evaluation: How to Build and Maintain an Exceptional Board Matt Kouri | President and Executive Director TEXAS ASSOCIATION OF COMMUNITY.
If you don’t know where you’re going, any road will take you there.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Working Definition of Program Evaluation
School-Family-Community Partnerships Increasing Volunteerism
Developing and Writing Winning Individual, Corporate and Foundation Proposals Robin Heller, Director, Corporate and Foundation Philanthropy, BBBSA Robert.
Developing a Case Statement CSWE/NADD Spring 2006 meeting Randy L. Holgate Senior Vice President, University Resources The University of Chicago
LORETTA DUNCAN-BRANTLEY Associate Communications Manager Microsoft Corporation Discovering Your Path Through the Maze of Life.
The Importance of SCANNING World Service Conference April 2010.
1 Designing Effective Programs: –Introduction to Program Design Steps –Organizational Strategic Planning –Approaches and Models –Evaluation, scheduling,
Essential Tools for Fundraising Staff Productivity Jim Lyons Pride Philanthropy.
 Read through problems  Identify problems you think your team has the capacity and interest to solve  Prioritize the problems and indicate the.
1 of 27 How to invest in Information for Development An Introduction Introduction This question is the focus of our examination of the information management.
The Next Stage for Results in Africa. Context 2005 Paris Declaration on Aid Effectiveness 2006 Mutual Learning Events Uganda & Burkina Faso 2007 Hanoi.
August 10, 2004 “Best in Class” Leadership Coaching Program at CSAA.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
R 0 G125 B 177 R 78 G 47 B 145 R 185 G 50 B 147 R 245 G132 B 107 R 255 G234 B 83 R 123 G193 B 67 R149 G169 B Goal Setting Overview.
Community Investment Reviewer Training United Way of Greater New Haven March 2010.
Strategic Planning Crossing the ICT Bridge Project Trainers: Lynne Gibb Sally Dusting-Laird.
BRIGHTER FUTURES: An Annual Campaign for Sojourner House at PathStone.
Advancing Innovation in Measuring Patient Advocacy Outcomes.
Board Chair Responsibilities As a partner to the chief executive officer (CEO) and other board members, the Board Chair will provide leadership to Kindah.
2015 NEMA Conference Major Gifts for Small Shops Laura Ewing-Mahoney Co-Founder and Principal.
Improved socio-economic services for a more social microfinance.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
Blackbaud Segmentation Rollout & Prospect Management Updates January, 2014.
Managing the Bowl for Kids’ Sake Committee
Campaign Fundamentals
Improve Business Satisfaction by 10% Through Business Relationship Management Relationship management is the #1 driver of business satisfaction with IT.
Creating Priority Objectives
Proactive 360° Research: Quantifying Accountability and Measuring ROI in Prospect Development
“How to get noticed in your community”
GENERAL INTRODUCTION TO ADVOCACY
Developing SMART Professional Development Plans
Presentation transcript:

Sylvia Galen & Jon Thorsen AASP Regional Symposium January 21, 2015

Starting out We work in a bottom-line business Demonstrating ROI is a continuous need Tying results to the bottom line is critical We can be removed from results Not tied directly to achievements Not controlling results This makes the challenge more difficult We have opportunities to improve processes Data-based decisions are better than the other kind

Assessing the Environment Best actors in a supporting role Cost centers in an income stream Behind-the-scenes workers in an out-front function Process-oriented thinkers in a people-focused industry

Determining Relevant Metrics The most common approach – counting activity Number of visits Dollars raised Gifts closed Profiles written Requests fulfilled Events held Alumni contacted\engaged Message click-through rates

Determining Relevant Metrics A better approach: Measuring outcomes Number of visits – leading to new stage of relationship Dollars raised – for priority needs Gifts closed – as a % of rated capacity Profiles written – supporting a solicitation Requests fulfilled – leading to new prospect assignments Events held – reflecting ROI Alumni contacted\engaged – leading to new volunteer or donor activity Message click-through rates – constituent interaction (completing a call to action, responding to a survey, etc.)

Determining Relevant Metrics An even better approach: Reporting on impact & effectiveness, using “soft” data in addition to numbers Client satisfaction: Survey information on attitude, perceptions Hard numbers on turnaround time, deadlines Skills and competencies: How close is your team and each individual to meeting best practice expectations? Are gaps being closed? Reliability and planning How successful are we in meeting stated goals? Are we staying loyal to our mission and values?

Not everything that can be counted counts, and not everything that counts can be counted. - Albert Einstein Soft Data Matters, Too

Planning Ahead What do we want to report? How will we measure these factors? Counting – pieces, time, dollars, prospects Linking – our office’s activity to others’ actions Assessing gaps – where we are vs. where we want to be Tracking change and progress – processes improved, new results achieved ROI – how our investments have paid back

Setting the Right Goals & Objectives Clear connection to the mission Ties to the divisional priorities A SMART approach: Specific Measurable Achievable Relevant Time Limited

What tasks take the bulk of our time? Are they the right tasks? What clients demand the bulk of our time/resources? Are they right clients? How will we track and report progress? How is our responsiveness How’s the turnaround time? How is the work used? How is the quality of our work? What we can change to improve results? Asking the Tough Questions

What Messages Do We Want to Send? We need to build our systems to easily pull the data we need to report progress toward goals Think ahead to the points we want to make and the information we’ll need to demonstrate our points: Increasing staff and budget Changing structure or responsibilities Implementing new processes\systems Letting go of outdated processes\systems Data should be systemic, clear and consistent Ideally, it comes from the database of record

Automated ticketing system for all departments: In Place: Bio Records, Events, Relationship Management, Research, Technical Services In Progress: Gift Records, Memos of Understanding, Stewardship Allows reporting of Key Performance Indicators Ask for the Data We Want (Input) and Need (Output)

Using KPIs: Example – DAR Digital * Make a Gift * Register for an Event Log In to Access Services Update Contact Information Sign Up to Volunteer Social Advocacy

Getting the chair of a department or other key academic partner on board with the event to raise the visibility of the department Professor installations – honoring the donor who made the professorship possible, quantity/quality of the donor’s network who was engaged at the event Number of Alumni engaged further – newly volunteering, giving, attending more events Number of contact info updates Additional knowledge about interest areas of constituents Increasing volunteer\donor engagement (stage) Increasing gifts Suggested measures for events

Data for Evaluating Staff Evaluation tools Using skills sets to break down areas for evaluation Establishing target performance levels Cumulating the gaps Addressing the gaps Most useful when many people share same or similar job descriptions Can also help define job paths

Data in the Strategic Planning Process Analysis (where are we now?) Environment Statistics Strengths and weaknesses Opportunities, threats Setting direction (where do we want to be?) Action and Implications Evaluation

Data in the Planning Process Set goals Decide on strategies, objectives, action steps Assess the resource implications on each objective Need more staff to implement? Need to purchase more tools? Effect on workload/hours? Can’t answer these questions unless you know your data! Establish priorities – have a process in place Manage change

Sample: Project Prioritization Criteria

Data in Determining Progress – Outcomes What does success look like? How will we know we’ve achieved it? Applying SMART assessments: Specific Measurable Achievable Relevant Time Limited

Example: Reporting the Contributions of the Research & Relationship Management Team to GW’s Largest-ever Gift: 14 years from first document 15 RRM staff, including freelancers Starting in FY12: 1,780 approximate hours; 222 work days; 44.5 work weeks 3 big gifts Donors who are positively impacting future connections, cultivation, and solicitations across GW Tying Results to the Bottom Line

Taking The Show on the Road Share an annual report publicly Develop a dog-and-pony show for standard staff meetings Incorporate the materials into orientations for new staff and key clients Communicate in ways that will reach our clients Get outside of our comfort zones if necessary

Getting the Message Across Deliver meaning, not buzzwords Focus on results, not issues Apply the proper perspective Don’t focus on the provider (“Here’s all the cool stuff we do”) Connect to the user (“Here’s how this helps you in your work”) Remember the decision-makers Push information to the audience in ways that will be absorbed and utilized Think (and talk) like the client

NOT…

Talking Like Our Clients Know how the most vital information is shared Hint: it’s usually not the most common vehicle Use the appropriate methods and settings Get in people’s faces (in the good sense) Recruit advocates, testimonials, fans

Metrics are even more important for activity that is hard to count or comprehend Everyone likes to have clear goals and expectations Evaluation, performance management and professional development are easier and more effective with clear standards and measures In the Final Analysis…

AASP Best Practices: Questions?