Presentation is loading. Please wait.

Presentation is loading. Please wait.

Evaluating Your Campaign: So Much More Than Just Meeting the Goal. Presented at: Minnesota Planned Giving Conference, November 4, 2009.

Similar presentations


Presentation on theme: "Evaluating Your Campaign: So Much More Than Just Meeting the Goal. Presented at: Minnesota Planned Giving Conference, November 4, 2009."— Presentation transcript:

1 Evaluating Your Campaign: So Much More Than Just Meeting the Goal. Presented at: Minnesota Planned Giving Conference, November 4, 2009

2 Your presenters: Keith Christensen ‘80 Luther College, Vice President for Development Ann Sponberg Peterson Luther College, Director of Development for Principal Gifts/Campaign Co-Director

3 Some Introductions: Our personal and professional backgrounds. Luther College and our recent Higher Calling Campaign. Why we chose to evaluate this campaign effort.

4 Why Evaluate? Our Rationale. We learned that very few organizations evaluate their campaigns. We wanted to study our reach, our case for support, our staff and volunteer performance, and we wanted to ask our stake-holders their opinions of our efforts. What follows is how we managed this evaluation process and what we learned.

5 Where we started. We spent some quality time with our original philanthropic market study prepared for us by Bentz Whaley Flessner. This helped to form some of our “key framing” questions. Since the study was based on hard development data – we felt we could ask timely and pointed questions based on some of the study’s assumptions.

6 7 months and counting... The campaign was due to end on December 31, 2007. We began evaluating in July, 2007. We presented a final report to the Board of Regents in February, 2008. We’re not statisticians (!) – and we wanted unique and personal responses – so we asked for written feedback and opinions. Our format was a (non-razzle dazzle) survey (snail) mailed to key constituents.

7 Who we surveyed: Campaign Cabinet (20) Development & Alumni Relations Staff Team (24) Area Campaign Event Hosts (168) Luther held 38 campaign events across the country from 2006 to 2008. Such events each had six or more host couples, who were usually major donors or key stake-holders. Board of Regents (30)

8 A variety and range of pointed questions regarding: Expectations and responsibilities. The Case. Obligation vs. Joy. Understanding of a comprehensive campaign. Staff preparedness & capacity. Future fundraising potential. Event presentations & follow through. Outright vs. Planned Giving. Budget for the campaign. Staff vs. Volunteer – Moving toward a new Development Committee Model.

9 Key Framing Questions. Did we have a convincing Case for Support and how did we do articulating it? Was the campaign goal appropriate? Did we have the infrastructure and campaign management in place and how did our key leaders (staff and volunteers) perform? How do others feel we are poised for the future?

10 And the surveys said? A recitation of some verbatim responses from constituents.

11 Question: Luther will surpass the financial goals set forth in the campaign. Did we, in your opinion, have a convincing “Case for Support”? Was our message compelling? In what ways was “the case” convincing? Or not convincing?

12 Question: What aspects of the campaign were most appealing to you? Do you have a particular passion for one type of giving over another? When the college was striving for 100% Board giving to the campaign, 100% participation in the Annual Fund, enhanced by 100% giving to the new Science Center, was giving to one or the other easier? Was one a joy and another an obligation?

13 Question: From your vantage point, was Luther staffed well enough for this campaign? Did we have the infrastructure in place for success? Were you ever worried about our ability to succeed? Did it seem as though we were working with an appropriate budget?

14 Question: What did you think of the $90 million goal ($57.25 million in outright gifts + $31.75 million in planned gifts)? Did you believe this goal was attainable or was it a “stretch” in your mind? Did your opinion change at any time during the course of the campaign? If so, how did it change, and what prompted your change of thinking? Do you think we had the right balance between outright and deferred giving?

15 Our response rates. Campaign Cabinet (July) 75% response rate Development/Alumni Staff Team (August) 100% response rate Area Campaign Event Hosts (Fall) 25% response rate Board of Regents (Year End) 45% response rate

16 Our report to the Board The evaluation process & why it was critical to ask such questions. We shared actual commentary. We shared our summary assumptions of what went well, what did not go so well, how this informed our management efforts, and what we would plan to change. As a result of the surveys we targeted key concerns the Board should address as Luther moved forward toward the next campaign.

17 In summary: what went well. Our surveys revealed we “made the case” but also acknowledged those surveyed were “in the choir.” Great appreciation for development staff efforts and belief that the budget “seemed” adequate. Unanimous agreement that President Rick Torgerson (a trained higher education development professional) at the helm was a huge advantage. Approval of our Campaign DVD – great emotional impact. Acknowledgement that Area Campaigns may be an outdated model, but that they work for Luther. Overall enthusiasm for the priorities of the campaign and how they were articulated.

18 Continued: There seemed an implication that “we sure did well” despite our perceived not-very-wealthy alumni base. Staff commented that they felt more accountability in this campaign and that systems, reporting, and technology were put to better use. Area hosts submitted new referrals of prospects. General support for a proposed new Development Committee using volunteer solicitors. Belief that the campaign goal was appropriate.

19 In summary: what did not so go well. There was little, or no, peer-to-peer donor solicitation by Regents and Campaign Cabinet, which they all readily acknowledged. As a result, this set the stage for our transition to the “new model.” We need better clarification and communication of volunteer roles, specific expectations and follow through. We should have managed the Campaign Cabinet better.

20 Continued: We learned not all development officers can manage area campaign events. To save on costs we never developed campaign letterhead or specialized invitations. Hence, our area campaign event invitations got easily lost in the shuffle. We never did get that original lead gift – though most of our key constituents knew this and had great empathy.

21 Response to the Evaluation. Timing is Everything! Our report to the Board of Regents was in February 2008. The Luther Regents weighed in with comments about the markets and the economy. Caution was in the air. Tempered goal setting was a result.

22 In Summation: This was a new experience for Luther College and we’re glad we embarked on the effort. The evaluation process turned into a worthy donor cultivation tool and opportunity for more engaged dialogue with our best supporters. Our Regents were more fully prepared and informed when it came to authorizing the next funding initiative.

23 Questions? We’re happy to share: chriskei@luther.edu petean03@luther.edu


Download ppt "Evaluating Your Campaign: So Much More Than Just Meeting the Goal. Presented at: Minnesota Planned Giving Conference, November 4, 2009."

Similar presentations


Ads by Google