Presentation is loading. Please wait.

Presentation is loading. Please wait.

2014 AmeriCorps State and National Symposium Evaluation: Where We’ve Been and Where We’re Going.

Similar presentations


Presentation on theme: "2014 AmeriCorps State and National Symposium Evaluation: Where We’ve Been and Where We’re Going."— Presentation transcript:

1 2014 AmeriCorps State and National Symposium Evaluation: Where We’ve Been and Where We’re Going

2 Presenters Carla Ganiel, Senior Program & Project Specialist, AmeriCorps State and National Emily Steinberg, Associate Director, AmeriCorps*Texas, OneStar Foundation Stephen Plank, Director of Research and Evaluation, CNCS Bethanne Barnes, Program Examiner, Office of Management and Budget

3 AmeriCorps State and National: Lessons Learned Carla Ganiel Senior Program & Project Specialist cganiel@cns.gov

4 Strategies for Building Evidence Include evidence in assessment criteria for competitive applicants Ensure that grantees/subgrantees are on track with evaluation plans Build evaluation capacity through training and technical assistance for national grantees, state commissions and state subgrantees

5 Grant Application Review Process (GARP) Levels of evidence outlined in NOFO Points awarded based on strength of evidence –Strong evidence = 8 points –No evidence = 0 points Evaluation reports reviewed concurrent with staff review and informed level of evidence assessment

6 Competitive Grantees by Level of Evidence LevelNumber of New/Recompete Grantees Percent of New/Recompete Grantees Strong77% Moderate2122% Preliminary2628% Pre-Preliminary3941% No Data11%

7 Evaluation Reports Reviewed by CNCS Office of Research & Evaluation Staff did not review reports from applicants that were not required to submit an evaluation Evaluations were not scored Evaluation results were considered in determining an applicant’s level of evidence

8 Evaluation Review Reviewers assessed: Alignment of Models Methodological Quality Strength of Findings Recency Compliance with ASN evaluation requirements

9 Large Applicants ($500,000 +) Large ApplicantsNumber Total evaluations reviewed10 Met evaluation requirements (type of study)3 Rated satisfactory quality4 Met evaluation requirements AND rated satisfactory quality2

10 Small Applicants (Less than $500,000) Small ApplicantsNumber Total evaluations reviewed31 Met evaluation requirements (type of study)31 Rated satisfactory quality12 Met evaluation requirements AND rated satisfactory quality12

11 Evaluation Reports Submitted Type of Study # Large Applicants # Small Applicants Total Experimental347 Quasi-Experimental044 Non-Experimental5611 Process/Implementation21719 Total103141

12 High Quality Evaluations

13 Lessons Learned Less than a third of the portfolio has moderate or strong evidence Few evaluations met both ASN evaluation requirements and quality standards Some small applicants are conducting impact evaluations More in-depth review of evidence by external experts, focused on quality rather than quantity of studies

14 Evaluation Technical Assistance One-on-one technical assistance provided to large ($500,000+) national grantees and state subgrantees TA focused on developing a plan to conduct an evaluation that would meet CNCS requirements –Quasi-Experimental Design (Comparison Group) –Randomized Control Trial (Control Group)

15 Lessons Learned Lack of clarity about the difference between performance measurement and evaluation Many grantees/subgrantees did not understand CNCS evaluation requirements Some grantees did not take full advantage of TA offered

16 Lessons Learned Grantees/subgrantees need: –Clear timeline for completing an evaluation with in a three-year cycle –Guidance on budgeting for evaluation –Guidance on how to hire an external evaluator

17 Lessons Learned At the time of grant award, most grantees have considerably more planning to do before they can meet CNCS evaluation requirements Certain program designs face challenges when using a quasi-experimental or experimental design Alternative evaluation approaches may be acceptable for some grantees (Pilot process) Conversation must shift from “evaluation as compliance” to “evaluation to strengthen program and build evidence.”

18 Evaluation Core Curriculum Completed courses: –AmeriCorps Evaluation Requirements –Evaluation 101 –Basic Steps in Conducting an Evaluation –Overview of Evaluation Designs –How to Develop a Program Logic Model –How to Write an Evaluation Plan –How to Budget for an Evaluation https://www.nationalserviceresources.gov/evaluation- americorps

19 Next Steps Alternative Evaluation Approach Pilot Process Feedback to all small 2014 competitive grants who were required to submit evaluation plans One on one TA for large 2014 competitive grantees who were required to submit evaluation plans Dialogue with commissions on how to build capacity to work with subgrantees Develop additional core curriculum courses Identify needs of tribal grantees

20 20 CNCS Grantee Symposium September 17, 2014 Emily Steinberg, Director of National Service Programs

21 Why evaluation matters –Accountability –Evaluative Learning Our evaluative evolution –Then & Now –Capacity Building Evaluations (Other State/Federal Projects) –Direct Service Evaluations (AmeriCorps) –Lessons Learned Current philosophy and approach

22 DUAL ROLE Funder Texas State Service Commission $8-13 million in AmeriCorps grants/year Responsibility to evaluate grantees and portfolio Federal reporting requirements 22 Nonprofit Governor’s Office of Faith- Based and Community Initiatives Think Tank/Incubator Responsibility to evaluate self and initiatives State/private/internal reporting requirements

23 What is evaluation? 23 … a method for determining what change occurred, how, why, and to whom a given change has happened.

24 Important for: Accountability/Stewardship Learning about Impact – Driving Program Design Communicating Impact – to staff, board, volunteers, funders and community stakeholders Multiplying your Impact – increasing the knowledge-base of the field EVALUATION IS…

25 ACCOUNTABILITY VS. LEARNING Accountability Did what you said you’d do? Implemented as designed? Cost-benefit analysis – R.O.I. Evaluative Learning Game film vs. scoreboard What outcomes were achieved and for whom? Did outcomes differ across sub-groups? What made the difference? Which specific program aspects achieved the most/least impact? Peter York, TCC Group

26 LEARNING AND LEADERSHIP Learning is key to leadership: Good leaders are good learners:* –Only one in four nonprofit orgs are “well led” –Only one in four nonprofits are effective “learners” *York, P. (2010). “Evaluative Learning.” TCC Group: Philadelphia, PA.

27 HISTORY Our Evaluation Trajectory 2003-04: OneStar became 501(c)(3) commission 2004-2008: Blissful ignorance / empty enforcement –Began implementing National PMs –Focused more on capacity building initiatives & evaluation –Did not have a framework for reviewing AmeriCorps subgrantees’ evaluation plans and final reports (honor system)

28 HISTORY Our Evaluation Trajectory… 2006-2007: Compassion Capital Fund (federal) –FBCI: Focused on grants to small FBCOs ($25K) –Results: Some orgs performed better than others – WHY? 2007-2008: Rural Texas Demonstration Project (state) –Applied learning to small FBCOs in rural Texas ($10-12K) –Results: Developed indicators of success and critical milestones: readiness to change, trust/rapport, minimum capacity

29 HISTORY Our Evaluation Trajectory 2008: FBCI Capacity Building Project (State) –Statewide grantees, honed evaluation and success indicators –Studied Org Assessment Tools: Selected Core Capacity Assessment Tool (CCAT) by TCC-Group 2009-2012: AmeriCorps Statewide Evaluation (State)

30 STATEWIDE EVALUATION OVERVIEW Multi-year (2009-2012) Competitive RFP –Selected external evaluator: UT-Austin RGK Center Based on previous research –Longitudinal AmeriCorps study by Abt Associates –Organizational capacity assessments – TCC and RGK Instruments –Program director surveys –AmeriCorps member surveys –Organizational capacity surveys

31 EVALUATION DESIGN PROPOSAL:  analyze the value added of AmeriCorps to TX  assess organizational/management structures most commonly associated with impactful, value-adding programs  assess the different kinds of impact that AmeriCorps programs have on communities

32 Organizational Structure and Program Management Characteristics Member Assessment of Community Impact and Value Value Added of AmeriCorps in Communities EVALUATION MODEL

33 EVALUATION COMPONENTS Surveys AmeriCorps Program Managers AmeriCorps Members Database Organizational & Financial Indicators Organizational Capacity Surveys Grantee Budgets 990 Forms Organizational Annual Budgets/Financial Reports Case Studies Conducted four (4) site visits

34 FINDING #1: Overall, AmeriCorps Service Has a Positive Impact Program Managers Members Percent survey respondents choosing top response Program/service is “very effective” Members “make an important contribution” “A lot of change” is observed in the clients served 79%83%63% 2009-102010-11 96%91%75% 67%70%51%

35 Finding #2: Most Programs Achieve Goals 83% of 2009-10 and 91% of 2010-11 respondents report meeting or exceeding most of the goals reported to OneStar Statistically significant differences (p=0.045) exist between organization types

36 FINDING #3: Most Members Believe in Long-Term Impact of Their Service

37 FINDING #4: Three Program Characteristics Are Tied to Member Assessment of Service Impact Service Clarity Orientation Quality Communication Quality Program Characteristics

38 FINDING #5: Service Has a Positive Financial Impact on Most Organizations…

39 …and a Positive Economic Impact on Communities Total net value per year: Over $10 million * Program manager assessment of value has been regionally adjusted. Numbers in chart represent averages for 2009-10 and 2010-11 terms

40 LESSONS LEARNED Evaluating more than 1 AmeriCorps program at a time is like apples and oranges. You may pay a lot of $$$ to learn things that seem somewhat predictable or ordinary. Be ready to edit the final product. Heavily. Not all evaluative data is created equal. (‘Perceived’ vs. ‘Actual’ Value) Case Studies are difficult to do well.

41 CURRENT PHILOSOPHY + APPROACH Back to basics – You have to get ‘down in the weeds’ at some point – we have to specialize in each grantee’s program design to a basic extent, to empower them to make smart evaluation decisions and to ask the right questions. Evaluation begins at accountability, but is really what should really drive decision-making – both for funders AND grantees. Good leadership is what drives a culture of evaluative learning—make it part of everything we do from the top down. ‘Practice what you preach’ to grantees + ‘There’s always room to grow’ (Annual grantee survey, training event surveys, member experience and inclusion survey)

42 CURRENT TOOLS + RESOURCES GRANTEE RESOURCE LIBRARY: http://onestarfoundation.org/americorpstexas/grantees/grantee-resources/ 42

43 CURRENT TOOLS + RESOURCES GRANTEE RESOURCE LIBRARY: http://onestarfoundation.org/americorpstexas/grantees/grantee-resources/ 43

44 Contact Information Emily Steinberg –Director, National Service Programs emily@onestarfoundation.org emily@onestarfoundation.org 44

45 45 www.onestarfoundation.org

46 CNCS Office of Research & Evaluation: Recent Path, and Coming Steps Stephen Plank, Ph.D. Director of Research & Evaluation splank@cns.gov

47 Overview Who we are What we do What we strive to do in support of AmeriCorps State and National grantees Recent accomplishments Next priorities & activities

48 About Our Team 8 full-time members, plus talented interns, on- site consultant, & contracted partners Strong working relationships with AmeriCorps State and National program leadership Expertise in program evaluation, measurement, experimental & quasi-experimental design, community psychology, sociology, economics, public policy

49 Our Mission To build the evidence base for national service programs To facilitate the use of evidence within CNCS & among its grantees and subgrantees –Our vision of what evaluation & evidence can be at their best… –Our view of what evaluation should not be about…

50 What We Strive to do in Support AmeriCorps Grantees Modeling & supporting best practices in evidence & evaluation Participating in dialog & problem-solving Participating in the grant-making & evaluation- planning process, as appropriate Communicating findings & lessons learned to multiple audiences –Clarity & practicality as watchwords

51 Our Portfolio R&E about, and directed toward, the national service field Grantee support –Technical assistance, “evidence exchange,” capacity- building tools Helping CNCS, as an agency, use evidence & evaluation more effectively –GARP, organizational accountability, learning strategies

52 Highlights from the Past Year - R&E with AmeriCorps State and National Minnesota Reading Corps evaluations School Turnaround AmeriCorps evaluation –Design Phase AmeriCorps Exit Survey –revisions, focus on response rates and utilization Grantee Evaluation and Training and Technical Assistance Evaluation Bundling Project –Pilot Phase

53 For the Coming Year and More - R&E with AmeriCorps State and National School Turnaround AmeriCorps evaluation justice AmeriCorps evaluation R & E in support of other national service partnerships Grantee evaluation training & technical assistance Evaluation bundling project Additional projects –building on past lessons learned

54 Bethanne Barnes Program Examiner Office of Management and Budget

55 Q&A


Download ppt "2014 AmeriCorps State and National Symposium Evaluation: Where We’ve Been and Where We’re Going."

Similar presentations


Ads by Google