Creating an Evaluation Plan Freya Bradford, Senior Consultant NorthSky Nonprofit Network May 23, 2013.

Slides:



Advertisements
Similar presentations
Performance Measurement: Defining Results Tutorial #1 for AmeriCorps
Advertisements

Program Evaluation. Overview and Discussion of: Objectives of evaluation Process evaluation Outcome evaluation Indicators & Measures Small group discussions.
Team 6 Lesson 3 Gary J Brumbelow Matt DeMonbrun Elias Lopez Rita Martin.
Grantee Program Plan. Components of the Program Plan Goals  Objectives  Activities  Techniques & Tools  Outcomes/Change.
Evaluating Collaboration National Extension Family Life Specialists Conference April 28, 2005 Ellen Taylor-Powell, Ph.D. Evaluation Specialist University.
Designing an Effective Evaluation Strategy
May 5, 2015 Strategies for Evaluation Data Collection Eric Graig, Ph.D.
Decision Making Tools for Strategic Planning 2014 Nonprofit Capacity Conference Margo Bailey, PhD April 21, 2014 Clarify your strategic plan hierarchy.
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
Cancer Disparities Research Partnership Program Process & Outcome Evaluation Amanda Greene, PhD, MPH, RN Paul Young, MBA, MPH Natalie Stultz, MS NOVA Research.
Orientation to Performance and Quality Improvement Plan
Evaluation. Practical Evaluation Michael Quinn Patton.
Collecting Basic Evaluation Information Chapter Six cont…
Student Assessment Inventory for School Districts Inventory Planning Training.
Choosing Your Primary Research Method What do you need to find out that your literature did not provide?
Molly Chamberlin, Ph.D. Indiana Youth Institute
Qualitative Research MKTG 3342 Fall 2008 Professor Edward Fox.
How to Focus and Measure Outcomes Katherine Webb-Martinez Central Coast and South Region All Staff Conference April 23, 2008.
1 Qualitative Evaluation Terms Coding/categorization The process of condensing qualitative data through the identification of common themes. Data Matrix.
Evaluating Your STEM Outreach Program MISO Spring Workshop May 7, 2012 MISO Data Analytics Team Jeni Corn, Tricia Townsend, Alana Unfried
Assessment of Staff Needs and Satisfaction in an Evergreen Environment David Green NC Cardinal Consultant State Library of North Carolina.
CENTER FOR NONPROFIT EXCELLENCE EVALUATION WORKSHOP SERIES Session IV: Analyzing and Reporting Your Data Presenters: Patty Emord.
Edward M. Haugh Jr. ESC Consultant. III. Recommendations for Applying Outcomes Planning to ESC  I. Introduction to Outcomes Planning II. A Sample ESC.
Program Evaluation Using qualitative & qualitative methods.
The Evaluation Plan.
Do it pro bono. Strategic Scorecard Service Grant The Strategy Management Practice is presented by Wells Fargo. The design of the Strategic Scorecard Service.
NorthSky Nonprofit Network Creating Customer Satisfaction Surveys Presented by Christine A. Ameen, Ed.D. Ameen Consulting & Associates
Too expensive Too complicated Too time consuming.
Logic Models Handout 1.
Program Evaluation and Logic Models
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Elizabeth Godfrey 1.  Periodic assessment of results Appropriateness, effectiveness, efficiency, impact, sustainability  Identifies intended and unintended.
Introduction to Evaluation January 26, Slide 2 Innovation Network, Inc. Who We Are: Innovation Network National nonprofit organization Committed.
“Advanced” Data Collection January 27, Slide 2 Innovation Network, Inc. Who We Are: Innovation Network National nonprofit organization Committed.
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
National Evaluation of Community Sport Activation Fund (CSAF)
Joint Infant and Toddler Steering Committee/Early Learning Regional Coalition Statewide Meeting “Using our Data for Continuous Improvement” Organizational.
Overview of Evaluation Designs. Learning objectives By the end of this presentation, you will be able to: Explain evaluation design Describe the differences.
Developing a logic model Western Region Institute Brian Luckey, University of Idaho Extension 1 © 2008 by the University of Wisconsin System..
Boston Geneva San Francisco Seattle Beginning the Evaluation Journey with FSG KCIC Boot Camp March 24, 2010 Prepared for:
ASSESSMENT OF STUDENT SUPPORT SERVICES Kimberly Gargiulo, Coordinator of Assessment Office of Institutional Research and Assessment.
1. Housekeeping Items June 8 th and 9 th put on calendar for 2 nd round of Iowa Core ***Shenandoah participants*** Module 6 training on March 24 th will.
Measuring the Impact of Your Volunteer Program Barbra J. Portzline, Ph.D. Liz Benton, MBA.
Continuous Improvement. Focus of the Review: Continuous Improvement The unit will engage in continuous improvement between on-site visits. Submit annual.
1 A QTS Web Training Writing Consumer Education & Referral Outcomes.
C HOOSING E VALUATION T OOLS. Gathering information requires the appropriate tool Important to use a variety of data collection types.
Developing SEA Change’s Evaluation Plan
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
Program Evaluation for Nonprofit Professionals Unit 3: Collecting the Data.
Marketing Research Aaker, Kumar, Day and Leone Ninth Edition Instructor’s Presentation Slides.
Project Evaluation for MSP Targeted Partnerships Eric R. Banilower Horizon Research, Inc.
The Roles Evaluators Play in Providing TA to SPDG Projects 1 Cheryl Leever Huffman C L Huffman & Associates 3316 Eton Avenue Oklahoma City, OK 73122
Managing Marketing Information 4 Principles of Marketing.
Are we there yet? Evaluating your graduation SiMR.
Session 2: Developing a Comprehensive M&E Work Plan.
Getting to the Root of the Problem Learn to Serve 501 Commons November 6, 2013 Bill Broesamle.
[Presentation location] [Presentation date] (Confirm ABT logo) Building Bridges and Bonds (B3): An introduction.
Improved socio-economic services for a more social microfinance.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
Altarum Institute integrates independent research and client-centered consulting to deliver comprehensive, systems-based solutions that improve health.
Logic Models How to Integrate Data Collection into your Everyday Work.
Using Data to Drive Decision-Making
Developing Community Assessments
Rebecca McQuaid Evaluation 101 …a focus on Programs Rebecca McQuaid
Designing Effective Evaluation Strategies for Outreach Programs
Introduction to Program Evaluation
Program Evaluation Essentials-- Part 2
Looking at your program data
Universal Design Workshop July 30-31, 2013
Performance and Quality Improvement
Presentation transcript:

Creating an Evaluation Plan Freya Bradford, Senior Consultant NorthSky Nonprofit Network May 23, 2013

How is the landscape changing?

Learning has never been more important LeadershipAdaptability Program Capacity Sustainability “Anyone who keeps learning stays young.” – Henry Ford TCC Group Sustainability Formula

Phases of Evaluation 1. Create Evaluation Plan 2. Collect & Analyze Data 3. Make Meaning & Adapt Program

Evaluation Plan: A written document that describes how you will assess the success of a program or project.

Evaluation Planning Process Convene a Team Create a Logic Model Prioritize & Define Eval Questions Create a Data Collection Matrix Write & Share the Eval Plan

Convene a team Evaluator/Facilitator (if you have one) Project lead Board member/leadership Key staff person Member of target population Evaluation

Create a logic model

Prioritize & Define Eval Questions “Not everything that counts can be counted, and not everything that can be counted counts.” Measure what matters! Think about uses for the data - Can you identify a decision that could be made as a result of the data? Consider Process & Outcome

Example Questions – NS Coaching Program Process Evaluation Questions 1) Were project activities conducted as planned? Were the outputs achieved? 2) What are the characteristics of participants that engaged in coaching? 3) How satisfied are participants with the coaching that they received? Did it meet expectations? Outcome Evaluation Questions 1) To what extent did coaching help participants meet individual and organizational goals? 2) In what ways did participants and organizations change/grow as a result of coaching?

Create a data collection matrix What Information will be Collected? (Indicators). From whom? (Sources) In what way? (Methods) By whom? (Person Responsible) How will the information be stored and managed? (Data Management)

Create a data collection matrix

Indicators An indicator is piece of data that you will collect to help tell you if you are achieving what you intended Can be Quantitative or Qualitative You may have multiple indicators that you want to collect to measure one part of an Activity or Outcome Common Activity (Process) Indicators FTE/volunteer hours contributed # of partners Amount of products/services delivered #/type of clients served #/type of materials produced/disseminated Timeliness of service provision Quality of services (satisfaction data) Common Outcome Indicators #/% demonstrating increased knowledge/skill #/% demonstrating attitude or behavior change % changes in conditions (longer-term)

Indicators Example Activity of Coaching: Two (2) 45-minute introductory information sessions will be led by coach. They will be promoted through the NS weekly update. Indicators to measure progress: - # of times promoted in weekly update - # clicked through - # of participants at intro sessions - Participant demographics – org. role, tenure in position

Indicators Example Outcome of Coaching: Participants will improve in individual development goal areas (e.g. managing others, self-awareness, self-management, work-life balance). Indicators to measure progress: - % of participants improving by goal area - % of participants meeting and exceeding desired goal - % change in goal achievement scores overall and by goal type

Indicators

Methods Activity tracking Document review Surveys Tests Focus groups Interviews Case studies

Methods – When to Use a Survey Need a little info about a lot of things Have a large number in your target population Numbers are important to your decision making Respondents might not feel comfortable talking about answers in a group Need information quickly

Methods – Tips on Survey Have someone available who can help with survey construction Use a variety of question types – not too many open-ended Consider using an online survey Pilot test survey with group similar to your target Incentives!

Methods – When to Use a Focus Group You have deep or complex issues you want to understand more completely You want to develop some preliminary understanding of the issues surrounding your topics You want to hear people’s deep feelings or insights about your topics Some questions need to be explained in detail or probed to elicit good feedback People need time to ponder questions before responding

Methods – Tips on Focus Group Use a trained facilitator who is as neutral as possible Use a recording device (with permission) and/or a note taker in addition to facilitator Limit groups to 6 – 12 participants Give participants background info before the session Carefully plan questions – 3 – 4 in-depth questions with planned probes if needed Incentives! Send thank you notes

Indicators

Write & Share the Eval Plan Evaluation Plan Outline I.Introduction II.Evaluation Design & Questions III.Evaluation Methods Method Description Indicators measured with method (Repeat for each method) IV.Use of Findings Attachments: Timeline for Evaluation Implementation Logic Model

Resources NorthSky Resource Center This presentation and other free resources Including: University of Wisconsin-Extension, Program Development and Evaluation Unit: United Way Outcome Measurement Resource Network: W.K. Kellogg Foundation Logic Model Handbook: