Outcome Logic Model Guide

Slides:



Advertisements
Similar presentations
Goal Attainment Scales Definitions Examples Scoring.
Advertisements

The Value of What We Do Dan Phalen US EPA Region 10.
The Art of Writing a Compelling Grant Application Rachel Cleaves, LiveWell Coordinator Barb Parnell, LiveWell Northwest Colorado.
The Lumina Center Grantseeking Workshop Series Presents Outcomes & Evaluations April 20, 2006.
Sustaining Community Projects: Logic Model Construction and Implementation CYFAR Evaluation Team CYFAR, 2005.
+ Monitoring, Learning & Evaluation Questions or problems during the webinar?
How to Write Goals and Objectives
A Tool to Monitor Local Level SPF SIG Activities
Edward M. Haugh Jr. ESC Consultant. III. Recommendations for Applying Outcomes Planning to ESC  I. Introduction to Outcomes Planning II. A Sample ESC.
NJ - 1 Performance Measurement Reporting Development Services Group, Inc. Don Johnson For more information contact Development Services Group, Inc
Topic 4 How organisations promote quality care Codes of Practice
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Classroom Assessments Checklists, Rating Scales, and Rubrics
Center for Evaluation and Program Improvement CFIT Clinical Annual Report: Executive Summary a joint quality enhancement initiative GEORGIA (revised) Providence.
EVALUATION RESEARCH. Evaluation Research  How do we begin?  What are the different types of evaluation research?  How do these different types fit.
Logic Models and Theory of Change Models: Defining and Telling Apart
OVERVIEW Partners in Pregnancy is a community program aimed at giving young couples the resources they need with their pregnancy. Young pregnant couples.
Program Evaluation for Nonprofit Professionals Unit 1 Part 2: Evaluation and The Logic Model.
Center for Leadership Development Guarantee The Money: Making Your Case Through Program Evaluation.
Middle Leadership Programme Day 1: The Effective Middle Leader.
Community Planning Training 5- Community Planning Training 5-1.
Outcomes. What is an outcome? An outcome can be defined as the benefit or difference made to an individual as a result of an intervention
The Logic Model An Outcomes-Based Program Model. What is a Logic Model? “a systematic and visual way to present and share your understanding of the relationships.
Behavioral and Emotional Rating Scale - 2 Understanding and Sharing BERS-2 Information and Scoring with Parents, Caregivers and Youth May 1, 2012.
GET READY 1 Task 1: Assemble and Orient an Outcome Measurement Workgroup Workgroup: Task 3: Develop Timeline Task 4: Distribute your game plan to Key Players.
Assessment without Levels September Effective  Effective Assessment Systems should;  Give reliable information to parents about how their child,
Texas Homeless Education Office The University of Texas at Austin Charles A. Dana Center 2901 N IH 35,
1 A QTS Web Training Writing Consumer Education & Referral Outcomes.
Proposal Development Guidelines for Signature Grantee Semi-Finalists The Covenant Foundation.
Making it Count! Program Evaluation For Youth-Led Initiatives.
Behavioral and Emotional Rating Scale - 2 Parents, Caregivers and Youth Information on BERS-2 Parent Rating Scale April 13, 2012.
Performance Measurement 101. Performance Measurement Performance Measurement is: –The ongoing monitoring and reporting of program accomplishments and.
Session 2: Developing a Comprehensive M&E Work Plan.
Y1 SBT Workshop EYFS Input Please ensure you have registered your name before you take a seat.
Assessment of Advising Linda Taylor, PhD, LPC, NCC September 9, 2011.
Measuring the Results of Your Volunteer Efforts With acknowledgement to Nikki Russell, United Good Neighbors of Jefferson County.
New Assessment Routines: Update for Parents
Classroom Assessments Checklists, Rating Scales, and Rubrics
Bystander Intervention Training: Assessment IN ACTION
GFC M&E 101 THE BASICS OF MONITORING AND EVALUATION AT GFC
Rebecca McQuaid Evaluation 101 …a focus on Programs Rebecca McQuaid
Subject Development in History
Designing Effective Evaluation Strategies for Outreach Programs
M&E in HNP Operations: Lessons Learned in South Asia
CHAPTER 11 CSCP IN ACTION: A RAMP MIDDLE SCHOOL
The assessment process For Administrative units
Making CLAS Happen A Guide to Implementing Culturally and Linguistically Appropriate Services (CLAS) Hello, thank you for coming! Introduce myself Rodrigo.
Welcome to Year 7 – The start of a 5 year Learning Journey
PURPOSE AND SCOPE OF PROJECT MONITORING AND EVALUATION M&E.
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 2. Session 6. Developing indicators.
Using Logic Models in Program Planning and Grant Proposals
Program Logic Models Clarifying Your Theory of Change
Classroom Assessments Checklists, Rating Scales, and Rubrics
Looking at your program data
Logical Framework I want to design a project by planning out the logic
Logic Models and Theory of Change Models: Defining and Telling Apart
SUCCESSFUL MEASURING JAMES HARDIN, DIRECTOR, HEALTH IMPACT
How to use the FoH Tools with Patients/Clients – for use by Clinicians
PARENT INFORMATION SESSION
Goal Setting: Learning Objectives
Data Analysis Training for The SPSA Writing Process
Director of Family Partnership
École St. Angela Merici School year
Achieving Success in the Early Years Thursday 11th October 2018
OGB Partner Advocacy Workshop 18th & 19th March 2010
Training for 2018 Funded Program Evaluation form
Name of Your Outcome Presenter’s Name, Organization and
Values Based Goals Values-Based Goals: (45-50 minutes)
BOOTCAMP SOCIAL INNOVATION ACCELERATOR TO CREATE LASTING CHANGE
Presentation transcript:

Outcome Logic Model Guide

This is the outcome logic model form. A blank version of the model is available as a word document on The Community Foundation’s website: http://www.tcfrichmond.org/receive/apply-for-a-grant/

The Outcome Logic Model illustrates the relationship between the efforts and resources your organization’s staff invest in program’s quality and your program’s resulting outcomes. A successful model gives our Program Officers insights into how your program works and how your staff monitors program progress. The model is intended to organize metrics that your program staff are already collecting in a way that is easily readable for our staff – not to compel your nonprofit to measure more things. A measure is only useful and informative to us if it is useful and informative to your organization. The following pages will walk you through the process of completing the model.

Long-term outcomes describe your ambitious vision for your clients, and are usually hard to measure. For example, an afterschool program providing wrap-around services to middle school students might have a goal outcome that “children will succeed in school.”

Measures of program outcomes speak to observable changes in client: Intermediate outcome indicators are usually easier to track than long-term outcomes. These are the short–term program outcomes your staff monitor to track progress towards long-term goals. Measures of program outcomes speak to observable changes in client: Knowledge Behavior Attitudes Resources Health

Example performance measures speak to: The Immediate Performance & Quality indicators are the short–term program quality and performance metrics your organization tracks to ensure your program is on its way to achieving its short-term outcomes and long-term goals. Example performance measures speak to: The quantity and quality of resources invested in the program The breadth and depth of services provided Client satisfaction Client retention/attrition Program scale

The performance measure column is where you list exactly how you track and measure program quality. These measures should imply a number, e.g. average _, # __, %__, etc. For example: % Children meeting with counselor 3x a week % Parents satisfied with the program Program attrition rate Average # hours children spend in tutoring

Outcome Measures indicate how you measure changes in client knowledge, behavior, attributes, or attitudes. As with performance measures, these measures explain the numbers you’ll enter in the “last year’s results” and “this year’s target” columns. Examples: % of Children attending ≥95% of school days % Students maintaining a 3.0+ GPA or improving their GPA by 1 pt. % Students with fewer than one suspension per year

For the Last Year’s Results columns, report the most recent statistic for each performance and outcome measure. Understanding how your program has performed in the past helps us put your targets for the current year in context. As before, only enter #s in these cells (unless it’s a new measure, in which case you can enter “baseline year”).

For This Year’s Target, for each measure, report your goals for the grant year. This year’s target doesn’t necessarily need to be “higher” or more ambitious than last year’s results. Rather, the data in these columns should give our staff a sense of what we can reasonably expect from an investment in your program.

In the Data Source & Methods columns, provide information on how and how often you collect data for each indicator. Information helps us put your results in context, and whether a statistic is reliable or should be interpreted with caution.

A completed model might look something like this. The form is meant to give you the flexibility you need to enter metrics that your program is already tracking. You can provide as many measures as you like. Feel free to delete or add rows as needed. As a guideline, we usually prefer to see at least two performance measures and one outcome measure.

Please feel free to contact me if you have questions or need assistance. Kaitlyn Wark Program Evaluation Officer (804) 330-7400 ext. 157 kwark@tcfrichmond.org