Changing the Game The Logic Model

Slides:



Advertisements
Similar presentations
Results Based Monitoring (RBM)
Advertisements

The Value of What We Do Dan Phalen US EPA Region 10.
Managing and measuring organizational performance Brent Stockwell, Strategic Initiatives Director Scottsdale City Manager’s Office
The Lumina Center Grantseeking Workshop Series Presents Outcomes & Evaluations April 20, 2006.
Logic Models. A logic model is your program ROAD MAP. Where are you trying to go? How are you trying to get there? What will tell you that you’ve arrived?
OUTCOME MEASUREMENT TRAINING Logic Models OBJECTIVES FOR TODAY: n Recognize and understand components of a logic model n Learn how to create a logic.
How to Develop the Right Research Questions for Program Evaluation
How to Focus and Measure Outcomes Katherine Webb-Martinez Central Coast and South Region All Staff Conference April 23, 2008.
1 Qualitative Evaluation Terms Coding/categorization The process of condensing qualitative data through the identification of common themes. Data Matrix.
2014 AmeriCorps External Reviewer Training
Chapter 1 Marketing Strategy Chapter 1 Strategic Market Planning.
PROGRAM PLANNING, IMPLEMENTATION & EVALUATION The Service Delivery Model Link Brenda Thompson Jamerson, Chair Services to Youth Facet May 8-12, 2013.
Edward M. Haugh Jr. ESC Consultant. III. Recommendations for Applying Outcomes Planning to ESC  I. Introduction to Outcomes Planning II. A Sample ESC.
Measuring the Value of Your Volunteer Efforts Nikki Russell Volunteer Initiatives Manager United Way of King County.
Indicators of Success -- Applying the TOC What will change? You must be able to test your theory!
The Evaluation Plan.
Outcome Based Evaluation for Digital Library Projects and Services
Specialty Crop Block Grant Program –Farm Bill CFDA –
Logic Models and Theory of Change Models: Defining and Telling Apart
Developing a logic model Western Region Institute Brian Luckey, University of Idaho Extension 1 © 2008 by the University of Wisconsin System..
LIBERIA THE BIG PICTURE Can the Agency tell the truth about results? Can the participants tell the truth about results?
Julie R. Morales Butler Institute for Families University of Denver.
Monitoring and Evaluation of GeSCI’s Activities GeSCI Team Meeting 5-6 Dec 2007.
UK Aid Direct Introduction to Logframes (only required at proposal stage)
The Need for a Balanced Measurement System Using Different Perspectives to Create Meaningful Measures Bill Rabung and Brad Sickles U.S. Department of Labor.
Monitoring and Evaluation
Proposal Development Guidelines for Signature Grantee Semi-Finalists The Covenant Foundation.
TCRF Strategic Planning Process A Stakeholders’ Consultative Retreat- Morogoro 26 th -27 April 2013.
Does Your HHW Education Program Work? Workshop on Program Evaluation – Evaluation Logic Model NAHMMA Conference September 22, 2006 Tacoma, WA Trudy C.
School Development Goal Development “Building a Learning Community”
Session 2: Developing a Comprehensive M&E Work Plan.
Assessment/Evaluation Make evaluation a central part of planning – not an afterthought 1) Determine Needs 2) Determine Desired Outcomes 3) Determine Activities.
ADRCs Do What? Using Logic Models to Document and Improve ADRC Outcomes Glenn M. Landers.
Measuring the Results of Your Volunteer Efforts With acknowledgement to Nikki Russell, United Good Neighbors of Jefferson County.
Strategic planning A Tool to Promote Organizational Effectiveness
SUPPORT FOR YOUR STUDENT EQUITY PLAN
Logic Models How to Integrate Data Collection into your Everyday Work.
Evaluating the Quality and Impact of Community Benefit Programs
National Coalition Academy Summary
The School Improvement Planning Process
Project monitoring and evaluation
Rebecca McQuaid Evaluation 101 …a focus on Programs Rebecca McQuaid
Designing Effective Evaluation Strategies for Outreach Programs
Gender-Sensitive Monitoring and Evaluation
Gender-Sensitive Monitoring and Evaluation
Using Logic Models in Program Planning and Grant Proposals
Program Logic Models Clarifying Your Theory of Change
Strategic Planning for Learning Organizations
OUTCOME MEASUREMENT TRAINING
Ross O. Love Oklahoma Cooperative Extension Service
Communicate the Impact of Poor Cost Information on a Decision
Communicate the Impact of Poor Cost Information on a Decision
Foundations of Planning
Logic Models and Theory of Change Models: Defining and Telling Apart
Communicate the Impact of Poor Cost Information on a Decision
Introduction to CPD Quality Assurance
ANALYSIS AND DESIGN Case Study Work Session 2 From Concept to Reality
Logic Model, Rubrics & Data collection tools
CATHCA National Conference 2018
What is a Logic Model? A depiction of a program showing what the program will do and what it is to accomplish. A series of “if-then” relationships that.
Evaluation: Framing Outcomes For Impact
Logic Model of Program Performance
A Focus on Strategic vs. Tactical Action for Boards
Assessment of Service Outcomes
Copyright © 2005 Prentice Hall, Inc. All rights reserved.
Using Logic Models in Project Proposals
BOOTCAMP SOCIAL INNOVATION ACCELERATOR TO CREATE LASTING CHANGE
What is a Logic Model? A depiction of a program showing what the program will do and what it is to accomplish. A series of “if-then” relationships that.
Presentation transcript:

Changing the Game The Logic Model

The Government Performance and Results Act (1993) (Public Law 103-62) often called the Results Act Budget deficits and an American public demanding a leaner, less costly government prompted a re-examination of what agencies do and the costs involved. Provide consensus among agencies, Congress, and customers on program goals, strategies, and appropriate measures of success.

Accounting for Success Previously federal agencies used funding allocations, the level of staff deployed, or the number of tasks completed as measurements of performance. business environment requires a different orientation—one that focuses on results. Agencies are being held accountable less for inputs and outputs than for outcomes. For example --a federal employment training program that traditionally measured its success by the number of training participants (an output). Under GPRA, a more meaningful measurement would be changes in the wage levels of its graduates (an outcome). A more general name for this is value-added outcomes.

Performance Measurement means of assessing progress against stated goals and objectives in a way that is unbiased and quantifiable. emphasis on objectivity, fairness, consistency, and responsiveness. functions as a reliable indicator of an organization’s long-term health. Its impact on an organization can be both immediate and far-reaching.

What does success really mean? For outputs and outcomes, and it requires managers to examine how operational processes are linked to goals. program performance is evaluated not on the basis of the amount of money that is spent or the types of activities that are performed, but on whether a program has produced real, tangible results. GPRA requires that each federal agency produce strategic plans that cover at least five years. Intended to be the starting point for each agency’s performance measurement efforts, these strategic plans should: 1. Include the agency’s mission statement 2. Identify the agency's long-term strategic goals 3. Describe how the agency intends to achieve those goals through its activities and through its human, capital, information, and other resources.

Agency Missions The mission statements required by GPRA strategic plans are designed to bring agencies into sharper focus. Why the agency exists what it does describe how it does it. The strategic goals that follow should be an outgrowth of this clearly stated mission. Only when an agency has a true sense of who it is can it align its activities to support mission-related goals and make linkages between levels of funding and their anticipated results. Federal Emergency Management Agency (FEMA) had traditionally concentrated its efforts on post-disaster assistance. By reexamining mission performance, and by restructuring their programs to support it, FEMA concluded that all emergencies share common traits and pose common demands. Therefore, they should be approached functionally. With this new information in hand, FEMA instituted an "all-hazard" mission that takes a multifaceted, sequential approach to managing disaster—mitigation, preparedness, response, and recovery

A logic model is… A depiction of a program showing what the program will do and what it is to accomplish. A series of “if-then” relationships that, if implemented as intended, lead to the desired outcomes The core of program planning and evaluation

Simplest form INPUTS OUTPUTS OUTCOMES

LOGIC MODEL Differences the principles of reasoning reasonable the relationship of elements to each other and a whole MODEL small object representing another, often larger object (represents reality, isn’t reality) preliminary pattern serving as a plan tentative description of a system or theory that accounts for all its known properties

Logic model aka Theory of change Program action Model of change Conceptual map Outcome map Program logic

Era of accountability What gets measured gets done If you don’t measure results, you can’t tell success from failure If you can’t see success, you can’t reward it If you can’t reward success, you’re probably rewarding failure If you can’t see success, you can’t learn from it If you can’t recognize failure, you can’t correct it. If you can demonstrate results, you can win public support.

What logic model is not… A theory Reality An evaluation model or method It is a framework for describing the relationships between investments, activities, and results. It provides a common approach for integrating planning, implementation, evaluation and reporting.

Assumptions Assumptions underlie much of what we do. It is often these underlying assumptions that hinder success or produce less-than-expected results. One benefit of logic modeling is that it helps us make our assumptions explicit. The beliefs we have about the program, the participants, and how the program will work. Includes ideas about: the problem or existing situation program operations expected outcomes and benefits the participants and how they learn, behave, their motivations resources staff external environment: influences the knowledge base etc.

Indicators How will you know it when you see it? What will be the evidence? What are the specific indicators that will be measured? Often expressed as #, % Can have qualitative indicators as well as quantitative indicators

Logic model with indicators for Outputs and Outcomes Farmers practice new techniques Farm profitability increases Program implemented Targeted farmers Farmers learn Number of workshops held Quality of workshops Number and percent of farmers attending Number and percent who increase knowledge Number and percent who practice new techniques Number and percent reporting increased profits; amount of increase

Methods of data collection SOURCES OF INFORMATION Existing data Program records, attendance logs, etc Pictures, charts, maps, pictorial records Program participants Others: key informants, nonparticipants, proponents, critics, staff, collaborators, funders, etc. DATA COLLECTION METHODS Survey Interview Test Observation Group techniques Case study Photography Document review Expert or peer review