Evaluating REACH Funded Projects Using the REACH Theory of Change.

Slides:



Advertisements
Similar presentations
Living (through) the Theory of Change Gwen Martin, Ph.D. EVALYTICS LLC.
Advertisements

Local Control and Accountability Plan: Performance Based Budgeting California Association of School Business Officials.
Title I Schoolwide Providing the Tools for Change Presented by Education Service Center Region XI February 2008.
How to Evaluate Your Health Literacy Project Jill Lucht, MS Project Director, Center for Health Policy
Designing an Effective Evaluation Strategy
2014 AmeriCorps External Reviewer Training Assessing Need, Theory of Change, and Logic Model.
Decision Making Tools for Strategic Planning 2014 Nonprofit Capacity Conference Margo Bailey, PhD April 21, 2014 Clarify your strategic plan hierarchy.
Program Evaluation. Lecture Overview  Program evaluation and program development  Logic of program evaluation (Program theory)  Four-Step Model  Comprehensive.
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
1 Why is the Core important? To set high expectations – for all students – for educators To attend to the learning needs of students To break through the.
Evaluation is a professional and ethical responsibility and is a core part of PHN professional practice Commitment to evaluation helps build the PHN intelligence.
The Lumina Center Grantseeking Workshop Series Presents Outcomes & Evaluations April 20, 2006.
2014 AmeriCorps State and National Symposium How to Develop a Program Logic Model.
How to Write Goals and Objectives
Questions from a patient or carer perspective
Action Planning Guidance Illinois Public Health Institute.
Developing Earmark Grant Performance Measures: Grant Proposal Section 3 Deanna Khemani.
Community Planning Training 1-1. Community Plan Implementation Training 1- Community Planning Training 1-3.
Community Planning Training 4- Community Planning Training 4-1.
Molly Chamberlin, Ph.D. Indiana Youth Institute
How to Develop the Right Research Questions for Program Evaluation
2014 AmeriCorps External Reviewer Training
Performance Measurement and Evaluation Basics 2014 AmeriCorps External Reviewer Training.
Organization Mission Organizations That Use Evaluative Thinking Will Develop mission statements specific enough to provide a basis for goals and.
Investing in Change: Funding Collective Impact
1. 2 Why is the Core important? To set high expectations –for all students –for educators To attend to the learning needs of students To break through.
Creating a New Vision for Kentucky’s Youth Kentucky Youth Policy Assessment How can we Improve Services for Kentucky’s Youth? September 2005.
Performance Measures AmeriCorps Project Director Training Saratoga, NY October 7 th – 9 th, 2013.
Indicators of Success -- Applying the TOC What will change? You must be able to test your theory!
Abu Raihan, MD, MPH Director of Program, Asia IAPB 9th GA, Hyderabad, September 2012 Symposium 6: Challenges in Monitoring.
Training of Process Facilitators Training of Process Facilitators.
The Evaluation Plan.
Evaluation Basics Principles of Evaluation Keeping in mind the basic principles for program and evaluation success, leaders of youth programs can begin.
Logic Models Handout 1.
Program Evaluation and Logic Models
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Evaluation Assists with allocating resources what is working how things can work better.
Rethinking Homelessness Their Future Depends on it!
Outcome Based Evaluation for Digital Library Projects and Services
Overview of Evaluation Designs. Learning objectives By the end of this presentation, you will be able to: Explain evaluation design Describe the differences.
Logic Models and Theory of Change Models: Defining and Telling Apart
DISTRICT MANAGEMENT COUNCIL ACADEMIC RETURN ON INVESTMENT (A-ROI)
Boston Geneva San Francisco Seattle Beginning the Evaluation Journey with FSG KCIC Boot Camp March 24, 2010 Prepared for:
1 Using Logic Models to Enhance Evaluation WESTAT Center to Improve Project Performance (CIPP) Office of Special Education Programs Amy A. Germuth, Ph.D.
Prepared by the North Dakota State Data Center July HNDECA and ECCS Evaluation Dr. Richard Rathge Professor and Director North Dakota State Data.
1. Housekeeping Items June 8 th and 9 th put on calendar for 2 nd round of Iowa Core ***Shenandoah participants*** Module 6 training on March 24 th will.
1 Nemours Health Planning and Evaluation Collaborative Learning Session I: Designing Effective Health Programs.
Evaluation: from Objectives to Outcomes Janet Myers, PhD MPH AIDS Education and Training Centers National Evaluation Center
Proposal Development Guidelines for Signature Grantee Semi-Finalists The Covenant Foundation.
 2007 Johns Hopkins Bloomberg School of Public Health Section B Logic Models: The Pathway Model.
Using Logic Models to Create Effective Programs
Performance Measurement 101. Performance Measurement Performance Measurement is: –The ongoing monitoring and reporting of program accomplishments and.
Basic Concepts of Outcome-Informed Practice (OIP).
Session 2: Developing a Comprehensive M&E Work Plan.
RE-AIM Framework. RE-AIM: A Framework for Health Promotion Planning, Implementation and Evaluation Are we reaching the intended audience? Is the program.
Debunking Theories of Change
Assessment/Evaluation Make evaluation a central part of planning – not an afterthought 1) Determine Needs 2) Determine Desired Outcomes 3) Determine Activities.
Program Planning for Evidence-based Health Programs.
Selection Criteria and Invitational Priorities School Leadership Program U.S. Department of Education 2005.
Stages of Research and Development
Logic Models How to Integrate Data Collection into your Everyday Work.
Designing Effective Evaluation Strategies for Outreach Programs
Introduction to Program Evaluation
Logic Models and Theory of Change Models: Defining and Telling Apart
SUCCESSFUL MEASURING JAMES HARDIN, DIRECTOR, HEALTH IMPACT
What is a Logic Model? A depiction of a program showing what the program will do and what it is to accomplish. A series of “if-then” relationships that.
Troubleshooting Logic Models
BOOTCAMP SOCIAL INNOVATION ACCELERATOR TO CREATE LASTING CHANGE
What is a Logic Model? A depiction of a program showing what the program will do and what it is to accomplish. A series of “if-then” relationships that.
Presentation transcript:

Evaluating REACH Funded Projects Using the REACH Theory of Change

Evaluation is a systematic* method for collecting, analyzing, and using information to answer basic questions about a program What is Evaluation?

 Help you tease out why a program works or doesn’t and under what conditions  Establishes an evidence-base that we all want and need  Improves your staff’s practice with patients, clients or consumers  Show how patients, clients or consumers benefit from the program  Helps the Foundation determine which strategies to continue to invest in Why does the Foundation ask that you include evaluation in your proposal?

 Hundreds of different types of evaluations; two most common are:  Implementation evaluation assesses whether a program was implemented as planned, whether the intended target population was reached, and the major challenges and successful strategies associated with program implementation  Outcome evaluation determines whether, and to what extent, the expected changes in health outcomes occur (also called Impact) and whether these changes can be attributed to the program. Types of Evaluations

 REACH’s TOC provides guidance on where to focus your evaluation and what to measure  REACH’s evaluation requirements for your grant ask you to focus your evaluation on key elements of the TOC:  Impact (the change in patient health outcomes)  Outcomes (the change in access or quality)  Implementation (aka – Measures of Execution) How does Evaluation relate to REACH’s Theory of Change?

What is theory of change? REACH Healthcare Foundation’s Framework for Investments and Evaluation

Why we use a Theory of Change Approach

 TOC is the Foundation’s roadmap that visually shows how we think our work will lead to our desired impact  TOC is a representation of the Board approved strategic plan and is informed by published evidence, staff experience, and advice from experts in the field  TOC defines the key building blocks required to bring about a given long-term impact. This set of connected building blocks - impacts, outcomes, strategies, barriers is depicted on a map known as a pathway of change/change framework REACH Theory of Change

 A good TOC tells you...  what impacts we seek through our investments in your work  what outcomes are necessary to attain the desired impact, and  which strategies are theorized to bring about positive change in outcomes Theory of Change

 Five components of REACH’s TOC:  Expected Impact of your work  Expected Outcome of your work  Evidence-Based or Promising Strategies  Barriers – What we seek to break through  Indicators (on page 2 of TOC) general description of the kinds of changes we desire in outcomes and impact Theory of Change

Examples of Theory of Change (From Simple to the Absurdly Complex)

Communities for Teaching Excellence

Legacy LA Empowers youth to become leaders in their lives and their communities

Accountability Lab

Involving Parents in their Child’s Education – Theory of Change

TIG Theory of Change 2010

Improving Supply Chains for Community Care Management of Chronic Diseases

Indicators

Digging into the REACH Theory of Change A deeper dive into the components of the Theory

Definitions Reading the Theory of Change

 Impact – The desired change in health outcomes in the target population  Outcome – The necessary theorized precursors to change in health outcomes – in the REACH TOC we believe that increasing access and improving the quality of services for our target population will lead to our desired impact  Indicator – Indicators tell us how success will be recognized. In the REACH TOC indicators represent a category of potential metrics  Metric – The specific behavior, condition, or status that will be measured – Metrics must be operational. By operational we mean that they include enough detail for us to be able to measure it Key Terms

 The desired change in health outcomes in the target population  What our investments are designed to accomplish  Two impacts we seek: improving health outcomes and achieving equity in outcomes, access and quality  Grantees asked to select one or more  Impacts may take years to accomplish - Incremental improvements are expected though Digging In: Long-Term Impact

 REACH asks grantees to select an indicator and describe one or more metrics to measure the impact of their project on patient health outcomes  The indicators in the TOC represent broad categories of specific metrics you could choose  Very often the metrics you choose are ones you are already using Measuring your Impact - Indicators

 Early or intermediate outcome – can be achieved within a few years  The measurable change in the health care provider organization and larger health care system – precursors to impact  Focus is on two outcomes: increasing access to high quality services or improving quality of the services delivered – must chose at least 1 Digging into Outcomes

 REACH asks grantees to measure the change in the outcomes of their project in terms of patient access and/or the quality of services patients receive.  The indicators in the TOC represent broad categories of specific metrics – you must select at least 1 indicator and propose at least one metric  Very often the metrics you choose to measure are ones you are already using Measuring your Outcomes - Indicators

Our focus so far – What you measure as a condition of REACH grant Outcomes Evaluation

 Strategies are what you have proposed to implement (and what we fund you to do)  Their presence in the TOC indicates that they have an evidence-base or are promising as a practice to bring about change in the outcomes  Two kinds of strategies: those that will increase access to care and those that will improve the quality of care received Digging into Strategies

 Strategies require execution – how well your organization executes the strategy is the focus of implementation evaluation  More extensive evaluations of specific strategies will have identified implementation benchmarks and quality thresholds or standards  REACH has not imposed a rigorous implementation evaluation requirement – instead, we ask that you document implementation by answering a series of key questions – Measures of Execution Measuring Strategy Implementation

1. Did you do what you said you would do? Any modifications? 2. What are your standards of quality and did you meet them? 3. Is project on pace to be successfully implemented? 4. Were the right patients recruited? (Screening Criteria) 5. Were your clients satisfied with the services received? 6. Did your partners/collaborators perform as expected? Measures of Execution

Preparing the Evaluation Section of your Program Proposal Looking at the Proposal Template for Program Grants

Your Plan to Measure Impact Impact (from REACH Theory of Change): The impact this project will have on the patients served is: Improve health outcomes for uninsured and medically underserved people. Indicator of Long-Term Impact SampleMetricBaselineTarget Goal 1. Improvements in health outcomes associated with chronic diseases (please specify: hypertension) 120 adult uninsured Hispanic males seen in clinic with other diagnoses and suspected hypertension SBP/DBP10% have SBP < 140 mm Hg and/or DBP < 70 mm Hg 90% will have SBP < 140 mm Hg or DBP < 90 mm Hg

Your Plan to Measure Outcomes REACH Outcome (from REACH Theory of Change): The outcome of this project is: Improved quality of health care services The REACH strategy this project will implement to achieve this outcome is: Care coordination and/or intensive case management/disease management (for hypertensive adult males). Outcome IndicatorSample and MetricBaseline Target Goal and Timeframe Source of Data 1. Increase in patient knowledge, satisfaction and/or engagement in health care decisions % of 120 adult uninsured Hispanic male participating in the care coordination project. Metric: patient knowledge, satisfaction and/or engagement in care decisions 22% knowledgeable 18% satisfied 6% engaged By the end of the grant term 90% will report feeling knowledgeable, satisfied, and engaged Patient Satisfaction Survey administered at entry into the project and within 2 weeks of conclusion of the grant term

Evaluation of the REACH Theory of Change How REACH evaluates our Investments

 REACH requests and aggregates data from our grantees and partners on these measures of the components of the theory of change in order to:  Test the fidelity of the theory of change; and  track whether change in the inputs is indeed leading to change "downstream" in the outcomes and impacts.  Where investments in specific strategies fail to bring about threshold levels (meaningful) of change in outcomes after a sustained period of investment we ask ourselves six questions: How REACH evaluates our Theory of Change

1)Is the organization implementing the strategy the right organization? 2)Was the selected strategy the right strategy to address the needs of the target population? 3)Was the strategy implemented with fidelity and consistency over sufficient time to allow for change to occur? Questions REACH Asks When No Meaningful Change in Impact or Outcomes is Found

4)Were the size and duration of the investment sufficient to allow change in outcomes and impact to occur? 5)Were the processes used by the grantee to create the conditions and capacity necessary to implement the strategy successful? 6)Were the pre-existing conditions or influences in the organization, project, and/or community considered and addressed by the grantee such that negative influences were suppressed to allow for the potential effects of the strategy to be realized? Questions REACH Asks When No Meaningful Change in Impact or Outcomes is Found