Using Data to Successfully Drive Your Program: Program Evaluation and Evidence Informed Respite Programs MaryJo Alimena Caruso & Jennifer Abernathy.

Slides:



Advertisements
Similar presentations
Program Development & Logic Model
Advertisements

Empowering tobacco-free coalitions to collect local data on worksite and restaurant smoking policies Mary Michaud, MPP University of Wisconsin-Cooperative.
From Research to Advocacy
Proposal Development Guidelines for Signature Grantee Semi-Finalists The Covenant Foundation.
Using Logic Models in Program Planning and Grant Proposals The Covenant Foundation.
Goals, Outcomes and Program Evaluation Community Memorial Foundation March 5, 2014.
Overview M&E Capacity Strengthening Workshop, Maputo 19 and 20 September 2011.
Donald T. Simeon Caribbean Health Research Council
Presentation at The Conference for Family Literacy Louisville, Kentucky By Apter & O’Connor Associates April 2013 Evaluating Our Coalition: Are We Making.
Making a Difference: Measuring Your Outcomes Montgomery County Volunteer Center February 4, 2014 Pam Saussy and Barry Seltser, Consultants.
Local Control and Accountability Plan: Performance Based Budgeting California Association of School Business Officials.
Grantee Program Plan. Components of the Program Plan Goals  Objectives  Activities  Techniques & Tools  Outcomes/Change.
Designing an Effective Evaluation Strategy
1 Theory of Change Chesapeake Bay Funders Network Program Evaluation Training Workshop OMG Center for Collaborative Learning January 9-10, 2008.
Part Two: Organizational Domains and Considerations Defining and Applying Cultural Competence for Kansas SPF-SIG Prevention Programs and Services.
Logic Modeling for Success Dr Kathryn Wehrmann Illinois State University.
Reality Check For Extension Programs Deborah J. Young Associate Director University of Arizona Cooperative Extension.
Evaluation. Practical Evaluation Michael Quinn Patton.
OUTCOME MEASUREMENT TRAINING Logic Models OBJECTIVES FOR TODAY: n Recognize and understand components of a logic model n Learn how to create a logic.
Evaluating Physical Activity Intervention Programs Thomas Schmid, PhD Physical Activity and Health Branch CDC Atlanta, Georgia, USA.
Sustaining Local Public Health and Built Environment Programs Fit Nation NYC November 3, 2011 Annaliese Calhoun.
Sunday 29 March – Thursday 2 April 2009, CAIRO, EGYPT
Molly Chamberlin, Ph.D. Indiana Youth Institute
NAVIGATING THE WATERS: USING ASSESSMENT TO MAKE A DIFFERENCE Amy Harper, Area Coordinator, Fordham University Greer Jason, PhD, Assistant Dean of Students,
1 Qualitative Evaluation Terms Coding/categorization The process of condensing qualitative data through the identification of common themes. Data Matrix.
Reporting and Using Evaluation Results Presented on 6/18/15.
PROGRAM PLANNING, IMPLEMENTATION & EVALUATION The Service Delivery Model Link Brenda Thompson Jamerson, Chair Services to Youth Facet May 8-12, 2013.
Edward M. Haugh Jr. ESC Consultant. III. Recommendations for Applying Outcomes Planning to ESC  I. Introduction to Outcomes Planning II. A Sample ESC.
Methods in Implementing an Effective CQI Program in a Social Services Setting Mid-Cumberland Community Services Agency Nashville, Tennessee Shirley Crawford,
Step 6: Implementing Change. Implementing Change Our Roadmap.
Evaluation Design Matching Logic Model Selecting Indicators Ben Silliman Youth Development Specialist NC 4-H.
Saginaw KEYS Data Analysis Training for Continuous School Improvement March 20 and 21, 2006 Jacques Nacson Gary Obermeyer.
Measuring the Impact WP5 & WP9 ECHO Utrecht / KinderUni Wien Hendrik Asper (ECHO)
MEET U.S. Performance Measurement Confidential – Do not Distribute NP STRATEGIES MEASURING PERFORMANCE IN THE NONPROFIT ORGANIZATION MEET U.S. In-Region.
Preparing for the Main Event Using Logic Models as a Tool for Collaboratives Brenda M. Joly Leslie M. Beitsch August 6, 2008.
1 Evaluation Overview and Findings From An Interim Evaluation of Grants to Green Presented by: Highland Communications, LLC Jennifer Ballentine, MPH October.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Rethinking Homelessness Their Future Depends on it!
Outcome Based Evaluation for Digital Library Projects and Services
EVALUATION RESEARCH. Evaluation Research  How do we begin?  What are the different types of evaluation research?  How do these different types fit.
Evelyn Gonzalez Program Evaluation. AR Cancer Coalition Summit XIV March 12, 2013 MAKING A DIFFERENCE Evaluating Programmatic Efforts.
Welcome! Please join us via teleconference: Phone: Code:
Logic Models and Theory of Change Models: Defining and Telling Apart
 How do we begin?  What are the different types of evaluation research?  How do these different types fit together?  What purpose do they serve?
1 Using Logic Models to Enhance Evaluation WESTAT Center to Improve Project Performance (CIPP) Office of Special Education Programs Amy A. Germuth, Ph.D.
Indicators for ACSM.
Extension Advisory Leadership Systems A Partnership for Extension Programming Laurie Cantrell, MS Program Development Specialist Family and Consumer Sciences.
Mapping the logic behind your programming Primary Prevention Institute
Monitoring and Evaluation
Logic Models: Laying the Groundwork for a Comprehensive Evaluation Office of Special Education Programs Courtney Brown, Ph.D. Center for Evaluation & Education.
Planning Evaluation Setting the Course Source: Thompson & McClintock (1998) Demonstrating your program’s worth. Atlanta: CDC/National Center for Injury.
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
INTRODUCING THE PSBA-GTO ACT FOR YOUTH CENTER OF EXCELLENCE IN CONSULTATION WITH HEALTHY TEEN NETWORK Planning for Evidence-Based Programming.
Research at Girl Scouts of Northern Illinois Emily Keilback, M.A., CFRE Chief Advancement Officer.
Module 3: Ensuring Sustainability Session 1. Empowerment of the Community 1 CBDRR Framework Training - Myanmar Red Cross Society.
WIS DOT MCLARY MANAGEMENT PERFORMANCE MEASUREMENT.
Continuous Quality Improvement: Our Desired State The Vision for Continuous Quality Improvement (CQI)
PACFaH M&E TOOLS AND REPORTNG TIMELINES Jayne Arinze-Egemonye.
Logic Models How to Integrate Data Collection into your Everyday Work.
Evaluating the Quality and Impact of Community Benefit Programs
Rebecca McQuaid Evaluation 101 …a focus on Programs Rebecca McQuaid
Enacting Multiple Strategies and Limiting Potential Successes: Reflections on Advocacy Evaluation, Competing Objectives, and Pathways to Policy Change.
Introduction to Program Evaluation
Program Evaluation Essentials-- Part 2
Outcomes and Evidence Based Programming
Implementation, Monitoring, and NM DASH
What is a Logic Model? A depiction of a program showing what the program will do and what it is to accomplish. A series of “if-then” relationships that.
Changing the Game The Logic Model
Part B: Evaluating Implementation Progress and Impact of Professional Development Efforts on Evidence-Based Practices   NCSI Cross-State Learning Collaboratives.
What is a Logic Model? A depiction of a program showing what the program will do and what it is to accomplish. A series of “if-then” relationships that.
Presentation transcript:

Using Data to Successfully Drive Your Program: Program Evaluation and Evidence Informed Respite Programs MaryJo Alimena Caruso & Jennifer Abernathy

Some Guiding Questions for Today Why evaluate? What are some promising ways to approach evaluation? What are some examples of program specific evaluation tools and tools that can be used cross program / cross state? What are some of the goals of evaluation? What are some of the myths surrounding evaluation? How can evaluation be put to good use (and create a win / win situation)?

What is IT you want to evaluate? Implementation Science tell us that successful implementation can be measured if we know what our “it” is and then collect and use data to determine: ▫Did IT make a difference? ▫What kind of difference did IT make? ▫How much of a difference did IT make? ▫Was IT worth it? Cost effectiveness Know it, do it, determine the impact… So, what’s our strategy?

Some Promising Ways to approach Evaluation Use a logic model to guide evaluation strategy Develop indicators that measure progress towards short-and long-term outcomes Identify or create a tool to measure them Interpret the data Use the data ~ inform funders, budgeting, CQI

What is a Logic Model? A graphic that represents what your program hopes to accomplish, what it is doing, and what its impacts are on target participants and the community A logic model guides a respite program by aiding in strategic planning and the development of effective communications among leadership, staff, constituents and the community.

UW-Extension University of Wisconsin - Cooperative Extension Program Development & Evaluation © 2003

How Can a Logic Model Guide Evaluation of Respite Programs? ▫Promotes a process of continuous learning and improvement ▫Helps you identify whether there are logical linkages between inputs and desired outcomes ▫Helps you identify indicators of progress towards outcomes ▫Helps you distinguish between measures of effort and measures of effect

Considerations for Data Collection Programs are sometimes confused between assessing the number of people they reach and evaluating the actual impact of the program’s services. How can they shift to measuring the latter?

What types of data can we collect? Process data (numbers served, services provided, demographics) Outcome data (client changes) Fidelity data Satisfaction data (families, practitioners with implementation assistance) Other? Is one more important than the other? Do they all have relevance?

Considerations for Data Management Collect the Data ▫Select the tool ▫Train staff to use the tool (including informed consent) ▫Identify data collection points ▫Define sample size for analysis ▫Administer the tool Enter the data ▫Have / create a data base system ▫Train staff on data entry Analyze the data ▫Develop and disseminate reports ▫Meet with staff to review results ▫Identify necessary changes

Examples of Data Collection Tools

CB: Before you were matched with your CareBreak volunteer, how “stressed” would you say you were Frequen cy Percent Valid Percent Cumulative Percent Valid moderately stressed stressed very often extremely stressed Total D/O: Before starting the Day & Overnight Respite Camp, how "stressed" would you say you were Frequen cy Percent Valid Percent Cumulative Percent Valid moderately stressed stressed very often extremely stressed Total Pre Service (Data Table) Post Service (Bar graph)

How do we put evaluation data to good use? Use more than one evaluation measure Gather Qualitative data / input (stories) Use Data as an implementation driver Identify Expectations for CQI ▫Staff training ▫Reporting and dissemination to stakeholders Document Activities for CQI ▫Document service adjustments ▫Revisit your logic model

Other Uses for Evaluation Data Raise awareness of promising practices Support programs’ improvement efforts Enhance programs’ sustainability Other???

What’s in our future? Challenges? Learnings? Successes? Ongoing Strategies?

Contact Information Jennifer Abernathy Tennessee Respite Coalition 19 Music Square West, Suite J Nashville, TN MaryJo Alimena Caruso CareBreak at the Watson Institute 301 Camp Meeting Road Sewickley, PA