1 Outcomes and Program Improvement: Designing Effective Evaluations Presented by Christine A. Ameen, Ed.D. Ameen Consulting & Associates

Slides:



Advertisements
Similar presentations
Program Evaluation. Overview and Discussion of: Objectives of evaluation Process evaluation Outcome evaluation Indicators & Measures Small group discussions.
Advertisements

Professional Learning Communities Connecting the Initiatives
Conceptual Feedback Threading Staff Development. Goals of Presentation What is Conceptualized Feedback? How is it used to thread the development of staff?
Performance Management
Educational Specialists Performance Evaluation System
Developing the Learning Contract
M & E for K to 12 BEP in Schools
Comprehensive Organizational Health AssessmentMay 2012Butler Institute for Families Comprehensive Organizational Health Assessment Presented by: Robin.
Aligning Employee Performance with Agency Mission
Prince George’s County Human Services Coalition Funders Panel Presenter: Renette Oklewicz Director, Foundation Programs January 11, 2012.
Determining Your Program’s Health and Financial Impact Using EPA’s Value Proposition Brenda Doroski, Director Center for Asthma and Schools U.S. Environmental.
Catulpa Community Support Services.  Use of an electronic data entry program to record demographic data and case notes to reflect service delivery 
The Department of Communications and Engagement Jimmy Lee Peterkin, Jr., MBA District Business and Community Partnership Coordinator
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
Service Agency Accreditation Recognizing Quality Educational Service Agencies Mike Bugenski
1 WIA YOUTH PROGRAM Case Management. 2 ò Case management is a youth-centered, goal- oriented process for assessing needs of youth for particular services.
Family Resource Center Association January 2015 Quarterly Meeting.
The Lumina Center Grantseeking Workshop Series Presents Outcomes & Evaluations April 20, 2006.
Chapter 3 Preparing and Evaluating a Research Plan Gay and Airasian
PHAB's Approach to Internal and External Evaluation Jessica Kronstadt | Director of Research and Evaluation | November 18, 2014 APHA 2014 Annual Meeting.
Chapter 15 Evaluation.
How to Write Goals and Objectives
Coaching Workshop.
Community Planning Training 1-1. Community Plan Implementation Training 1- Community Planning Training 1-3.
A Tool to Monitor Local Level SPF SIG Activities
Organization Mission Organizations That Use Evaluative Thinking Will Develop mission statements specific enough to provide a basis for goals and.
Sociology 3322a. “…the systematic assessment of the operation and/or outcomes of a program or policy, compared to a set of explicit or implicit standards.
The Evaluation Plan.
NorthSky Nonprofit Network Creating Customer Satisfaction Surveys Presented by Christine A. Ameen, Ed.D. Ameen Consulting & Associates
PANAMA-BUENA VISTA UNION SCHOOL DISTRICT
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Working Definition of Program Evaluation
Elizabeth Godfrey 1.  Periodic assessment of results Appropriateness, effectiveness, efficiency, impact, sustainability  Identifies intended and unintended.
Evelyn Gonzalez Program Evaluation. AR Cancer Coalition Summit XIV March 12, 2013 MAKING A DIFFERENCE Evaluating Programmatic Efforts.
Promoting the Success of a New Academic Librarian Through a Formal Mentoring Program The State University of West Georgia Experience By Brian Kooy and.
TRACKING AND REPORTING PROGRESS AND CONTINUOUS IMPROVEMENT AmeriCorps Program Directors’ Kickoff: 2015 –
Welcome! Please join us via teleconference: Phone: Code:
Logic Models and Theory of Change Models: Defining and Telling Apart
Director of Evaluation and Accountability Manager, UW’s Grand Rapids, Michigan Robert McKown, CIRS Director of Evaluation and Accountability Sherri.
Managing Organizational Change A Framework to Implement and Sustain Initiatives in a Public Agency Lisa Molinar M.A.
1. Housekeeping Items June 8 th and 9 th put on calendar for 2 nd round of Iowa Core ***Shenandoah participants*** Module 6 training on March 24 th will.
SCHOOL BOARD A democratically elected body that represents public ownership of schools through governance while serving as a bridge between public values.
EDU 385 CLASSROOM ASSESSMENT Week 1 Introduction and Syllabus.
Addressing Maternal Depression Healthy Start Interconception Care Learning Collaborative Kimberly Deavers, MPH U.S. Department of Health & Human Services.
General Grant Writing Tips. Research the problem/need and the program before beginning the grant proposal Review research on similar problems/needs and.
1-2 Training of Process Facilitators 3-1. Training of Process Facilitators 1- Provide an overview of the role and skills of a Communities That Care Process.
Family Resource and Youth Services Centers: Action Component Plan.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
Goal Setting and Continuous Improvement.  What will be the goals you set that make a difference for your customers?  What role will you play?  With.
Responsiveness to Instruction RtI Tier III. Before beginning Tier III Review Tier I & Tier II for … oClear beginning & ending dates oIntervention design.
BUILDING A PROFESSIONAL LEARNING COMMUNITY Adapted from "Professional Learning Communities at Work"- Richard and Rebecca Dufour and Robert Eaker Delsea.
Marketing Pharmaceutical Care Dr. Muslim Suardi, MSi., Apt. Faculty of Pharmacy University of Andalas 2010.
WASHINGTON STATE UNIVERSITY EXTENSION Evaluation Based Program Development This material was developed in partnership with Rutgers's University and Lydia.
A Team Members Guide to a Culture of Safety
1 A QTS Web Training Writing Consumer Education & Referral Outcomes.
Program Evaluation Principles and Applications PAS 2010.
Regional Dental Consultants’ Meeting Presented by Emerson Robinson, DDS, MPH Region II and V Dental Consultant.
RE-AIM Framework. RE-AIM: A Framework for Health Promotion Planning, Implementation and Evaluation Are we reaching the intended audience? Is the program.
1 UST Support for Institutional Effectiveness: Information for Academic OA Plan Writers.
Final-placement Meeting 18 October Demonstrate the ability to identify and apply appropriate methods of intervention, describe their theoretical.
Middle Managers Workshop 2: Measuring Progress. An opportunity for middle managers… Two linked workshops exploring what it means to implement the Act.
Authentic service-learning experiences, while almost endlessly diverse, have some common characteristics: Positive, meaningful and real to the participants.
Cindy Tumbarello, RN, MSN, DHA September 22, 2011.
Selection Criteria and Invitational Priorities School Leadership Program U.S. Department of Education 2005.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
Tell Survey May 12, To encourage large response rates, the Kentucky Education Association, Kentucky Association of School Administrators, Kentucky.
Coaching.
Logic Models and Theory of Change Models: Defining and Telling Apart
Assessment of Service Outcomes
Welcome to Your New Position As An Instructor
Presentation transcript:

1 Outcomes and Program Improvement: Designing Effective Evaluations Presented by Christine A. Ameen, Ed.D. Ameen Consulting & Associates

ameenconsulting & associates 2 Reflection “If you don’t know where you’re going, you’ll end up somewhere…” - Yogi Berra

ameenconsulting & associates 3 Some “Reasons” for Doing Outcome Evaluation... I need to be able to fill out these forms that the state department gave me. I need a plan for a grant I’m writing. I need to get my agency accredited. I need to show outcomes for the grant project we just completed.

ameenconsulting & associates 4 Other Reasons for Doing Outcome Evaluation... Our staff needs to know the impact they make Our agency needs to know the impact it makes Our funders need to see the return on their investment Our accrediting body needs to see evidence of our intent to provide quality services

ameenconsulting & associates 5 The #1 Reason for Doing Outcome Evaluation... Our clients deserve the very best service!

6 How effective is your program? Let me tell you about the kids that are in my group!

7 How effective is your program? Here’s what’s working… Here’s what we’re learning… Here’s how we’ve increased effectiveness…

ameenconsulting & associates 8 What is Continuous Program Enhancement (CPE)? A commitment to ongoing assessment of service delivery and client outcomes to achieve the best outcomes possible.

ameenconsulting & associates 9 CPE assumes that: Every program has strengths and weaknesses, and opportunities for improvement The organization values learning Administrators nurture a climate of trust and welcome feedback Staff want clients/customers to be successful Staff are open to learning new and different ways of providing service

ameenconsulting & associates 10 Determining Purpose Assess the impact of a program Improve program implementation Identify unmet client needs Respond to accountability. CPE may be used to:

ameenconsulting & associates 11 LET’S FIND OUT! ARE YOU READY?

ameenconsulting & associates 12 Its purpose is clearly understood by everyone The organization’s environment is a risk-free learning environment where a climate of trust is nurtured and feedback is welcomed Internal collaboration is encouraged and expected. Adapted from Brinkerhoff, R.O.; Brethower, D.M.; Hluctyl, T.; and Nowakowski, J.R. Program Evaluation: A Practitioner’s Guide for Trainers and Educators. Boston Kluwer-Nijhoff, You’re ready if:

ameenconsulting & associates 13 Results will be shared with those who have a right to know The results will be useful and used The process of doing CPE is humane The benefits of the CPE justify the cost. Adapted from Brinkerhoff, R.O.; Brethower, D.M.; Hluctyl, T.; and Nowakowski, J.R. Program Evaluation: A Practitioner’s Guide for Trainers and Educators. Boston Kluwer-Nijhoff, You’re ready if:

ameenconsulting & associates 14 Don’t Ask a Question for Which You Don’t Want the Answer! Respond to the results Be prepared to learn that perceptions don’t always reflect reality Accept the fact that CPE is an innovation and there will be some natural resistance to it Be patient — CPE will take some time before its payoffs are realized You must get administrative buy-in first!

ameenconsulting & associates 15 Planning for CPE 1.A program model which describes the services for delivering services to clients and the outcomes clients should achieve 2.An objectives model which defines objectives for the services and objectives for the outcomes 3.An evaluation model which defines how data will be collected and used to monitor the attainment of process objectives and outcome objectives.

ameenconsulting & associates 16 The Processes Used to Develop the CPE Models 1.Review contracts, licensing agreements, program proposals for service requirements, information requirements and pre-determined outcomes 2.Assess what information needs or concerns board members, administrators, staff, clients, customers, volunteers, policy-makers and public-at-large may have about the program 3.Convene a work group of staff and administrators to develop the CPE models and support implementation

17 Building Your Program Model Defining Program Services and Client Outcomes

ameenconsulting & associates 18 Defining Your Program: The Program Model

ameenconsulting & associates 19 Some Definitions Some Definitions Client Conditions - demographic characteristics and other factors that describe client need, assets and client risk for not being successful in program Program Services - the services delivered to address client needs and risks Client Outcomes - a desired effect or impact of a service, intervention or experience on a client

ameenconsulting & associates 20 Residential Program Model Residential Program Model

ameenconsulting & associates 21 Residential Program Model Residential Program Model

ameenconsulting & associates 22 Residential Program Model Residential Program Model

ameenconsulting & associates 23 A Conceptual Framework for Behavior Change 1.Awareness: the initial consciousness, perception, or sense of a concept 2.Knowledge/Skills: understanding, comprehension of a concept; demonstration of ability of that understanding or comprehension 3.Behavior: performance or conduct in a specified way 4.Modeling behavior: demonstration of specified behavior to others, most often to teach the behavior to others.

ameenconsulting & associates 24 How Far Does Your Program Go? How long are clients in the program? How intense is the program? How extensive is content covered in the program?

25 LET’S TRY IT! Build a Program Model

ameenconsulting & associates 26 Defining Your Program: The Program Model

27 Building Your Objectives Model Defining Objectives for Services and Outcomes

ameenconsulting & associates 28 Your Program Objectives Model

ameenconsulting & associates 29 Some Definitions Service Objective - a specific statement about how the service is to be delivered, usually including a timeframe and benchmark Outcome Objective - a specific statement about how much desired effect or impact of a service is going to be achieved.

ameenconsulting & associates 30 Establishing Objectives for Program Services What clients? What services will be provided or experienced? During what time frame?

ameenconsulting & associates 31 Examples of Service Objectives l100% of clients served will have written treatment plans within 30 days of intake l85% of clients served will have weekly visits with their families throughout their treatment stay l85% of clients will attend school 90% of the time throughout their treatment stay l100% of clients will attend all individual therapy sessions designed in their treatment plans throughout their treatment stay

ameenconsulting & associates 32 Establishing Objectives for Client Outcomes Desired effect or impact On whom or what The specific outcome By how much, when.

ameenconsulting & associates 33 Writing Client Outcome Objectives

ameenconsulting & associates 34 Examples of Client Outcomes lNo youth served will experience recurrence of abuse/neglect while in care. l75% of youth will increase their academic level of functioning by at least one grade level upon release from the program. lNo more than 15% of youth completing treatment will be arrested within six months of release from the program

ameenconsulting & associates 35 Residential Program Objectives Model Residential Program Objectives Model

ameenconsulting & associates 36 Residential Program Objectives Model Residential Program Objectives Model

ameenconsulting & associates 37 Residential Program Objectives Model Residential Program Objectives Model

38 LET’S TRY IT! Build an Objectives Model

ameenconsulting & associates 39 Your Program Objectives Model

40 Building Your Evaluation Plan Defining Data Collection to Monitor the Attainment of Service and Outcome Objectives

ameenconsulting & associates 41 Your Program Evaluation Model

ameenconsulting & associates 42 Data Collection Plan for the Objectives Create a data collection plan that includes: Information source Measurement method Timeline for when collected Identification of parties responsible for data

ameenconsulting & associates 43 Some Issues to Resolve Related to Service Objectives Data Service objectives usually report the status of service delivery or the status of the client at a specific point in time Attendance versus participation Opportunity to participate versus actual participation Using currently existing data, e.g., rating forms, attendance records, incident reports, etc.

ameenconsulting & associates 44 Some Common Methods for Collecting Outcome Objectives Data Formal testing or assessment, e.g., vocational skills Attitude measurement, e.g., empathy Surveys, e.g., therapeutic environment Interviews, e.g., follow up with the family upon the youth returning home

ameenconsulting & associates 45 Data Collection for Outcomes Use both qualitative and quantitative information : Qualitative Used when the program is aimed at individualized outcomes Quantitative Used when the program is aimed at common outcomes for all participants

ameenconsulting & associates 46 Some Issues to Resolve Related to Outcome Objectives Data Will service delivery result in the impact on the client? Will service delivery result in the client being able to maintain a positive impact after leaving the program? An increase or a decrease in something, e.g., empathy, requires the measure be used at the beginning and the end of the program

ameenconsulting & associates 47 Some Issues to Resolve Related to Outcome Objectives Data The achievement of a specific level of an outcome, e.g., the family will be able to name 3 community resources they can utilize, requires the measure be used at the end of the program only Will instruments such as surveys be read to clients? If read to some, read to all Instruments should be administered by staff with whom client have limited or no interaction, to reduce the potential, unintended bias of client responding based upon the relationship to the staff administering the instrument

ameenconsulting & associates 48 Some Tips for Selecting Measures Does the instrument directly measure the target behavior or issue the program is intended to change? Is the instrument able to measure change? Is the instrument appropriate for the population? Can the instrument be used in ways that are respectful of the clients and the program? Can the instrument be used consistently? What is the reading level?

ameenconsulting & associates 49 Data Utilization for the Objectives Create a data utilization plan that includes: Who will review the data How data will be interpreted Timeline for when reviewed How data will be used

ameenconsulting & associates 50 Some Tips for Writing the Data Utilization Plan Plan on including direct service staff in the review and utilization process Results will be interpreted against the benchmark you set for each objective Try to anticipate how the information will be used.

ameenconsulting & associates 51 Some Tips for Writing the Data Utilization Plan Review often enough to allow for changes to be made if the objectives aren’t met Consider reviewing data about service objectives on a monthly basis Consider reviewing data about outcome objectives on a quarterly basis Consider reviewing all of the data on an annual basis, to consider setting new benchmarks or new objectives

ameenconsulting & associates 52 Residential Program Evaluation Model Residential Program Evaluation Model

ameenconsulting & associates 53 Residential Program Evaluation Model Residential Program Evaluation Model

ameenconsulting & associates 54 Residential Program Evaluation Model Residential Program Evaluation Model

55 LET’S TRY IT! Build an Evaluation Model

ameenconsulting & associates 56 Your Program Evaluation Model

ameenconsulting & associates 57 Where do I start? Rome wasn’t built in a day! Focus on 2-3 key services and 2-3 outcomes. Don’t reinvent the wheel! Use available resources, beg, borrow and “steal” Don’t go it alone! Get together with colleague agencies to learn from one another, pool resources Don’t wait for perfection! Measuring outcomes is an involving field

ameenconsulting & associates 58 Why Quality Improvement Efforts? We need to provide the highest quality services possible We need to learn what’s working and what isn’t We need to let our staff know how effective our services are We need to be accountable to our funders We need to make a difference

59 Thank you for attending my session today! If you have any feedback for how I can improve my training style or content, please me your suggestions!