Program Evaluation. Overview and Discussion of: Objectives of evaluation Process evaluation Outcome evaluation Indicators & Measures Small group discussions.

Slides:



Advertisements
Similar presentations
Engaging stakeholders. What is the Centre? Knowledge Bringing people and knowledge together to promote the best mental health and well-being for every.
Advertisements

REGIONAL CONFERENCES Planning, Doing, and Using Evaluation.
Process Evaluation Susan Kasprzak, March 13, 2009.
Evaluating Communities of Practice in Child and Youth Mental Health.
Regional Conferences Objectives for today… To receive an update on the Centres activities in : –Mobilizing knowledge and changing.
How to write a winning proposal. Overview Thinking about the proposed work (the who, the what and the when) Creating the plan (identifying steps, assigning.
Putting it all together: Writing the final report.
REGIONAL CONFERENCES Planning, Doing, and Using Evaluation.
Implications for Think Tanks Need to be able to: –Understand the political context –Do credible research –Communicate effectively –Work with others Need.
PQF Induction: Small group delivery or 1-1 session.
Introduction to Monitoring and Evaluation
Department of Education, Employment and Workplace Relations
M & E for K to 12 BEP in Schools
Expanding & Sustaining Systems of Care: New Challenges and Opportunities Presentation Beaver County (PA) System of Care: Optimizing Resources, Education.
Donald T. Simeon Caribbean Health Research Council
Presentation at The Conference for Family Literacy Louisville, Kentucky By Apter & O’Connor Associates April 2013 Evaluating Our Coalition: Are We Making.
OECD/INFE High-level Principles for the evaluation of financial education programmes Adele Atkinson, PhD OECD With the support of the Russian/World Bank/OECD.
Comprehensive Organizational Health AssessmentMay 2012Butler Institute for Families Comprehensive Organizational Health Assessment Presented by: Robin.
Catulpa Community Support Services.  Use of an electronic data entry program to record demographic data and case notes to reflect service delivery 
Deanne Gannaway Facilitating Change in Higher Education Practices.
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
Ray C. Rist The World Bank Washington, D.C.
Healthy Child Development Suggestions for Submitting a Strong Proposal.
Empowering Staff Through Institute Planning (ESTIP) Executive Workshop Institute Name: XXXXXX Presenter: XXXXXX Date: XXXXXX.
By Saurabh Sardesai October 2014.
Sustaining Local Public Health and Built Environment Programs Fit Nation NYC November 3, 2011 Annaliese Calhoun.
Community Planning Training 1-1. Community Plan Implementation Training 1- Community Planning Training 1-3.
Continuous Quality Improvement (CQI)
Molly Chamberlin, Ph.D. Indiana Youth Institute
Dr.Mohamed E. Osman & Prof.Thuwayba A. Al Barwani With Dr.Abdo M. Al Mekhlafi Dr. Khalid Al Saadi Ms.Laila Alhashar Ms.Fathiya Al Maawali Ms.Zuhor Al lawati.
Organization Mission Organizations That Use Evaluative Thinking Will Develop mission statements specific enough to provide a basis for goals and.
From Evidence to Action: Addressing Challenges to Knowledge Translation in RHAs The Need to Know Team Meeting May 30, 2005.
World Languages Portfolio. Student Growth Portfolio with Peer Review 2  THE GOAL: A holistic and meaningful picture of the value a teacher adds to students,
Training of Process Facilitators Training of Process Facilitators.
The Evaluation Plan.
The RRCP Program A Framework for Change Presented to our SPDG Partners June 2010.
Do it pro bono. Strategic Scorecard Service Grant The Strategy Management Practice is presented by Wells Fargo. The design of the Strategic Scorecard Service.
The Cedar Foundation Stephen Mathews – Chief Executive Stella Maguire – Head of Organisational and Service Development.
Strategic Planning Session David Rudder, Ph.D. Rudder Consultants, LLC. May 17, 2006.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Evaluation Assists with allocating resources what is working how things can work better.
Chapter 9 Developing an Effective Knowledge Service
Outcome Based Evaluation for Digital Library Projects and Services
EVALUATION RESEARCH. Evaluation Research  How do we begin?  What are the different types of evaluation research?  How do these different types fit.
OECD/INFE Tools for evaluating financial education programmes Adele Atkinson, PhD Policy Analyst OECD With the support of the Russian/World Bank/OECD Trust.
CSI - Introduction General Understanding. What is ITSM and what is its Value? ITSM is a set of specialized organizational capabilities for providing value.
Alaska Staff Development Network – Follow-Up Webinar Emerging Trends and issues in Teacher Evaluation: Implications for Alaska April 17, :45 – 5:15.
The Student Services Assessment Institute (SSAI): Creating a Culture of Assessment through Professional Development Kim Black, Ph.D. Stephanie Torrez,
Juggling the Program Management Ball 1. One day thou art an extension educator… The next day thou art a county director or district extension director…how.
Masterful Facilitation Model. Facilitation Cycle Designing Intervention Facilitating &Evaluating Results Initial Contact & Clarify Objectives.
2013 NEO Program Monitoring & Evaluation Framework.
Julie R. Morales Butler Institute for Families University of Denver.
Brighter Futures Programme Cheryl Hopkins Independent Consultant.
Integrating Knowledge Translation and Exchange into a grant Maureen Dobbins, RN, PhD SON, January 14, 2013.
Take Charge of Change MASBO Strategic Roadmap Update November 15th, 2013.
Introduction to the Continual Service Improvement Toolkit Welcome.
2009 OSEP Project Directors Meeting Martha Diefendorf, Kristin Reedy & Pat Mueller.
North Etobicoke LIP Summit Woodbine Convention Centre June 28 th, 2011.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Developing a Framework In Support of a Community of Practice in ABI Jason Newberry, Research Director Tanya Darisi, Senior Researcher
Catholic Charities Performance and Quality Improvement (PQI)
Evaluation: from Objectives to Outcomes Janet Myers, PhD MPH AIDS Education and Training Centers National Evaluation Center
TCRF Strategic Planning Process A Stakeholders’ Consultative Retreat- Morogoro 26 th -27 April 2013.
Online Learning Module: Planning evaluation. Webinar overview  This webinar will be recorded so that it can be available on the Centre’s website as an.
CSI - Introduction ITIL v3.
Champaign Unit 4 Parent Advocacy Committee Update Cheryl Camacho & Tony Howard April 22, 2013.
Masterful Facilitation Model. Facilitation Ladder Basic Facilitation Skills TOT Facilitation Facilitation as Core Competency.
A Professional Development Series from the CDC’s Division of Population Health School Health Branch Professional Development 101: The Basics – Part 1.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
Monitoring & Evaluation Capacity Strengthening Workshop WORKSHOP INTRODUCTION AND OVERVIEW.
Presentation transcript:

Program Evaluation

Overview and Discussion of: Objectives of evaluation Process evaluation Outcome evaluation Indicators & Measures Small group discussions

Program Evaluation: Here we go again! WHY WHY WHY do evaluation? Time Resources Validity and reliability Action

Why Evaluate? Increase knowledge & understanding of the program being evaluated Identify specific problems/needs, as well as effective practices Develop capacity for ongoing assessment and improvement Evaluation is of little value unless it leads to further decisions and activities re: program improvement

Why Evaluate? In the past… Most often, the focus of evaluation was on what agencies did – how many clients did they see, how many clinical hours were spent, how many sessions did they deliver Now… There is an increasing focus on what happens when agencies deliver services – what changes, what improves for the client

Thorny Issues in Program Evaluation ?

Thorny Issues in Program Evaluation Cost and human resources Buy-in at all levels Logic models that defy logic (overly ambitious) Process versus outcomes/impacts Choosing the right indicators/measures Moving data to action

Capacity to do evaluation Capacity to use evaluation Human Resources Staffing Technical Skills Communication Skills Professional Development Leadership Organizational Resources Budget Ongoing Data Collection Infrastructure Evaluation Literacy Results-Management Orientation Involvement-Participation Organizational Decision- Making Management Processes Decision Support Learning Benefits Instrumental-Conceptual Use Process Use { { Evaluation Planning and Activities Evaluation Plan Use of Consultants Information Sharing Organizational Linkages External Supports Framework of Evaluation Capacity, Isabelle Bourgeois. Do not cite without authors permission.

Ensuring support & buy-in Efforts to ensure support & buy-in for evaluation from staff & stakeholders should happen from the onset of deciding to conduct an evaluation Hold preliminary meetings Clarify roles & responsibilities over the evaluation Identify why the evaluation is being done; clarify any misconceptions; provide reassurance Identify who the end-user will be Develop agreement on what is to be evaluated Assess readiness for evaluation

Logic Models Keep it as simple and as focused as possible!

Process Evaluations Key Issues: Service Delivery – the extent to which program components are being delivered as planned Coverage – extent of target population participation in program Bias – extent to which subgroups of the designated target population participate (or dont participate)

Process Evaluations – Strengths & Weaknesses Strengths: Measures how well program meets client needs Measures the extent to which the program was implemented as intended Ensures a consumer-driven program Easy & inexpensive Weaknesses: Bias can be significant oFew dissatisfied participants return surveys oClients may report inflated levels of satisfaction oVery subjective

Outcome Evaluations Focus: program results, program effectiveness Examples of Outcomes: Change in circumstances Change in understanding/knowledge Change in attitude Change in behaviour

What Are Some of Your Outcomes? How are you measuring them? ?

Indicators and Measures – Measurement Strategy OutcomeIndicator(s)Source of data (records, clients, caregivers, etc.) Method to Collect Data & Frequency Who collects data When collects data EXAMPLE: Skill acquisition – self management, adaptive Increase in scores on administered psychometric test, caregiver survey Caregivers, Psychometric measures Pre and post administered tests, surveys Evaluation Researcher At the beginning and end of program, and at six months post program

Moving Results Into Action Use results Develop action plans for improving program effectiveness Develop plans to disseminate and share findings – be innovative! Plans should be developed from the beginning and continue throughout Linkages & partnerships developed in the initial stages of the evaluation form the basis of plans for information sharing

Centre of Excellence Activities Consultation services –Development of evaluation framework, logic model, indicators and measures –Training or workshops –Facilitating or developing networks/ communities of practice (e.g., Triple P) –Review of grant applications Evaluation grants

Objectives – Evaluation Grants Build skills –Developing evaluation framework and logic model –Implementing evaluation plan and analyzing results –Using evaluation findings Build support –Frequent communications between the Centre and agencies –Engage internal and external stakeholders Build communities –Establishing communities of practice

Communities of Practice – Triple P Nine agencies – key contacts for Triple P within Ontario Developing a common approach to evaluation – measuring same set of outcomes with common measures Facilitating a provincial roll-out of Triple P data

Communities of Practice – Triple P What made it work? Clearly identified purpose Willingness to work together Face-to-face meeting at strategic points Communication portal to facilitate active learning and sharing of ideas Centre as facilitator

Small Group Discussion 1.Think of a program in your agency. List all the external and internal stakeholders for the evaluation. What are some challenges in meeting the interests of all stakeholders in your program? What are some ways your program has been successful in getting buy-in from key stakeholders?

Small Group Discussion 2. In your local area or region, what community of practice would you like to be involved in? What topics or issues would you like the community of practice to discuss and meet about? How can the Centre of Excellence help you in forming and maintaining this community of practice?

Program Evaluation Contact Information Contact Information: Susan Kasprzak Research Associate (613) ext Tanya Witteveen Research Associate (613) ext

Visit our website for more information: