Evaluation – the why’s, the what’s and the how’s 2014 Dr Basma Ellahi (PI) Cat Crum (PhD Student)

Slides:



Advertisements
Similar presentations
Program Evaluation: What is it?
Advertisements

Focusing an Evaluation Ben Silliman, Youth Development Specialist NC 4-H Youth Development.
Anonymous Services, Hard to Reach Participants, and Issues around Outcome Evaluation Centre for Community Based Research United Way of Peel Region Roundtable.
Introduction to Monitoring and Evaluation
Overview M&E Capacity Strengthening Workshop, Maputo 19 and 20 September 2011.
Donald T. Simeon Caribbean Health Research Council
Measuring Value: Using Program Evaluation to Understand What’s Working -- Or Isn’t Juliana M. Blome, Ph.D. , MPH Office of Program Analysis and Evaluation.
Mywish K. Maredia Michigan State University
Designing an Effective Evaluation Strategy
Monitoring and evaluation of carers’ services and projects Dr Andrea Wigfield - Associate Professor of Social Policy Centre for International Research.
Lesson 2: Project: Evaluation, Monitoring, Auditing Macerata, 22 nd October Alessandro Valenza, Director, t33 srl.
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
Delmar Learning Copyright © 2003 Delmar Learning, a Thomson Learning company Nursing Leadership & Management Patricia Kelly-Heidenthal
Business research methods: data sources
1 Minority SA/HIV Initiative MAI Training SPF Step 3 – Planning Presented By: Tracy Johnson, CSAP’s Central CAPT Janer Hernandez, CSAP’s Northeast CAPT.
Presented By: Tracy Johnson, Central CAPT
Role of Result-based Monitoring in Result-based Management (RBM) Setting goals and objectives Reporting to Parliament RBM of projects, programs and.
2014 AmeriCorps State and National Symposium How to Develop a Program Logic Model.
How to Write Goals and Objectives
Measuring Learning Outcomes Evaluation
Evaluating Physical Activity Intervention Programs Thomas Schmid, PhD Physical Activity and Health Branch CDC Atlanta, Georgia, USA.
Continuous Quality Improvement (CQI)
Professional Growth= Teacher Growth
How to Develop a Project Evaluation Plan Pat Gonzalez Office of Special Education Programs
How to Develop the Right Research Questions for Program Evaluation
Monitoring Evaluation Impact Assessment Objectives Be able to n explain basic monitoring and evaluation theory in relation to accountability n Identify.
SESSION ONE PERFORMANCE MANAGEMENT & APPRAISALS.
Impact assessment framework
Measuring your Impact & Outcomes How you know you are making a difference Jill Davies – South Hams CVS.
Assessment 101 Center for Analytics, Research and Data (CARD) United Church of Christ.
Too expensive Too complicated Too time consuming.
Program Evaluation and Logic Models
1 Introduction to Evaluating the Minnesota Demonstration Program Paint Product Stewardship Initiative September 19, 2007 Seattle, WA Matt Keene, Evaluation.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Using Electronic Portfolios to Assess Learning at IUPUI. Trudy Banta, et. al. Indiana University-Purdue University Indianapolis 2007.
Lesson 2: Project: Evaluation, Monitoring, Auditing Macerata, 22 nd October Alessandro Valenza, Director, t33 srl.
Evelyn Gonzalez Program Evaluation. AR Cancer Coalition Summit XIV March 12, 2013 MAKING A DIFFERENCE Evaluating Programmatic Efforts.
Semester 2: Lecture 9 Analyzing Qualitative Data: Evaluation Research Prepared by: Dr. Lloyd Waller ©
INTERNATIONAL LABOUR ORGANIZATION Conditions of Work and Employment Programme (TRAVAIL) 2012 Module 13: Assessing Maternity Protection in practice Maternity.
Introduction to Evaluation. Objectives Introduce the five categories of evaluation that can be used to plan and assess ACSM activities. Demonstrate how.
EVALUATION APPROACHES Heather Aquilina 24 March 2015.
Workshop 6 - How do you measure Outcomes?
The Impact of Health Coaching
Implementation and process evaluation: developing our approach Ann Lendrum University of Manchester Neil Humphrey University of Manchester Gemma Moss Institute.
Monitoring & Evaluation: The concepts and meaning Day 9 Session 1.
Nancy L. Weaver, PhD, MPH Department of Community Health School of Public Health Saint Louis University 16 July 2010 LOGIC MODEL FUNDAMENTALS.
Integrating Knowledge Translation and Exchange into a grant Maureen Dobbins, RN, PhD SON, January 14, 2013.
Graduate studies - Master of Pharmacy (MPharm) 1 st and 2 nd cycle integrated, 5 yrs, 10 semesters, 300 ECTS-credits 1 Integrated master's degrees qualifications.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
Chapter 8 New Wave Research: Contemporary Applied Approaches.
Monitoring and Evaluation
Applied Methodologies in PHI V: Evaluation Dave Jenner (EMPHO) Adapted from material provided by Jo Cooke and Jane Dyas (Trent RDSU) Dara Coppell (Nottingham.
The Literacy and Numeracy Secretariat Le Secrétariat de la littératie et de la numératie October – octobre 2007 The School Effectiveness Framework A Collegial.
Planning Evaluation Setting the Course Source: Thompson & McClintock (1998) Demonstrating your program’s worth. Atlanta: CDC/National Center for Injury.
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
Evaluation: from Objectives to Outcomes Janet Myers, PhD MPH AIDS Education and Training Centers National Evaluation Center
The School Effectiveness Framework
The Literacy and Numeracy Secretariat Le Secrétariat de la littératie et de la numératie October – octobre 2007 The School Effectiveness Framework A Collegial.
Are we there yet? Evaluating your graduation SiMR.
4 th Biennial SAMEA Conference Meaningful Evaluation: Improving Use and Results Evaluation of Teacher-Directed ICT Initiatives Presenter Mokete Mokone.
Evaluation Nicola Bowtell. Some myths about evaluation 2Presentation title - edit in Header and Footer always involves extensive questionnaires means.
Applied Methodologies in PHI Session 5: Evaluation Kath Roberts (NOO/EMPHO)
Guidance for Analyses of Pilot Interventions European Workplace and Alcohol Berlin, 9 November 2012 Jon Dawson.
Designing Effective Evaluation Strategies for Outreach Programs
Nursing Process Applied to Community Health Nursing
Multi-Sectoral Nutrition Action Planning Training Module
Governance and leadership roles for equality and diversity in Colleges
What is a Logic Model? A depiction of a program showing what the program will do and what it is to accomplish. A series of “if-then” relationships that.
What is a Logic Model? A depiction of a program showing what the program will do and what it is to accomplish. A series of “if-then” relationships that.
Presentation transcript:

Evaluation – the why’s, the what’s and the how’s 2014 Dr Basma Ellahi (PI) Cat Crum (PhD Student)

What is evaluation? Evaluation is a process that critically examines a program. It involves collecting and analysing information about a program’s activities, characteristics, and outcomes. Its purpose is to make judgments about a program, to improve its effectiveness, and/or to inform programming decisions (Patton, 1987).

Why evaluate? Answer the objectives of the scheme/project Demonstrate change due to the scheme/project Determine effectiveness i.e. causality or association Inform improvements

‘dull but important When compared with the creative and exciting process of conceiving and initiating a project, evaluation can often be forgotten and be perceived as dull! However, some form of evaluation or formative feedback is the only thing that will show the effectiveness of the project

Evaluations fall into one of two broad categories: formative and summative. Formative evaluations are conducted during program development and implementation and are useful if you want direction on how to best achieve your goals or improve your program. Summative evaluations should be completed once your programs are well established and will tell you to what extent the program is achieving its goals.

Type of EvaluationPurpose Formative 1. Needs Assessment Determines who needs the program, how great the need is, and what can be done to best meet the need. A needs assessment can help determine what audiences are not currently served by programs and provide insight into what characteristics new programs should have to meet these audiences’ needs. 2. Process or Implementation Evaluation Examines the process of implementing the program and determines whether the program is operating as planned. Can be done continuously or as a one-time assessment. Results are used to improve the program. A process evaluation of a program may focus on the number and type of participants reached and/or determining how satisfied these individuals are with the program. Summative 1. Outcome Evaluation Investigates to what extent the program is achieving its outcomes. These outcomes are the short-term and medium-term changes in program participants that result directly from the program. For example, outcome evaluations may examine improvements in participants’ knowledge, skills, attitudes, intentions, or behaviours. 2. Impact Evaluation Determines any broader, longer-term changes that have occurred as a result of the program. These impacts are the net effects, typically on the entire school, community, organisation, society, or environment. Adapted from Norland (2004), Pancer and Westhues (1989) and Rossi et al. (2004).

Aims of evaluation How has the pilot been received by the different participants in each of the health communities/ What have been the successes and the issues – lessons learned? How successful has the pilot been in moving the health communities towards working on the basis of the 5 key principles? How successful has the pilot been in helping the health communities meet their objectives?

Evaluation Objectives of the Malnutrition Prevention Pilot Programme Confirming outcomes and impact indicators Inputs and activities Appraising progress Identify limiters Identify unintended consequences of implementation Generate evidence base Use self generated data

Methodology Impact Evaluation Framework Theory of change Logic Model Linkages between inputs, activities, outputs and outcomes

Evaluation Plan Why? What? Who? How? When? Where?

Complex Evaluation Range of activities -One-off information and/or taster session -Promotional stand -Promotional Campaigns -Health days -Demonstrations -Cooks and eat -Community Enterprise -Training courses Combination of evaluation methods Qualitative Quantitative

The tools of the evaluator Quantitative Monitoring Information Questionnaire/survey Experimental evaluation – RCT, case control, cohort etc Qualitative Observation Interviews Focus Groups Case study Documentation

Methods Semi-structured interviews (telephone) Action Learning sessions Audit - baselines and MUST Sampling strategy – purposive sampling Pluralistic model Ethics Frame of reference

What does good look like? Good evaluation is tailored to your program and builds on existing evaluation knowledge and resources. Good evaluation is inclusive Good evaluation is honest. Good evaluation is replicable and its methods are as rigorous as circumstances allow.

Common dilemmas Intellectual property rights / Data protection and data sharing Be conscious of multiple roles Follow informed-consent rules Respecting confidentiality and privacy Ethics Complexity of data collection sites

How do I make evaluation an integral part of my program? Making evaluation an integral part of your program means evaluation is a part of everything you do. You design your program with evaluation in mind, collect data on an on-going basis, and use these data to continuously improve your program.

To build and support an evaluation system:  Couple evaluation with strategic planning.  Revisit and update your evaluation plan and logic model to make sure you are on track.  Build an evaluation culture

What are the benefits?  better understand your target audiences' needs and how to meet these needs  design objectives that are more achievable and measurable  monitor progress toward objectives more effectively and efficiently  learn more from evaluation  increase your program's productivity and effectiveness

10 reasons to evaluate your project 1.So you know whether it’s working 2. So you can be adaptable 3.To know how things are working 4.So you’re aware of unintended outcomes 5.To be able to better communicate the value of your work

10 reasons to evaluate your project cont. 6. To focus your work 7. To help look after the people you work with 8. Build organisational resilience 9. Know why things are working 10. Life is complicated

Make evaluation part of your program; don’t tack it on at the end!

Resources Magenta Handbook - m/uploads/attachment_data/file/220542/mage nta_book_combined.pdf m/uploads/attachment_data/file/220542/mage nta_book_combined.pdf The Magenta Book: guidance notes for policy evaluation and analysis - content/uploads/2011/09/the_complete_mage nta_book_2007_edition2.pdf content/uploads/2011/09/the_complete_mage nta_book_2007_edition2.pdf

Contact details Centre for Ageing Studies Faculty of Health and Social Care University of Chester Riverside Campus Chester Dr Basma Ellahi – Reader in Food and Nutrition Professor Paul Kingston – Director of The Centre for Ageing Studies, Professor of Mental Health and Ageing. Cat Crum (PhD Student – sponsored by Age UK South Staffordshire)