Evaluation is a professional and ethical responsibility and is a core part of PHN professional practice Commitment to evaluation helps build the PHN intelligence.

Slides:



Advertisements
Similar presentations
Focusing an Evaluation Ben Silliman, Youth Development Specialist NC 4-H Youth Development.
Advertisements

Introduction to Monitoring and Evaluation
Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
Performance management guidance
Donald T. Simeon Caribbean Health Research Council
Project Monitoring Evaluation and Assessment
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
Assessing the Engaged Student: Using Evaluation Tools to Reexamine and Strengthen Civic Engagement Programs Ethan A. Kolek, Associate Director of Institutional.
Impact and outcome evaluation involve measuring the effects of an intervention, investigating the direction and degree of change Impact evaluation assesses.
Evaluation in the Field: Putting Concepts into Action Janet Myers, PhD MPH Richard Vezina, MPH CAPS HIV Prevention Conference April 21, 2006.
Action Implementation and Evaluation Planning Whist the intervention plan describes how the population nutrition problem for a particular target group.
Risk Management and Strategy Prioritisation Intelligence Step 8 - Risk Management and Strategy Prioritisaiton Considering the risks associated with action.
Formative and Summative Evaluations
PPA 502 – Program Evaluation
Problem Analysis Intelligence Step 2 - Problem Analysis Developing solutions to complex population nutrition problems (such as obesity or food insecurity)
Action Writing Action Statements Writing action statements is the first step in the second (action) stage of the public health nutrition (PHN) intervention.
Chapter 3 Preparing and Evaluating a Research Plan Gay and Airasian
Title I Needs Assessment and Program Evaluation
Program Evaluation In A Nutshell 1 Jonathan Brown, M.A.
Evaluating Physical Activity Intervention Programs Thomas Schmid, PhD Physical Activity and Health Branch CDC Atlanta, Georgia, USA.
Standards and Guidelines for Quality Assurance in the European
Community Planning Training 1-1. Community Plan Implementation Training 1- Community Planning Training 1-3.
Title I Needs Assessment/ Program Evaluation Title I Technical Assistance & Networking Session October 5, 2010.
Quality Improvement Prepeared By Dr: Manal Moussa.
How to Develop a Project Evaluation Plan Pat Gonzalez Office of Special Education Programs
Using hotlines to improve women’s access to information in legally restricted settings Bangkok, March 9-11, 2012 Challenges for documenting hotlines’s.
Dr. G. Johnson, Program Evaluation and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson.
Evaluating Your STEM Outreach Program MISO Spring Workshop May 7, 2012 MISO Data Analytics Team Jeni Corn, Tricia Townsend, Alana Unfried
Sociology 3322a. “…the systematic assessment of the operation and/or outcomes of a program or policy, compared to a set of explicit or implicit standards.
The Evaluation Plan.
Sina Keshavaarz M.D Public Health &Preventive Medicine Measuring level of performance & sustaining improvement.
Evaluation Basics Principles of Evaluation Keeping in mind the basic principles for program and evaluation success, leaders of youth programs can begin.
Evaluation of the Indianapolis, Indiana September 2002 – August 2003 Stamp Out Syphilis Coalition.
The Audit Process Tahera Chaudry March Clinical audit A quality improvement process that seeks to improve patient care and outcomes through systematic.
1 What are Monitoring and Evaluation? How do we think about M&E in the context of the LAM Project?
Evaluation Assists with allocating resources what is working how things can work better.
Elizabeth Godfrey 1.  Periodic assessment of results Appropriateness, effectiveness, efficiency, impact, sustainability  Identifies intended and unintended.
Targeted Assistance Programs: Requirements and Implementation Spring Title I Statewide Conference May 15, 2014.
Introduction to Evaluation January 26, Slide 2 Innovation Network, Inc. Who We Are: Innovation Network National nonprofit organization Committed.
Semester 2: Lecture 9 Analyzing Qualitative Data: Evaluation Research Prepared by: Dr. Lloyd Waller ©
OECD/INFE Tools for evaluating financial education programmes Adele Atkinson, PhD Policy Analyst OECD With the support of the Russian/World Bank/OECD Trust.
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
Course on Data Analysis and Interpretation P Presented by B. Unmar Sponsored by GGSU PART 2 Date: 5 July
Lesson 8: Effectiveness Macerata, 11 December Alessandro Valenza, Director, t33 srl.
Evaluating a Research Report
Overview of Evaluation Designs. Learning objectives By the end of this presentation, you will be able to: Explain evaluation design Describe the differences.
EVALUATION APPROACHES Heather Aquilina 24 March 2015.
GSSR Research Methodology and Methods of Social Inquiry socialinquiry.wordpress.com January 17, 2012 I. Mixed Methods Research II. Evaluation Research.
Issues in Validity and Reliability Conducting Educational Research Chapter 4 Presented by: Vanessa Colón.
Community Planning Training 5- Community Planning Training 5-1.
Program Evaluation.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
CHAPTER 16 Accounting for human resource management.
Evaluation: Methods & Concerns Otojit Kshetrimayum V.V. Giri National Labour Institute, Noida
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
Readings n Text: Riddick & Russell –Ch1 stakeholders – p10 –Ch 2 an evaluation system –Proposal p25-36 – Ch 4 – Lit Review n Coursepack –GAO report Ch.
Evaluation: from Objectives to Outcomes Janet Myers, PhD MPH AIDS Education and Training Centers National Evaluation Center
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 9, 2012.
Regional Dental Consultants’ Meeting Presented by Emerson Robinson, DDS, MPH Region II and V Dental Consultant.
The Starting Point Things to consider include Defining the overarching aim(s) of the event/activity Deciding the type of event you wish to undertake Defining.
The purpose of evaluation is not to prove, but to improve.
IPSP Outcomes Reporting Framework What you need to know and what you need to do.
V 2.1 Version 2.1 School-wide PBIS Tiered Fidelity Inventory.
Session 2: Developing a Comprehensive M&E Work Plan.
Research And Evaluation Differences Between Research and Evaluation  Research and evaluation are closely related but differ in four ways: –The purpose.
Assessment/Evaluation Make evaluation a central part of planning – not an afterthought 1) Determine Needs 2) Determine Desired Outcomes 3) Determine Activities.
Selection Criteria and Invitational Priorities School Leadership Program U.S. Department of Education 2005.
Controlling Measuring Quality of Patient Care
GETTING ‘EVEN’ BETTER: FIVE LEVELS FOR PD EVALUATION
Monitoring and Evaluation
Presentation transcript:

Evaluation is a professional and ethical responsibility and is a core part of PHN professional practice Commitment to evaluation helps build the PHN intelligence about  what interventions work  in what context  why and how Evaluation findings helps improve practice to be more effective and efficient Evaluation Process Evaluation

Evaluation Evaluation determines the extent to which an intervention has achieved the desired health outcomes and assesses the contribution of strategies used Key reasons for evaluation PHN interventions is to assess and improve intervention:  Effectiveness – has the intervention worked  Efficiency – relative effectiveness compared to other interventions  Efficacy – effectiveness under ideal circumstances  Economic Impact – cost-effectiveness and use of resources  Intelligence – inform future planning and theory building  Accountability – justify resource allocation and use

Process Evaluation Relationship of goals and objectives to evaluation Problem reflected in Goal measured by Outcome evaluation Determinants reflected in Objective measured by Impact evaluation Strategies reflected in Strategy Activities measured by Process evaluation Action Statements and Evaluation Levels

Process Evaluation Levels of Evaluation There are several different levels of evaluation in PHN practice: 1.Formative Evaluation - data collected prior to intervention implementation which is used to inform intervention design and assess capacity 2.Process Evaluation – assessed the intervention strategies and capacity building strategies 3.Impact Evaluation – measures whether the intervention objectives have been met 4.Outcome Evaluation – measures whether the intervention goal has been met 5.Economic Evaluation – measures cost-effectiveness of the intervention or intervention strategies

Process Evaluation Qualitative V’s Quantitative methods The two main forms of data gathering used in evaluation include: qualitative and quantitative Quantitative methods focus on numeric data that can be statistically analysed and can test the extent to which an intervention causes change in health status, health behaviour, knowledge, attitude etc Qualitative methods attempt to determine the meaning and experience of the intervention for the target group and other participants Good quality evaluation usually has components of both qualitative and quantitative methods

Process Evaluation Process evaluation assesses intervention implementation and is concerned with Intervention exposure – extent target group are engaged or aware of PHN problem Reach – proportion of target group who participate Participant satisfaction – whether participants are happy and like the intervention activities Delivery – whether activities are implemented as intended Fidelity – assessing performance of intervention materials and components Contextual aspects – aspects of the environments that influence the intervention implementation

Process Evaluation Both quantitative and qualitative methods are used in process evaluation:  Quantitative methods measure reach, delivery and exposure aspects of the intervention  Qualitative methods assess participant satisfaction, fidelity and context elements of intervention delivery. Process evaluation provides rapid feedback on the quality and integrity of the intervention – useful management tool Process evaluation is relatively low cost and is a useful quality assurance tool

Process Evaluation Evaluating education materials Several tools exist for evaluating education materials:  Standard protocol for leaflets and audiovisual materials - considers: attraction, comprehension, acceptability, personal involvement, persuasion  SMOG test - Formula for readability by calculating the number of polysyllabic words  Group leader performance - true/false questionnaire completed by participants

Process Evaluation Methodological component General definitionExample – qualitative and quantitative methods DesignTiming of data collection: when and how often data will be collected Observe classroom activities at least twice per semester with at least 2 weeks of observation Conduct focus groups with participants in the last month of the intervention Data sourcesSource of information (for example, who will be surveyed, observed, interviewed) Both qualitative and quantitative – data sources include participants, teachers/staff delivering sessions records, the environment etc Data collection tools/ measures Instruments, tools and guides used for gathering process-evaluation data Both qualitative and quantitative – tools include surveys, checklists, observations forms, interview guides etc Data collection procedures Protocols for how the data collection tool will be administered Detailed description of how to do quantitative/ qualitative classroom observation, face-to-face or phone interview, mailed survey, focus group etc Data managementProcedures for getting data from field and entered – plus quality checks Staff turn in participant sheets weekly, evaluation coordinator collects and checks surveys and gives them to data entry staff Interviews transcribed and tapes submitted at the end of the month Data analysisStatistical and/or qualitative methods used to analyse or summarise data Statistical analysis and software that will be used to analyse the quantitative data Types of qualitative analysis used Key methodological components to consider in process evaluation

Process Evaluation Process indicators Evaluation indicators are the criteria against with data or observations are assessed for judgement of intervention success or failure Evaluation indicators may come from :  Historical comparisons with similar efforts in the past  Comparisons with contemporary activities  Professional consensus – using the above and professional judgement Finding comparison data may be difficult due to a lack of published results hence collective professional judgement should be applied