Capacity Building For Program Evaluation In A Local Tobacco Control Program Eileen Eisen-Cohen, Maricopa County Tobacco Use Prevention Program Tips for.

Slides:



Advertisements
Similar presentations
Leon County Schools Performance Feedback Process August 2006 For more information
Advertisements

Virginia - March 2014 (Content adapted from 2014 MSRRC Forum) Preparing for the State Systemic Improvement Plan.
Understanding Capacity Building Assistance
Healthy Schools, Healthy Children?
Donald T. Simeon Caribbean Health Research Council
Session 2.3: Skills for Supportive Supervision
Aligning Employee Performance with Agency Mission
What You Will Learn From These Sessions
© 2005 The Finance Project Module II: Developing a Vision and Results Orientation Oregon 21 st Century Community Learning Center Programs October 2011.
Sustainability Planning Pat Simmons Missouri Department of Health and Senior Services.
Practicing Community-engaged Research Mary Anne McDonald, MA, Dr PH Duke Center for Community Research Duke Translational Medicine Institute Division of.
Program Evaluation. Lecture Overview  Program evaluation and program development  Logic of program evaluation (Program theory)  Four-Step Model  Comprehensive.
System Office Performance Management
ASSESSMENT WORKSHOP: SESSION 1 ADMINISTRATIVE SUPPORT SERVICES ACADEMIC AND STUDENT SUPPORT SERVICES PRESENTED BY THE DIVISION OF INSTITUTIONAL EFFECTIVENESS.
Presented By: Tracy Johnson, Central CAPT
Schoolwide Planning, Part III: Strategic Action Planning
202: Truancy: Prevention and Intervention. The Pennsylvania Child Welfare Resource Center Learning Objectives Participants will be able to: Discuss the.
System Office Performance Management
How to Write Goals and Objectives
PECAT Physical Education Curriculum Analysis Tool Lessons for Physical Education Teacher Preparation Programs National Center for Chronic Disease Prevention.
Healthy North Carolina 2020 and EBS/EBI 101 Joanne Rinker MS, RD, CDE, LDN Center for Healthy North Carolina Director of Training and Technical Assistance.
What is the Parent Involvement Plan (PIP)? Why do we have a Parent Involvement Plan (PIP)? (PIP) PARENT INVOLVEMENT PLAN 1.
How to Develop the Right Research Questions for Program Evaluation
GSU-NACDD-CDC Chronic Disease and Public Health Workforce Training Training Needs Survey and Public Health Certificate in Chronic Disease Training for.
STUDENT ASSISTANCE AND THE 7 SCHOOL TURNAROUND PRINCIPLES Dale Gasparovic, MSed., Administrator Student Assistance Center at Prevention First
1-2 Training of Process FacilitatorsTraining of Coordinators 5-1.
Essential Service # 7:. Why learn about the 10 Essential Services?  Improve quality and performance.  Achieve better outcomes – improved health, less.
Capability Cliff Note Series: HPP Capability Overview What They Are and How To Measure Them.
Performance Measurement and Analysis for Health Organizations
Performance Measures AmeriCorps Project Director Training Saratoga, NY October 7 th – 9 th, 2013.
Purpose of Network Evaluation Increase understanding of the relationship between the network design, objectives and functions and the outcomes achieved.
FewSomeAll. Multi-Tiered System of Supports A Comprehensive Framework for Implementing the California Common Core State Standards Professional Learning.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Assessment on the implementation of the Mediterranean Strategy for Sustainable Development Dr Nicola Cantore Overseas Development Institute,
Toolkit for Mainstreaming HIV and AIDS in the Education Sector Guidelines for Development Cooperation Agencies.
Children, Young People and Families Early Intervention Fund and Adult Learning and Empowering Communities Fund Application support July/August 2015.
Commissioning Self Analysis and Planning Exercise activity sheets.
Title I Parent Meeting at Back-to-School Night Tri-Community Elementary School September 2, 2015.
Prepared by the North Dakota State Data Center July HNDECA and ECCS Evaluation Dr. Richard Rathge Professor and Director North Dakota State Data.
General Grant Writing Tips. Research the problem/need and the program before beginning the grant proposal Review research on similar problems/needs and.
21/4/2008 Evaluation of control measures 1. 21/4/2008 Evaluation of control measures 2 Family and Community Medicine Department.
Office of Special Education Programs U.S. Department of Education GRANT PERFORMANCE REPORT FOR CONTINUATION FUNDING.
Evaluation of the Indiana ECCS Initiative. State Context Previous Early Childhood System Initiatives –Step Ahead –Building Bright Beginnings SPRANS Grant.
Suggested Components of a Schoolwide Reading Plan Part 1: Introduction Provides an overview of key components of reading plan. Part 2: Component details.
NAEYC Developmentally Appropriate Practice in Early Childhood Programs Key Messages and Implication.
Using COS Data to Inform Program Improvement at All Levels Every day, we are honored to take action that inspires the world to discover, love and nurture.
Quality Assurance Review Team Oral Exit Report School Accreditation Center Grove High School 10 November 2010.
0 ©2015 U.S. Education Delivery Institute While there is no prescribed format for a good delivery plan, it should answer 10 questions What a good delivery.
Planning Evaluation Setting the Course Source: Thompson & McClintock (1998) Demonstrating your program’s worth. Atlanta: CDC/National Center for Injury.
 2007 Johns Hopkins Bloomberg School of Public Health Section B Logic Models: The Pathway Model.
Quality Assurance Review Team Oral Exit Report School Accreditation Sugar Grove Elementary September 29, 2010.
National Coordinating Center for the Regional Genetic Service Collaboratives ( HRSA – ) Joan A. Scott, MS CGC, Chief, Genetics Services Branch Division.
1 DEMONSTRATION PROJECTS TO ENSURE STUDENTS WITH DISABILITIES RECEIVE A QUALITY HIGHER EDUCATION PROGRAM Performance Measurement, Program and Project Evaluation.
A Professional Development Series from the CDC’s Division of Population Health School Health Branch Professional Development 101: The Basics – Part 1.
Session 2: Developing a Comprehensive M&E Work Plan.
Department of Defense Voluntary Protection Programs Center of Excellence Development, Validation, Implementation and Enhancement for a Voluntary Protection.
AssessPlanDo Review QuestionYesNo? Do I know what I want to evaluate and why? Consider drivers and audience Do I already know the answer to my evaluation.
[Presentation location] [Presentation date] (Confirm ABT logo) Building Bridges and Bonds (B3): An introduction.
Devon Enhanced C&I Programme. © Babcock Integration LLP, No unauthorised copying permitted. 2 Priorities To.
Business Leadership Network: Program Evaluation in Public Health March 28, 2016.
Selection Criteria and Invitational Priorities School Leadership Program U.S. Department of Education 2005.
Response to Intervention for PST Dr. Kenneth P. Oliver Macon County Schools’ Fall Leadership Retreat November 15, 2013.
Monitoring and Evaluation Systems for NARS organizations in Papua New Guinea Day 4. Session 10. Evaluation.
February 25, Today’s Agenda  Introductions  USDOE School Improvement Information  Timelines and Feedback on submitted plans  Implementing plans.
325K: COMBINED PRIORITY FOR PERSONNEL PREPARATION Webinar on the Annual Performance Report for Continuation Funding Office of Special Education Programs.
Publisher to insert cover image here Chapter 9
Chapter 03 Project Design
MaryCatherine Jones, MPH, Public Health Consultant, CVH Team
Overview: Understanding and Building a Schoolwide Assessment Plan
Performance and Quality Improvement
Presentation transcript:

Capacity Building For Program Evaluation In A Local Tobacco Control Program Eileen Eisen-Cohen, Maricopa County Tobacco Use Prevention Program Tips for Best Results For best results with evaluation planning, consider the following tips:  Invest heavily in planning.  Communication is key! Meetings may have to occur with project teams and individuals on an ongoing basis.  Lots of personal communication is needed with all levels of staff. Make sure line staff as well as management understand evaluation goals and the benefits to the organization.  Use humor.  Streamline data collection systems for staff ease whenever possible.  When training staff on evaluation issues it’s not always beneficial to review the whole forest, just one “park” is often enough.  Expand the evaluator’s role to be the intermediary between: other evaluators, programmers, and program staff.  Combine evaluation needs for data collection and what program managers need for program monitoring.  When there’s going to be a change…introduce it early and often in different settings.  Consider the objective of the evaluation at every step along the way.  Refer to your logic model to be certain that you have considered all parts of the program.  Plan to evaluate early and often. It is less costly to correct problems as they occur. The sooner we detect and correct problems, the more likely we are to have a successful program overall. 5 References: 1. Centers for Disease Control and Prevention. (1997). Smoking-attributable mortality and years of potential life lost—United States, MMWR 46 (20), 441– Centers for Disease Control and Prevention. (2001). Describe the program. In Introduction to program evaluation for comprehensive tobacco control programs (pp ). Atlanta, GA: the Author TTAC website 4.Centers for Disease Control and Prevention. (2001). Introduction. In Introduction to program evaluation for comprehensive tobacco control programs (pp. 5-13). Atlanta, GA: the Author. 5.Child Outcomes Research and Evaluation Team. (n.d.). Why evaluate your program? In The program manager's guide to evaluation. Retrieved January 25, 2004 from the Administration for Children and Families web site: Evaluation Overview Tobacco use is the single most preventable cause of death and disease in the United States, contributing to more than 430,000 deaths annually. 1 Many tobacco prevention and control programs do tremendous work that is never fully recognized by the public, by other health professionals, or even by the people who benefit directly from the program’s accomplishments. Why does this happen? Usually, it is because program managers and staff strongly believe that their work is doing good things, but have no solid evidence to prove their success to people outside their program. In other words, such programs are missing one important element: evaluation. Program evaluation with local tobacco control programs is a necessary function towards achieving programmatic goals. In many programs, however, evaluation is an after-thought to program planning and implementation, particularly when evaluation is not the highest priority. Moreover, tobacco control staff may not have the requisite skills, resources, familiarity, and desire to conduct evaluation. Evaluation allows us to monitor program implementation compared to the program design. Additionally, evaluation encourages us to examine all the parts of a program, including what we put into the program, which activities take place, who conducts the activities, and who is reached as a result. 2 Evaluation provides us with information to serve a variety of purposes, including:  Learning whether proposed program materials are suitable for the people who are to receive them.  Learning whether program plans are feasible before they are put into effect.  Ensuring that a program is being conducted as it was designed.  Having an early warning system for problems that could become serious if unattended.  Monitoring whether programs are producing the desired results.  Learning whether programs have any unexpected benefits or problems.  Enabling managers to improve service.  Monitoring progress toward the program’s goals.  Producing data on which to base future programs.  Demonstrating the effectiveness of the program to the target population, to the public, to others who want to conduct similar programs, and to those who fund the program. 3 The CDC suggests that a good evaluator have these characteristics: Has experience in the type of evaluation needed. Is comfortable with qualitative and quantitative data sources and analysis. Is able to work with a wide variety of stakeholders. Can develop innovative approaches to evaluation while considering the realities affecting a program. Incorporates evaluation into all program activities. Understands both the potential benefits and risks of evaluation Educates program personnel about designing and conducting the evaluation. Will give staff the full findings (i.e., will not gloss over or fail to report certain findings for any reason). Has strong coordination and organization skills. Explains material clearly and patiently. Respects all levels of personnel. Communicates well with key personnel. Exhibits cultural competency. Delivers reports and protocols on time. 4 Department of Public Health Funded by Arizona Department of Health Services