Planning Evaluation Setting the Course Source: Thompson & McClintock (1998) Demonstrating your program’s worth. Atlanta: CDC/National Center for Injury.

Slides:



Advertisements
Similar presentations
Program Evaluation: What is it?
Advertisements

Focusing an Evaluation Ben Silliman, Youth Development Specialist NC 4-H Youth Development.
Performance Assessment
Assessment Assessment should be an integral part of a unit of work and should support student learning. Assessment is the process of identifying, gathering.
MODULE 8: PROJECT TRACKING AND EVALUATION
Campus Improvement Plans
Designing an Effective Evaluation Strategy
Introduction to Strengthening Families: An Effective Approach to Supporting Families Massachusetts Home Visiting Initiative A Department of Public Health.
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
A plain English Description of the components of evaluation Craig Love, Ph.D. Westat.
Student Growth Developing Quality Growth Goals II
The Lumina Center Grantseeking Workshop Series Presents Outcomes & Evaluations April 20, 2006.
Curriculum, Instruction, & Assessment
How to Write Goals and Objectives
Measuring Learning Outcomes Evaluation
Evaluation of Training and Education Activities. Objectives By the end of this presentation, participants will be able to List reasons why evaluation.
Evaluating Physical Activity Intervention Programs Thomas Schmid, PhD Physical Activity and Health Branch CDC Atlanta, Georgia, USA.
What should be the basis of
Standards and Guidelines for Quality Assurance in the European
CDC Evaluation Process Harlen Hays, MPH Office of Health Promotion, KDHE.
Community Planning Training 1-1. Community Plan Implementation Training 1- Community Planning Training 1-3.
Continuous Quality Improvement (CQI)
Professional Growth= Teacher Growth
Evaluation of Math-Science Partnership Projects (or how to find out if you’re really getting your money’s worth)
How to Develop a Project Evaluation Plan Pat Gonzalez Office of Special Education Programs
How to Develop the Right Research Questions for Program Evaluation
Focusing Your Evaluation A Planning Template. Discerning Readiness Evaluate no program before its time Internal Chemistry Objectives Target program selected.
Deepening Our Understanding of Student Learning Objectives (SLOs)
One Voice – One Plan Office of Education Improvement and Innovation MI-CSI: Do Stage Implement Plan and Monitor Plan.
Process Evaluation Intermediate Injury Prevention Course August 23-26, 2011 Billings, MT.
Striving for Quality Using continuous improvement strategies to increase program quality, implementation fidelity and durability Steve Goodman Director.
Evaluation Basics Purpose of Evaluation As adults and youth design and implement their evaluation, there are several important principles that will help.
Portfolio based assessment - options for the new CGEA.
Evaluation Design Matching Logic Model Selecting Indicators Ben Silliman Youth Development Specialist NC 4-H.
1 PI 34 and RtI Connecting the Dots Linda Helf Teacher, Manitowoc Public School District Chairperson, Professional Standards Council for Teachers.
SEISMIC Whole School and PLC Planning Day Tuesday, August 13th, 2013.
1 Introduction to Evaluating the Minnesota Demonstration Program Paint Product Stewardship Initiative September 19, 2007 Seattle, WA Matt Keene, Evaluation.
Food Safety Professional Development for Early Childhood Educators Evaluation Plan.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
2 The combination of three concepts constitutes the foundation for results: 1) meaningful teamwork; 2) clear, measurable goals; and 3) regular collection.
Cathy Burack and Alan Melchior The Center for Youth and Communities The Heller School for Social Policy and Management, Brandeis University Your Program.
How to Write Successful Grant Proposals Carmichael Centre Facilitator: Caroline Egan, Consultant & External Relations Manager.
Purposes of Evaluation Why evaluate? Accountability: Justify trust of stakeholders (funders, parents, citizens) by showing program results (Summative)
Prepared by the North Dakota State Data Center July HNDECA and ECCS Evaluation Dr. Richard Rathge Professor and Director North Dakota State Data.
Designing Local Curriculum Module 5. Objective To assist district leadership facilitate the development of local curricula.
Guidance for Completing Interim Report I Evaluation Webinar Series 3 Dec 2013.
Washington State Teacher and Principal Evaluation Project Update 11/29/12.
Curriculum Instruction Assessment Aligning Manitoba Physical Education Supervisors Association Linda Thorlakson Manitoba Education February 17, 2010.
Responsiveness to Instruction RtI Tier III. Before beginning Tier III Review Tier I & Tier II for … oClear beginning & ending dates oIntervention design.
NAEYC Developmentally Appropriate Practice in Early Childhood Programs Key Messages and Implication.
The P Process Strategic Design
Data Analysis Processes: Cause and Effect Linking Data Analysis Processes to Teacher Evaluation Name of School.
Management Plan Describe the adequacy of the management plan to achieve the objectives of the proposed project: On time and within budget Include clearly.
What Are the Characteristics of an Effective Portfolio? By Jay Barrett.
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
About District Accreditation Mrs. Sanchez & Mrs. Bethell Rickards Middle School
Project Evaluation for MSP Targeted Partnerships Eric R. Banilower Horizon Research, Inc.
Dr. Derrica Davis Prospective Principal Candidate: Fairington Elementary School.
The purpose of evaluation is not to prove, but to improve.
INTRODUCING THE PSBA-GTO ACT FOR YOUTH CENTER OF EXCELLENCE IN CONSULTATION WITH HEALTHY TEEN NETWORK Planning for Evidence-Based Programming.
Monitoring and evaluation Objectives of the Session  To Define Monitoring, impact assessment and Evaluation. (commonly know as M&E)  To know why Monitoring.
Capacity Building For Program Evaluation In A Local Tobacco Control Program Eileen Eisen-Cohen, Maricopa County Tobacco Use Prevention Program Tips for.
IMPLEMENTATION AND PROCESS EVALUATION PBAF 526. Today: Recap last week Next week: Bring in picture with program theory and evaluation questions Partners?
Chapter 6 Assessing Science Learning Updated Spring 2012 – D. Fulton.
Assessment/Evaluation Make evaluation a central part of planning – not an afterthought 1) Determine Needs 2) Determine Desired Outcomes 3) Determine Activities.
Program Planning for Evidence-based Health Programs.
Mindset 2000 LtdSlide 1 Train to Gain Provider Support Programme October 2007 Progress and progression + evaluation.
Business Leadership Network: Program Evaluation in Public Health March 28, 2016.
Designing Effective Evaluation Strategies for Outreach Programs
Mathematics/Science/Health Methods for ECE/Elem/MS Teachers
Presentation transcript:

Planning Evaluation Setting the Course Source: Thompson & McClintock (1998) Demonstrating your program’s worth. Atlanta: CDC/National Center for Injury Prevention & Control. Online at

Planning Evaluation What’s going on? If these youth were part of a 4-H program, how would you show evidence for program quality and outcomes? What would they (or their parents, teachers, or peers) tell you about their experience?

Why Evaluate? Brainstorm reasons for evaluating programs

Reasons to Evaluate Prove (scientists “show evidence”) –Program impact (school/college/career success) –Program outcomes (knowledge-attitude-skills-aspirations) –Program quality (best practices) Improve: guidance to reach audience Approve: feedback for staff

Rationale for Evaluation Demonstrate solid evidence for success Allow other programs to learn Monitor ongoing quality and outcomes

Summing up evaluation “…the process of determining whether a program or certain aspects of a program are appropriate, adequate, effective, or efficient, and if not, how to make them so.” Source: Thompson & McClintock (1998) Demonstrating your program’s worth. Atlanta: CDC/National Center for Injury Prevention & Control. Online at

Evaluation may bring more than you expected People talk…and feel good that you listen You talk…stakeholders and media listen Problems become opportunities Programs are sometimes ‘better than expected’

Begin with the end in mind Clear and definite objectives Distinctive target population Straightforward indicators of success Evaluation integrated with programming Appropriate, well-tested methods and tools Comparison data (population, control) Information about process and quality

Shakespeare evaluates Stage 1: Formative (Implementation) —Is it in place? Stage 2: Formative (Process/Progress) —is it serving target audience? Stage 3: Summative (Outcome) —Is it getting results? Stage 4: Summative (Impact) —Is it building results?

Planning Evaluation Formative: Implementation Is the project being implemented according to plan? (e.g., participant selection and involvement, activities and strategies, adjustments matching program plan, capable staff members hired, trained, and well-managed, materials and equipment ready, timelines maintained, appropriateness of personnel, and the development and fulfillment of the management plan.

Planning Evaluation Formative: Progress Is the project progressing toward planned results? (e.g., participant progress on key indicators, activities and strategies fostering progress?

Program Fidelity How can you say that changes in youth knowledge, attitudes, skills, or aspirations result from your program rather than some external factor?

Program Fidelity Keys Document pre- and post-project scores Monitor best practices and youth progress via –External observers –Youth participant feedback

Planning Evaluation Summative: (Short-term) Outcomes At the completion of each/all “units,” how have participants changed? (e.g., knowledge, attitudes, skills, aspirations)

Planning Evaluation Summative: (Long-term) Impacts As a result of program participation, what profound changes occurred in a youth (family, community)? (e.g., behavior, application of program lessons)

Outcome Expectations What kinds of changes are significant? How much change is enough? What if some participants don’t change? How long will changes “stick”?

Answers on Outcome Expectations It depends

Clarifying Expectations What kinds of changes are significant? –Depends on the factor (e.g., attitude toward reading vs. reading comprehension) –Depends on audience (e.g., competent readers vs. struggling readers) –Depends on program (e.g., one-time/short-term vs. all year/all summer) –Depends on context (e.g., stage/pace-appropriate vs. constrained or chaotic)

Clarifying Expectations How much change is enough? –Depends on the above (reality, research) –Depends on funder expectations …often critical first steps or progress toward a goal is a key indicator of continued success (think about staying up on your first bike)

Clarifying Expectations What if some participants don’t change? –See the above (clarify expectations first) –Critically examine threshold criteria (e.g., minimal health, safety, and education goals vs. substantial or optimal improvement) –Critically examine program potential (e.g., relative benefit for specific participants)

Clarifying Expectations How long will changes “stick”? –See the above (check research and reason) –Depends on the nature of the change Interest in science or practice of healthy eating sustained through life (turning point) Increasing involvement and growth in ongoing programming (cumulative benefits)

So where do we begin? Create a “logic model” that describes what results you want and how to get to them Check the research to see what others have learned Get to know your audience so that you know what results are relevant for them