10/05 1 Program Evaluation Framework Before we begin, a little about our format…  Presentation by seminar speaker (approx. 30 min.)  Followed by question.

Slides:



Advertisements
Similar presentations
Oakland EMA Patricia LaBrie Calloway, R.N., P.H.N.
Advertisements

Welcome to “Billing for Consumer Centered Family Consultation in PROS” Webinar Hosted by: The Family Institute for Education, Practice & Research & The.
Impact of Age and Race on New HIV Infections among Men who have Sex with Men in Los Angeles County Shoshanna Nakelsky, MPH Division of HIV and.
Anita M. Baker, Ed.D. Jamie Bassell Evaluation Services Program Evaluation Essentials Evaluation Support 2.0 Session 2 Bruner Foundation Rochester, New.
Does It Work? Evaluating Your Program
HIV PREVENTION AND SUPPORTIVE SERVICES FOR LATINA WOMEN: A GRANT PROPOSAL PROJECT LIZETT MORALES CALIFORNIA STATE UNIVERSITY, LONG BEACH MAY 2013.
Healthy Start Interconception Care Learning Community (ICC LC) Using Quality Improvement for Better Preconception Care Preconception Care Summit June 14,
Evaluation.
Department of Health and Human Services Measuring Clinical Lab Ordering Quality: Theory and Practice Steven M. Asch MD MPH VA, RAND, UCLA April 29, 2005.
HIV INTERVENTION FOR PROVIDERS (HIP) Principal Investigators:  Carol Dawson Rose, RN, Ph.D. and Grant Colfax, MD. Co-Investigators:  Cynthia Gomez, Ph.D.,
PPA 502 – Program Evaluation Lecture 10 – Maximizing the Use of Evaluation Results.
1 Minority SA/HIV Initiative MAI Training SPF Step 3 – Planning Presented By: Tracy Johnson, CSAP’s Central CAPT Janer Hernandez, CSAP’s Northeast CAPT.
Presented By: Tracy Johnson, Central CAPT
Competitive Grant Program: Year 2 Meeting 2. SPECIAL DIABETES PROGRAM FOR INDIANS Competitive Grant Program: Year 2 Meeting 2 Data Quality Assurance Luohua.
How to Write Goals and Objectives
CAPP Evaluation: Implementing Evidence Based Programs in NYS Jane Powers ACT for Youth Center of Excellence 2011 A presentation for Comprehensive Adolescent.
The Internet: An Emerging Venue for Syphilis Epidemics Among Men Who Have Sex with Men in Los Angeles LAC - DHS Getahun Aynalem, MD, MPH, Kellie Hawkins,
African Americans and HIV: CA Office of AIDS Response Michelle Roland, MD Chief, Office of AIDS California Department of Public Health.
RTI International RTI International is a trade name of Research Triangle Institute. Barriers and Facilitators to Immunization Grantees’ Collection.
Community Care and Wellness for Seniors
Organization Mission Organizations That Use Evaluative Thinking Will Develop mission statements specific enough to provide a basis for goals and.
Welcome to the Acción Mutua web-seminar on Understanding Latinas and their HIV-related risks Before we begin, a little about our format… Presentation by.
New Haven-Fairfield Counties End of Year Studies: Ryan White Planning Council New Haven-Fairfield Counties End of Year Studies: Ryan White Planning Council.
Evaluating REACH Funded Projects Using the REACH Theory of Change.
SPF SIG State-Level Evaluation COMMUNITY LEVEL INSTRUMENT (CLI): PART 2.
The Evaluation Plan.
Do it pro bono. Strategic Scorecard Service Grant The Strategy Management Practice is presented by Wells Fargo. The design of the Strategic Scorecard Service.
Evaluation Basics Principles of Evaluation Keeping in mind the basic principles for program and evaluation success, leaders of youth programs can begin.
Cynthia Baur, Ph.D. Senior Advisor, Health Literacy August 23, 2011 The National Action Plan to Improve Health Literacy Office of the Director Office of.
Too expensive Too complicated Too time consuming.
Specific Learning Disability: Accurate, Defensible, & Compliant Identification Mississippi Department of Education.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Evaluation Assists with allocating resources what is working how things can work better.
Introduction to Evaluation January 26, Slide 2 Innovation Network, Inc. Who We Are: Innovation Network National nonprofit organization Committed.
Racial/Ethnic Disparities in the HIV and Substance Abuse Epidemics: Communities Responding to the Need Hortensia Amaro, Anita Raj, Rodolfo Vega, Thomas.
Perspectives on Impact Evaluation Cairo, Egypt March 29 – April 2, 2009 Presented by: Wayne M. Harding. Ed.M., Ph.D., Director of Projects, Social Science.
Evelyn Gonzalez Program Evaluation. AR Cancer Coalition Summit XIV March 12, 2013 MAKING A DIFFERENCE Evaluating Programmatic Efforts.
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
eHARS to CAREWare Pilot Project Update and Training
1 Data analysis. 2 Turning data into information.
Director of Evaluation and Accountability Manager, UW’s Grand Rapids, Michigan Robert McKown, CIRS Director of Evaluation and Accountability Sherri.
Recovery Support Services and Client Outcomes: What do the Data Tell Us? Recovery Community Services Program Grantee Meeting December 14, 2007.
Early Childhood Outcomes Center1 Using Data for Program Improvement Christina Kasprzak, NECTAC/ECO Ann Bailey, NCRRC July 2010.
Surveillance Data in Action: Tuberculosis Indicators Melissa Ehman, MPH Tuberculosis Control Branch (TBCB) Division of Communicable Disease Control Center.
Nancy L. Weaver, PhD, MPH Department of Community Health School of Public Health Saint Louis University 16 July 2010 LOGIC MODEL FUNDAMENTALS.
Addressing Maternal Depression Healthy Start Interconception Care Learning Collaborative Kimberly Deavers, MPH U.S. Department of Health & Human Services.
Program Evaluation Dr. Ruth Buzi Mrs. Nettie Johnson Baylor College of Medicine Teen Health Clinic.
NATIONAL YOUTH IN TRANSITION DATABASE (NYTD) A Guide for Implementation.
Family Resource and Youth Services Centers: Action Component Plan.
Educational Research: Competencies for Analysis and Application, 9 th edition. Gay, Mills, & Airasian © 2009 Pearson Education, Inc. All rights reserved.
Comprehensive Field Record. Introduction to the Training ● The slides will first show a picture of the section of the template that will be discussed.
Sexually Transmitted Disease (STD) Surveillance Report, 2013 Minnesota Department of Health STD Surveillance System Minnesota Department of Health STD.
Reporting Updated 05/2014. Handbook References Chapter 3: Administrative Guidance – Demographic Report – Match Report – Annual Report – Deaf and Hard.
Community Planning 101 Disability Preparedness Summit Nebraska Volunteer Service Commission Laurie Barger Sutter November 5, 2007.
Healthy Futures Performance Measures Session. Session Overview Combination of presentation and interactive components Time at the end of the session for.
Sexually Transmitted Disease (STD) Surveillance Report, 2008 Minnesota Department of Health STD Surveillance System Minnesota Department of Health STD.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Education Performance Measures Session. Session Overview Combination of presentation and interactive components Time at the end of the session for Q&A.
By Anna Cunningham, Michelle Klochack, and Stephanie Wietecha Ferris State University.
1 No glove, no love: Why California’s ethnic youth report using contraception Shelly Koenemann, MPH Marlena Kuruvilla, MPH/MSW Michelle Barenbaum, MPH.
1 Information Systems Use Among Ohio Registered Nurses: Testing Validity and Reliability of Nursing Informatics Measurements Amany A. Abdrbo, RN, MSN,
Andrea Moore Information Specialist MANILA Consulting Group, Inc. American Evaluation Association Annual Meeting November 11, 2010 The Community-based.
OUTCOME MONITORING A Short Review Nidal Karim. What is Outcome Monitoring? WhatHowWhyWhen The routine process of monitoring whether or not service recipients.
Logic Models How to Integrate Data Collection into your Everyday Work.
Rebecca McQuaid Evaluation 101 …a focus on Programs Rebecca McQuaid
Before we begin, a little about our format…
Believed discrimination occurred because of their:
SUCCESSFUL MEASURING JAMES HARDIN, DIRECTOR, HEALTH IMPACT
TRACE INITIATIVE: Data Use
BOOTCAMP SOCIAL INNOVATION ACCELERATOR TO CREATE LASTING CHANGE
Presentation transcript:

10/05 1 Program Evaluation Framework Before we begin, a little about our format…  Presentation by seminar speaker (approx. 30 min.)  Followed by question and answer session (approx. 30 min.) *************************************************************  Please press *6 on your telephone keypad to mute your line (to un-mute your line, press *6 again)  If you are experiencing difficulty with your phone connection, dial *0 for the conferencing service operator  For questions that arise during the presentation, click on the “hand” button and wait to be called on to ask your question over the phone

Program Evaluation Overview: A Conversation with Uyen H. Kao, M.P.H. November 16, 2006 Uyen H. Kao, MPH Center for HIV Identification, Prevention, and Treatment Services

10/05 3 Acknowledgement  AIDS Project Los Angeles  Research and Evaluation Core—APLA  Center for HIV Identification, Prevention, and Treatment Services—UCLA  The César E. Chávez Institute

10/05 4 About the presenter: Mrs. Uyen Kao works for the Department of Family Medicine at UCLA and Center for HIV Identification, Prevention, and Treatment Services (CHIPTS). As a Project Director, she manages a NIDA-funded study examining the potential emerging public health problem of treatment- resistant HIV transmission in methamphetamine-abusing MSM in Los Angeles. She provides a broad array of HIV-related technical assistance including trainings on grant proposal writing, adapting and tailoring evidence-based HIV interventions and HIV program planning and evaluation. Prior to CHIPTS, she was with AIDS Project Los Angeles where she managed a capacity building project funded by the Los Angeles County OAPP to provide program evaluation training and technical assistance services. Her research interest is in the sociocultural factors that impact women’s health, adolescent sexuality, and HIV disclosure among those infected. Mrs. Kao received her Masters of Public Health from UCLA.

10/05 5 Seminar Objectives  Define program evaluation and understand how it fits into program planning  Identify steps for evaluating a program

10/05 6 “I think you should be more explicit here in step two.”

10/05 7 What is Program Evaluation? “The systematic (orderly) collection of information about the characteristics, activities, and outcomes of services or programs to assess the extent to which objectives have been achieved, identify needed improvements, and/or make decisions about future programming.” – HRSA, 1999

10/05 8 Benefits of Evaluation  Decision-making and program planning  Stay on track  Improve program and service delivery  Fulfill grant or contract requirements  Determine the cost-effectiveness  Make budgetary decisions  Provide evidence for future funding

10/05 9 Barriers to Evaluation  Lack of skills  Limited resources  Lack of support  Fear of consequences  Burden on clients and staff

10/05 10 Evaluation Framework STEP 3 Develop Evaluation Questions STEP 2 Develop Measurable Objectives STEP 4 Collect/Gather Credible Evidence STEP 5 Analyze Info & Develop Conclusion STEP 6 Report Findings STEP 1 Determine Purpose and Uses

10/05 11 Case Study Your agency recently started a Healthy Wellness Program. The goal of the program is to promote physical activity and healthy eating habits among all its employees. As a team leader, you are responsible for coordinating activities as well as evaluating your team’s success. How do you plan to evaluate your program??

10/ Who is most likely to need and use the information obtained? 2. What is the primary purpose of the evaluation? 3. How will the information be used? STEP 1 Determine Purpose and Uses Case Study 1.Supervisor, Human Resources dept, Executive Director, funder 2.To determine program effectiveness 3.It will be used to provide evidence for continuing funds; to encourage more staff participation

10/05 13 Objectives are specific statements which describe what you plan to do with your proposed program within a given time period (CDC, 1999) STEP 2 Develop Measurable Objectives

10/05 14 Measurable Objectives WHEN Time (date) by or during which it is to occur HOW MUCH Target rate or the amount of change FOR WHOM Refers to the target population WHERE Area in which target population is located IN WHAT Problem/behavior/ outcome to be changed or intervention to be accomplished [21]

10/05 15 Measurable Objectives By the end of fiscal year 80 %of program participants At G.R.E.A.T. Agency Will reach their goal weight loss WHEN HOW MUCH FOR WHOM WHERE IN WHAT Measurable objectives should be REALISTIC!!

10/05 16 Help focus the evaluation Vary from one program to another Based on purpose, objectives, resources, and timeframe of the evaluation STEP 3 Develop Evaluation Questions Case Study 1.Who participated in the program (gender, race, PT/FT, age)? 2.How many participants reach their weight loss goal? 3.What were participant’s satisfaction level?

10/ Identify types of information needed 2. Determine sources for information 3. Select methods to collect information 4. Define procedures to collect information STEP 4 Collect/Gather Credible Evidence

10/ Identify Types of Info Needed  Variables - observable characteristics of a person, organization, or program that are counted and measured  Measure - the observable and measurable data or item of information to be collected for a specific variable (also called indicator)

10/05 19 Gender# of males, females, or transgender Race/Ethnicity# of persons per race/ethnic category (e.g. Caucasian, African Am, Latino/a, etc) AgeWhat is your age? Or What is your date of birth? Or What is your age category? Examples of Variables & Measures VariablesMeasures

10/ Determine Sources for Info Sources of Information People Documents Observations

10/ Select Methods for Collection  Methods: document reviews, surveys, interviews, observations, focus groups, case studies  Selection of data collection methods should be based on:  Available resources  Desired response rate  Timeframe  Access to data source  Staff experience  Reliability and validity

10/ Define Procedures for Collection  When will the information be collected?  Where will the information be collected?  Who will collect the information?  How will the information be collected?

10/05 23 Summary of Step 4 – Collecting credible evidence 1. Identify types of information needed 2. Determine sources for information 3. Select methods to collect information 4. Define procedures to collect information Case Study 1.Weight, height (to calculate BMI) 2.Participants, other staff members, medical history 3.Observation, interview, document review, survey, instrument tool 4.The team leader will obtain weight/height measures using a scale/measuring tape from participants on the 1 st of each month between 9-10am in the conference room.

10/05 24 Data Analysis – the process of categorizing, ordering, manipulating, and summarizing data to obtain answers to evaluation questions STEP 5 Analyze/Develop Conclusions

10/05 25 STEP 5 Analyze/Develop Conclusions 1. Enter data and check for errors 2. Tabulate data 3. Analyze data by key characteristics 4. Provide interpretation of findings

10/ Enter Data/Check for Errors  Transfer data into new form  Check for errors Look at every nth case Check visually or run frequencies Check if answers make sense

10/05 27 GenderRace/EthnicityExercise PIDMFTWHBLKHISAPIOTYNR 101XXX 102XXX 103XXX 104XXX 105XXX 106XXX 107XXX Total Sample Spreadsheet

10/05 28 PIDGenderEthnicityWeightHeightExercise Sample Spreadsheet (EXCEL) 1=Male 2=Female 3=Transgender 1=White 2=Hispanic 3=API 4=Other 1=Yes 2=No 3=Refuse

10/ Tabulate Data  Total # of Participants  Frequency  Percentage  Ratio  Mean  Median  Mode  Range

10/05 30 Example Range: 45-21=24 These are the participant’s age reported: 25, 29, 27, 22, 30, 25, 23, 21, 27, 23, 40, 45, 23, 27, 35 Mean: 420/15=28 years Median: 21,22,23,23,23,25,25,27,27,27,27,30,35,40,45 Mode: 27

10/ Analyze Data  Break down data by key characteristics (e.g. age, gender, ethnicity, etc.)  Compare results by key characteristics  Compare data at different points in time Case Study Of the 52 employees, 15 (29%) participated in the Healthy Wellness Program during the first quarter. 73% of the participants were women and had a mean age of 28. Most of the participants were (46%) Hispanics, followed by 33% Whites, and 20% Blacks. 47% of participants who had BMI > 25 at baseline were able to decrease their BMI by at least 1 point. The average weight loss by participants were 7lbs during the 3 month period.

10/ Provide Interpretations Helps intended users understand what the numbers may mean Increases appreciation for your program Shares reasons for why your results are the way the are

10/05 33 Should include description: Program/services being evaluated Purposes of the evaluation Methods of data collection Results of data analysis Discussion strengths and weaknesses and implications of the results STEP 6 Report Findings

10/05 34 STEP 6 Report Findings Case Study Based on the evaluation conducted, the Healthy Wellness Program was effective in promoting physical activity and healthy eating among its participants. 67% reported eating more vegetables/fruits than before starting the program and 53% reported exercising/engaging in a physical activity for a total of 1hour per week post-intervention. The average weight loss was highest among non-Latino participants compared to Latinos. 93% of the participants reported being satisfied or very satisfied with the program. Areas for improvement include outreaching and developing program activities more targeted for male employees (e.g. basketball game during lunch hour, Friday night baseball). The healthy eating workshops and cooking demos need to be more cultural specific such as including low-carb Latin recipes.

10/05 35 Summary of Six Steps STEP 3 Develop Evaluation Questions STEP 2 Develop Measurable Objectives STEP 4 Collect/Gather Credible Evidence STEP 5 Analyze Info & Develop Conclusion STEP 6 Report Findings STEP 1 Determine Purpose and Uses

10/05 36 QUESTIONS & ANSWERS 1. Click on the “hand” button to raise your hand. 2. Press *6 to unmute your phone * Please keep your phone muted at all other times

10/05 37 Future Acción Mutua web seminars: Latina Transgenders & HIV Risk January 18, 2007 Program Evaluation Series January 25, 2007 (overview) February 27, 2007 (part two – process evaluation) March 27, 2007 (part three – outcome monitoring)

10/05 38 Thank You!! Contact info: Uyen Kao, MPH CHIPTS