Presentation is loading. Please wait.

Presentation is loading. Please wait.

Copyright © 2006 Healthy Teen Network. All rights reserved. Promoting Evidence-Based Approaches to Teen Pregnancy Prevention CityMatCH Audio-Conference.

Similar presentations


Presentation on theme: "Copyright © 2006 Healthy Teen Network. All rights reserved. Promoting Evidence-Based Approaches to Teen Pregnancy Prevention CityMatCH Audio-Conference."— Presentation transcript:

1 Copyright © 2006 Healthy Teen Network. All rights reserved. Promoting Evidence-Based Approaches to Teen Pregnancy Prevention CityMatCH Audio-Conference Nov. 30, 2006 Mary Martha Wilson, Training Director, Healthy Teen Network marymartha@healthyteennetwork.org

2 Defining “Evidence-Based”  In 2002, CDC Division of Reproductive Health funded a National Project to promote science- based approaches in teen pregnancy, HIV and STI prevention.  Project goal: to decrease teen pregnancy, STI and HIV rates by increasing the use of research- proven practices and programs, or what we call “science-based approaches.”  Three national and five state organizations were funded and together with CDC agreed on definitions.

3 Six Approaches 1.Using local and state data to select and assess the priority populations for programs; 2.Identifying the risk and protective factors of the priority populations to be served; 3.Using health behavior or health education theory to guide the selection of risk and protective factors that will be addressed by the program, and also to guide the selection of program strategies;

4 Six Approaches 4.Using a logic model to link risk and protective factors with program strategies and outcomes; 5.Conducting process and outcome evaluation of the implemented program, and using evaluation data to make modifications; and 6.Selecting, adapting, if necessary, and implementing programs that are either science-based or are promising.

5 Evidence-Based Programs Evidence-based programs are those programs proven through rigorous evaluation to be effective in changing sexual risk-taking - risk-taking behavior (e.g. delay onset of sexual intercourse, use condoms and contraception, reduce frequency of sex, reduce number of sexual partners) Criteria: Used a comparison design Sample size over 100 Changed sexual risk-taking behavior at 3-6 months after program Published in a peer-reviewed journal

6 What Are They? Lists of evidence-based programs: Advocates for Youth Science and Success: www.advocatesforyouth.org Slightly different criteria: National Campaign to Prevent Teen Pregnancy What Works: www.teenpregnancy.org PASHA at Sociometrics: www.socio.com

7 Examples Safer Choices Reducing the Risk Making Proud Choices SiHLE: Health Workshops for Young Black Women Teen Outreach Program (TOP) HIV Risk Reduction for African American and Latina Adolescent Women Aban Aya Youth Project for African American Boys

8 Promising Programs Not everyone will choose an evidence-based program and then replicate that program faithfully! Promising programs are those that have not been Rigorously evaluated but have most of the characteristics of effective programs. The Tool to Assess for Characteristics of Effective Teen Pregnancy, STD and HIV Programs (The TAC ) is available at: www.healthyteennetwork.org

9 No Matter What … … programs we are implementing, we all need to conduct some kind of program evaluation to know if our programs are meeting objectives.  Basic program evaluation doesn’t have to be difficult, time consuming or expensive!

10 Why Don’t We Evaluate? What keeps people from conducting evaluations of their programs? Lack of knowledge Lack of time, skills, supports, funds Funder’s requirements for external evaluator Fear and anxiety Other program and organizational priorities

11 It’s Essential! Who wants to know about the effectiveness of your programs? Board and staff Funders Potential funders Partners Potential partners Community

12 Purposes of Evaluation Process: think numbers Outcome: think changes in the program participant Impact: think long-term changes in a population or community

13 Process Evaluation  Used to assess whether or not the program is being implemented the way it is described on paper.  Conducted throughout the implementation of the final program design.  Can also be used to revise and strengthen a program.

14 Outcome Evaluation  Focuses on changes that occur in the participants shortly after the completion of the program.  Changes are generally categorized into knowledge, attitude and/or behavior changes.  Requires pre- and post-test data.  Can use a control or comparison group.

15 Examples of Outcome Objectives Knowledge (information, facts, stats) –List 5 contraceptive methods and describe how they work. –List 3 common symptoms associated with STDs. Attitude (values, opinions, feelings) –How comfortable are you in talking to your parents about sex? –When do you believe it’s OK for someone to have sex? Behaviors (skills, actions) –Can refuse a sexual advance assertively. –Can demonstrate the correct use of a condom.

16 Impact Evaluation  Similar to Outcome Evaluation, except changes are tracked over a longer period of time.  Follow-up data is generally tracked 12 months or more after the completion of the intervention.

17 BDI Logic Models and Evaluation InterventionsDeterminantsBehaviors Health Goal Helps to develop process objectives Helps to develop outcome objectives (short term) Helps to develop impact objectives (long term)

18 Writing Objectives  Be specific: Clarify who, what, how much, and by when  Make is measurable: Are there ways to measure your success? Do you have the resources to do so?  Be realistic : Don’t over-promise!  Focus on funder requirements and your interests

19 Evaluation Designs 1.Post-only design 2.One group pretest-posttest design 3.Non-equivalent control or comparison group design 4.Randomized pretest-posttest control or comparison group design

20 Post-Only Design + Easy - Without pretest, don’t really know magnitude of change ProgramMeasurement

21 One Group Pre-Post Design + Relatively easy - Without comparison group, limited ability to attribute changes to program ProgramMeasurement

22 Comparison Group Design Experiment Group Comparison Group + Measurement before and after program + Includes comparison group that is similar to program group (or the same for randomized design) - Comparison group may be biased - More expensive and difficult to manage MeasurementProgramMeasurement NO Program

23 Quantitative & Qualitative Methods Qualitative data collection methods emphasize deep and detailed understandings of human experience. Issues are explored openly without the constraint of predetermined categories. Quantitative data collection methods emphasize precise, objective and generalizable findings. These methods require the use of standardized measures so that varying perspectives and experiences can fit into a limited number of predetermined response categories.

24 Qualitative Measurement Methods  Interviews  Focus Groups  Open-ended responses on a survey  Portfolios  Essays  Meeting minutes  Observations/Field notes  Photo Voice

25 Qualitative Measurement Methods Advantages + Captures more depth + Provides insight to why and how Disadvantages - Time consuming to capture and analyze data - More difficult to summarize results - Typically yields smaller sample

26 Quantitative Measurement Methods  Surveys - closed ended questions (e.g., True/False, multiple choice, matching, Likert scale)  Implementation/activity logs  Performance tests  Clinical tests (e.g., urine and blood tests for STDs)

27 Quantitative Measurement Methods Advantages + Easy to administer + Can include relatively large number of questions + Can yield large samples + Easier to summarize data + More widely accepted as a form of evidence regarding program effectiveness

28 Quantitative Measurement Methods Disadvantages - Data may not be as rich or detailed as qualitative - Survey taking is difficult for some participants - Large amounts of data require more sophisticated analysis approach

29 Evaluation Reporting Possible Methods Written Report (e.g. annual report) Oral Presentation Newsletter/Journal Article Poster Board Presentation Grant Proposal

30 What to Include in Evaluation Report 1.Executive Summary (summary of evaluation findings) 2.Project Background & Description 3.Evaluation Methods 4.Description of Tools to Used to Collect Data 5.Report on Data (tables, charts, etc.) 6.Changes made to program as a result of evaluation

31 Where Can You Find: Program Evaluation Info: –Sage Publications: sagepub.com –Sociometrics: socio.com –Philliber Research: philliberresearch.com –American Evaluation Association: eval.org –Management Assistance Program for Nonprofits: mapnp.org Info on State Coalition and Health Dept contacts for each state: –Healthy Teen Network Coalition Directory –Available on the HTN website 1-1-07

32 Resources Proven Programs –Advocates For Youth’s Science and Success –NCPTP’s What Works, Emerging Answers Research –Science Says Research Briefs –Kirby’s Risk and Protective Factor Paper –Kirby’s 17 Characteristics of Effective Programs

33 Resources Healthy Teen Network’s Trainings on Science-Based Approaches –Introduction to Science-Based Approaches, BDI Logic Model, Program Evaluation Basics, Assessing for Program Characteristics, HIV and TPP Integration, Getting to Outcomes, etc. –Coming Up: Adaptation Guidelines


Download ppt "Copyright © 2006 Healthy Teen Network. All rights reserved. Promoting Evidence-Based Approaches to Teen Pregnancy Prevention CityMatCH Audio-Conference."

Similar presentations


Ads by Google