Presentation is loading. Please wait.

Presentation is loading. Please wait.

CAPP Evaluation: Implementing Evidence Based Programs in NYS Jane Powers ACT for Youth Center of Excellence 2011 A presentation for Comprehensive Adolescent.

Similar presentations


Presentation on theme: "CAPP Evaluation: Implementing Evidence Based Programs in NYS Jane Powers ACT for Youth Center of Excellence 2011 A presentation for Comprehensive Adolescent."— Presentation transcript:

1 CAPP Evaluation: Implementing Evidence Based Programs in NYS Jane Powers ACT for Youth Center of Excellence 2011 A presentation for Comprehensive Adolescent Pregnancy Prevention (CAPP) providers in New York State

2 Overview Review Basic Concepts of Evaluation The Science of Implementation CAPP evaluation: Implementing EBPs Evaluation Partnership with COE

3 1)You care about youth 2)You want to make a difference in their lives 3)You want to know whether you have made a difference in their lives

4 My question is, “Are we making a difference?”

5 Program Evaluation can help us answer the question: Are we making a difference???

6

7 What is Program Evaluation? “Program evaluation is the systematic collection of information about the activities, characteristics, and outcomes of programs to make judgments about the program, improve program effectiveness, and/or inform decisions about future programming.” Michael Quinn Patton (1997)

8 Evaluation Terminology

9 Types of Program Evaluation PROCESS PROCESS OUTCOME OUTCOME http://www.actforyouth.net/youth_development/evaluation/

10 Process Evaluation Focuses on: Focuses on: – What happened in the program Who got how much of what? Who got how much of what? Was the program implemented as planned? Was the program implemented as planned? –Participant reactions to the program

11 Examples of Process Questions Which youth are participating in our program? (neighborhood, RHY, LGBTQ, FC) Which youth are participating in our program? (neighborhood, RHY, LGBTQ, FC) Who are we not reaching? Who are we not reaching? How many sessions were offered? What % of participants attended all of the sessions? How many sessions were offered? What % of participants attended all of the sessions? What program activities were conducted? What program activities were conducted? Were there any adaptations made to the EBP? Were there any adaptations made to the EBP?

12 Outcome Evaluation Focuses on whether the program made a difference Focuses on whether the program made a difference Answers the question: SO WHAT? Answers the question: SO WHAT? What difference does the program make for participants, individuals, groups, families, and the community?

13 Examples of Outcome Questions Have adolescents increased knowledge about different types of birth control? Have adolescents increased knowledge about different types of birth control? Have adolescents learned how to use a condom? Have adolescents learned how to use a condom? Have attitudes toward condom use changed? Have attitudes toward condom use changed? Are parents more knowledgeable about adolescent sexuality? Are parents more knowledgeable about adolescent sexuality? Do parents talk to their kids about contraception? Do parents talk to their kids about contraception?

14 Process Data are foundational In order to get good outcome data, must obtain good process data In order to get good outcome data, must obtain good process data

15 CAPP Initiative Goal 1: Promote healthy sexual behaviors and reduce the practice of risky sexual behaviors among adolescents Provide comprehensive, age appropriate, evidence-based and medically accurate sexuality education to promote healthy sexual behaviors Core Strategy 1: Provide comprehensive, age appropriate, evidence-based and medically accurate sexuality education to promote healthy sexual behaviors

16 We know a lot about what works to prevent teen pregnancy EBP Increase age of first intercourse Increase use of condoms Decrease # sexual partners Decrease frequency of sex Decrease Teen Pregnancy Promote Adol Sexual Health

17 Identify problem or disorder and determine its extent Identify risk and protective factors associated with the problem Develop intervention and conduct efficacy trials Conduct large scale effectiveness trials of the intervention Implement the program in the community and conduct ongoing evaluation Feedback Loop The Prevention Research Cycle Reproduced from Fig. 1. The interactive systems framework for dissemination and implementation (p174) published in Wandersman et al. 2008.

18 Just DO IT!

19 What do we know about implementing EBPs in communities?

20 Taking EBPs to scale Very little is known about the processes required to effectively implement EBPs on a national scale (Fixsen et al, 2005) Very little is known about the processes required to effectively implement EBPs on a national scale (Fixsen et al, 2005) Research to support the implementation activities that are being used is even rarer Research to support the implementation activities that are being used is even rarer While many EBPs have yielded positive outcomes in research settings, the record at the local level of “practice” is mixed (Wandersman, 2009; Lesesne et al, 2008). While many EBPs have yielded positive outcomes in research settings, the record at the local level of “practice” is mixed (Wandersman, 2009; Lesesne et al, 2008).

21 What do we know about Implementation? Durlak and DuPre, 2008: Level of implementation influences program outcomes Level of implementation influences program outcomes If EBPs are not implemented with fidelity and quality, not likely to result in outcomes observed in research If EBPs are not implemented with fidelity and quality, not likely to result in outcomes observed in research Achieving good implementation increases chances of program success and stronger benefits for participants Achieving good implementation increases chances of program success and stronger benefits for participants

22 Factors Affecting Implementation Community Level Community Level Facilitator Characteristics Facilitator Characteristics Program Characteristics Program Characteristics Organizational Capacity Organizational Capacity Training and TA Training and TA

23 Need to Document Implementation Assessment of implementation is critical in program evaluation Assessment of implementation is critical in program evaluation Evaluations that lack carefully collected implementation data are incomplete Evaluations that lack carefully collected implementation data are incomplete Our understanding of program outcomes rests on knowing how the intervention was delivered. Our understanding of program outcomes rests on knowing how the intervention was delivered.

24 The Fidelity Tension Program developers and prevention researchers are concerned that changes in implementation of EBP will dilute effectiveness Program developers and prevention researchers are concerned that changes in implementation of EBP will dilute effectiveness Community leaders and practitioners are concerned that “one size does not fit all” Community leaders and practitioners are concerned that “one size does not fit all” US Department of Health and Human Services, 2002

25 HELP NEEDED!!!

26 Data Collection Tools for CAPP Evaluation of Implementation Fidelity Check List: individualized, keep track of what you did, successes/challenges Fidelity Check List: individualized, keep track of what you did, successes/challenges Attendance Record: who you reached, where, dosage Attendance Record: who you reached, where, dosage

27 Fidelity Checklist

28 Demographic Survey

29 Attendance Record

30 After you have completed an entire cycle of the EBP (i.e., ALL of the EBP sessions or modules): 1)Send all the completed evaluation tools (except the Brief Demo Survey!) to the Center of Excellence 2)Make sure that you clip all completed documents together so that we can keep track of individual EBP cycles. This includes: Fidelity Check List (one per EBP cycle) Fidelity Check List (one per EBP cycle) Attendance Record (one per EBP cycle with all names removed) Attendance Record (one per EBP cycle with all names removed) 3)Mail these documents to: Amy Breese Cornell University ACT for Youth Center of Excellence Beebe Hall Ithaca, NY 14853

31 Questions? Amanda Purington: ald17@cornell.edu or 607-255-186 ald17@cornell.edu

32 Comments? Jane Powers: jlp5@cornell.edu jlp5@cornell.edu 607-255-3993


Download ppt "CAPP Evaluation: Implementing Evidence Based Programs in NYS Jane Powers ACT for Youth Center of Excellence 2011 A presentation for Comprehensive Adolescent."

Similar presentations


Ads by Google