Presentation is loading. Please wait.

Presentation is loading. Please wait.

Looks Good, Feels Good, But Is It Working? Lessons Learned from the Field Based on National Healthy Marriage Models 13 th Annual International Fatherhood.

Similar presentations


Presentation on theme: "Looks Good, Feels Good, But Is It Working? Lessons Learned from the Field Based on National Healthy Marriage Models 13 th Annual International Fatherhood."— Presentation transcript:

1 Looks Good, Feels Good, But Is It Working? Lessons Learned from the Field Based on National Healthy Marriage Models 13 th Annual International Fatherhood Conference June 9, 2011, Arlington, VA Presented by Jeanette Stevens

2 Purpose Evaluating your program is important for anyone considering implementing curriculum- based education within a fatherhood or relationship education program Purpose today is to provide examples from three years of implementation of healthy relationship/marriage education across three national sites that used the same curriculum.

3 Presentation Overview Who Is “AMTC”? Developing Effective Program Models Choosing Appropriate Curricula and Delivering Curricula with Fidelity Evaluation Defined Importance of Evaluating Programs Results from Field – Based on AMTC Evaluation Model

4 AMTC Services Angela M. Turner Consulting (AMTC) is a full- service evaluation firm that specializes in formative and summative evaluation. AMTC utilizes a team of contracted experts with years of program development and operations experience and independent evaluation services.

5 The AMTC Team Principal Angela M. Turner is also the CEO of the nonprofit organization, the Center for Self-Sufficiency (CFSS), a current HMI grantee. As a result, AMTC purchases services from CFSS to maximize program expertise and provide comprehensive, yet cost- effective services. CFSS conducts a portion of its evaluation services in- house (data collection and program fidelity) and subcontracts with the same independent university evaluators that AMTC does. This allows for consistency in approach.

6 The AMTC Team (continued) MANAGEMENT Angela M. Turner, Principal HEALTHY MARRIAGE PROGRAM EXPERTS Jeanette Stevens, Program Director Various staff and consultants RESEARCH AND EVALUATION EXPERTS Sara Polifka, Director of Research & Planning (R&P) Ryan Adomavich, R&P Analyst (Data Collection) Meg Houlihan-Block, R&P Analyst (Data Management)

7 The AMTC Team (continued) INDEPENDENT CONSULTANTS Summative (Outcome) Evaluation: University of Wisconsin, Milwaukee Dr. Cindy Walker, Lead Researcher Consulting Office on Research and Evaluation (CORE): Ph.D. Graduate Students Formative Evaluation: Ann Miller: Theory, Curriculum, Program Fidelity, Training Selena Webb-Ebo & Angelique Sharpe: Curriculum and Program Training Two Islands Consulting: Database Development, Management and Reporting

8 AMTC Clients U.S. Department of Healthy & Human Services Healthy Marriage Better Family Life, St. Louis, MO OIC of America, Philadelphia, PA Trinity Church, Miami, FL Social Development Commission, Milwaukee, WI (former client) Responsible Fatherhood Rosalie Manor, Milwaukee, WI

9 AMTC Clients (continued) Teen Pregnancy Prevention Better Family Life, St. Louis, MO Mission West Virginia, WV OIC of America, Philadelphia, PA OIC of Broward County, PA Trinity Church, Miami, FL Previous clients for teen pregnancy prevention include four organizations in Connecticut, Maryland, Wisconsin and Illinois.

10 AMTC Clients (continued) Workforce Development : U.S. Department of Labor Pathways Out of Poverty Better Family Life, St. Louis, MO OIC of America, Philadelphia, PA Energy Training Partnership Broward County Minority Builders Coalition, FL Offender Reentry Substance Abuse: U.S. DHHS OIC of Broward County, FL Nonprofit Capacity Building, Strengthening Communities Fund: U.S. DHHS OIC of America, Philadelphia, PA

11 Link Between Marriage and Fatherhood Programs  Healthy marriage and relationship education has been used extensively with fathers  One of the evaluated programs in this presentation, CFSS, has incarcerated males as a key target group, most of whom are fathers.

12 Developing an Evidence-Based Program Model: Choosing Curricula Literature Review – It is important to review literature on effective programs and curricula designed to provide the necessary skills for developing and sustaining a healthy relationship Identification of Theory of Change – Identify important theories of behavioral change which emphasizes impact of environment and cognition on behavior – Consider social cognitive theory in selecting curricula – Consider ACF requirement that selected curricula must include communication skills, conflict resolution skills, benefits of a healthy relationship and marriage and commitment to marriage education – Incorporate social cognitive theory into logic model and survey questionnaire

13 Developing an Evidence-Based Program Model: Choosing Curricula In choosing a curricula, it is important to consider: – Is the curricula evidence-based – Has the curricula been tested and shown positive results with the population you are choosing to work with – Is the curricula appropriate in length (or is it to long or to short for the time period you will have with the participants). It is important to involve the curricula author when deciding to exclude or modify lessons.

14 Delivering Curricula with Fidelity To achieve the best results, curriculum lessons should be delivered as designed and in the order that the curriculum is written.

15 Developing an Evidence-Based Program Model: Logic Model A properly specified logic model allows program designers, program staff, and evaluators to distinguish between failures of implementation and failures of theory. – Implementation failure means activities were not implemented as planned, and therefore the outcomes analysis will not be very meaningful. Failure of the program model occurs when activities are implemented as planned, but outcomes are not achieved. – This result calls for reformulation of program design; i.e. program revision or redesign. Caution: Do not allow logic model development to become an exercise resulting in completed columns on a worksheet. This approach obscures the theoretical linkages between the components (INPUTS  ACTIVITIES  OUTPUTS  OUTCOMES)

16 Defining “Evaluation” Evaluation: To identify all the changes that are taking place, to measure them, and to assess if the changes are due to the social program, to other extraneous factors, or would have happened anyway.” - Sampson, A. Developing robust approaches to evaluating social programmes (2007) Evaluation, 13 (4), pp. 477-493

17 Evaluation Is an Investment Evaluation is an investment in the future sustainability of the program, and in turn your organization. Do not skip this step! Build it into every aspect of program planning. It is difficult to add evaluation once program commitments have been made. Evaluation is most effective when multiple years of evaluation are dedicated to testing the model.

18 Purpose of Evaluation Primary Purpose: Program Impact on Need Evaluation must measure the impact of the program on the general need identified by the funder and/or the specific need identified by the program administrator. Are outcomes being achieved? Example: Is the HMI program affecting marriage and divorce rates?

19 Secondary Purpose: Sustainability Should funder or donors continue to support your work? Caveat: this presentation does not address evaluation for the purpose of public demonstration of an approach or theory publication of results but for internal use. Purpose of Evaluation

20 Types of Evaluation Formative Evaluation is used to design program and evaluation and to measure the program implementation and evaluation processes once they start to assure that all is going according to plan. Summative Evaluation examines the effects of the intervention on the program participants—the outcomes of the intervention.

21 Examples of Formative Evaluation Before the program – Community Needs Assessment: Need for Intervention – Conceptualization of Intervention: Theory of Behavior Change and Logic Model Development In early stages of program implementation – Program Fidelity: Implementation as Planned – Performance Measurement: Systems Needed to Measure Outputs and Outcomes – Process: Policies and Procedures Designed to Assure Fidelity to Implementation and Evaluation – Evaluation Feasibility: Timing, Commitment & Resources

22 Timing of Formative Evaluation Formative Evaluation is especially important in years 1 and 2 of the program. Unless program administrators are sure that the program implementation and evaluation processes are carried out according to plan and consistently by all staff, measurement of outcomes is unreliable. Formative Evaluation leads to implementation of the program and evaluation as designed and prepares the way for summative evaluation.

23 Examples of Summative Evaluation  Have the values of the program participants been impacted?  After the program, are the intentions of the program participants different from a similar group who did not receive the program?  Are the behaviors of those who participated in the program different from those who did not receive the program?

24 Timing of Summative Evaluation As soon as formative evaluation assures that program implementation and evaluation processes are carried out with fidelity to plan, evaluators and program managers can implement summative evaluation. Preferably summative evaluation can begin after 1 or at the most after 2 years. Our philosophy is you must do formative and summative together.

25 Tools of Summative Evaluation Pre- and Post-Surveys – To measure change in participants, research experts must work with the program staff to decide what will be measured (i.e. values, intentions, self-efficacy, or behaviors) and when those assessments will occur. – Administer surveys with purpose or it is a waste of resources. – A simple 20 question survey is unlikely to be useful. – Only an evaluation with a rigorous design can establish causal links between the intervention and the observed outcomes.

26 Tools of Summative Evaluation Comparison Groups – A comparison group helps determine if the program caused the change as opposed to extraneous factors. – Random assignment is called experimental. – Surveying a similar group is called quasi- experimental. – Using control groups is expensive and difficult to do.

27 Evaluation Results Formative and summative evaluation allow the program administrator to look at program implementation, evaluation and outcome aspects of the program where results are not being achieved and make adjustments. Examples: Area of Need Adjustments Implementation Improve quality of educator training Evaluation Design Develop new scales/constructs Poor Outcomes Modify curriculum as needed

28 Examples of Formative Evaluation Used in Phase I for HMI Program Design Evaluation staff used formative evaluation to discover that the following implementation strategies worked well for measuring attitude and behavioral changes through pre- and post-survey administration due to length of time for program participants to process what they are learning: No. of lessonsLength of sessionsNo. of weeksFrequency 131 hour 13 1 session /wk 131 hour 2.51 session / day 132 hours 61 session / wk.

29 Examples of Formative Evaluation Used in Phase I for HMI Program Design Formative evaluation was used to discover that the following program implementation strategies worked adequately but not ideally for measuring attitude and behavioral changes. It is possible to draw some but limited conclusions about attitude and behavior changes due to shortness of program. No. of lessonsLength of sessions No. of weeksFrequency 13 2 hours 2 3 sessions / week

30 Examples of Formative Evaluation Used in Phase I for HMI Program Design Formative evaluation was used to discover that the following implementation is less than ideal for measuring attitude and behavioral changes. Without a follow-up study, it is not possible to determine if any attitude or behavior changes have been made and sustained during a short program period. – Curriculum facilitated in a one weekend (i.e. Friday and Saturday) format. – Curriculum delivered within a one-day format.

31 Examples of Formative Evaluation Used in Phase I for HMI Program Design Conclusions and Recommendations for Timing of Implementation as a Result of Formative Evaluation Classes should be administered over the course of an extended time period (multiple days/weeks) in order for evaluation to fully gauge the retention of participants. However, post-survey administration can be difficult in extended programs. Without careful planning, not all participants will complete from beginning to end. It is important to know this going in and to develop the curriculum implementation plan accordingly.

32 Examples of Formative Evaluation Used in Phase I for HMI Program Design Conclusions and Recommendations for Timing of Implementation as a result of Formative Evaluation (continued) Do not let challenges lead you to skip or short- change the evaluation process and simply provide the service. Remember, evaluation is the best way to make your program work most effectively and to provide evidence of such.

33 Three HMI programs selected a youth healthy relationship curriculum called KEYS at project inception. KEYS had not been previously evaluated. Year 1 and year 2 results did not show positive movement in established scales. Curriculum change was made in year three to Relationship Smarts Plus by two grantees. A Tale of Two Curricula—Phase I KEYS and Relationship Smarts Plus

34 Lessons Keys to Healthy RelationshipsRelationship Smarts with Financial Lessons 1Thoughts-Feelings-Decisions-CycleWho am I? Where Am I Going? 2Self-EsteemMaturity and Values 3Balanced CommunicationAttractions and Infatuation 4Taking Responsibility For Feelings And ActionsLove and Intimacy 5Anger: Negative Expression of Thoughts/FeelingsPrinciples of Smart Relationships 6Components of a Healthy RelationshipLow-Risk Approach to Relationships 7Introduction To Money ManagementHow Healthy is this Relationship? 8Choosing A Boyfriend/GirlfriendBreaking Up and Dating Abuse 9Defining Marriage And Understanding Its BenefitsFoundation for Good Communication 10Influences On MarriageCommunication Challenges and More Skills 11Spending MoneyThrough the Eyes of a Child 12Healthy Behaviors In Relationships And MarriageLooking Towards Future: Healthy Relationships 13Changing Relationship PatternsFollow Your North Star 14Abusive RelationshipsIntroduction to Money Management 15 The Success Sequence - Tying It All TogetherSpending Money A Tale of Two Curricula—Phase I Curriculum Comparison Chart

35 Theme & Construct/ScaleKEYSRelationship Smarts “x” represents that statistical significant gains within construct/scale have been found ATTITUDES & BELIEFS Future Orientation and Beliefs about Marriage x Importance of Marriage x Environmental Impact xx PHYSICAL & VERBAL AGGRESSION Physical and Verbal Aggression x SELF-EFFICACY Self-Efficacy for Communication x Self-Efficacy for Self-Control Self-Efficacy for Self-Advocacy x Self-Efficacy for Refusal xx FINANCIAL LITERACY Financial Literacy x x Effectiveness of Keys to a Healthy Relationship versus Relationship Smarts+ * Results for program using Keys did improve in their 4 th program year A Tale of Two Curricula—Phase I Results Greatly Improved with RS+

36 Program Implementation Data Management Fidelity and Process Evaluation –This phase is designed to address evaluation issues of inconsistency and adherence to program design and monitoring of key program outputs. –The evaluation department works with managers and educators to track program stability. –We provide a yearly program staff training to include policies and procedures, data tracking, importance of evaluation and evaluation strategy, and coordination of curriculum training. –We provide ongoing curriculum training as necessary to ensure Educators are prepared to facilitate the curriculum. PHASE II

37 Formative and Summative Evaluation Used Together In the KEYS vs. RS+ example in Phase I, we use a well-developed management information system (MIS) to conduct formative evaluation of the year one implementation of KEYS. The evaluation revealed issues with program fidelity in terms of percentages of curriculum delivered and completed. In year two, formative evaluation revealed that fidelity improved, but summative evaluation revealed that outcomes did not! This led to a change in curriculum and points to the importance of using both types of evaluation together. A Tale of Two Curricula—Phase II

38 Pre- and post-surveys are used to test for changes in perceptions that are hypothesized based on literature review and captured in the logic model. Surveys are not based on curriculum but on expected outcomes. Our surveys are designed to be psychometrically strong (apply statistical and mathematical techniques to surveys with good internal consistency) so that data analysis is based on reliable findings. Questions related to the constructs are asked in different ways to assure reliability of response. PHASE III Data Analysis and Reporting Surveys and Data Analysis

39 PHASE III Data Analysis and Reporting Survey Instrument Repeated Measures Design allows program staff and evaluator to ascertain variances in program results which may be attributed to program implementation, and begin making recommendations for improvement A good sample size is important. Surveys are typically longer than we would like, but this is needed in order to produce valid results! Example: the HMI youth survey consists of 66 items that measure psychological constructs thought to be influenced by program implementation. Limitation: Only able to use data from participants who took both a pre- survey and a post-survey. This has proven challenging with the shorter than planned HMI programs.

40 Survey Instrument Phase Three: Data Analysis and Reporting Designed to be psychometrically strong with good internal consistency (i.e. reliability) Reliable – youth survey consists of 66 items that measures psychological constructs thought to be influenced by program implementation. – Questions related to the constructs are asked in different ways to assess reliability of response. – Allows one to test for changes in perceptions that are hypothesized based on literature – A good sample size is important, and yes, a survey is typically longer than we would like, but needed in order to produce valid results!

41 An AMTC Evaluation Lesson Learned At project inception, three AMTC clients selected a youth healthy relationship curriculum that had not been previously evaluated. Year 1 and year 2 results did not show positive movement in established scales. Curriculum change was made in year three; positive results were achieved!

42 Curriculum Comparison Chart Lesson Number Keys to Healthy RelationshipsRelationship Smarts with Financial Lessons 1Thoughts-Feelings-Decisions-CycleWho am I? Where Am I Going? 2Self-EsteemMaturity and Values 3Balanced CommunicationAttractions and Infatuation 4 Taking Responsibility For Your Feelings And Actions Through. Love and Intimacy 5 Anger: A Negative Expression Of Thoughts And Feelings Principles of Smart Relationships 6 Components Of A Healthy Relationship - Exploring Your Values Low-Risk Approach to Relationships 7Introduction To Money ManagementHow Healthy is this Relationship? 8Choosing A Boyfriend/GirlfriendBreaking Up and Dating Abuse 9Defining Marriage And Understanding Its BenefitsFoundation for Good Communication 10Influences On MarriageCommunication Challenges and More Skills 11Spending MoneyThrough the Eyes of a Child 12Healthy Behaviors In Relationships And Marriage Looking Towards the Future: Healthy Relationships/Heal 13Changing Relationship PatternsFollow Your North Star 14Abusive RelationshipsIntroduction to Money Management 15 The Success Sequence - Tying It All TogetherSpending Money

43 Program Impact Effectiveness of Keys to a Healthy Relationship vs. Relationship Smarts Theme & Construct/ScaleKeysRelationship Smarts “x” represents that statistical significant gains within construct/scale have been found ATTITUDES & BELIEFS Future Orientation and Beliefs about Marriage x Importance of Marriage x Environmental Impact xx PHYSICAL & VERBAL AGGRESSION Physical and Verbal Aggression x SELF-EFFICACY Self-Efficacy for Communication x Self-Efficacy for Self-Control Self-Efficacy for Self-Advocacy x Self-Efficacy for Refusal xx FINANCIAL LITERACY Financial Literacy x x * Results for program using Keys did improve in their 4 th program year

44 Fidelity Assessment With a well-developed management information system (MIS), we were able to conduct fidelity monitoring. In the example presented on the prior slides, in year one there were issues with program fidelity, in terms of percent of curriculum delivered and completed. However, in year two, fidelity improved, but intermediate outcomes did not! Systems to measure fidelity, process and outcomes are all needed before the program is modified.

45 Impact of Curriculum Delivery on Survey Administration The following methods are not ideal for an outcome evaluation without a follow-up study. It is not possible to determine if attitude or behavior changes have been made and sustained over a day or weekend. – Curriculum facilitated in a one weekend (i.e. Friday and Saturday) format. – Curriculum delivered within a one-day format.

46 Impact of Curriculum Delivery on Survey Administration Post-survey administration can be difficult in extended programs, as not all participants will complete from beginning to end. It is important to know this going in and to develop the curriculum implementation plan accordingly while being mindful that evaluation is important. Do not skip or short-change the evaluation process for the sake of simply providing the service. Remember, evaluation is the best way to provide evidence that your program is working.

47 Program Improvement Through a Program and Evaluation Analysis Report, AMTC summarizes results for both outputs and outcomes. We draw meaningful conclusions about the effectiveness of the program design and implementation and recommend changes to the client or acknowledgements of successes. The client reviews the analysis and responds to each recommendation or acknowledgement of success. Program design and implementation adjustments are made as needed. PHASE IV

48 Questions? Jeanette Stevens Vice President, Educational Services 4465 North Oakland Avenue Shorewood, WI 53211 414.326.3142 Jeanette.stevens@centerinc.org


Download ppt "Looks Good, Feels Good, But Is It Working? Lessons Learned from the Field Based on National Healthy Marriage Models 13 th Annual International Fatherhood."

Similar presentations


Ads by Google