Presentation is loading. Please wait.

Presentation is loading. Please wait.

Denise Kervin, Prevention Coordinator.  Background on Our Program  Evaluation Path  Some Observations.

Similar presentations


Presentation on theme: "Denise Kervin, Prevention Coordinator.  Background on Our Program  Evaluation Path  Some Observations."— Presentation transcript:

1 Denise Kervin, Prevention Coordinator

2  Background on Our Program  Evaluation Path  Some Observations

3

4  Planning year  DELTA Subcommittee meetings  Trainings  Focus Groups

5

6 “Mature” YAC  What: School-based class about healthy and unhealthy relationships  Where: County alternative high school in Chippewa Falls  Who: Juniors and seniors, getting service learning credit  Why: Prevent teen dating violence and encourage healthy relationships

7 Evaluation is important and you can do it!

8 Match your: Goals Program Evaluation

9 Logic Models Process Evaluation Outcome Evaluation

10

11 Something new to me… A detailed “picture” of how a program operates, comparing it to how it was originally envisioned

12 What Does It Do?  Outlines why a program was created (and so why it should be continued)  Describes how a program actually operates (e.g. decision-making, resources, context, challenges /opportunities)  Determines whether a program is operating as originally planned

13 Methods We Used  Looked at documents, like the YAC curriculum, minutes from planning meetings, etc.  Talked with people involved in the DELTA program, like YAC students and DELTA Subcommittee members

14 How Did It Help Us?  We found out where our program had changed  We laid out how the YAC program worked, allowing others to replicate our model  We showed how our program was working toward its goals

15 To me: “Traditional” Evaluation or Did what we were doing have the impact we wanted?

16 Doing Outcome Evaluation What changes did we want to see? = Outcomes How were we going to measure them? = Tools

17 Evaluating YAC’s Impact: OutcomeTool Increase in knowledge aboutPre- and Posttests, Class healthy relationships assignments, Observations Increase in healthy relationshipPre- and Posttests, Student skillscomments, Observations Rejection of unhealthy genderPre- and Posttests, Class normsassignments, Student com- ments, Observations

18 What We Ended Up With: Quantitative data Qualitative data

19 Quantitative Data Descriptive Comparison of Desirable Responses: Bystander Behaviors I agree with the following statements: Pre-test Percent N Post-test Percent N If you heard someone you know putting females down, you would ask them to stop 82%5092% 34 If you heard someone you know putting males down, you would ask them to stop 56%3476% 28 If you see a guy you know saying something to a female and you know she doesn't like it, you would ask him to stop 80%4992% 34 Table 4

20 Qualitative Data KABBAims Evidence of Improvement Example Knowledge Increase knowledge about TDV Very Strong “I know a lot more about abuse. For example, I learned texting and demanding updates all the time is a form of abuse.” Knowledge Increase knowledge about gender norms Moderately strong “I always thought that I was never going to match up to what a perfect female was, and understanding what a major role that can play in relationships.” Attitude Rejection of unhealthy gender norms Evidence was not as strong as expected “It’s [attitude toward gender roles] changed completely, I can’t even watch T.V. the same way.” Attitude Desire to do social action to prevent TDV Students expressed this desire less as an attitude and more as a behavior One young man really enjoyed the YAC’s first middle school presentation and now wants to be a social worker because he thought he could really help people.

21 A Challenge: Our program focused on preventing violence, so how to measure change in students who were already against violence? One Answer: Looking at what changes there were between pre- and posttests

22 A Second Answer: We were measuring students’ knowledge and attitudes, but not really their behaviors. Result: we revised our pre- and posttests to ask about behaviors, plus looked for data through our observations and student comments.

23 Using Our Evaluation Results Pre-tests showed us where students seemed to need more work. Posttests (and other tools) then showed us if there were changes in beliefs, attitudes and behaviors.

24 Using Our Evaluation Results When results were not what we hoped for we had to figure out what was wrong: curriculum, evaluation question, our thinking? Example: Music with violent lyrics

25 Conclusion Evaluation is challenging Evaluation is scary Evaluation (doing it and using it) can give us more effective programs and help us reach our goals


Download ppt "Denise Kervin, Prevention Coordinator.  Background on Our Program  Evaluation Path  Some Observations."

Similar presentations


Ads by Google