Presentation is loading. Please wait.

Presentation is loading. Please wait.

Partnership For Success (PFS) Orientation to Evaluation September 26, 2013.

Similar presentations


Presentation on theme: "Partnership For Success (PFS) Orientation to Evaluation September 26, 2013."— Presentation transcript:

1 Partnership For Success (PFS) Orientation to Evaluation September 26, 2013

2 Partnership For Success  Welcome!  Strategic Prevention Framework - Connecting the Steps  Why Planning for Evaluation is Critical

3 Why Evaluate?  Help understand how efforts work  Ongoing feedback can improve community work  Gives community members voice and opportunity to improve efforts  Hold accountable those groups doing, supporting, and funding the work

4 Partnership For Success Requirements  Process Measures  Number of Evidence-Based Strategies Implemented  Number of Active Partners  Number of Individuals Reached by Strategy Type  Number of Prevention Activities Supported by Leveraging Resources

5 Partnership For Success Requirements  Outcome Measures  Youth Past 30-Day Alcohol Use  Youth Past Two-Week Binge Drinking

6 Data Collection/ ReportingRequirementsHow often By When: Date for completion By Whom: Who will ensure this gets done? Kansas Communities That Care student survey (CTC) (Required) 80% goal participation for all four grade levels (6 th, 8 th, 10 th, 12 th ) with 80% goal Annually Between December 1 and January 31st each year Online Documentation and Support System (ODSS) (Required) Monthly documentationMonthly Ongoing prior to the 15 th of the following month Action Plans for each strategy implemented (Suggested) Targeted objectives &list of action steps necessary to achieve appropriate implementation Before implementation; then annually Prior to strategy implementation with annual updates – upload to WST Fidelity checklists and Environmental Strategy Checklist (Required) Use fidelity checklist template to complete for curriculum-based programs or environmental strategy checklist for all other strategies At least once during implementation Mid-way through the program – upload to WST Participation in sense-making sessions Attend scheduled meetings Twice per Year June, 2014 December, 2014 Pre/post surveys for curriculum-based programs (Required if implementing programs) Administration of pre and post surveys to all program participants; Scannable forms that will need to be requested in advance Once per program At program start and exit. Submit completed surveys to Greenbush for analysis Coalition Roster (Required) Spreadsheet of names and member activities Each coalition meeting/workgr oup meeting Submit with quarterly reports Quarterly Report (Required) Report accomplishments and barriers for each reporting period; template provided QuarterlyUpload to WST  March 30  June 30  September 30  December 30

7 Evaluation: Documenting Activity  Documentation in the Online Documentation and Support System (ODSS)  WHAT: Documentation of activities and accomplishments  HOW OFTEN: Weekly documentation  WHEN: End of week  Required  Training Provided by KU Work Group (KUWG)  Feedback monthly from KUWG

8 Navigating to the ODSS

9 Evaluation: Documenting Outcomes  Participation in Kansas Communities That Care (KCTC) Student Survey  80% at grades 6, 8, 10 & 12  Annually in December & January

10

11 New Website 

12 Evaluation: Action Planning  Action Planning  WHAT: Targeted objectives and list of action steps  HOW OFTEN: Before Implementation, then annually  WHEN: Prior to strategy implementation, then annually  Upload to Workstation  Writing SMART-C objectives  Setting reasonable goals  Evidence-based strategies

13 Using Standards to Identify Indicators to Support Evaluation  Standards for Good Evaluation  Utility (e.g., does the evaluation contribute to improvement)  Feasibility (e.g., is it practical)  Propriety (e.g., protection of participants)  Accuracy (e.g., is the information credible)

14 Action Plan Template Objective, Strategy, and Measures Objective: Strategy: Measure(s)/Indicator(s) Related to Strategy: Indicator Source(s): Person(s) Responsible for Collecting/Reporting Indicators:

15 Setting Targets: From here to where  Shotgun approach to changing outcomes  Ready, Shoot, Aim  Importance of Identifying appropriate indicators  Identifying how much a community can move an indicator balancing  Accuracy  Completeness  Credibility  Resources  Time

16 Writing Objectives  SMART+C Objectives

17 Action Plan Template Objective, Strategy, and Measures Objective: By December 31st, 2015 increase the proportion of youth in grades 6, 8, 10, and 12 who report that if a kid drank some beer, wine, or hard liquor in their neighborhood, or the area around where they live, they would be caught by the police to 36% from a baseline of 32% in (Enforcement) By December 31 st, 2011, decrease academic failure by 4% from a baseline of 40% in (Academic Achievement) Strategy: Measure(s)/Indicator(s) Related to Strategy: Indicator Source(s): Person(s) Responsible for Collecting/Reporting Indicators:

18 Key Concepts for Developing Strategies and Action Plans Key Concepts:  Objectives--how much of what by when  Strategies--how  Action Plans--what changes will be sought; who will do what by when  Identified strategies and action plans as tools for evaluation

19 Developing Strategies  Strategy  Definition—Greek, strategia; generalship; giving overall direction  How things will be accomplished  Evidence-based Strategies  Programs  Policies  Practices

20 Reviewing and Selecting Evidence-Based Strategies  Evidence-based strategy/approach  An approach (program, policy or practice) that has been shown, by research, to “work” or be effective in improving a behavior or outcome

21 Selecting Evidence-Based Strategies  Search for evidence-based practices  Consider strength of evidence for whether practice caused the effect  Consider whatever practice would achieve results in your situation.  Consider presence of conditions for success.  Consider and adaptations for your context

22 Criteria for Strategies and Changes  Effectiveness  Consistency with Objectives  Fit with resources  Anticipated resistance  Reach

23 Action Plan Template Objective, Strategy, and Measures Objective: By December 31st, 2015 increase the proportion of youth in grades 6, 8, 10, and 12 who report that if a kid drank some beer, wine, or hard liquor in their neighborhood, or the area around where they live, they would be caught by the police to 36% from a baseline of 32% in (Enforcement) By December 31 st, 2011, decrease academic failure by 4% from a baseline of 40% in (Academic Achievement) Strategy: Stay On Track (school opportunities for involvement; truancy/lack of parental involvement) Measure(s)/Indicator(s) Related to Strategy: Indicator Source(s): Person(s) Responsible for Collecting/Reporting Indicators:

24 Gathering Evidence to Address the Evaluation Questions  Evidence—information that could be used to assess the merit or worth of a program  Gathering credible evidence—Overview:  Indicators of Success  Sources of Evidence  Quality of Evidence  Quantity of Evidence  Logistics for Gathering Information

25 Identifying Indicator Sources  Required sources:  KCTC student survey  Pre/post program data  Supplemental local data sources if available:  Enforcement (# MIP citations; #safety and sobriety checkpoints  Survey  Considerations  Data quality  Data collection & consistency

26 Action Plan Template Objective, Strategy, and Measures Objective: By December 31st, 2015 increase the proportion of youth in grades 6, 8, 10, and 12 who report that if a kid drank some beer, wine, or hard liquor in their neighborhood, or the area around where they live, they would be caught by the police to 36% from a baseline of 32% in (Enforcement) By December 31 st, 2011, decrease academic failure by 4% from a baseline of 40% in (Academic Achievement) Strategy: Stay On Track (school opportunities for involvement; truancy/lack of parental involvement) Measure(s)/Indicator(s) Related to Strategy: 1.Outcome and Influencing Factor data (academic failure scale, school opportunities for involvement scale) 2. Number of students/youth completing program Indicator Source(s): 1.KCTC 2. Program Administrator’s rosters Person(s) Responsible for Collecting/Reporting Indicators: 1. Greenbush Evaluator 2. Program Administrator’s rosters

27 Developing Action Plans  Action Plans vs. Logic Models  Elements of an action plan  Changes (interventions)—to be sought or implemented  Action Steps— who will do what by when to bring them about

28 Developing Action Plans  Create action steps  When creating a plan with action steps for each activity sought, minimally describe : 1. What specific change or aspect of the intervention will occur 2. Who will carry it out 3. When it will be completed or its duration

29 Action Plan Template Objective, Strategy, and Measures Objective: Strategy: Measure(s)/Indicator(s) Related to Strategy: Indicator Source(s): Person(s) Responsible for Collecting/Reporting Indicators: Action Steps to Support Implementation of Strategy Activity By WhenWho is Responsible Review program and train trainersJanuary, 2014SOT trainers, Coordinator, trainers With Kansas National Guard staff pilot SOT.February, 2014Implementation Coordinator/ Stay On Track (SOT) Trainer, Coalition representatives, NGB Review with SOT implementation coordinators after SOT is piloted at selected sites to determine SOT outcomes. November, 2014 Implementation Coordinator/ Stay On Track (SOT) Trainer, Coalition representative Implement first in Xavier High School and then in other schools January 2015Implementation Coordinator/ Stay On Track (SOT) Trainer, Coalition representatives,

30 Implementing and Evaluating Action Plans Evaluate and implement action plan using criteria of:  Completeness  Clarity  Sufficiency  Currency  Flexibility with Fidelity

31 Using Objectives, Strategies, and Action Plans to Guide and Enhance Your Work  Use to communicate your initiative’s purpose to others  Make use in ongoing evaluation and organizational and program development  Review to identify additional partners  Review at regular intervals  Integrate into your routine activities

32 Implementation Fidelity  Strategies have essential components necessary to re-produce the outcomes that led to the program being certified as evidence-based.  Fidelity consists of implementing these essential components of a program - the degree to which an intervention or program is delivered as intended.

33 Implementation Fidelity  Evaluation of implementation fidelity is important because it may moderate the relationship between an intervention and its outcomes  its assessment may also prevent potentially false conclusions from being drawn about an intervention's effectiveness  It may even help in the achievement of improved outcomes

34 Fidelity ChecklistYesNoNotes Delivery Dosage - Number of lessons - Length of sessions - Frequency of sessions Setting Materials Target Population Implementer Qualifications Implementer Training

35 Evaluation: Effectiveness of Programs  Pre/Post surveys  Unique ID

36

37 Coalition Roster  Excel spreadsheet to track member activity  Demonstration

38 Quarterly Report Questions  Did your coalition/workgroups hold regular meetings, trainings, or events this quarter?  If yes please attach your Coalition Spreadsheet  Were all strategies implemented with moderate to high fidelity as defined by the fidelity checklist?  What barriers to implementation fidelity did you experience this quarter?  How can these barriers be removed?  List Definition --How is cultural diversity being addressed in the implementation of your strategies and in your community?

39 Quarterly Report Questions  List all strategies (e.g. school or faith-based initiatives, YMCA, BB/BS, etc.), other than those funded through K-SPF, being implemented in your community.  How did you communicate results and share success with your community and stakeholders this quarter (i.e. town hall meeting, school newsletter, parent brochure, social networking, newspaper article, etc.)?  What strategies do you intend to sustain in your community after K-SPF funding?  What resources (in-kind, cash, personnel) have been obtained this quarter to sustain these strategies?  List the accomplishments you made this quarter.

40 Quarterly Report Due Dates  Upload reports to the Work Station  March 30  June 30  September 30  December 30

41 Putting It All Together Ain’t nothin’ to it but to do it. ~ Maya Angelou

42 Evaluation: Sensemaking  Engage in semiannual (January and June) reflecting on quarterly community efforts across strategies  Identification of mediating contextual factors in the implementation of action steps  Addresses the question: “What does it all mean?”

43 Community Change Defined as a new or modified program, policy, or practice  Program  Example: The addition of the SPF program, "Middle Littles" (a mentoring program through Big Brothers Big Sisters targeting middle school age students), begins in community.  Policy  Example: Coalition member, Joe Patton, introduced H 2165 to establish recklessness as a standard in unlawfully hosting minors in person's residence (aka Social Hosting Law).  Practice  Example: Katie Allen developed a coalition page on Facebook in order to begin announcing activities about the SPF grant programs.

44

45 Process Questions Implications for Adjustment: Given what we are seeing in these data, what adjustments should be made to enhance the initiative? Key Evaluation Question: What factors or processes are associated with the rate of community/system change brought about by the initiative?  At what times were there marked increases or decreases in the rate of change?  What was happening that might have increased the rate of change? What was happening that might have decreased the rate of change? Key Evaluation Question: Is the initiative serving as a catalyst for community and system change?  What is the pattern of community/system change overall? Is the rate of community/system change increasing? Decreasing? Staying the same?  What is the pattern of community/system change recently (e.g., in the past three/six months)?

46 Sample Report # objectives for intervening variables Desired direction = yes Desired direction = noMIP % objectives in desired direction 440Yes100% Intervening Variables Pro-Social Involvement Strategies to address targeted influencing factors: Positive Action Stay on Track # Participating in strategy: 5218 By December 31, 2014 increase the opportunity for pro-social involvement scale in school by 6% from a baseline of 61.7% in % % % %6.5% Exceeds Target67.7% Academic achievement Strategies to address targeted influencing factors: Positive Action Stay on Track # Participating in strategy: 5218 By December 31, 2014 decrease the risk factor scale for academic failure by 4% from a baseline of 41.1% in % % % %2.5% Right direction Target37.1% Objectives

47 Impact Questions Key Evaluation Questions: Given what we are seeing in the data, what implications can be made about the impact of the strategy implementation on underage drinking? Are the outcome measures increasing or decreasing or staying the same? How does the community data compare to others in the state? Were strategies and action plans implemented with fidelity? Are the communities meeting the targets in stated objectives for each strategy? How have the data and what you know about the community changed? Do some populations show more or less impact than others? -- If so, why? How do process and outcome measures work together to complete the story of community change and underage drinking outcomes?

48 Evaluation Team Lisa Chaney, Southeast Kansas Education Service Center, Greenbush Jerry Schultz and Momina Sims, Work Group for Community Health and Development, Univ. of Kansas


Download ppt "Partnership For Success (PFS) Orientation to Evaluation September 26, 2013."

Similar presentations


Ads by Google