Presentation is loading. Please wait.

Presentation is loading. Please wait.

Logic Models Handout 1.

Similar presentations


Presentation on theme: "Logic Models Handout 1."— Presentation transcript:

1 Logic Models Handout 1

2 Morehouse’s Logic Model [handout]

3 YRBS 2009 Key Findings Handout 3

4 YRBS 2009 Key Findings Handout 4

5 Social-economic-environmental improvements
Hierarchy of Effects Handout 5 Social-economic-environmental improvements Source: Bennett and Rockwell, 1995, Targeting Outcomes of Programs Reactions Learning Actions Number and characteristics of people reached; frequency and intensity of contact Degree of satisfaction with program; level of interest; feelings toward activities, educational methods Changes in knowledge, attitudes, skills, aspirations Changes in behaviors and practices Participation Many Extension staff will remember the Bennett hierarchy of the 1970’s that was so popular and widely used throughout Extension. The Bennett hierarchy is a precursor of the present day logic model. You can see the similarities in this graphic. Rockwell and Bennett have since developed a toolkit titled, Targeting Outcomes of Programs (TOP) that is available on the web at See it for more information. University of Wisconsin-Extension, Program Development and Evaluation

6 Pulling it Together Data sources to document accomplishment
Handout 6 Theory Base Inputs Resources available to operate a program; personnel, fiscal and other resources Activities/ Strategies Things that the program is doing. List processes, tools, events and actions Output: What is delivered? services, types, levels, Who is served? Short-Term/ Immediate Outcomes 1-3 years; focuses on change in knowledge, attitudes, and skills Inter- mediate 3-5 years; changes in behavior, norms, and/or policies Long- Term 4-6 years; changes in organiza-tions and systems Data sources to document accomplishment

7 Logic model and common types of evaluation
Handout 7 Types of evaluation Needs/asset assessment: What are the characteristics, needs, priorities of target population? What are potential barriers/facilitators? What is most appropriate to do? Process evaluation: How is program implemented? Are activities delivered as intended? Fidelity of implementation? Are participants being reached as intended? What are participant reactions? Outcome evaluation: To what extent are desired changes occurring? Goals met? Who is benefiting/not benefiting? How? What seems to work? Not work? What are unintended outcomes? Impact evaluation: To what extent can changes be attributed to the program? What are the net effects? What are final consequences? Is program worth resources it costs? See that the questions we might ask line up with the common types of evaluations: need assessment, process evaluation, outcome evaluation and impact evaluation (and the type of questions inherent in each type) University of Wisconsin-Extension, Program Development and Evaluation

8 Typical Activity Indicators to Track
Handout 8 Activities Outcomes Measures SAC Interventions Improved Grades GPA Student interview Parent Interview Report Card Improved Attendance UA school and class Reduced Disciplinary Infractions ODRs Policy Violations Increased Coping Skills Student Survey Pre-Post Increased Resiliency Increased Perceived Social Support Decreased or delayed substance use State Risk Survey

9 Parent Education Example: Evaluation Questions, Indicators
Handout 9 Parents increase knowledge of child dev. Staff Develop parent ed curriculum Reduced stress Parents better understand their own parenting style Money Parents of 3-10 year- olds Parents identify appropriate actions to take Deliver series of 8 interactive sessions Partners Improved child-parent relations Parents gain skills in new ways to parent Parents use effective parenting practices Research Facilitate support groups Parents gain confidence in their abilities Strong families EVALUATION QUESTIONS What amount of $ and time were invested? How many sessions were held? How effectively? #, quality of support groups? Who/how many attended/did not attend? Did they attend all sessions? Supports groups? Were they satisfied – why/why not? To what extent did knowledge and skills increase? For whom? Why? What else happened? To what extent did behaviors change? For whom? Why? What else happened? To what extent is stress reduced? To what extent are relations improved? Explain how fits with collecting data over course of program; integrate into planning and program delivery INDICATORS # Staff $ used # partners # Sessions held Quality criteria #,% attended per session Certificate of completion #,% demonstrating increased knowledge/skills Additional outcomes #,% demonstrating changes in behavior Types of changes #,% demonstrating improvements Types of improvements University of Wisconsin-Extension, Program Development and Evaluation

10 Logic Model Resources Document Developer
Handout 10 Document Developer Bibliography: Logic Models in Program Evaluation CDC Evaluation Working Group Everything You Wanted to Know About Logic Models But Were Afraid to Ask Connie C. Schmitz and Beverly A. Parsons Learning from Logic Models: An example of a Family-School Partnership Program Harvard Family Research Project Making Logic Models More Systemic: An Activity Beverly A. Parsons Source: CDC Evaluation Working Group

11 More Resources Document Developer
Handout 11 Document Developer Logic Model Development Guide W.K. Kellogg Foundation Some Practical Tools for Planning and Evaluation Innonet Successfully Enhancing Program Performance Through Logic Models Logic Model Tools Univ. of Wisconsin Cooperative Extension Source: CDC Evaluation Working Group


Download ppt "Logic Models Handout 1."

Similar presentations


Ads by Google