Prepared by the North Dakota State Data Center July 2008 1 HNDECA and ECCS Evaluation Dr. Richard Rathge Professor and Director North Dakota State Data.

Slides:



Advertisements
Similar presentations
REGIONAL CONFERENCES Planning, Doing, and Using Evaluation.
Advertisements

REGIONAL CONFERENCES Planning, Doing, and Using Evaluation.
Logic modeling.
How to Evaluate Your Health Literacy Project Jill Lucht, MS Project Director, Center for Health Policy
Definition: A program logic model is a systematic, visual way to present a program It is a picture of why and how you believe a program will work.
1 Theories of Change and Logic Models: Telling Them Apart Heléne Clark Director, ActKnowledge Andrea A. Anderson Research.
Using Logic Models for Program Planning and Evaluation
1 Theory of Change Chesapeake Bay Funders Network Program Evaluation Training Workshop OMG Center for Collaborative Learning January 9-10, 2008.
Developing a Logic Model
Dennis McBride, Ph.D. The Washington Institute (253) Goal Driven Logic Models.
NRCOI March 5th Conference Call
The Lumina Center Grantseeking Workshop Series Presents Outcomes & Evaluations April 20, 2006.
1 Minority SA/HIV Initiative MAI Training SPF Step 3 – Planning Presented By: Tracy Johnson, CSAP’s Central CAPT Janer Hernandez, CSAP’s Northeast CAPT.
Logic Models. A logic model is your program ROAD MAP. Where are you trying to go? How are you trying to get there? What will tell you that you’ve arrived?
Reality Check For Extension Programs Deborah J. Young Associate Director University of Arizona Cooperative Extension.
TOOLS FOR POLICY DEVELOPMENT AND ANALYSIS Goals and Objectives.
How to Develop the Right Research Questions for Program Evaluation
How to Focus and Measure Outcomes Katherine Webb-Martinez Central Coast and South Region All Staff Conference April 23, 2008.
Logic Models Ron Hale Office of Health Promotion & Community Health Improvement April 7, 2008 Las Vegas, NM.
Performance Measurement and Evaluation Basics 2014 AmeriCorps External Reviewer Training.
Theories of Change and Logic Models: Telling Them Apart
Performance Measures AmeriCorps Project Director Training Saratoga, NY October 7 th – 9 th, 2013.
The Evaluation Plan.
Logic Models Handout 1.
Program Evaluation and Logic Models
Objectives 1. Children will be supported in an integrated way through the establishment of a Start Right Community Wrap- Around Programme in the target.
Model mod·el noun \mä-d ə l\ 1:Structural design. 2:A usually miniature representation of something; a pattern of something to be made. 3:An example for.
Outcome Based Evaluation for Digital Library Projects and Services
The Logic Model An Introduction. Slide 2 Innovation Network, Inc. Who We Are National nonprofit organization Committed to evaluation as a tool for empowerment.
Prepared by the North Dakota State Data Center April Children’s Health and Well-being Data: Resource for Decision Makers Dr. Richard Rathge Professor.
Logic Models and Theory of Change Models: Defining and Telling Apart
Evaluating Financial Education Programs: A framework for measuring results Ellen Taylor-Powell, Ph.D. Evaluation Specialist American Savings Education.
Juggling the Program Management Ball 1. One day thou art an extension educator… The next day thou art a county director or district extension director…how.
Developing a logic model Western Region Institute Brian Luckey, University of Idaho Extension 1 © 2008 by the University of Wisconsin System..
LCES Advisory Leadership System… The role of advisory leadership councils in Extension’s programming process.
LOGIC MODEL: Moving Forward into the Accountability Era Sharon Schnelle, Presenter Sponsored through.
Futuring the Key to NC Success Pat Sobrero NC Urban Extension Summit May 11, 2005.
Julie R. Morales Butler Institute for Families University of Denver.
How to Get Where You’re Going (Part 1)
John R. Kasich, Governor Tracy J. Plouck, Director Kraig J. Knudsen, PhD, LISW * Slides have been adapted from UW Extension training on Logic Models.
Nancy L. Weaver, PhD, MPH Department of Community Health School of Public Health Saint Louis University 16 July 2010 LOGIC MODEL FUNDAMENTALS.
Evaluation of the Indiana ECCS Initiative. State Context Previous Early Childhood System Initiatives –Step Ahead –Building Bright Beginnings SPRANS Grant.
Mapping the logic behind your programming Primary Prevention Institute
Community Planning 101 Disability Preparedness Summit Nebraska Volunteer Service Commission Laurie Barger Sutter November 5, 2007.
Planning Evaluation Setting the Course Source: Thompson & McClintock (1998) Demonstrating your program’s worth. Atlanta: CDC/National Center for Injury.
Using Logic Models to Create Effective Programs
What is a Logic Model and What Does it Do?
Basic Program Logic RESOURCES/INPUTS The ingredients you need to implement your program! YOUR PROGRAM What you do to achieve your departmental goals! RESULTS/IMPACT.
Getting to Outcomes: How to do strategic planning with your CRP Theresa Costello National Resource Center for Child Protective Services May 25, 2007.
Logic modeling. “Would you tell me, please, which way I ought to go from here?” “That depends a good deal on where you want to get to.” said the Cat.
Giving Them our Best: 4-H Professional Development Logic Model Outcome: 4-H educators reflect quality, distinction and leadership in the field of youth.
Logic Models Performance Framework for Evaluating Programs in Extension.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
ADRCs Do What? Using Logic Models to Document and Improve ADRC Outcomes Glenn M. Landers.
Allison Nichols, Ed.D. Extension Specialist in Evaluation.
Designing Effective Evaluation Strategies for Outreach Programs
Utilizing the LOGIC MODEL for Program Design and Evaluation
Short term Medium term Long term
Logic Models and Theory of Change Models: Defining and Telling Apart
Logic modeling.
SUCCESSFUL MEASURING JAMES HARDIN, DIRECTOR, HEALTH IMPACT
Logic modeling.
Purpose of Outcomes measurement
What is a Logic Model? A depiction of a program showing what the program will do and what it is to accomplish. A series of “if-then” relationships that.
Resources Activity Measures Outcomes
Using Logic Models in Project Proposals
What is a Logic Model? A depiction of a program showing what the program will do and what it is to accomplish. A series of “if-then” relationships that.
Logic modeling.
Presentation transcript:

Prepared by the North Dakota State Data Center July HNDECA and ECCS Evaluation Dr. Richard Rathge Professor and Director North Dakota State Data Center Suggestions and Strategies for Evaluation Bismarck, ND Oct. 6, 2008

Prepared by the North Dakota State Data Center July Presentation Objective: 1. Provide overview of evaluation approaches 2. Review directions from other states 3. Offer recommendation/strategy for ND approach to the grant

Prepared by the North Dakota State Data Center July Typical Logic Model

4 University of Wisconsin-Extension, Program Development and Evaluation OUTPUTS What we do Who we reach ACTIVITIES Train, teach Deliver services Develop products and resources Network with others Build partnerships Assess Facilitate Work with the media … PARTICIPATION Participants Clients Customers Agencies Decision makers Policy makers

5 University of Wisconsin-Extension, Program Development and Evaluation OUTCOMES What results for individuals, families, communities..… SHORT Learning Changes in Awareness Knowledge Attitudes Skills Opinion Aspirations Motivation Behavioral intent MEDIUM Action Changes in Behavior Decision-making Policies Social action LONG-TERM Conditions Changes in Conditions Social (well-being) Health Economic Civic Environmental C H A I N OF O U T C O M E S

6 What is a Theory of Change? Long-term Outcome Necessary Pre- condition Necessary Pre- condition Necessary Pre- condition Necessary Pre- condition Necessary Pre- condition Short-term and intermediate outcomes must be achieved BEFORE long- term outcome Need to explain WHY

7 How are they different? Logic models graphically illustrate program components. Creating one helps stakeholders clearly identify outcomes, inputs and activities Theories of Change link outcomes and activities to explain HOW and WHY the desired change is expected to come about Aspen Institute Roundtable on Community Change

8 How are they different? (1) Logic Models usually start with a program and illustrate its components Theories of Change may start with a program, but are best when starting with a goal, before deciding what programmatic approaches are needed Aspen Institute Roundtable on Community Change

9 How are they different? (2) Logic Models require identifying program components, so you can see at a glance if outcomes are out of sync with inputs and activities, but they don’t show WHY activities are expected to produce outcomes Theories of Change also require justifications at each step – you have to articulate the hypothesis about why something will cause something else (it’s a causal model) Aspen Institute Roundtable on Community Change

10 How are they different? (3) Logic Models don’t always identify indicators (evidence to measure whether outcomes are met or not) Theories of Change require identifying indicators Aspen Institute Roundtable on Community Change

Prepared by the North Dakota State Data Center July

12 University of Wisconsin-Extension, Program Development and Evaluation INPUTSOUTPUTSOUTCOMES Program investments Activities Participation Short Medium What we invest What we do Who we reach What results Long-term Logic Model built from Theory of Change Using “So That” chains Why we think we should do….. So that

13 University of Wisconsin-Extension, Program Development and Evaluation EVALUATION: check and verify What do you want to know?How will you know it? PLANNING: start ith the end in mind Logic model needs to incorporate outcome based performance measures for evaluation Evaluation Component

14 University of Wisconsin-Extension, Program Development and Evaluation Logic model and common types of evaluation Needs/asset assessment: What are the characteristics, needs, priorities of target population? What are potential barriers/facilitators? What is most appropriate to do? Process evaluation: How is program implemented? Are activities delivered as intended? Fidelity of implementation? Are participants being reached as intended? What are participant reactions? Outcome evaluation: To what extent are desired changes occurring? Goals met? Who is benefiting/not benefiting? How? What seems to work? Not work? What are unintended outcomes? Impact evaluation: To what extent can changes be attributed to the program? What are the net effects? What are final consequences? Is program worth resources it costs?

15 University of Wisconsin-Extension, Program Development and Evaluation Logic model for parent education program Staff Money Partners Assess parent ed programs Design- deliver evidence -based program of 8 sessions Parents increase knowledge of child dev Parents better understanding their own parenting style Parents use effective parenting practices Improved child- parent relations Research INPUTSOUTPUTS OUTCOMES Facilitate support groups Parents gain skills in new ways to parent Parents identify appropriate actions to take Parents of kids under age attend Improve school readiness Parents gain confidence in their abilities Safe, stable, nurturing families Strategy/Theory Based

16 University of Wisconsin-Extension, Program Development and Evaluation Parent Education Example: Evaluation questions, indicators Staff Money Partners Parents increase knowledge of child dev Parents better understand their own parenting style Parents use effective parenting practices Improved child- parent relations Research Facilitate support groups Parents gain skills in new ways to parent Parents identify appropriate actions to take To what extent is school readiness increased? To what extent are relations improved? To what extent did behaviors change? For whom? Why? What else happened? To what extent did knowledge and skills increase? For whom? Why? What else happened? Who/how many attended/did not attend? Did they attend all sessions? Supports groups? Were they satisfied – why/why not? How many sessions were held? How effectively? #, quality of support groups? What amount of $ and time were invested? Parents of kids under age 6 Deliver series of 8 interactive sessions EVALUATION QUESTIONS # Staff $ used # partners # Sessions held Quality criteria INDICATORS #,% attended per session Certificate of completion #,% demonstrating increased knowledge/skills Additional outcomes #,% demonstrating changes Types of changes #,% demonstrating improvements Types of improvements Develop parent ed curriculum Improve school readiness Parents gain confidence in their abilities Safe, stable, nuturing families

17 University of Wisconsin-Extension, Program Development and Evaluation Data collection plan QuestionsIndicatorsData collection SourcesMethodsSampleTiming

Prepared by the North Dakota State Data Center July

Prepared by the North Dakota State Data Center July

Prepared by the North Dakota State Data Center July

Prepared by the North Dakota State Data Center July

Prepared by the North Dakota State Data Center July

Prepared by the North Dakota State Data Center July

24

Infrastructure: Financing, Training, Communication Vision: In Indiana, children are safe, healthy and reach their full potential. Young children birth through five and their families are a policy, program and resource priority. Every family with young children birth through five has access to quality, comprehensive resources and supports Resources and supports for young children birth through five are coordinated, cost effective, linguistically competent and community- based.

Prepared by the North Dakota State Data Center July HNDECA Evaluation 2008 Dr. Richard Rathge, Director North Dakota State Data Center, Fargo, ND NDSU, IACC 424, Fargo, ND Phone: (701) Fax: (701) URL: