Presentation is loading. Please wait.

Presentation is loading. Please wait.

Prepared by the North Dakota State Data Center July 2008 1 HNDECA and ECCS Evaluation Dr. Richard Rathge Professor and Director North Dakota State Data.

Similar presentations


Presentation on theme: "Prepared by the North Dakota State Data Center July 2008 1 HNDECA and ECCS Evaluation Dr. Richard Rathge Professor and Director North Dakota State Data."— Presentation transcript:

1 Prepared by the North Dakota State Data Center July 2008 1 HNDECA and ECCS Evaluation Dr. Richard Rathge Professor and Director North Dakota State Data Center Suggestions and Strategies for Evaluation Bismarck, ND Oct. 6, 2008

2 Prepared by the North Dakota State Data Center July 2008 2 Presentation Objective: 1. Provide overview of evaluation approaches 2. Review directions from other states 3. Offer recommendation/strategy for ND approach to the grant

3 Prepared by the North Dakota State Data Center July 2008 3 Typical Logic Model

4 4 University of Wisconsin-Extension, Program Development and Evaluation OUTPUTS What we do Who we reach ACTIVITIES Train, teach Deliver services Develop products and resources Network with others Build partnerships Assess Facilitate Work with the media … PARTICIPATION Participants Clients Customers Agencies Decision makers Policy makers

5 5 University of Wisconsin-Extension, Program Development and Evaluation OUTCOMES What results for individuals, families, communities..… SHORT Learning Changes in Awareness Knowledge Attitudes Skills Opinion Aspirations Motivation Behavioral intent MEDIUM Action Changes in Behavior Decision-making Policies Social action LONG-TERM Conditions Changes in Conditions Social (well-being) Health Economic Civic Environmental C H A I N OF O U T C O M E S

6 6 What is a Theory of Change? Long-term Outcome Necessary Pre- condition Necessary Pre- condition Necessary Pre- condition Necessary Pre- condition Necessary Pre- condition Short-term and intermediate outcomes must be achieved BEFORE long- term outcome Need to explain WHY

7 7 How are they different? Logic models graphically illustrate program components. Creating one helps stakeholders clearly identify outcomes, inputs and activities Theories of Change link outcomes and activities to explain HOW and WHY the desired change is expected to come about Aspen Institute Roundtable on Community Change

8 8 How are they different? (1) Logic Models usually start with a program and illustrate its components Theories of Change may start with a program, but are best when starting with a goal, before deciding what programmatic approaches are needed Aspen Institute Roundtable on Community Change

9 9 How are they different? (2) Logic Models require identifying program components, so you can see at a glance if outcomes are out of sync with inputs and activities, but they don’t show WHY activities are expected to produce outcomes Theories of Change also require justifications at each step – you have to articulate the hypothesis about why something will cause something else (it’s a causal model) Aspen Institute Roundtable on Community Change

10 10 How are they different? (3) Logic Models don’t always identify indicators (evidence to measure whether outcomes are met or not) Theories of Change require identifying indicators Aspen Institute Roundtable on Community Change

11 Prepared by the North Dakota State Data Center July 2008 11

12 12 University of Wisconsin-Extension, Program Development and Evaluation INPUTSOUTPUTSOUTCOMES Program investments Activities Participation Short Medium What we invest What we do Who we reach What results Long-term Logic Model built from Theory of Change Using “So That” chains Why we think we should do….. So that

13 13 University of Wisconsin-Extension, Program Development and Evaluation EVALUATION: check and verify What do you want to know?How will you know it? PLANNING: start ith the end in mind Logic model needs to incorporate outcome based performance measures for evaluation Evaluation Component

14 14 University of Wisconsin-Extension, Program Development and Evaluation Logic model and common types of evaluation Needs/asset assessment: What are the characteristics, needs, priorities of target population? What are potential barriers/facilitators? What is most appropriate to do? Process evaluation: How is program implemented? Are activities delivered as intended? Fidelity of implementation? Are participants being reached as intended? What are participant reactions? Outcome evaluation: To what extent are desired changes occurring? Goals met? Who is benefiting/not benefiting? How? What seems to work? Not work? What are unintended outcomes? Impact evaluation: To what extent can changes be attributed to the program? What are the net effects? What are final consequences? Is program worth resources it costs?

15 15 University of Wisconsin-Extension, Program Development and Evaluation Logic model for parent education program Staff Money Partners Assess parent ed programs Design- deliver evidence -based program of 8 sessions Parents increase knowledge of child dev Parents better understanding their own parenting style Parents use effective parenting practices Improved child- parent relations Research INPUTSOUTPUTS OUTCOMES Facilitate support groups Parents gain skills in new ways to parent Parents identify appropriate actions to take Parents of kids under age attend Improve school readiness Parents gain confidence in their abilities Safe, stable, nurturing families Strategy/Theory Based

16 16 University of Wisconsin-Extension, Program Development and Evaluation Parent Education Example: Evaluation questions, indicators Staff Money Partners Parents increase knowledge of child dev Parents better understand their own parenting style Parents use effective parenting practices Improved child- parent relations Research Facilitate support groups Parents gain skills in new ways to parent Parents identify appropriate actions to take To what extent is school readiness increased? To what extent are relations improved? To what extent did behaviors change? For whom? Why? What else happened? To what extent did knowledge and skills increase? For whom? Why? What else happened? Who/how many attended/did not attend? Did they attend all sessions? Supports groups? Were they satisfied – why/why not? How many sessions were held? How effectively? #, quality of support groups? What amount of $ and time were invested? Parents of kids under age 6 Deliver series of 8 interactive sessions EVALUATION QUESTIONS # Staff $ used # partners # Sessions held Quality criteria INDICATORS #,% attended per session Certificate of completion #,% demonstrating increased knowledge/skills Additional outcomes #,% demonstrating changes Types of changes #,% demonstrating improvements Types of improvements Develop parent ed curriculum Improve school readiness Parents gain confidence in their abilities Safe, stable, nuturing families

17 17 University of Wisconsin-Extension, Program Development and Evaluation Data collection plan QuestionsIndicatorsData collection SourcesMethodsSampleTiming

18 Prepared by the North Dakota State Data Center July 2008 18

19 Prepared by the North Dakota State Data Center July 2008 19

20 Prepared by the North Dakota State Data Center July 2008 20

21 Prepared by the North Dakota State Data Center July 2008 21

22 Prepared by the North Dakota State Data Center July 2008 22

23 Prepared by the North Dakota State Data Center July 2008 23

24 24

25 Infrastructure: Financing, Training, Communication Vision: In Indiana, children are safe, healthy and reach their full potential. Young children birth through five and their families are a policy, program and resource priority. Every family with young children birth through five has access to quality, comprehensive resources and supports Resources and supports for young children birth through five are coordinated, cost effective, linguistically competent and community- based.

26 Prepared by the North Dakota State Data Center July 2008 26 HNDECA Evaluation 2008 Dr. Richard Rathge, Director North Dakota State Data Center, Fargo, ND NDSU, IACC 424, Fargo, ND 58105 Richard.Rathge@ndsu.nodak.edu Phone: (701) 231-8621 Fax: (701) 231-9730 URL: www.ndsu.edu/sdc


Download ppt "Prepared by the North Dakota State Data Center July 2008 1 HNDECA and ECCS Evaluation Dr. Richard Rathge Professor and Director North Dakota State Data."

Similar presentations


Ads by Google