Presentation is loading. Please wait.

Presentation is loading. Please wait.

REGIONAL CONFERENCES 2008-2009 Planning, Doing, and Using Evaluation.

Similar presentations


Presentation on theme: "REGIONAL CONFERENCES 2008-2009 Planning, Doing, and Using Evaluation."— Presentation transcript:

1 REGIONAL CONFERENCES 2008-2009 Planning, Doing, and Using Evaluation

2 What are the experiences you have had in evaluation?

3

4 Introduction 3 Components to Program Evaluation Planning Design of the evaluation Doing Collection of data, analysis, interpretation Using Applying and disseminating the results

5 Planning Evaluations Determining Information Needs – purpose of doing an evaluation, communicating with stakeholder Developing an Evaluation Strategy – logic model, indicators, measures, questions, logistics

6 Who would you define as a key stakeholder?

7 Doing Evaluation Collecting and compiling data Analyzing information Developing conclusions and recommendations Writing the report

8 Using Evaluations Developing a communications plan Developing strategies and action plans to ensure use of evaluation findings

9 Planning Evaluations Steps to ensuring readiness – a critical success factor for evaluation: Identify any barriers and facilitators Identify and involve key stakeholders with expressed interest in evaluation Identify an evaluation champion Research existing best practices, previous designs and measures and academic literature pertaining to program

10 Planning Evaluation Steps to ensuring readiness (contd) Collaborate with other agencies to build a network of support Establish understanding of why evaluation is being carried out Develop evaluation framework, ensuring involvement of key stakeholders and staff Construct program logic model

11 Planning an Evaluation: Developing an Evaluation Framework Steps to ensuring readiness (contd) Develop evaluation questions Identify indicators for all outcomes being measured Select appropriate measures and sources of information Identify responsibilities for data collection

12 In your experience, offer examples of: 1) Two barriers you have encountered in the implementation of evaluation in your organization, and… 2) Two facilitators you have encountered in the implementation of evaluation in your organization.

13 Program Logic Model A logic model is a visual diagram that shows causal relationships between the various components of a program, i.e., linking program activities with intended results. It is also the foundation for the design of evaluation questions.

14 Program Logic Models - Example InputsActivitiesOutputsOutcomes What must the program have in order to function well? What must you do to achieve your intended results? How much do you do for how many people over what time? What difference does the program make for the target population ?

15 Program Logic Model Components: Inputs – funding and resources Activities – what is being delivered in the program; the services you provide Outputs – what is produced – the tangible products that result from the activities Outcomes – the impacts of the program related to the target population that occur as a result of program delivery

16 Program Logic Model Outcomes: Short-term – client changes that can be directly attributed to program delivery; Intermediate – client changes that can be directly attributed to program delivery; can be related to short-term outcomes long-term goal(s) Long-term goal – Overall program goal; the ideal state where program objectives have been met; the community need/problem has been addressed

17 Program Logic Models - Example Marathon Runner Goal: to improve race performance over last years time

18 Can you identify Inputs, Activities, Outputs, and Outcomes that might be involved in this training program?

19 Program Logic Models - Example Inputs: new running shoes; dedicated time in daily schedule; training resources; gym membership Activities: interval training; distance runs; strength training Outputs: 2 x 1-hour strength training sessions/week; 2 x 1-hour interval sessions/week; 2 x 2-hour distance runs/week Outcomes: increased speed; increased endurance

20 Indicators and Measures

21 Indicators: evidence that youve attained your goal and that the outcomes have been achieved Select the particular outcomes you want to measure. Ask How will you know when this has been achieved? What is the evidence and how will it be measured? Ensure quality – appropriateness & integrity of information, and quantity – power to detect effects with minimal burden on respondents for providing information

22 INPUTSACTIVITIESOUTPUTS Short-term Outcomes Intermediate -term Outcomes Final Outcome PROCESS EVALUATION OUTCOME EVALUATION PROCESS EVALUATION OUTCOME EVALUATION

23 Process Evaluations Key Issues: Service Delivery – the extent to which program components are being delivered as planned Coverage – extent of target population participation in program Bias – extent to which subgroups of the designated target population participate (or dont participate)

24 Outcome Evaluations Key Issues: Program Effectiveness/Impact – is the program achieving the intended outcomes within the targeted population, e.g., changes in knowledge/understanding, attitudes, and behaviour/functioning

25

26 Activity: Logic model development Long-term Goal: Target Population: INPUTS (Resources e.g. $, staff, equipment) ACTIVITIES (Services e.g. Counseling, outreach, support groups) OUTCOMES (Impact or effectiveness of the program on the client) SHORT-TERMINTERMEDIATE OUTPUTS (Products e.g. # of classes, # of sessions)

27 Susan Kasprzak, Research Associate skasprzak@cheo.on.ca Tanya Witteveen, Research Associate twitteveen@cheo.on.cawitteveen@cheo.on.ca

28 Visit our website for more information: www.onthepoint.ca


Download ppt "REGIONAL CONFERENCES 2008-2009 Planning, Doing, and Using Evaluation."

Similar presentations


Ads by Google