Presentation on theme: "REGIONAL CONFERENCES 2008-2009 Planning, Doing, and Using Evaluation."— Presentation transcript:
REGIONAL CONFERENCES Planning, Doing, and Using Evaluation
What are the experiences you have had in evaluation?
Introduction 3 Components to Program Evaluation Planning Design of the evaluation Doing Collection of data, analysis, interpretation Using Applying and disseminating the results
What do you think is involved in each of these stages?
Planning Evaluations Determining Information Needs – purpose of doing an evaluation Developing an Evaluation Strategy – logic model, indicators, measures, questions, logistics
Doing Evaluation Collecting and compiling data Analyzing information Developing conclusions and recommendations Writing the report
Using Evaluations Developing a communications plan Developing strategies and action plans to ensure use of evaluation findings
Planning Evaluations Steps to ensuring readiness – a critical success factor for evaluation: Identify any barriers and facilitators Identify and involve key stakeholder with expressed interest in evaluation Identify an evaluation champion Research existing best practices, previous designs and measures and academic literature pertaining to program
Planning Evaluation Steps to ensuring readiness (contd) Collaborate with other agencies to build a network of support Establish understanding of why evaluation is being carried out Develop evaluation framework, ensuring involvement of key stakeholders and staff Construct program logic model
Planning an Evaluation: Developing an Evaluation Framework Steps to ensuring readiness (contd) Develop evaluation questions Identify indicators for all outcomes being measured Select appropriate measures and sources of information Identify responsibilities for data collection
In your experience, offer examples of: 1) two barriers you have encountered in the implementation of evaluation in your organization, and 2) two facilitators…
Program Logic Model A logic model is a visual diagram that shows causal relationships between the various components of a program, i.e., linking program activities with intended results. It is also the foundation for the design of evaluation questions.
Program Logic Model Components: Inputs – funding and resources Activities – what is being delivered in the program; the services you provide Outputs – what is produced – the tangible products that result from the activities Outcomes – the impacts of the program related to the target population that occur as a result of program delivery
Program Logic Model Outcomes: Short-term – client changes that can be directly attributed to program delivery; Intermediate – client changes that can be directly attributed to program delivery; can be related to short-term outcomes long-term goal(s) Long-term goal – Overall program goal; the ideal state where program objectives have been met; the community need/problem has been addressed.
Program Logic Models - Example Marathon Runner Goal: to improve performance over last years time
Can you identify Inputs, Activities, Outputs, and Outcomes that might be involved in this training program?
Program Logic Models - Example Inputs: new running shoes; dedicated time in daily schedule; training resources; gym membership Activities: interval training; distance runs; strength training Outputs: 2 x 1-hour strength training sessions/week; 2 x 1-hour interval sessions/week; 2 x 2-hour distance runs/week Outcomes: increased speed; increased endurance
Program Logic Models - Example ResourcesActivitiesOutputsOutcomes What must the program have in order to function well? What must you do to achieve your intended results? How much do you do for how many people over what time? What difference does the program make for the target population ?
Indicators and Measures
Indicators: evidence that youve attained your goal and that the outcomes have been achieved Select the particular outcomes you want to measure. Ask How will you know when this has been achieved? What is the evidence and how will it be measured? Ensure quality – appropriateness & integrity of information, and quantity – power to detect effects with minimal burden on respondents for providing information
INPUTSACTIVITIESOUTPUTS Short-term Outcomes Final Outcome PROCESS EVALUATION OUTCOME EVALUATION PROCESS EVALUATION OUTCOME EVALUATION
Process Evaluations Key Issues: Service Delivery – the extent to which program components are being delivered as planned Coverage – extent of target population participation in program Bias – extent to which subgroups of the designated target population participate (or dont participate)
Outcome Evaluations Key Issues: Program Effectiveness/Impact – is the program achieving the intended outcomes within the targeted population, e.g., changes in knowledge/understanding, attitudes, and behaviour/functioning
ACTIVITY: Part A Developing evaluation questions Part B Developing a framework
Susan Kasprzak, Research Associate Tanya Witteveen Research Associate