Presentation is loading. Please wait.

Presentation is loading. Please wait.

Program Evaluation Spero Manson PhD

Similar presentations


Presentation on theme: "Program Evaluation Spero Manson PhD"— Presentation transcript:

1 Program Evaluation Spero Manson PhD

2 Program evaluation is a careful investigation of a program’s characteristics and merits.
Its purpose is to provide information about the effectiveness of an activity or set of activities in order to optimize the outcomes, efficiency, and quality of health care.

3 Evaluation examines a program’s structure, activities, organization, as well as broader social environment. Evaluation also appraises the achievement of a program’s goals and objectives, and extent of its impact as well as cost.

4 We study, we plan, we research. And yet, somehow evaluation
remains as much an art as a science.

5 The Basic Steps in Program Evaluation
Describe the program or activities to evaluate Define goals and objectives Create a Logic Model Pose the evaluation question(s) Select the methods - information needed to answer the question Gather the information Report on and learn from results

6 Describe the Program or Activities to Evaluate
Need clarity as to the program and activities that are the focus of evaluation Must be able to distinguish the program from the broader environment in which it operates Premium placed on specifying the nature of the activities: who, what, where, when, how

7 Define the Program’s Goals and Objectives
Goals: a broad general statement of what the program hopes to accomplish Objectives: measurable, time-specific out-comes that are expected to be achieved as a result of the program or activities

8 Define the Program’s Goals and Objectives
Establishing measurable, time-specific outcomes is a major key to an evaluation’s credibility The more specific they are, the easier they are to measure The absence of uniformly accepted definitions or levels of performance introduces ambiguity – the bane of program evaluation!

9 Define the Program’s Goals and Objectives
Example: Goal: Reduce the risk of diabetes in youth by increasing their participation in physical activity Objective: Increase the number of students participating in physical activity in the wellness center by 50% during the next school year

10 Create a Logic Model Overall description of program activities and potential outcomes, impacts Useful to planning evaluation, defining questions, determining information that you can measure

11 Logic Model Components
Context Influences, circumstances, resources, stakeholders Program goals, objectives Outputs, Activities – what will be done or result Outcomes – Potential measurable data, information Impacts – Short and long term benefits

12 Logic Model Example Context Influences, circumstances, resources, stakeholders Program goals, objectives Outputs, Activities – what will be done Outcomes – Potential measurable data, information Impacts – Short and long term benefits Tribe, school, health program, Move it! Grant resources Summer Camp Provide Nutrition education Increase physical activity Daily walks Introduce new activities Prepare meals Classes on nutrition Increase participation, distance, # steps per day Knowledge on nutrition, fitness Awareness of their risk for diabetes Weight loss Decreased obesity Prevention of diabetes

13 Evaluation Art as well as Science
They’re harmless when they’re alone, but get a bunch of them together to evaluate a program … watch out!!

14 Pose the Evaluation Question(s)
Most important step Determines the methods of the evaluation Can have more than one question Pose after reviewing the logic model and can see possible questions

15 Types of Evaluation Questions
Process evaluation questions Did we do what we said we would do? Documentation of activities, review Outcome evaluation questions Did we achieve any short or long term outcomes as a result of our program?

16 Typical Evaluation Questions
To what extent did the program achieve its goals and objectives? What are the characteristics of the individuals and groups who participated in the program? For which individuals or groups was the program most effective? How enduring were the effects?

17 Typical Evaluation Questions
Which features (e.g., activities, settings, care strategies) of the program were most effective? How applicable are the program’s objectives and activities to other participants in other settings? What are the relationships among the costs of the program and its effects? To what extent did changes in social, political, and/or financial circumstances influences the program’s support and outcomes?

18 Evaluation Questions: Examples
Process evaluation: Did the school implement the Move It! Campaign activities during the summer school session? Outcome evaluation: Did student awareness of their risk for diabetes increase after the summer school session? Did at least ½ of the student population participate in physical activity each day? What percent of students experienced weight loss during the summer school session?

19 Select the Methods What you need to do to answer your evaluation questions – depends on the question! Various methods Pre/post test on knowledge, attitudes Track participation rates in activities Track clinical parameters – weight, heart rate, blood glucose Evaluate satisfaction with activities Always consider a comparison if possible Before vs. after comparisons Participants vs. non-participants

20 Evaluation Designs Evaluations with concurrent controls in which participants are randomly assigned to groups Benefits: If properly conducted, can establish the extent to which a program caused outcomes Concerns: More difficult to implement, logistically and methodologically

21 Evaluation Designs Evaluations with concurrent controls in which participants are not randomly assigned to groups Benefits: Easier to implement Concerns: A wide range of potential biases may occur because, without an equal chance of selection, participants in the program may be systematically different from those in the control. Also the 2 groups in the evaluation may be systematically different than other, nonparticipating groups.

22 Evaluation Designs Evaluations with self-controls. Require pre-/post-measures and often are referred to as longitudinal evaluations or before-and-after designs Benefits: Relatively easy to implement logistically. Provides data on change and improvement Concerns: Must be certain that measurements are appropriately timed. Without a control group, cannot tell if seemingly program effects are also present in other, nonparticipants

23 Evaluation Designs Evaluations with historical controls use data collected from participants in other evaluations Benefits: Easy to implement, unobtrusive Concerns: Must be certain that “normative” comparisons are applicable to participants in the evaluation.

24 Threats to Validity of Evaluation
Maturation – as a part of normal human development, individuals mature intellect-ually, emotionally, and socially. This new maturity may be as important as the program in producing change.

25 Threats to Validity of Evaluation
History – historical events may occur that can bias results or produce similar changes as those intended by the program, e.g., new educational campaigns that encourage the community at large to change its behavior, change in the structure and financing of health care, etc.

26 Threats to Validity of Evaluation
Instrumentation – unless the measures or tools used to collect the data are dependable, one cannot be confident that the data are accurate.

27 Threats to Validity of Evaluation
Attrition – the participants who remain in a program may be, indeed often are different from those who drop-out.

28

29 Gather information Set up a process for gathering the information
Create forms, tracking sheets, surveys Plan for how you will analyze or review the information when it is done Hand counts, tallys Spreadsheets, Statistical Databases Common themes

30 Questions to Ask in Choosing a Data Source
What variable, constructs, concepts need to be measured? Are they sufficiently well defined to be measured? Can I borrow or adapt a currently available measure, or must a new measure be created? If an available measure seems appropriate, has it be used in circumstances similar to the current evaluation?

31

32 Questions to Ask in Choosing a Data Source
Do I have the technical skills, financial resources, and time to create a valid measure? Do I have the technical skills, financial resources, and time to collect information with the chosen measure? Are participants likely to be able to fill out the forms, answer the questions, and provide the information called for by the measure?

33 Questions to Ask in Choosing a Data Source
When medical and other confidential records are relevant data sources, are these records likely to: Contain the necessary information? Be complete? Be timely? Be reliable? Be accessible?

34 Questions to Ask in Choosing a Data Source
To what extent will users of the evaluation’s results have confidence in the nature of the information, the manner in which it was collected, and its sources?

35 Report on and Learn from Results
Most commonly forgotten step Communicate results to stakeholders Review results – lessons learned Develop a plan to respond to results, future activities to address Communicate with parents, community Include in grant progress report! Publish in appropriate scientific forums to established best practices specific to our communties

36


Download ppt "Program Evaluation Spero Manson PhD"

Similar presentations


Ads by Google