Presentation is loading. Please wait.

Presentation is loading. Please wait.

Evaluation Considerations: Measures & Methods

Similar presentations


Presentation on theme: "Evaluation Considerations: Measures & Methods"— Presentation transcript:

1 Evaluation Considerations: Measures & Methods
presented at Evaluation Considerations: Measures & Methods Shrikant I. Bangdiwala, PhD Professor of Research in Biostatistics Injury Prevention Research Center University of North Carolina at Chapel Hill, USA WHO VIP Webinar 2011

2 Content Purpose of evaluation Cycle of program planning & evaluation
Indicators Study designs Statistical modeling Challenges WHO VIP Webinar 2011

3 What are we ‘evaluating’?
Actions, programs, activities Conducted in a community setting, over a period of time Aimed at reducing deaths, injuries, and/or events and behaviors that cause injuries 3 Safety 2010 London WHO VIP Webinar 2011 3

4 Example: Suwon, South Korea  area of ‘safety promotion’
WHO VIP Webinar 2011

5 Why do we ‘evaluate’? To know ourselves what works and if we are doing some good In performing some activity In the community In the country To convince funders and supporters that their investment is worthwhile To convince the community about the benefits of the multiple activities and actions carried out WHO VIP Webinar 2011

6 Main purposes of evaluation
Evaluation helps determine: How well a program/policy works relative to its goals & objectives Why a program/policy did or didn’t work, relative to planned process How to restructure a program/policy to make it work, or work better Whether to change funding for a program Program evaluation. Throughout this presentation we usually use the word program. By that we are talking about programs, policies, efforts, interventions. For example, coalition building is not a program per se, but is clearly an activity that can be assessed and evaluated on how well it’s being done relative to your goals for it. NSC Chicago 2010 WHO VIP Webinar 2011 6

7 Methodological complications
Multiplicities Multiple components of a program Multiple populations at risk Multiple study designs Multiple types of effects/impacts/outcomes & severities Multiple audiences/objectives of ‘evaluation’ Multiple methods for conducting evaluation WHO VIP Webinar 2011

8 When should evaluation be considered?
Evaluation needs to begin in, and be part of, the planning process… Otherwise, “if you do not know where you are going, it does not matter which way you go, and you will never know if you got there or not!” Lewis Carroll (1872) Alice in Wonderland One of our main messages today is that evaluation is not something that you do after the fact. It is best done and MOST EASILY done when it is part of the planning process WHO VIP Webinar 2011 8 Adapted from M. Garrettson 8

9 Types of evaluation depending on program phase
Formative Evaluation How can the program activities be improved before implementation? Program Planning Phase Program Implementation Phase Process Evaluation How is/was the program (being) implemented? Impact / Outcome Did the program succeed in achieving the intended impact or outcome? Post Program Phase NSC Chicago 2010 WHO VIP Webinar 2011 9 9

10 Cycle of program planning and evaluation
WHO VIP Webinar 2011 Adapted from C Runyan 10

11 Identify population & problem
Surveillance data Other needs assessment strategies key informant interviews focus groups surveys evaluations of past programs literature consultation with peers other info… WHO VIP Webinar 2011 11 11

12 Define target audience
Identify problem & population Define target audience Disseminate Evaluation: Process Impact Outcome Identify resources Implement Test, Refine, Implement Evaluation: Formative Test & refine implementation Set goals/ objectives Choose strategies NSC Chicago 2010 WHO VIP Webinar 2011 12 12

13 Define target audience
To whom is the program directed? Whose injuries need to be reduced? Who is the target of the program? at risk persons care givers (e.g. parents) general public media decision makers The injuries you are trying to reduce and the target of the program may not be the same. If you are trying to reduce SBS you’re targeting the parents not the babies. WHO VIP Webinar 2011 13 13

14 Understand target audience
What are their characteristics? Special needs (e.g. literacy) Interests, concerns, priorities Attitudes & beliefs re: problem & solutions to problem Cultural issues And you need to know about the population you are targeting. WHO VIP Webinar 2011 14 14

15 Define target audience
Identify problem & population Define target audience Disseminate Evaluation: Process Impact Outcome Identify resources Implement Test, Refine, Implement Evaluation: Formative Test & refine implementation Set goals/ objectives Choose strategies WHO VIP Webinar 2011 15 15

16 Identify resources Community partners On-going activities
interest in topic working on similar projects On-going activities Sources of financial support Interests in community WHO VIP Webinar 2011 16 16

17 Define target audience
Identify problem & population Define target audience Disseminate Evaluation: Process Impact Outcome Identify resources Implement Test, Refine, Implement Evaluation: Formative Test & refine implementation Set goals/ objectives Choose strategies WHO VIP Webinar 2011 17 17

18 Set goals & objectives Goal Objectives
broad statement of what program is trying to accomplish Objectives Specific Measurable Time-framed This takes us back to Alice. Where are you trying to get and how are you planning to try to get there. WHO VIP Webinar 2011 18 18

19 Define target audience
Identify problem & population Define target audience Disseminate Evaluation: Process Impact Outcome Identify resources Implement Test, Refine, Implement Evaluation: Formative Test & refine implementation Set goals/ objectives Choose strategies WHO VIP Webinar 2011 19 19

20 Choose Strategies Identify existing strategies/programs
Literature: evidence based? promising practice? WHO manuals Successes from other communities-regions-countries Develop new strategies: Logic model (how would it work) Haddon matrix WHO VIP Webinar 2011 20

21 Haddon Matrix Person Vehicle/ vector Physical Environ. Social Environ.
Pre- event Event Post-event There are multiple models that can help you identify strategies. One that is often used in injury prevention is the Haddon Matrix. I’m not going to teach you how to use it, but just show you what it is. This is the kind of thing that we could help you with in the future. WHO VIP Webinar 2011 21 Haddon 1970 Am J Public Health 21

22 3-dimensional Haddon Matrix
Phases Pre-Event Other?? Feasibility So the Haddon Matrix lets you look at different phases of an injury event and different kinds of factors that influence it. It can also help you identify other criteria, like cost and equity in choosing a strategy. So for example, I mentioned champions before as a factor that might influence your choice, they are something that can greatly improve the feasibility of actually caring out a particular strategy or program. Preferences Event Stigmatization Equity Post-event Freedom Person Vehicle/ Vector) Physical Environ. Social Cost Effectiveness Decision Criteria Factors WHO VIP Webinar 2011 22 Runyan 1998 Injury Prevention 22

23 Define target audience
Identify problem & population Define target audience Disseminate Evaluation: Process Impact Outcome Identify resources Implement Test, Refine, Implement Evaluation: Formative Test & refine implementation Set goals/ objectives Choose strategies NSC Chicago 2010 WHO VIP Webinar 2011 23 23

24 Formative Evaluation Questions it answers Why it’s useful What is the best way to influence the target population? Will the activities reach the people intended, be understood and accepted by target population? How can activities be improved? Improves (pilot-tests) program activities before full-scale implementation May increase likelihood program or policy will succeed May help stretch resources Market reserach Chevy: Nova Questionnaire with Alaskan native communities about parenting practices—can’t ask yes/no questions What we are doing now is formative evaluation—we are gathering information about our target audience (SC coalitions) to be able to develop a program of evaluation support and we are piloting this support in the US before trying to roll it out on a more global scale. WHO VIP Webinar 2011 24 * Modified from Thompson & McClintock, 2000 24

25 Define target audience
Identify problem & population Define target audience Disseminate Evaluation: Process Impact Outcome Identify resources Implement Test, Refine, Implement Evaluation: Formative Test & refine implementation Set goals/ objectives Choose strategies NSC Chicago 2010 WHO VIP Webinar 2011 25 25

26 Implementation As planned, with attention to detail
Documented clearly so others can replicate if appropriate Nuff said. WHO VIP Webinar 2011 26 26

27 Define target audience
Identify problem & population Define target audience Disseminate Evaluation: Process Impact Outcome Identify resources Implement Test, Refine, Implement Evaluation: Formative So now we get to the part of evaluation that most people think about. Let’s start with process eval. Test & refine implementation Set goals/ objectives Choose strategies NSC Chicago 2010 WHO VIP Webinar 2011 27 27

28 Process evaluation Purpose is to address: What was done?
How was it implemented? How well was it implemented? Was it implemented as planned? WHO VIP Webinar 2011 28 28

29 Process evaluation – examples of questions
Who carried out intervention? Was this the appropriate person/group? Who supported and opposed intervention? What methods/activities were used? You are asking these questions along the way as part of your continuous quality improvement of your effort. You are also then asking them in retrospect over a period of time to help you understand any outcomes that you are or are not getting. WHO VIP Webinar 2011 29 29

30 Process evaluation - why is it useful?
Allows replication of programs that work. Helps understand why programs fail. Process eval helps you know exactly what happened so you can replicate it if you have found positive outcomes. It also helps you understand what did or did not happen in the implementation so that a “failure” can be attributed to a program that doesn’t work versus a program that wasn’t actually implemented as planned. In other words it help to keep you from throwing the baby out with the bathwater Close to Home example WHO VIP Webinar 2011 30 * Modified from Thompson & McClintock, 2000 30

31 The intervention cannot be a black box…
It must be clearly understood ? Idea Outcome Think of it this way: If you are going to have heart surgery, you certainly don’t want your surgeon to have been told to 1. cut you open and 2. replace a valve. You want them to have a very detailed understanding of every little step. WHO VIP Webinar 2011 31 31

32 Define target audience
Identify problem & population Define target audience Disseminate Evaluation: Process Impact Outcome Identify resources Implement Test, Refine, Implement Evaluation: Formative Impact and outcome eval. The big hairy beast… Test & refine implementation Set goals/ objectives Choose strategies NSC Chicago 2010 WHO VIP Webinar 2011 32 32

33 Impact evaluation Purpose is to address changes in: knowledge
attitudes beliefs/ values skills behaviors / practices Short term changes. Things that may happen in months or a year. WHO VIP Webinar 2011 33 33

34 Using impact measures for
Establishing effectiveness Suppose we have a public safety campaign as our strategy Need to show Campaign  Behavior  Outcome If we already have demonstrated that  Behavior  Outcome We simply need to show Campaign  Behavior You can then decide if showing these impact changes is sufficient. If the correlation between the impacts and the outcomes is strong then they can be used as a proxy. WHO VIP Webinar 2011 34 34

35 Outcome evaluation Purpose is to address changes in:
injury events (e.g. frequency, type, pattern) morbidity (e.g. frequency, severity, type) mortality (e.g. frequency, time to death) cost (e.g. direct and indirect) WHO VIP Webinar 2011 35 35

36 Example: Bike helmets Intervention Impacts Outcomes
Parental attitudes toward child helmet use Physician counseling parents Head injury in bike crashes Enforcement of helmet law Purchase of helmets Deaths from head injury in crashes Media campaign Use of helmets by children NSC Chicago 2010 WHO VIP Webinar 2011 36 36

37 Evaluation examples of questions for local policy of smoke alarms
Did the local policy of smoke alarms in apartments… Get passed Where people aware of it? Did people have access to smoke alarms? Did people get them installed properly? Do people keep them maintained? Lead to a reduction in the number or rates of: events (e.g. apartment fires) injuries deaths costs (e.g. burn center costs, family burden, property loss) Working to pass a policy that mandates that all apartment owners install smoke alarms. There are lots of questions you can ask. What are these first kinds of evaluation questions… WHO VIP Webinar 2011 37 37

38 Evaluation – selection of measures ‘Quantitative Indicators’
Process Impact Outcome Health related Financial WHO VIP Webinar 2011 38 38

39 Choice of measure or indicator
We need to choose appropriate impact and outcome measures ‘Soft’ (more difficult to measure) outcomes – Perceptions constructs: fear, insecurity, wellbeing, quality of life Knowledge, attitudes and behaviors constructs Hard outcomes – Deaths, hospitalizations, disabilities due to injuries and violence Societal impacts – local development indicators Economics outcomes – Direct $/€/£/¥, indirect DALYs, QALYs, opportunities lost, burdens WHO VIP Webinar 2011

40 Evidence of effectiveness
Obtain qualitative ‘evidence’ to complement the quantitative ‘evidence’ Ex. Are “multisectorial collaborations and partnerships” friendly and functioning well? Ex. Is “community participation” optimal? Incorporate process indicators Incorporate narratives & testimonials WHO VIP Webinar 2011

41 Define target audience
Identify problem & population Define target audience Disseminate Evaluation: Process Impact Outcome Identify resources Implement Test, Refine, Implement Evaluation: Formative Test & refine implementation Set goals/ objectives Choose strategies NSC Chicago 2010 WHO VIP Webinar 2011 41 41

42 Dissemination Dissemination not done well Dissemination done well
Not attempted Not based on research about how to disseminate information to intended audience Dissemination done well Defining audience How to access audience How best to communicate change message to them Presentation of clear, straightforward messages WHO VIP Webinar 2011 42 42

43 Evaluation measures Lead to evidence of effectiveness
But only if the research and study methodologies, and the statistical analyses methodologies, are appropriate to convince the funders and supporters, the skeptics, the stakeholders, the community and understandable This might be the place to put an exercise so they can think about ways to measure process, impact and outcome for their specific topic example. THEN you can help them think about the ease or difficulty of getting those types of measures from secondary sources or primary data collection. But, at least get them to understand what they would WANT to measure. WHO VIP Webinar 2011 43 43

44 Research methodology approach: Evidence of effectiveness
Obtain quantitative ‘evidence’ that favors the hypothesis that the intervention is effective as opposed to the (null) hypothesis that the intervention is not effective. How? Experimental study designs - randomized clinical trials, grouped randomized experiments, community-randomized studies Quasi-experimental study designs - non-randomized comparative studies, before-after studies Observational studies - cohort studies, case-control studies and comparative cross-sectional studies WHO VIP Webinar 2011

45 Randomized controlled trial (RCT) / Experiment ‘strongest’ evidence
Intervention Group O X O Randomize Need better colors on these slides. I find that students here are very confused about intervention and control groups vs. case and control designs. Perhaps your audience won’t know that jargon or know it better. But, our students are not learning in EPID the differences, so I always have to explain. They also don’t understand difference between random selection and randomization. O O X’ Control Group WHO VIP Webinar 2011 45 45

46 Quasi-experimental designs ‘qualified’ evidence
Intervention Group O X O I always make a point to explain how this can be individuals or can be communities. Comparison Group O O WHO VIP Webinar 2011 46 46

47 One group pre/post ‘weak’ evidence
Intervention Group O X O I used to teach this before the quasi-experimental but have found they more easily see the flaws in this design if they work from the ideal backwards. WHO VIP Webinar 2011 47 47

48 One group – multiple pre / multiple post better ‘weak’ evidence
Intervention Group O O O X O O O O O I used to teach this before the quasi-experimental but have found they more easily see the flaws in this design if they work from the ideal backwards. WHO VIP Webinar 2011 48 48 48

49 One group, post only ‘basically ignorable’ evidence
Intervention Group X O When I present this, I point out that this is often what is done, but DON”T DO IT… and leave it at that. By this point, if they have understood the prior three or four slides, they will get it. Might want to finish with a review of WHY EVALUATE And how evaluation has many types and is part of the planning cycle – that is a slide that you or Andres cut that may want to put back in here. Also, I think you need to help them understand that this was just a taste. That you don’t expect them to do evaluation based on this presentation and that they will need help and much more training. Again – knowing how to add and solve simple equations is necessary but not sufficient to understanding how to do statistics – same with evaluation, there are many elements and subtleties. People take whole courses on this stuff and spend their careers doing it. You neither want them to have false sense of security or to be too discouraged. It is learnable – their job is to be clear about what they want to accomplish, be sure it is measurable and to engage with experts in evaluation to help with the details – just as one would engage a statistician to help with analysis planning and implementation.. WHO VIP Webinar 2011 49 49

50 Observational designs - cohort study evidence?
Self-chosen Intervention Group X X X I always make a point to explain how this can be individuals or can be communities. Self-chosen Non-intervention Group O O O 50 Safety 2010 London WHO VIP Webinar 2011 50 50 50

51 Observational designs - case-control study evidence?
X O Cases I always make a point to explain how this can be individuals or can be communities. X O Controls 51 Safety 2010 London WHO VIP Webinar 2011 51 51 51

52 Observational designs - cross-sectional study evidence?
Injured X X O X O Non-injured O X O O O I always make a point to explain how this can be individuals or can be communities. 52 WHO VIP Webinar 2011 Safety 2010 London 52 52 52

53 Statistical analysis methodologies
Choice - often guided by what has been done previously, or what is feasible to do, or easy to explain Choice should be tailored to the audience & their ability to understand results; but also on the ability of the presenter to explain the methodologies WHO VIP Webinar 2011

54 Statistical analysis Determined by research question(s)
Guided by study design – experimental or observational Group randomized controlled experiment Non-randomized comparison study Single site pre/post; surveillance study Retrospective or cross-sectional Guided by whether outcome is studied at a single time point or multiple time points Time series analyses Guided by audience Visual and descriptive appreciation WHO VIP Webinar 2011

55 Visual and descriptive analysis – longitudinal time series
Example: Espitia et al (2008) Salud Pública Mexico WHO VIP Webinar 2011

56 Visual and descriptive analysis – comparisons over time
Example: 56 WHO VIP Webinar 2011 Safety 2010 London

57 Statistical analysis - challenge
But what we as a field have not done as well as other fields, is to draw strength from numbers  develop collective evidence Combine results from multiple studies Systematic reviews (of observational studies) Meta analysis (of experimental & observational studies) Meta regression (of heterogeneous studies) Mixed treatment meta regression (for indirect comparisons) WHO VIP Webinar 2011

58 Systematic reviews A protocol driven comprehensive review and synthesis of data focusing on a topic or on related key questions formulate specific key questions developing a protocol refining the questions of interest conducting a literature search for evidence selecting studies that meet the inclusion criteria appraising the studies critically synthesizing and interpreting the results WHO VIP Webinar 2011

59 Example – Systematic review
Shults et al (2001) Amer J Prev Med WHO VIP Webinar 2011

60 Systematic reviews Of particular value in bringing together a number of separately conducted studies, sometimes with conflicting findings, and synthesizing their results. To this end, systematic reviews may or may not include a statistical synthesis called meta-analysis, depending on whether the studies are similar enough so that combining their results is meaningful Zaza et al (2001) Amer. J Preventive Medicine – motor vehicle Green (2005) Singapore Medical Journal WHO VIP Webinar 2011

61 Meta analysis A method of combining the results of studies quantitatively to obtain a summary estimate of the effect of an intervention Often restricted to randomized controlled trials Recently, the Cochrane Collaboration is ‘branching out’ to include both experimental and observational studies in meta analyses WHO VIP Webinar 2011

62 Meta analysis e.g. Liu et al (2008) Cochrane Collaboration
WHO VIP Webinar 2011

63 Meta analysis The combining of results should take into account:
the ‘quality’ of the studies Assessed by the reciprocal of the variance the ‘heterogeneity’ among the studies Assessed by the variance between studies WHO VIP Webinar 2011

64 Meta analysis – estimation of effect
The estimate is a weighted average, where the weight of a study is the reciprocal of its variance In order to calculate the variance of a study, one can use either a ‘fixed’ effects model or a ‘mixed’/’random’ effects model Fixed effects model: utilizes no information from other studies  Random effects model: considers variance among and within studies  WHO VIP Webinar 2011 Borenstein et al (2009) Introduction to Meta Analysis

65 Meta analysis & meta regression
Dealing with ‘heterogeneity’ among the studies - 2 Decompose the total variance into among and within components  using mixed effects models for getting a more precise estimate of the intervention effect If there is still residual heterogeneity Expand the mixed effects model to include study-level covariates that may explain some of the residual variability among studies  meta regression WHO VIP Webinar 2011

66 Meta regression e.g. random error study random effect Overall mean
X1 study variable – EU/USA X2 study variable – population type WHO VIP Webinar 2011

67 Meta analysis Standard meta-analytical methods are typically restricted to comparisons of 2 interventions using direct, head-to-head evidence alone. So, for example, if we are interested in the Intervention A vs Intervention B comparison, we would include only studies that compare Intervention A versus Intervention B directly. Many times we have multiple types of interventions for the same type of problem, and we hardly have head-to-head comparisons We may also have multiple component interventions WHO VIP Webinar 2011

68 Mixed treatment meta analysis
Let the outcome variable be a binary response 1 = positive response 0 = negative response We can calculate the binomial counts out of a total number at risk on the kth intervention in the jth study We can then calculate the estimated probability of the outcome (risk of response) for the kth intervention in the jth study WHO VIP Webinar 2011 Welton et al 2009 Amer J Epid

69 Mixed treatment meta analysis
Let each study have a reference ‘‘standard’’ intervention arm, sj, with study-specific ‘‘standard’’ log odds of outcome, j . The log odds ratio, j:k, of outcome for intervention k, relative to standard sj, is assumed to come from a random effects model with mean log odds ratio , and between-study standard deviation  where dk is the mean log odds ratio of outcome for intervention k relative to control (so that d1 = 0). WHO VIP Webinar 2011 Welton et al 2009 Amer J Epid

70 Mixed treatment meta analysis
This leads to the following logistic regression model: where WHO VIP Webinar 2011 Welton et al 2009 Amer J Epid

71 Mixed treatment meta analysis - multiple-methods interventions
If we have multiple methods in the ith intervention Plus we have multiple times when the outcome is assessed Study effect Components 1 & 2 effects Error term Time effect Study covariable WHO VIP Webinar 2011

72 Statistical analysis Methodology does exist for developing stronger collective evidence, evaluating the effectiveness of community based interventions, using different types of study designs and interventions Developing “practice-based evidence” WHO VIP Webinar 2011

73 Dissemination We should not stop at developing the evidence
We must work alongside economists in developing ways to effectively communicate ‘what works’  methodology and cost models do exist for estimating the “return on investment” Money talks !! 73 Safety 2010 London WHO VIP Webinar 2011 73

74 Challenges – Evaluation requires
Integration of evaluation from the beginning Appropriate measures, possible to be collected objectively, unbiasedly, easily and with completeness Appropriate qualitative and process information, to complement the quantitative information Concrete and convincing evidence of what aspects work in individual communities Formal methodological statistical evaluation of specific elements of programs Collective evidence of what common elements of programs work Effective dissemination strategies – “return on investment” WHO VIP Webinar 2011 74

75 We are really looking forward to working with you
We are really looking forward to working with you. We will have a brief session tomorrow to share and discuss what we have to offer more specifically and what that might look like WHO VIP Webinar 2011 75 75


Download ppt "Evaluation Considerations: Measures & Methods"

Similar presentations


Ads by Google