Presentation is loading. Please wait.

Presentation is loading. Please wait.

WHO VIP Webinar 2011 1 Evaluation Considerations: Measures & Methods Shrikant I. Bangdiwala, PhD Professor of Research in Biostatistics Injury Prevention.

Similar presentations


Presentation on theme: "WHO VIP Webinar 2011 1 Evaluation Considerations: Measures & Methods Shrikant I. Bangdiwala, PhD Professor of Research in Biostatistics Injury Prevention."— Presentation transcript:

1 WHO VIP Webinar Evaluation Considerations: Measures & Methods Shrikant I. Bangdiwala, PhD Professor of Research in Biostatistics Injury Prevention Research Center University of North Carolina at Chapel Hill, USA presented at

2 WHO VIP Webinar Content Purpose of evaluation Cycle of program planning & evaluation Indicators Study designs Statistical modeling Challenges

3 Safety 2010 London 3 What are we evaluating? Actions, programs, activities Conducted in a community setting, over a period of time Aimed at reducing deaths, injuries, and/or events and behaviors that cause injuries WHO VIP Webinar

4 4 Example: Suwon, South Korea area of safety promotion

5 WHO VIP Webinar Why do we evaluate? To know ourselves what works and if we are doing some good In performing some activity In the community In the country To convince funders and supporters that their investment is worthwhile To convince the community about the benefits of the multiple activities and actions carried out

6 WHO VIP Webinar NSC Chicago Main purposes of evaluation Evaluation helps determine: How well a program/policy works relative to its goals & objectives Why a program/policy did or didnt work, relative to planned process How to restructure a program/policy to make it work, or work better Whether to change funding for a program

7 WHO VIP Webinar Methodological complications Multiplicities Multiple components of a program Multiple populations at risk Multiple study designs Multiple types of effects/impacts/outcomes & severities Multiple audiences/objectives of evaluation Multiple methods for conducting evaluation

8 WHO VIP Webinar When should evaluation be considered? Evaluation needs to begin in, and be part of, the planning process… Otherwise, if you do not know where you are going, it does not matter which way you go, and you will never know if you got there or not! Lewis Carroll (1872) Alice in Wonderland Adapted from M. Garrettson

9 WHO VIP Webinar NSC Chicago Program Planning Phase Formative Evaluation How can the program activities be improved before implementation? Process Evaluation How is/was the program (being) implemented? Post Program Phase Impact / Outcome Did the program succeed in achieving the intended impact or outcome? Program Implementation Phase Types of evaluation depending on program phase

10 WHO VIP Webinar Cycle of program planning and evaluation Adapted from C Runyan

11 WHO VIP Webinar Identify population & problem Surveillance data Other needs assessment strategies key informant interviews focus groups surveys evaluations of past programs literature consultation with peers other info…

12 WHO VIP Webinar NSC Chicago Test & refine implementation Test, Refine, Implement Choose strategies Set goals/ objectives Disseminate Identify problem & population Define target audience Evaluation: Formative Implement Identify resources Evaluation: Process Impact Outcome

13 WHO VIP Webinar Define target audience To whom is the program directed? Whose injuries need to be reduced? Who is the target of the program? at risk persons care givers (e.g. parents) general public media decision makers

14 WHO VIP Webinar Understand target audience What are their characteristics? Special needs (e.g. literacy) Interests, concerns, priorities Attitudes & beliefs re: problem & solutions to problem Cultural issues

15 WHO VIP Webinar Test & refine implementation Test, Refine, Implement Choose strategies Set goals/ objectives Disseminate Identify problem & population Define target audience Evaluation: Formative Implement Identify resources Evaluation: Process Impact Outcome

16 WHO VIP Webinar Identify resources Community partners interest in topic working on similar projects On-going activities Sources of financial support Interests in community

17 WHO VIP Webinar Test & refine implementation Test, Refine, Implement Choose strategies Set goals/ objectives Disseminate Identify problem & population Define target audience Evaluation: Formative Implement Identify resources Evaluation: Process Impact Outcome

18 WHO VIP Webinar Set goals & objectives Goal broad statement of what program is trying to accomplish Objectives Specific Measurable Time-framed

19 WHO VIP Webinar Test & refine implementation Test, Refine, Implement Choose strategies Set goals/ objectives Disseminate Identify problem & population Define target audience Evaluation: Formative Implement Identify resources Evaluation: Process Impact Outcome

20 WHO VIP Webinar Choose Strategies Identify existing strategies/programs Literature: evidence based? promising practice? WHO manuals Successes from other communities-regions- countries Develop new strategies: Logic model (how would it work) Haddon matrix

21 WHO VIP Webinar Haddon Matrix PersonVehicle/ vector Physical Environ. Social Environ. Pre- event Event Post- event Haddon 1970 Am J Public Health

22 WHO VIP Webinar Pre-Event Other?? Decision Criteria Phases 3-dimensional Haddon Matrix Person Vehicle/ Vector) Physical Environ. Social Environ. Factors Event Post-event Feasibility Preferences Stigmatization Equity Freedom Cost Effectiveness Runyan 1998 Injury Prevention

23 WHO VIP Webinar NSC Chicago Test & refine implementation Test, Refine, Implement Choose strategies Set goals/ objectives Disseminate Identify problem & population Define target audience Evaluation: Formative Implement Identify resources Evaluation: Process Impact Outcome

24 WHO VIP Webinar Formative Evaluation What is the best way to influence the target population? Will the activities reach the people intended, be understood and accepted by target population? How can activities be improved? Improves (pilot-tests) program activities before full-scale implementation May increase likelihood program or policy will succeed May help stretch resources Why its usefulQuestions it answers * Modified from Thompson & McClintock, 2000

25 WHO VIP Webinar NSC Chicago Test & refine implementation Test, Refine, Implement Choose strategies Set goals/ objectives Disseminate Identify problem & population Define target audience Evaluation: Formative Implement Identify resources Evaluation: Process Impact Outcome

26 WHO VIP Webinar Implementation As planned, with attention to detail Documented clearly so others can replicate if appropriate

27 WHO VIP Webinar NSC Chicago Test & refine implementation Test, Refine, Implement Choose strategies Set goals/ objectives Disseminate Identify problem & population Define target audience Evaluation: Formative Implement Identify resources Evaluation: Process Impact Outcome

28 WHO VIP Webinar Process evaluation Purpose is to address: What was done? How was it implemented? How well was it implemented? Was it implemented as planned?

29 WHO VIP Webinar Process evaluation – examples of questions Who carried out intervention? Was this the appropriate person/group? Who supported and opposed intervention? What methods/activities were used?

30 WHO VIP Webinar Process evaluation - why is it useful? Allows replication of programs that work. Helps understand why programs fail. * Modified from Thompson & McClintock, 2000

31 WHO VIP Webinar The intervention cannot be a black box… Outcome Idea ? It must be clearly understood

32 WHO VIP Webinar NSC Chicago Test & refine implementation Test, Refine, Implement Choose strategies Set goals/ objectives Disseminate Identify problem & population Define target audience Evaluation: Formative Implement Identify resources Evaluation: Process Impact Outcome

33 WHO VIP Webinar Impact evaluation Purpose is to address changes in: knowledge attitudes beliefs/ values skills behaviors / practices

34 WHO VIP Webinar Using impact measures for Establishing effectiveness Suppose we have a public safety campaign as our strategy Need to show Campaign Behavior Outcome If we already have demonstrated that Behavior Outcome We simply need to show Campaign Behavior

35 WHO VIP Webinar Outcome evaluation Purpose is to address changes in: injury events ( e.g. frequency, type, pattern) morbidity (e.g. frequency, severity, type) mortality (e.g. frequency, time to death) cost ( e.g. direct and indirect )

36 WHO VIP Webinar NSC Chicago OutcomesImpactsIntervention Physician counseling parents Enforcement of helmet law Media campaign Parental attitudes toward child helmet use Purchase of helmets Use of helmets by children Head injury in bike crashes Deaths from head injury in crashes Example: Bike helmets

37 WHO VIP Webinar Evaluation examples of questions for local policy of smoke alarms Did the local policy of smoke alarms in apartments… Get passed Where people aware of it? Did people have access to smoke alarms? Did people get them installed properly? Do people keep them maintained? Lead to a reduction in the number or rates of: events (e.g. apartment fires) injuries deaths costs (e.g. burn center costs, family burden, property loss)

38 WHO VIP Webinar Evaluation – selection of measures Quantitative Indicators Process Impact Outcome Health related Financial

39 WHO VIP Webinar Choice of measure or indicator We need to choose appropriate impact and outcome measures Soft (more difficult to measure) outcomes – Perceptions constructs: fear, insecurity, wellbeing, quality of life Knowledge, attitudes and behaviors constructs Hard outcomes – Deaths, hospitalizations, disabilities due to injuries and violence Societal impacts – local development indicators Economics outcomes – Direct $//£/¥, indirect DALYs, QALYs, opportunities lost, burdens

40 WHO VIP Webinar Evidence of effectiveness Obtain qualitative evidence to complement the quantitative evidence Ex. Are multisectorial collaborations and partnerships friendly and functioning well? Ex. Is community participation optimal? Incorporate process indicators Incorporate narratives & testimonials

41 WHO VIP Webinar NSC Chicago Test & refine implementation Test, Refine, Implement Choose strategies Set goals/ objectives Disseminate Identify problem & population Define target audience Evaluation: Formative Implement Identify resources Evaluation: Process Impact Outcome

42 WHO VIP Webinar Dissemination Dissemination not done well Not attempted Not based on research about how to disseminate information to intended audience Dissemination done well Defining audience How to access audience How best to communicate change message to them Presentation of clear, straightforward messages

43 WHO VIP Webinar Evaluation measures Lead to evidence of effectiveness But only if the research and study methodologies, and the statistical analyses methodologies, are appropriate to convince the funders and supporters, the skeptics, the stakeholders, the community and understandable

44 WHO VIP Webinar Research methodology approach: Evidence of effectiveness Obtain quantitative evidence that favors the hypothesis that the intervention is effective as opposed to the (null) hypothesis that the intervention is not effective. How? Experimental study designs - randomized clinical trials, grouped randomized experiments, community-randomized studies Quasi-experimental study designs - non-randomized comparative studies, before-after studies Observational studies - cohort studies, case-control studies and comparative cross-sectional studies

45 WHO VIP Webinar Randomized controlled trial (RCT) / Experiment strongest evidence Randomize Intervention Group Control Group OXOOXO O X

46 WHO VIP Webinar Quasi-experimental designs qualified evidence Intervention Group Comparison Group OXOOXO O

47 WHO VIP Webinar One group pre/post weak evidence Intervention Group OXOOXO

48 WHO VIP Webinar One group – multiple pre / multiple post better weak evidence Intervention Group O O O XO O O O O

49 WHO VIP Webinar One group, post only basically ignorable evidence Intervention Group XOXO

50 Safety 2010 London Observational designs - cohort study evidence? Self-chosen Intervention Group Self-chosen Non-intervention Group XXXXXX OOOOOO WHO VIP Webinar

51 Safety 2010 London Observational designs - case-control study evidence? Cases Controls XOXO XOXO WHO VIP Webinar

52 Safety 2010 London Observational designs - cross-sectional study evidence? Injured X O X O O X O Non-injured WHO VIP Webinar

53 WHO VIP Webinar Statistical analysis methodologies Choice - often guided by what has been done previously, or what is feasible to do, or easy to explain Choice should be tailored to the audience & their ability to understand results; but also on the ability of the presenter to explain the methodologies

54 WHO VIP Webinar Statistical analysis Determined by research question(s) Guided by study design – experimental or observational Group randomized controlled experiment Non-randomized comparison study Single site pre/post; surveillance study Retrospective or cross-sectional Guided by whether outcome is studied at a single time point or multiple time points Time series analyses Guided by audience Visual and descriptive appreciation

55 WHO VIP Webinar Visual and descriptive analysis – longitudinal time series Example: Espitia et al (2008) Salud Pública Mexico

56 Safety 2010 London 56 Visual and descriptive analysis – comparisons over time Example: WHO VIP Webinar

57 WHO VIP Webinar Statistical analysis - challenge But what we as a field have not done as well as other fields, is to draw strength from numbers develop collective evidence Combine results from multiple studies Systematic reviews (of observational studies) Meta analysis (of experimental & observational studies) Meta regression (of heterogeneous studies) Mixed treatment meta regression (for indirect comparisons)

58 WHO VIP Webinar Systematic reviews A protocol driven comprehensive review and synthesis of data focusing on a topic or on related key questions formulate specific key questions developing a protocol refining the questions of interest conducting a literature search for evidence selecting studies that meet the inclusion criteria appraising the studies critically synthesizing and interpreting the results

59 WHO VIP Webinar Example – Systematic review Shults et al (2001) Amer J Prev Med

60 WHO VIP Webinar Systematic reviews Of particular value in bringing together a number of separately conducted studies, sometimes with conflicting findings, and synthesizing their results. To this end, systematic reviews may or may not include a statistical synthesis called meta-analysis, depending on whether the studies are similar enough so that combining their results is meaningful Green (2005) Singapore Medical Journal Zaza et al (2001) Amer. J Preventive Medicine – motor vehicle

61 WHO VIP Webinar Meta analysis A method of combining the results of studies quantitatively to obtain a summary estimate of the effect of an intervention Often restricted to randomized controlled trials Recently, the Cochrane Collaboration is branching out to include both experimental and observational studies in meta analyses

62 WHO VIP Webinar Meta analysis e.g. Liu et al (2008) Cochrane Collaboration

63 WHO VIP Webinar Meta analysis The combining of results should take into account: the quality of the studies Assessed by the reciprocal of the variance the heterogeneity among the studies Assessed by the variance between studies

64 WHO VIP Webinar Meta analysis – estimation of effect The estimate is a weighted average, where the weight of a study is the reciprocal of its variance In order to calculate the variance of a study, one can use either a fixed effects model or a mixed/random effects model Fixed effects model: utilizes no information from other studies Random effects model: considers variance among and within studies Borenstein et al (2009) Introduction to Meta Analysis

65 WHO VIP Webinar Meta analysis & meta regression Dealing with heterogeneity among the studies - 2 Decompose the total variance into among and within components using mixed effects models for getting a more precise estimate of the intervention effect If there is still residual heterogeneity Expand the mixed effects model to include study-level covariates that may explain some of the residual variability among studies meta regression

66 WHO VIP Webinar Meta regression e.g. Overall mean X 1 study variable – EU/USA X 2 study variable – population type study random effect random error

67 WHO VIP Webinar Meta analysis Standard meta-analytical methods are typically restricted to comparisons of 2 interventions using direct, head-to- head evidence alone. So, for example, if we are interested in the Intervention A vs Intervention B comparison, we would include only studies that compare Intervention A versus Intervention B directly. Many times we have multiple types of interventions for the same type of problem, and we hardly have head-to- head comparisons We may also have multiple component interventions

68 WHO VIP Webinar Mixed treatment meta analysis Let the outcome variable be a binary response 1 = positive response 0 = negative response We can calculate the binomial counts out of a total number at risk on the k th intervention in the j th study We can then calculate the estimated probability of the outcome (risk of response) for the k th intervention in the j th study Welton et al 2009 Amer J Epid

69 WHO VIP Webinar Mixed treatment meta analysis Let each study have a reference standard intervention arm, s j, with study-specific standard log odds of outcome, j. The log odds ratio, j:k, of outcome for intervention k, relative to standard s j, is assumed to come from a random effects model with mean log odds ratio, and between-study standard deviation where d k is the mean log odds ratio of outcome for intervention k relative to control (so that d 1 = 0). Welton et al 2009 Amer J Epid

70 WHO VIP Webinar Mixed treatment meta analysis This leads to the following logistic regression model: where Welton et al 2009 Amer J Epid

71 WHO VIP Webinar Mixed treatment meta analysis - multiple-methods interventions If we have multiple methods in the i th intervention Plus we have multiple times when the outcome is assessed Study effect Time effect Components 1 & 2 effects Study covariable Error term

72 WHO VIP Webinar Statistical analysis Methodology does exist for developing stronger collective evidence, evaluating the effectiveness of community based interventions, using different types of study designs and interventions Developing practice-based evidence

73 Safety 2010 London 73 Dissemination We should not stop at developing the evidence We must work alongside economists in developing ways to effectively communicate what works methodology and cost models do exist for estimating the return on investment Money talks !! WHO VIP Webinar

74 WHO VIP Webinar Challenges – Evaluation requires Integration of evaluation from the beginning Appropriate measures, possible to be collected objectively, unbiasedly, easily and with completeness Appropriate qualitative and process information, to complement the quantitative information Concrete and convincing evidence of what aspects work in individual communities Formal methodological statistical evaluation of specific elements of programs Collective evidence of what common elements of programs work Effective dissemination strategies – return on investment

75 WHO VIP Webinar


Download ppt "WHO VIP Webinar 2011 1 Evaluation Considerations: Measures & Methods Shrikant I. Bangdiwala, PhD Professor of Research in Biostatistics Injury Prevention."

Similar presentations


Ads by Google