Presentation on theme: "Mark Petticrew Faculty of Public Health and Policy London School of Hygiene & Tropical Medicine Complex interventions ‑ Evaluating them & synthesising."— Presentation transcript:
Mark Petticrew Faculty of Public Health and Policy London School of Hygiene & Tropical Medicine Complex interventions ‑ Evaluating them & synthesising evidence on their impacts
Implications of complexity for… 1. Refining the research question 2. Thinking about what sort of public health evidence we need – what sort of research we do 3. Social determinants of health and health inequalities 4. Systematic reviews of complex interventions
Why the recent interest in complex interventions in public health? #1 Evidence-based medicine (1990’s)… Evidence based policy and need to know “what works”…i.e. outcomes of interventions to improve public health… Various reports from Acheson Inquiry (1998) onwards lament the lack of evidence on “what works” Growing consensus around need for better “evidence” on what to do…
Another impetus: “We were saying ‘what counts is what works’, and by that we meant outputs… …it was bunkum as a piece of policy”
The evidence is weak… poor…missing….focussed on description rather than intervention…lost opportunities…etc
So why are bits of the public health jigsaw missing? Many reasons…
Five barriers to public health intervention research Unpopular with researchers (often involves collecting new, representative data in “difficult to access” communities) Methodologically difficult (you don’t control the intervention; it never happens on time; knock-on effects on funding; difficulty of finding controls/comparison areas; ethics) Seen as “biomedical” and “evidence-based” – and the idea of a hierarchy of evidence is misused, widely misunderstood, and a major turn-off for some researchers No guarantee that the findings will be useful, or used: At least 293 barriers to using evidence have been identified (Cabana, 1999) Politically unacceptable or at least unhelpful
“Certainly in British politics, the power of a story beats almost anything.” (Policy advisor, UK) This suggests that the ethical and methodological barriers are only part of the story
Transport: An example A 2004 systematic review sought to identify interventions to promote a “modal shift” in transport choice (interventions to influence people to walk and cycle, rather than using their cars) (Ogilvie et al., 2004)* From tailored self-help materials to households and individuals in order to encourage them to leave their car at home, to city-wide initiatives, such as publicity campaigns, new cycle routes, financial incentives, traffic management schemes *Br Med J Oct 2;329(7469):763
The research question What interventions are effective in promoting a shift from using cars towards using physically active modes of transport in urban populations in developed countries?
Types of intervention “Health promotion” activities (agents of change, campaigns, behaviour change programmes) Engineering measures (Bike infrastructure, traffic restraint) Financial dis/incentives (parking charges, road user charges) Providing alternative services (New railway station) Complex urban transport policies
Summary Targeted behaviour change: can be effective Engineering and financial measures: little evidence, or small effects
Study designs, by type of intervention Randomised controlled trial 3 Panel survey 13 Repeated cross-sectional survey 17 Retrospective or after-only survey 11 Case study, or uncertain 20 Individual level City wide change An example of the “inverse evidence law” (Nutbeam) Study designs Intervention No. of studies
The inverse evidence law The best evidence we have is about the most simple interventions. We have less, or weaker evidence about complex interventions – such as policies But policymakers are often most interested in the complex questions: “Many policy choices cannot be broken down into discrete interventions evaluated in a context-free manner.” (Kouri, 2009)
The causes of poor health and health inequalities are complex, so the remedies will also be complex Move away from downstream (tackling individual health behaviours) towards the upstream causes
Why the current interest in complexity in public health...#2 It recognises that the development of social and public health interventions is not a linear process, unlike drug development
Craig et al. Developing and evaluating complex interventions: the new Medical Research Council guidance Br. Med. J :a1655
Maybe the answer lies in complexity…? We need to know what works, but nothing seems to work…the evidence is “poor” Individual behaviour change is not enough… Traditional epidemiology (trials etc) isn’t always good at evaluating change in complex systems…
“Complex interventions are …interventions that are not drugs or surgical procedures, but have many potential “active ingredients... A complex intervention combines different components in a whole that is more than the sum of its parts.” (Oakley et al. BMJ 2006) What are complex interventions?
Number of interacting components Number and difficulty of behaviours required by those delivering or receiving the intervention Number of groups or organisational levels targeted by the intervention Number and variability of outcomes Degree of flexibility or tailoring of the intervention permitted (Non-standardisation/reproducibility; Hawe, JECH 2004) Key features of complex interventions (Craig et al. Br Med J 2008;337:a1655.; the MRC guidance on complex interventions)
Intervention flexibility “In school health… non-standard interventions “cannot be compartmentalised into a predetermined number and sequence of activities… Characterised by activities like capacity building and organisational change, these interventions have specific, theory-driven principles that ensure that non-standard interventions, [which have] different forms in different contexts, conform to standard processes”. Hawe et al. (2004) Complex interventions: how “out of control” can a randomised controlled trial be? JECH 2004
Alliance for Health Policy & Systems Research (2009) “Complexity” arises from a system’s interconnected parts, and “adaptivity” from its ability to communicate and change based on experience.”... …The high degree of connectivity means that a change in one subsystem affects the others.... They are governed by feedback (a positive or negative response that may alter the intervention or expected effects), and non-linear (relationships within a system cannot be arranged along a simple input-output line).”
Theory underlying the intervention Mechanisms and pathways by which the intervention affects (or may affect) outcomes Lack of linear, well-evidenced causal pathways linking the intervention and the health outcomes Mediators (causal mechanisms) and moderators (characteristics of populations, settings etc) of outcomes Aspects of context and setting and how they interact with the intervention Feedback loops, synergies, phase effects Other aspects of complex interventions to consider …
Examples from the literature P4P – a pay- for-performance intervention to improve health service quality A heart integrated care programme involving trials of primary care interventions for risk reduction in cardiovascular disease Community based health promotion Stroke Units Strategies for implementing clinical guidelines Community based screening for Chlamydia trachomatis infection Adolescent sexual health intervention in rural Zimbabwe Etc etc
1. Defining the research question Implications of complexity for…
Implications for evaluations of complex interventions Getting the research question right is even more important than usual. Population: (Individuals, families, communities?) Intervention: What is the intervention? Delivered at what level? Do we lump or split? (Broad vs narrow questions; components vs whole systems) Comparison: Are controls possible? Are RCTs possible? (RCTs of area-based interventions, or changes in organisations/systems are few and far between…) Outcomes: What outcomes? At what levels? Include health and non- health outcomes
Going beyond PICO Context: How do we identify context? How do we assess how it interacts with the intervention? Processes: Where do we find, and how do we integrate this information? Where does theory fit in?: Helps describe the causal pathways linking the intervention through intermediate outcomes to the final health and other outcomes; helps understand the purpose and assumptions underlying “components” of interventions and how they interact; informs evidence synthesis
The main problem: getting the question right Is the research question about the whole system, or components within the system? Even if the intervention is complex, it doesn’t necessarily follow that all that complexity needs to be analysed unless that is a legitimate question… E.g. if your research users want to know about it...And sometimes a simpler question is appropriate
An example: Urban Regeneration Urban regeneration programmes as an example of a complex intervention Which can be seen as being made up of other interventions (housing, environmental change, employment and training opportunities, transport improvements etc) Each of which in their turn can be broken down further into component interventions… …which affect numerous outcomes simultaneously, which are prioritised differently by different stakeholders These interventions are not simply “components”; they add to, catalyse and interact with each other The outcomes are experienced at different levels…
…and the “Level” of those outcomes also varies City-level outcomes - External perceptions of the city Individual level outcomes (health, health behaviours) Family-level outcomes Community-level outcomes (networks, well-functioning communities) Fewer RCTs More RCTs
Many evaluations have taken the opposite approach – treating urban regeneration as a “simple” intervention How much did we spend on it? (£10bn over 20 years in the UK to 2005) What did we get in return for that investment? (in terms of health outcomes, jobs, new houses) This is a perfectly reasonable approach to finding out the overall result of investment on regeneration programmes (as a whole) Complex interventions are often amenable to both simple and analytic analyses, as appropriate - it depends on the research question Thomson H, et al. Do urban regeneration programmes improve public health and reduce health inequalities? A synthesis of the evidence from UK policy and practice ( ). J Epidemiol Community Health 2006;60(2):
Implications of complexity for… 2. Public health evidence
Evaluations of complex social interventions can involve … Randomised Controlled Trials; other trial designs Quasi-experimental study designs, using control/comparison groups/areas where appropriate Uncontrolled studies (e.g. time series analysis, before-and-after studies) – what we know about the effects of smoking bans comes from ITS and uncontrolled B/A studies “Case studies”; qualitative research; other types of data Depending on the research question
Mechanisms and pathways by which the intervention brings about the desired intermediate and final outcomes (including unintended adverse effects) Mediators (causal mechanisms) and moderators (characteristics of studies, populations, settings etc) Aspects of context and setting Feedback loops, synergies, phase effects (Maybe) Theory underlying the intervention …Depending on the research question Some things it might be helpful to know about (when looking for evidence of the effects of complex interventions)
Where do we find all this? It may be helpful to think of the sources of complexity (rather than complexity itself), and map each of these onto specific sources of evidence
Mapping aspects of complexity to types of evidence and study designs Source of complexity: Multiple components; interactions between components Evidence needed : of the independent effects of components of the intervention, and interactions between those components. This may derive from quantitative or qualitative data. Studies with factorial designs may also explore these effects. Individual studies with different configurations of components may be included in a review allowing indirect comparisons between studies. Meta-regression may also be of value.
Feedback loops Source of complexity: Feedback loops Evidence needed : Evidence of feedback loops may derive from qualitative studies carried out alongside trials; or may be described in qualitative research. Longitudinal studies carried out as part of process evaluations may be of value.
Phase transitions Source of complexity : Phase transitions Source of evidence: Some longitudinal element is necessary to identify these. For example changes in direction or size of effects over time may be observed in studies with multiple data points (e.g., ITS studies). Qualitative data may also be available to describe phase changes.
Source of complexity: Multiple (health and non- health) outcomes Source of evidence: Any type of evaluative study - qualitative studies may also show the range direction of effects. The Cochrane Handbook: “it is important to include all outcomes that are likely to be important to users) but overall conclusions are more difficult to draw if there are multiple analyses)”. Important to state in the protocol which analyses and outcomes are of interest. Outcomes should be classified in advance as primary and secondary outcomes
Source of complexity: Flexibility/non- standardisation of implementation (Hawe et al., 2004) Source of evidence: Process evaluations; studies describing implementation; policy documents and other sources (Shepperd, 2009)
Source of complexity : Effects at different levels Source of evidence: Cluster RCTs may provide outcome data at both cluster, and individual level; studies may collect data from individuals about effects not just on themselves, but on their families, communities etc. External data sources (e.g. routine data) may show effects at these levels. Qualitative studies have also been used to explore effects of interventions at multiple levels.
Etc Not all relevant sources are “scientific” sources
Shepperd et al. (PLoS Med 2009)
Occam’s razor (explanations should be as complex as they need to be, and no more)* Evaluations (and systematic reviews) also should be as complex as they need to be, and no more) What dictates the choice of explanation (simple or complex)?...The research question, and...The end user * after William of Occam (c ).
3. “What’s all this got to do with the social determinants of health?” The causes of health inequity are complex So are the solutions
Logic model for interventions 1. Improve built environment 4. Reduce crime 6. Reduce fear of crime 5. Improve social environment 9. Improve mental health and social relationships 2. Other interventions e.g. policing 3. Other interventions e.g. social cohesion 7. Improve health behaviours 8. Improve physical health INTERVENTIONS MECHANISMS OUTCOMES
Logic models may help to clarify the question and relevant mechanisms Anderson LM et al. Using logic models to capture complexity in systematic reviews. Research Synthesis Methods, 2011; 2(1):33-42.
Unpredictable, unintended adverse effects A characteristic of complex interventions; and interventions to improve public health may have cause unintended, unpredictable adverse consequences
Evaluation of increasing number of police on the beat to reduce crime and increase public safety Result: Public felt less safe, and reporting more dissatisfaction with the police (Crawford and Lister, 2003)
Another unintended adverse effect of public health interventions may be the inadvertent widening of inequalities
Macintyre (2007) “Disadvantaged groups tend to be harder to reach, and find it harder to change behaviour.” A dental health education project in Scotland widened health inequalities in dental health because it was more successful among higher SES groups. A mass media campaign intended to reduce socio-economic differences in women's use of folic acid to prevent neural defects resulted in more marked social class differences in use than before the campaign
“Generating evidence on what works to reduce health inequities is a complex process..... Understanding the impact that context has on health inequities and the effectiveness of interventions requires a rich evidence base that includes both qualitative and quantitative data. Evidence needs to be judged on fitness for purpose – that is, does it convincingly answer the question asked – rather than on the basis of strict traditional hierarchies of evidence”. Finally, complex interventions require more complex approaches to identifying and synthesising evidence...
Consideration of complexity moves public health away from the idea that there is a hierarchy of evidence, and gold standards which often cannot be met, towards a recognition that real world interventions are “messy” and so the evaluation methods need to be flexible
Implications for systematic reviews of complex interventions
1. Defining the review question Implications of complexity for…
Getting the review question right is even more important than for other types of review. Population: (Individuals, families, communities?) Intervention: What is the intervention? Delivered at what level? Do we lump or split? (Broad vs narrow questions; components vs whole systems) Comparison: Are there likely to be any RCTs? What if there are not even any controlled trials? (RCTs of area-based interventions, or changes in organisations/systems are few and far between…) Outcomes: What outcomes? At what levels? Health vs non-health outcomes
Is the question about the whole system, or components within the system? Even if the intervention is complex, it doesn’t necessarily follow that all that complexity needs to be analysed unless that is a legitimate question for the review … E.g. if the review’s users want to know about it
A detailed protocol is even more important for reviews of complex interventions PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analysis) The review protocol
Implications of complexity for… 2. Inclusion/ exclusion criteria
Implications for methods/inclusion criteria Feedback loops… Synergies/interactions… Phase changes… Multiple components… Multiple outcomes… Multiple levels… Flexibility/tailoring… Moderating effects of context Theory
Mechanisms and pathways by which the intervention brings about the desired intermediate and final outcomes (including unintended adverse effects) Mediators (causal mechanisms) and moderators (characteristics of studies, populations, settings etc) Aspects of context and setting Feedback loops, synergies, phase effects (Maybe) Theory underlying the intervention …Depending on the research question Some things it might be helpful to search for evidence on (when doing SRs of complex interventions)
In a nutshell 1.Get the question right: What question(s) do you - or your users – really want the review to answer: Is the research question about outcomes? Processes? Causal pathways? All of these? Is it concerned with the whole system, or components within the system? 2.If you (and your users) really are interested in aspects of complexity, identify the key sources of complexity in the intervention 3.Map each source of complexity to the types of evidence you need to find (outcome evaluations, qualitative evidence etc) 4.Search for, appraise and synthesise that evidence
A final few sceptical thoughts: 1. A sceptical view of creativity, innovation, evidence and public health Frequently urged to develop innovative and creative approaches to developing public health interventions This may be related to the “weak” evidence base – Creativity, and innovation translates on the ground into “think up something new, with no clear goals, based on no evidence, unevaluated and evaluatable” Initiative fatigue; Evangelism; Nothing is learned; nothing cumulates
Health Action Zones Sought to improve health by promoting healthy lifestyles, improving employment, housing, education and tackling substance abuse (among other things)
“ Health Action Zones were conceived and implemented too hastily, were too poorly resourced and were provided with insufficient support and clear direction to make a significant contribution to reducing health inequalities in the time that they were given….
“ HAZs were born at a time when anything seemed possible for a New Labour Government desperate to make things work and quickly. But the tide of enthusiasm for change outran the capacity to deliver it. Too many hugely ambitious, aspirational targets were promulgated. The pressure put on local agents to produce ‘early wins’ was debilitating. A sense of disillusionment began to set in relatively early in their lifespan, and HAZs soon lost their high profile as the policy agenda filled with an ever- expanding list of new initiatives to transform public services and promote social justice.”
“…this was supposed to be a seven-year initiative, launched by one secretary of state, dramatically changed by the next, abandoned by the third, subject to different parliamentary and political timetables, where guidance from the centre was not clear, [or was] …contradictory”
“ Complexity is a strategy used by professional elites to maintain control. Proclaiming that a problem is complex is shorthand for saying that you have no role in solving it.” – Ian Roberts, Phil Edwards “The Energy Glut” 2011 Just because the problem is complex, doesn’t always mean that an equally complex explanation or analysis is always helpful In public health, it is often used simply as a metaphor with no practical applications 2. A sceptical view of complexity...
“Complex” = “my research is more challenging than yours, and I eschew your rigid, positivistic, biomedical views of evidence” Other possible subtexts... (Long may it continue…) “Complex”= publishable, fundable
What is “context”? Complex interventions are often said to be context-dependent – but what is “context”? “the social, economic, political, & organisational characteristics of the setting in which interventions take place” (Penny Hawe) “An ecological perspective encompasses...physical, social, cultural, and historical aspects of context (including trends at the local and global level such as globalisation, urbanisation, and large scale environmental change) as well as attributes and behaviours of persons within”. (McLaren & Hawe, 2005)