Presentation is loading. Please wait.

Presentation is loading. Please wait.

What can implementation research offer?

Similar presentations


Presentation on theme: "What can implementation research offer?"— Presentation transcript:

1 What can implementation research offer?

2 What is implementation?
Broadly speaking, implementation focuses on ‘closing the gap’ between evidence and service delivery / clinical practice. Evidence conceptualised more broadly than empirical research. Includes a focus on behaviour change, and organisational change Importance of organisational context. But, rejection of a linear model of research utilisation, towards the recognition of a ‘complex mobilisation’ of knowledge.

3 What is implementation research?
IMPLEMENTATION ‘…is the scientific study of methods to promote the systematic uptake of clinical research findings and other evidence- based practices into routine practice, and hence to improve the quality…..of health care. It includes the study of influences on healthcare professional and organisational behaviour.’ Eccles et al. (2009) An Implementation Research Agenda. Implementation Science 4:18

4 Impacts of implementation research
Understanding implementation challenges within healthcare. Implementation provides a broad theoretical basis to deconstruct practical problems. Provision of a (more or less) evidence-based tool box of interventions to enable implementation / de-implementation. Implementation offers a tool box of strategies and interventions (e.g. guidelines, incentives, facilitation) which might be helpful to increase the uptake of evidence of which have an evidence-base of varying degrees of strength. Increasing the quality and impacts of research. Embedding implementation theory and evidence within research programmes Creating evidence with greater implementability? Developments in thinking around implementation point to a more sustained and engaged way of organising the interface between how evidence is produced and applied.

5 Underpinning ideas / theories
Evidence – research – necessary but not sufficient in improving care or service delivery People (practitioners, managers, patients etc.) not rational actors Context matters Implementation requires active processes that work with evidence, people and context -Psychology (e.g. TPB, -Behaviour Change Wheel) Sociology (Normalisation) -Aesthetics -Design science

6 Implementation as forethought in clinical trials?
How can we encourage clinicians to access the evidence we have created? What are the barriers and enablers to the utilisation of the evidence we have created? How can we increase the implementability of the evidence we intend to create? The first implementation question reflects a very linear approach to thinking about implementation – putting trialists and clinicians in quite different communities – evidence producers and evidence users. Many studies, including some of those that we’ve been involved in have used implementation to theorise what barriers and enablers to implementation may be relevant in a given trial context, and to draw on models and frameworks to suggest what implementation interventions might be helpful in managing these barriers and enablers. Reflecting MRC framework (including more recent process evaluation guidance) – embedding implementation across trial design, then we can begin to think right up front about how we can integrate implementation and clinical trial design to generate evidence with the potential for greater implementability (i.e. external validity), and consequently impact on healthcare and health. Changing nature of ‘implementation questions’ over time.

7 Evidence challenges? Consequently implementation researchers are interested in two ‘evidence challenges’. What is the evidence-base for healthcare policy, programmes and activities? Research evidence (but a recognition that other forms of evidence are important: experience, aesthetic, moral and ethical, cultural etc.) What is the evidence-base for getting this evidence where it needs to be to improve healthcare and patient outcomes? Improvement strategies, tools and techniques etc. Consequently then we are interested in addressing two evidence challenges

8 Some suggestions We are going to make some suggestions which build on the ‘ effectiveness-implementation hybrid design’ typology offered by Curran et al in We’ll offer some reflections on the typology based on some of the current work that we are engaged in.

9 Effectiveness-implementation Hybrid Designs
Type 1: Identifying implementation issues alongside a clinical trial. Type 2: Simultaneous testing of clinical and implementation interventions. Type 3: Testing implementation interventions, with evaluation of evidence impact in the ‘real world’. Curran et al. (2012) Effectiveness-implementation Designs. Medical Care 50(3):

10 Type 1: Identifying implementation issues alongside a clinical trial.
Aims Primary: Clinical effectiveness Secondary: understanding of implementation issues Units of randomisation Patient / Setting (dependent on clinical intervention theory of change) Comparisons Placebo / Treatment as usual Sampling frames Patient: Limited restrictions Setting: subsample Evaluation Primary aim: Summative; quantitative Secondary: Mixed; Interpretive; Process oriented Measures Primary: Clinical and cost effectiveness Secondary: Feasibility / Uptake / Sustainability Challenges Generating buy-in around implementation aims; additional costs and broader teams; epistemological challenges around nature of organisational context Looking at the standard process evaluation alongside the randomised trial. Needs to be where Strong face validity of the clinical intervention Indirect evidence that the clinical intervention works (e.g. associated but different populations / systems) Minimal risk associated with the clinical intervention

11 Type 2: Simultaneous testing of clinical and implementation interventions.
Aims Co-Primary questions: Clinical effectiveness; Feasibility of an implementation strategy / intervention Units of randomisation Patient / Setting (dependent on clinical intervention theory of change), although implementation interventions may not be randomised (e.g. case study) Comparisons Placebo / Treatment as usual Sampling frames Patient: Limited restrictions Setting: consider optimal cases Evaluation Clinical effectiveness: Summative; quantitative Implementation: Secondary: Mixed; Formative and Summative Measures Primary: Clinical and cost effectiveness Secondary: Fidelity Challenges Generating simultaneous buy-in Balancing implementation and fidelity of the clinical intervention Recognition that promising interventions (from efficacy studies) fail to realise impact in real world investigations of effectiveness. Testing of interventions in unideal situations / settings Strong face validity for both clinical and implementation interventions Strong indirect evidence that both are effective Minimal risk from both Implementation momentum – e.g. policy direction Implementation intervention should be supportable Balancing implementation optimisation and fidelity of the clinical intervention is key. THINGS WE CAN ADD TO THIS Other ideas from implementation: core and peripheral intervention elements to manage the balancing of implementation and fidelity of the clinical intervention. How pragmatic is pragmatic? ALSO WE CAN ADD Starting point for the need for Hybrid 2 design: should we invest in developing context before we start simultaneous testing. Suggest start with initial preparation for both context and intervention staff preparation / baseline skill and competence etc.

12 Type 3: Testing implementation interventions, with evaluation of evidence impact in the ‘real world’. Aims Primary: Utility of an implementation intervention / strategy Secondary: clinical outcomes associated with implementation Units of randomisation Patient / Setting (dependent on clinical intervention theory of change) Comparisons Implementation as usual / competing implementation interventions / strategies. Sampling frames Setting: unit or system (dependent on theory of change) Patient: Limited (if any) exclusions Evaluation Primary aim: Mixed method; Formative and Summative Secondary: Quantitative; Summative Measures Primary: Adoption / Fidelity Secondary: Clinical and cost outcomes Challenges Data quality for clinical outcomes (may be reliant on clinical rather than research data standards) Balancing implementation and fidelity of the clinical intervention Ideal position is that ‘effective’ interventions are implemented, but this might not always be the case – e.g. prevailing health policy dictates / encourages implementation of a clinical intervention that is to varying degrees still in question from an effectiveness perspective. However important to note that might not be possible to achieve same degree of methodological rigour around the ’effectiveness’ element. EXAMPLE in Typology is an Integrated Primary Care and Mental Health Programme dictated by policy in the VA. Methods utilised here: Internal / external facilitation trial – clinical scores of patient outcomes (in this case depression) to shine a light on some elements of the primary care and mental health programme that were not evidence-based. Also useful when we think clinical interventions may be susceptible to change in different settings. Requires face validity of both clinical and implementation intervention Indirect evidence Minimal risk Strong implementation momentum – policy diktats Implementation intervention is feasible

13 Additional ideas our implementation work
Preparing the ground. Balancing the optimisation of implementation and fidelity of the clinical intervention. Linking formative evaluation and ‘learning effects’. Integrating implementation within intervention design. Starting point for the need for Hybrid 2 design: should we invest in developing context before we start simultaneous testing. Suggest start with initial preparation for both context and intervention staff preparation / baseline skill and competence etc. Implementation shines a light on various contextual factors which may shape etc.…. Need to invest in preparatory work. What are the critical thresholds for context and intervention staff competence - Both Type 2 and 3 designs raised the issue of balancing implementation and fidelity of the clinical intervention. Other ideas from implementation: core and peripheral intervention elements to manage the balancing of implementation and fidelity of the clinical intervention. How pragmatic is pragmatic? General observation that learning effects are often poorly conceptualised and investigated. The potential for formative evaluation in Type 2 and 3 designs may help us shine a light on this issue. Particularly important in large studies conducted over a period of time, and where intervention staff may ’learn’ – motivation and commitment may change etc. Finally, the typology neglects the opportunity to consider implementation issues within the design of a clinical trial intervention itself.

14 Coproducing interventions?
How can we encourage clinicians to access the evidence we have created? What are the barriers and enablers to the utilisation of the evidence we have created? How can we increase the implementability of the evidence we intend to create? Opportunities to think about strategies that engage with stakeholders, from relevant constituencies and systems, in the design of complex interventions. Various opportunities to do this: Design sciences referred to earlier – provide ordered and structured way of engaging in intervention development and prototype testing Experience based co-design work advocated by Kings Fund – another systematic approach Co-producing interventions Working collaboratively Managing fidelity Adapting trial design (intervention) Changing nature of ‘implementation questions’ over time.

15 Emerging questions How can we encourage further debate on the interface between implementation research and clinical trials? What is the potential for implementation research, evidence and theory being incorporated into adaptive (and other) trial design. What are the issues around maximising implementation, and balancing fidelity with trial validity?


Download ppt "What can implementation research offer?"

Similar presentations


Ads by Google