What can implementation research offer?

Slides:



Advertisements
Similar presentations
Research Strategies: Joining Deaf Educators Together Deaf Education Virtual Topical Seminars Donna M. Mertens Gallaudet University October 19, 2004.
Advertisements

Integrating the NASP Practice Model Into Presentations: Resource Slides Referencing the NASP Practice Model in professional development presentations helps.
Introduction to the unit and mixed methods approaches to research Kerry Hood.
Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
Person-centred Practice Research Centre
Donald T. Simeon Caribbean Health Research Council
Implementation Research: Using Science to Guide Implementation of Evidence-Based Practices Brian S. Mittman, PhD Director, VA Center for Implementation.
Teaching/Learning Strategies to Support Evidence-Based Practice Asoc. prof. Vida Staniuliene Klaipeda State College Dean of Faculty of Health Sciences.
Dr Dominique Allwood Public Health Registrar
IPhVWP Polish Presidency, Warsaw October 6 th 2011 Almath Spooner Irish Medicines Board Monitoring the outcome of risk minimisation activities.
Presenter-Dr. L.Karthiyayini Moderator- Dr. Abhishek Raut
My Own Health Report: Case Study for Pragmatic Research Marcia Ory Texas A&M Health Science Center Presentation at: CPRRN Annual Grantee Meeting October.
IPhVWP Polish Presidency, Warsaw October 6 th 2011 Almath Spooner Irish Medicines Board Monitoring the outcome of risk minimisation activities.
Learning Outcomes of the SCPHN Programme & How they Link to Practice.
Implementation and process evaluation: developing our approach Ann Lendrum University of Manchester Neil Humphrey University of Manchester Gemma Moss Institute.
Curriculum planning Proposed Collaborative working modules.
1.3 Modeling process and outcomes Walter Sermeus, RN, PhD Catholic University Leuven Belgium Witten, Fri Session 1: 11:00-12:30 Session 2: 13:30-15:00.
Improving skills and care standards in the support workforce: a realist synthesis of workforce development interventions Jo Rycroft-Malone, Christopher.
Implementation Science: Finding Common Ground and Perspectives Laura Reichenbach, Evidence Project, Population Council International Conference on Family.
Methods: The project used a mixed methods approach including qualitative focus groups, a questionnaire study and systematic searches of the research literature.
PRAGMATIC Study Designs: Elderly Cancer Trials
Evaluating human rights-based interventions Prepared by Dr Alice Donald (Middlesex University)
CYPS – Foundation Degree How to carry out a swot analysis.
Community Score Card as a social accountability Approach Methodology and Applications March 2015.
Title of the Change Project
Rich Tasks.
On the CUSP of change: Effective scaling for social norms change programming for gender equality. Community for Understanding Scale Up (CUSP)
Title of the Change Project
DATA COLLECTION METHODS IN NURSING RESEARCH
Care Act and young people with Sensory Impairments
Perfect Information Pathway Project
Support for English, maths and ESOL Module 5 Integrating English, maths and ICT into apprenticeship programmes.
Key recommendations Successful components of physical activity interventions fall into three categories: Planning and developing physical activity initiatives.
Front Line Innovation and Trials
The guiding principles of prudent healthcare
Evidence-based Practice v. Practice-based Evidence
Overview for Placement
Building the foundations for innovation
UNICEF Social Protection Training Course
Distortion of implementation techniques in health care:
Person Centred Care in NHS Wales
HEALTH IN POLICIES TRAINING
APM 2010 – 2011 Study Directing of Project Portfolios: Good Governance of Change initiatives Carried out by: Governance Specific Interest Group This.
Chapter 1: Introduction to Gerontological Nursing
Chapter 16 Nursing Informatics: Improving Workflow and Meaningful Use
YSGOL GWYDDORAU GOFAL IECHYD / SCHOOL OF HEALTHCARE SCIENCES
Peer Element of ODDESSI
Research for all Sharing good practice in research management
On the CUSP of change: Effective scaling for social norms change programming for gender equality. Community for Understanding Scale Up (CUSP)
The future: the full spectrum of simulation
Improving Outcomes by Helping People Take Control
Technology Enabled Care and Support in Devon
Measuring perceptions of safety climate in primary care
Changing practice to support self-management and recovery in mental illness: application of an implementation model M Harris1, P Jones2, M Heartfield1,
European Partnership for Supervisory Organisations in Health Services and Social Care (EPSO), Sofia, 12th October 2018 How does regulation impact on.
CATHCA National Conference 2018
A Better Start: Enhanced HCP project
How to apply successfully to the NIHR HTA Board?
State of World’s Cash Report:
ImpleMentAll Midterm Workshop
Social prescribing in County Durham
Raising the bar Meeting Europe’s future challenges
The Compelling Case for Integrated Community Care: Setting the Scene
Building Capacity for Quality Improvement A National Approach
Implementing disruptive health innovations: Redefining traditional innovation vectors for impactful solutions Jayaram Subramanian*, Mark Barrow, Timothy.
Salford Integrated Care Programme
How do we learn and how can we learn more effectively
What do we want to learn…. ….and how do we do it?
Changing the System: Do we have what it takes?
By: Andi Indahwaty Sidin A Critical Review of The Role of Clinical Governance in Health Care and its Potential Application in Indonesia.
Presentation transcript:

What can implementation research offer?

What is implementation? Broadly speaking, implementation focuses on ‘closing the gap’ between evidence and service delivery / clinical practice. Evidence conceptualised more broadly than empirical research. Includes a focus on behaviour change, and organisational change Importance of organisational context. But, rejection of a linear model of research utilisation, towards the recognition of a ‘complex mobilisation’ of knowledge.

What is implementation research? IMPLEMENTATION ‘…is the scientific study of methods to promote the systematic uptake of clinical research findings and other evidence- based practices into routine practice, and hence to improve the quality…..of health care. It includes the study of influences on healthcare professional and organisational behaviour.’ Eccles et al. (2009) An Implementation Research Agenda. Implementation Science 4:18

Impacts of implementation research Understanding implementation challenges within healthcare. Implementation provides a broad theoretical basis to deconstruct practical problems. Provision of a (more or less) evidence-based tool box of interventions to enable implementation / de-implementation. Implementation offers a tool box of strategies and interventions (e.g. guidelines, incentives, facilitation) which might be helpful to increase the uptake of evidence of which have an evidence-base of varying degrees of strength. Increasing the quality and impacts of research. Embedding implementation theory and evidence within research programmes Creating evidence with greater implementability? Developments in thinking around implementation point to a more sustained and engaged way of organising the interface between how evidence is produced and applied.

Underpinning ideas / theories Evidence – research – necessary but not sufficient in improving care or service delivery People (practitioners, managers, patients etc.) not rational actors Context matters Implementation requires active processes that work with evidence, people and context -Psychology (e.g. TPB, -Behaviour Change Wheel) Sociology (Normalisation) -Aesthetics -Design science

Implementation as forethought in clinical trials? How can we encourage clinicians to access the evidence we have created? What are the barriers and enablers to the utilisation of the evidence we have created? How can we increase the implementability of the evidence we intend to create? The first implementation question reflects a very linear approach to thinking about implementation – putting trialists and clinicians in quite different communities – evidence producers and evidence users. Many studies, including some of those that we’ve been involved in have used implementation to theorise what barriers and enablers to implementation may be relevant in a given trial context, and to draw on models and frameworks to suggest what implementation interventions might be helpful in managing these barriers and enablers. Reflecting MRC framework (including more recent process evaluation guidance) – embedding implementation across trial design, then we can begin to think right up front about how we can integrate implementation and clinical trial design to generate evidence with the potential for greater implementability (i.e. external validity), and consequently impact on healthcare and health. Changing nature of ‘implementation questions’ over time.

Evidence challenges? Consequently implementation researchers are interested in two ‘evidence challenges’. What is the evidence-base for healthcare policy, programmes and activities? Research evidence (but a recognition that other forms of evidence are important: experience, aesthetic, moral and ethical, cultural etc.) What is the evidence-base for getting this evidence where it needs to be to improve healthcare and patient outcomes? Improvement strategies, tools and techniques etc. Consequently then we are interested in addressing two evidence challenges

Some suggestions We are going to make some suggestions which build on the ‘ effectiveness-implementation hybrid design’ typology offered by Curran et al in 2012. We’ll offer some reflections on the typology based on some of the current work that we are engaged in.

Effectiveness-implementation Hybrid Designs Type 1: Identifying implementation issues alongside a clinical trial. Type 2: Simultaneous testing of clinical and implementation interventions. Type 3: Testing implementation interventions, with evaluation of evidence impact in the ‘real world’. Curran et al. (2012) Effectiveness-implementation Designs. Medical Care 50(3): 217-226.

Type 1: Identifying implementation issues alongside a clinical trial. Aims Primary: Clinical effectiveness Secondary: understanding of implementation issues Units of randomisation Patient / Setting (dependent on clinical intervention theory of change) Comparisons Placebo / Treatment as usual Sampling frames Patient: Limited restrictions Setting: subsample Evaluation Primary aim: Summative; quantitative Secondary: Mixed; Interpretive; Process oriented Measures Primary: Clinical and cost effectiveness Secondary: Feasibility / Uptake / Sustainability Challenges Generating buy-in around implementation aims; additional costs and broader teams; epistemological challenges around nature of organisational context Looking at the standard process evaluation alongside the randomised trial. Needs to be where Strong face validity of the clinical intervention Indirect evidence that the clinical intervention works (e.g. associated but different populations / systems) Minimal risk associated with the clinical intervention

Type 2: Simultaneous testing of clinical and implementation interventions. Aims Co-Primary questions: Clinical effectiveness; Feasibility of an implementation strategy / intervention Units of randomisation Patient / Setting (dependent on clinical intervention theory of change), although implementation interventions may not be randomised (e.g. case study) Comparisons Placebo / Treatment as usual Sampling frames Patient: Limited restrictions Setting: consider optimal cases Evaluation Clinical effectiveness: Summative; quantitative Implementation: Secondary: Mixed; Formative and Summative Measures Primary: Clinical and cost effectiveness Secondary: Fidelity Challenges Generating simultaneous buy-in Balancing implementation and fidelity of the clinical intervention Recognition that promising interventions (from efficacy studies) fail to realise impact in real world investigations of effectiveness. Testing of interventions in unideal situations / settings Strong face validity for both clinical and implementation interventions Strong indirect evidence that both are effective Minimal risk from both Implementation momentum – e.g. policy direction Implementation intervention should be supportable Balancing implementation optimisation and fidelity of the clinical intervention is key. THINGS WE CAN ADD TO THIS Other ideas from implementation: core and peripheral intervention elements to manage the balancing of implementation and fidelity of the clinical intervention. How pragmatic is pragmatic? ALSO WE CAN ADD Starting point for the need for Hybrid 2 design: should we invest in developing context before we start simultaneous testing. Suggest start with initial preparation for both context and intervention staff preparation / baseline skill and competence etc.

Type 3: Testing implementation interventions, with evaluation of evidence impact in the ‘real world’. Aims Primary: Utility of an implementation intervention / strategy Secondary: clinical outcomes associated with implementation Units of randomisation Patient / Setting (dependent on clinical intervention theory of change) Comparisons Implementation as usual / competing implementation interventions / strategies. Sampling frames Setting: unit or system (dependent on theory of change) Patient: Limited (if any) exclusions Evaluation Primary aim: Mixed method; Formative and Summative Secondary: Quantitative; Summative Measures Primary: Adoption / Fidelity Secondary: Clinical and cost outcomes Challenges Data quality for clinical outcomes (may be reliant on clinical rather than research data standards) Balancing implementation and fidelity of the clinical intervention Ideal position is that ‘effective’ interventions are implemented, but this might not always be the case – e.g. prevailing health policy dictates / encourages implementation of a clinical intervention that is to varying degrees still in question from an effectiveness perspective. However important to note that might not be possible to achieve same degree of methodological rigour around the ’effectiveness’ element. EXAMPLE in Typology is an Integrated Primary Care and Mental Health Programme dictated by policy in the VA. Methods utilised here: Internal / external facilitation trial – clinical scores of patient outcomes (in this case depression) to shine a light on some elements of the primary care and mental health programme that were not evidence-based. Also useful when we think clinical interventions may be susceptible to change in different settings. Requires face validity of both clinical and implementation intervention Indirect evidence Minimal risk Strong implementation momentum – policy diktats Implementation intervention is feasible

Additional ideas our implementation work Preparing the ground. Balancing the optimisation of implementation and fidelity of the clinical intervention. Linking formative evaluation and ‘learning effects’. Integrating implementation within intervention design. Starting point for the need for Hybrid 2 design: should we invest in developing context before we start simultaneous testing. Suggest start with initial preparation for both context and intervention staff preparation / baseline skill and competence etc. Implementation shines a light on various contextual factors which may shape etc.…. Need to invest in preparatory work. What are the critical thresholds for context and intervention staff competence - Both Type 2 and 3 designs raised the issue of balancing implementation and fidelity of the clinical intervention. Other ideas from implementation: core and peripheral intervention elements to manage the balancing of implementation and fidelity of the clinical intervention. How pragmatic is pragmatic? General observation that learning effects are often poorly conceptualised and investigated. The potential for formative evaluation in Type 2 and 3 designs may help us shine a light on this issue. Particularly important in large studies conducted over a period of time, and where intervention staff may ’learn’ – motivation and commitment may change etc. Finally, the typology neglects the opportunity to consider implementation issues within the design of a clinical trial intervention itself.

Coproducing interventions? How can we encourage clinicians to access the evidence we have created? What are the barriers and enablers to the utilisation of the evidence we have created? How can we increase the implementability of the evidence we intend to create? Opportunities to think about strategies that engage with stakeholders, from relevant constituencies and systems, in the design of complex interventions. Various opportunities to do this: Design sciences referred to earlier – provide ordered and structured way of engaging in intervention development and prototype testing Experience based co-design work advocated by Kings Fund – another systematic approach Co-producing interventions Working collaboratively Managing fidelity Adapting trial design (intervention) Changing nature of ‘implementation questions’ over time.

Emerging questions How can we encourage further debate on the interface between implementation research and clinical trials? What is the potential for implementation research, evidence and theory being incorporated into adaptive (and other) trial design. What are the issues around maximising implementation, and balancing fidelity with trial validity?