Presentation is loading. Please wait.

Presentation is loading. Please wait.

Transforming Implementation & Improvement Into Science: A skills building series December 6, 2017.

Similar presentations


Presentation on theme: "Transforming Implementation & Improvement Into Science: A skills building series December 6, 2017."— Presentation transcript:

1 Transforming Implementation & Improvement Into Science: A skills building series
December 6, 2017

2 Accelerate & Promote Sustainability
Engage with CIIS Provide guidance, support & innovation to design projects that rigorously evaluate the effectiveness of efforts to implement change Guide & Innovate Identify strategies that accelerate the adoption & promote sustainability of effective healthcare interventions Accelerate & Promote Sustainability Provide implementation & improvement sciences education to faculty, trainees, students Educate Welcome from CIIS Introduction to CIIS – a methodological hub for scientific evaluation of strategies to improve healthcare delivery in safety net settings like BMC. Much of our work focuses on how healthcare is delivered to underserved populations. Our Center goals are to: Guide and innovate ways to design projects that will rigorously evaluate the effectiveness of efforts to improve health systems. We achieve this by offering support and education about theoretical framing, study design and analytic approach. Identify factors and strategies that accelerate adoption and promote sustainability of interventions. We achieve this by rigorously documenting outcomes of tested strategies. Educating the BU community in IIS which I will talk more about on the next slide… CIIS supports IIS within the BU community via Monthly educational series Technical assistance/consulting Pilot Grant Program

3 Overview: Implementation & Improvement Sciences
Implementation Science Focuses on optimal strategies to promote evidence uptake in real-world settings Addresses Did stakeholders perform the desired endeavor? Why or why not? How well? Aims Translate research intro practice Systematically implement evidence-based practices Improve the quality of healthcare Improvement Science Focuses on rigorously measuring outcomes associated with efforts to improve care delivery Addresses Did the new endeavor measurably improve desired outcomes? Implementation science Identify and test optimal strategies to promote uptake of EBPs Translate research into practice, reducing the translation gap (~17 years to translate ~14% of research into practice) Improvement science Measure outcomes associated with efforts to improve care Answers questions about translation – does this intervention work in the real world? Implementation and Improvement sciences answer different, but related questions. Implementation and Improvement sciences have related aims

4 Identifying High-Quality Projects
Main NIH Criteria Issues Applying NIH Criteria to Implementation & Improvement (IIS) Overall impact Significance Innovation Approach Investigator team Research environment Broad, non-specific to IIS Criteria could be operationalized to better describe high-quality IIS Major goal of CIIS – help you to create high quality improvement projects We are thinking about this as if research because it gives us a framework, but this applies generally to all projects you might approach that have the goal of sustainable, generalizable improvement Thus, we starting with the NIH criteria in thinking about them, even though we know a lot of the work will not be as NIH focused The way NIH thinks about research Overall impact: likelihood for project to exert a sustained influence on the research field Significance: does the project address an important problem in the field? If the project aim is achieved, how will scientific knowledge or clinical practice be improved? Innovation: Does the application challenge and aim to shift current research or clinical practice by using theoretical concepts, methodologies, or interventions novel to the field? Approach: Are the overall strategy, methodology, analyses well-reasoned and appropriate to accomplish the specific aims of this project? Are potential problems, alternative strategies, benchmarks of success presented? Investigator team: Are the researchers well-suited to do this project? Are the investigators supported by stakeholders in this endeavor? Environment: Will the environment where the work will be done contribute to the probability of success? Is there institutional support, resources available to the investigator? A lot of this is applicable to the kind of work we want to do but not all of it or it is not operationalized well, so we thought about using a different approach

5 Proctor’s 10 Key Ingredients for Implementation Research Proposals
NIH Criteria Quality/care gap Impact; Significance Evidence-based treatment Significance; Innovation Conceptual model, theoretical justification Approach; Innovation Stakeholder priorities, engagement in change Impact; Approach; Research Environment Setting’s readiness to adopt new services Impact; Approach; Environment Implementation strategy/process Impact; Significance; Innovation Team experience with the setting, treatment, processes Approach; Investigator Team Feasibility of proposed research design Measurement & analysis section Policy/funding environment; support for sustaining change Proctor operationalized the NIH criteria in ways specific to IIS Recognizing the need for proposal development guidance specific to implementation science, Proctor et al. (2012) used the main NIH criteria to create 10 key ingredients for writing implementation research grant proposals. Proctor aligned these 10 Key Ingredients with the main NIH evaluation areas identified Big focus at NIH, as well as in our reviews at CIIS and CTSI if proposal is IIS-related Because we think these are important for any good IIS work, we thought our series could focus on how to do each of these well and be very practical Source: Writing Implementation Research Grant Proposals: Ten Key Ingredients. Proctor EK, Powell BJ, Baumann AA, Hamilton AM, Santerns RL. Implementation Science. 2012;7:96.

6 Proposal Areas Addressed
Upcoming Sessions Tentative Date Session Title Proposal Areas Addressed 10/25/2017 Identifying Your Implementation & Improvement Sciences Research Question Quality/Care Gap, Evidence-Based Practice 12/6/2017 Using & Discussing Implementation Science Models Conceptual Model 1/25/2018 Implementation Strategies Versus Study Interventions Implementation Strategy 2/28/2018 Designing an Implementation & Improvement Sciences Study Study Design 3/22/2018 Quantitative Methods for Implementation & Improvement Sciences Measurement, Analytic Methods 4/18/2018 Qualitative Methods for Implementation & Improvement Sciences 5/10/2018 Engaging with Stakeholders to Conduct Feasible & Meaningful Research Stakeholder Engagement, Feasibility, Team, Policy Environment This is an overview of the education series. Many of the key ingredients are complementary, so we’ve combined some into one session. In addition to discussing Proctor’s key ingredients, we’ve added in sessions that will provide an overview of study designs, quantitative and qualitative methodologies for implementation and improvement sciences. These sessions will discuss Proctor’s criteria for proposing measurement and analysis plans, and give you an overview of productive methodologies in the field. Format: Sessions will be interactive and feature different speakers with diverse perspectives and experience in implementation and improvement research.

7 Series Goal Proctor’s 10 Key Ingredients
High-Quality Implementation & Improvement Sciences Significant Contributions to Improve Care, Advance Fields Through the series, we hope that you will develop the skills necessary to conduct high-quality implementation and improvement sciences, and make significant contributions to these growing fields. CIIS Educational Series

8 Using & Discussing Implementation Science Models
Study Proposal Areas Addressed: Conceptual Model, Theoretical Justification Mari-Lynn Drainoni, PhD, MEd Co-Director, CIIS

9 Using and Identifying Conceptual Models
Gain introductory knowledge of IIS specific conceptual models & how to use models in research Discuss the importance of linking questions, process & outcomes measures to a model Provide some examples of using models to drive study design and activities Today we are going to talk about using models to frame projects and why they matter and how they can be used. There are three general goals Please jump in at any point

10 What is a Conceptual Model/Framework?
An analytic tool capable of identifying a “set a variables and relationships that should be examined in order to explain a phenomena” (Kitson et al., 2008) Source: Kitson AL, Rycroft-Malone J, Harvey G, McCormack B, Seers K, Titchen A. Evaluating the successful implementation of evidence into practice using the PARiHS framework: Theoretical and practical challenges. Implementation Science. 2008;3:1.

11 Implementation Models
Implementation models ≠ individual behavior change models Core characteristics: Focus on provider or systems levels Focus on acceptability, uptake, adoption, sustainability

12 Are Conceptual Models Helpful?
Useful for: Making conceptual distinctions Organizing ideas Can represent ideas in visual formats Pictures Diagrams Broad sense of why they are helpful

13 How Can Conceptual Models Help My Research?
Study design Outcome measures Recipe for replication Inform future research, scalability Generalize knowledge Offer a systematic method for operationalizing, navigating, evaluating complexities of implementation Specifically, how they can aid research Generalize knowledge on how to implement & sustain interventions across studies, settings, contexts

14 How Can Conceptual Models Help My Research?
Unfortunately, there is no one model that will guide all research questions Important to think about how a model can apply to the specific question you’re trying to answer

15 Practical Application: Formative Assessment of Narcan Distribution in the Emergency Department
Model: Promoting Action on Research Implementation in Health Services (PARIHS)

16 Quality/Care Gap Overdose(OD) is leading cause of accidental death
Narcan (naloxone) can reverse overdose…but it has to be accessible & timely (before it’s needed) Emergency departments (ED) often initial point of contact for care, OD reversal Potential high-yield venue for delivering OD prevention BUT…only 8% at-risk patients get narcan kit from ED

17 Program Objectives – What are We Trying to Change?
Program Goal: Ensure that ALL patients at risk for OD who appear in ED receive a Narcan kit

18 Program Design Expanded initiative & policy to provide 24-hour coverage to ensure all at-risk patients are offered rescue kits using 3 models Project ASSERT’s Licensed Alcohol & Drug Counselors Outpatient pharmacy prescriptions Inpatient pharmacy distribution by ED

19 Early results – What Is/Is Not Working?
Early results: Still low numbers, extremely low uptake of non- Project ASSERT component 7% at-risk

20 Implementation Study New program not being used –study to Identify barriers & facilitators to successful distribution of Narcan kits for all at-risk ED patients Use implementation model as a guide Systematic method to identify, understand, operationalize, evaluate implementation Identify set of variables & relationships that can be examined to explain the phenomena Consider context: who is involved, disciplines, intervention, implementation process

21 Using the PARIHS Model as a Guide
Evidence: Research, Experience, Data Facilitation: Skills, style Context: Culture, Leadership, Resources Implementation

22 PARIHS PARIHS Model Success Failure
3 constructs: evidence, context, facilitation Implementation success predicted by high degrees of each construct Implementation failure predicted by low degrees of 1 or more constructs Context Evidence Facilitation Success Failure Described in another way Highlight the flexibility of visual representation

23 Study Methods Formative evaluation: Assess program during development to make early improvements Qualitative methods: 7 focus groups, 6 interviews N=50: 19 MDs, 26 RNs, 3 ASSERT, 2 Pharmacists Qualitative analysis: Grounded theory, constant comparative methods to identify facilitators & barriers Link back to PARIHS model

24 Facilitators Real-world driven with philosophical, clinician & leadership support for the evidence “Why would you be against Narcan? I can’t even think of a rational argument….It’s like it’s something like condoms or…That’s like being against water…” Basic education & training efforts conducted Current ASSERT program, 24-hr availability of ED social workers & pharmacists Can leave ED with Narcan kit in hand

25 Protocol & Policy Barriers
No input from those tasked with implementation “the people who design the policy actually don’t work nightshifts” Important potential targets no included Parents of younger patients may be highly receptive, no pediatric ED involvement No clear criteria & lack of consensus of target population “I mean, so we’re guessing who needs it, who should get it and it’s just bad guessing because everybody should be offered” “So I think you try to focus on the most evident patients who have immediate risk….” “….in a world of limited resources, you have to decide where you’re going to devote your time and effort” Need to clearly determine public health role of ED

26 Workflow & Logistical Barriers
Standing verbal order inconsistent with current practice “we’re only allowed to take verbal orders in life-and death emergencies” Paper documentation needed to dispense; no EMR-based option “(the verbal order is) a misnomer because a piece of paper for documentation is needed and the pharmacy cannot actually dispense without documentation” Staff unfamiliar with process No consensus regarding timing or kit distribution Difficult to track referrals, paper orders, kit dispensing

27 Patient-Related Barriers
Challenging population, impulsive & anxious to leave ED once OD reversed “patients did not want their high to have been reversed by Narcan and so are not interested in getting some ‘to go’ ” Frequently unaccompanied in ED Unlikely to fill prescription due to inconvenience, stigma, potential co- payment There may be biases about “worthiness” of patients to receive Narcan kit “…the staff are human …there’s the deserving and the underserving ill and (the feeling) this is self- inflicted and not a disease. I don’t think people in healthcare are immune to that. We have our own biases. We grew up in a culture where some behaviors are acceptable and not.”

28 Staff Role/Responsibility Barriers
Clinical staff feel too busy Not in anyone’s specific job description – idea that if everyone is responsible, no one is responsible Everyone think it is someone else’s role “…it makes sense to me in that the nurses are the end point of discharge…it’s part of their role and responsibility, I think, to make sure the patients have a discharge plan….” “No other medication is the nurses’ responsibility to give without an order from a physician.”

29 Education & Training Barriers
Staff need training in policy & use of kit; training needs to be structured, consistent, ongoing Patients/families need training on Narcan & use of kit Need to determine how to incorporate patient/family training into ED workflow “one of the biggest time commitments, educating the patient…”

30 Study Results Linked to PARIHS
Evidence: Belief in effectiveness, little clinical experience, patients not receptive Facilitation: Style included episodic & didactic training, no creation of partnerships in development or training Context: Leadership support, multiple resources, lack of consensus regarding ED PH role Implementation

31 Practical Application: Using Community Health Workers for HIV to Improve Linkage & Retention in HIV Care Model: Reach Effectiveness Adoption Implementation Maintenance (RE-AIM)

32 Project Goals Increase utilization of community health workers (CHWs) to improve access, retention & outcomes among people living with HIV (PLWH) Strengthen HIV healthcare workforce & build capacity of Ryan White HIV/AIDS Program recipients to integrate CHWs into care team Evaluate implementation & effectiveness of different CHW models

33 Project Structure & Activities
10 Ryan White Care Act funded sites across US to be funded to Implement program with limited funding & limited staffing Receive training Participate in evaluation

34 Project Structure & Activities
3-year project timeline 12 months: BU team planning including program, curriculum, training development, evaluation design 18 months: Program implementation & evaluation, ongoing training, collect & provide data 6 months: Complete evaluation Evaluation No additional funding for surveys or data provision No funding for control/comparison sites

35 Does the intervention work?
Study Goals Assessed via: Client, CHW, site experience with intervention Integration of CHW program into setting Primary focus: Experience implementing the program from multiple staff/organizational perspectives Assessed via: Changes in clinical markers, adherence, appointment, attendance, changes in unmet needs Secondary focus: Does the intervention work?

36 RE-AIM Model Explained
Maintenance Adoption Effectiveness Implementation Reach Program maintained over time Impact on proposed outcome Who applied program & where Consistency in application Who is exposed & benefits RE-AIM 5 concepts/ constructs ask questions to guide research Described in another way Highlight the flexibility of visual representation

37 Using RE-AIM as a Guide RE-AIM Concept Key Questions for Concept REACH
Who is expected to benefit? What percent of those are actually exposed to intervention? Who are they (demographics)? EFFECTIVENESS What is the impact of the intervention on the proposed outcome (clinical markers, retention, adherence)? ADOPTION What settings applied the program? Who applied it? IMPLEMENTATION How was the program applied? How consistently was it applied in the way it was intended? MAINTENANCE Is the program maintained over time?

38 RE-AIM Concept Measures Data Sources Reach Effectiveness Adoption
% eligible who get CHW intervention; Dose of intervention received; Demographics Medical chart data Client survey Effectiveness Impact of the intervention on clinical markers, retention, adherence, unmet needs, stigma, self-efficacy, health literacy Adoption Frequency of adoption; Where is program adopted CHW encounter form Site visit tools Implementation Specific activities & dose; Integration of CHWs into team; Adaptions to protocol Fidelity monitoring tool CHW satisfaction survey Qualitative interview Maintenance Consistency over time; Budget impact Cost analysis

39 Model: Proctor Conceptual Model of Implementation Research
Practical Application: The Hepatitis C Testing & Assessment Project Model: Proctor Conceptual Model of Implementation Research

40 Research Question & Setting
Quality gap High rates of HCV in population born No evidence for routine testing for all (as with HIV) Research Question: What is the betterstrategy to improve HCV screening & testing within primary care in settings with a large proportion of high- risk patients? Routine birth cohort testing Enhanced risk screening with targeted testing for all others Setting: 3 large community health clinics in South Bronx, New York

41 Using the Proctor Model as a Guide
Implementation model, pretty broad and can be used at multiple levels - individual provider, system, community – can be applied to numerous settings

42 Enhanced Risk Screener Phase

43 Birth Cohort Sticker

44 Outcome Definition Data Source Implementation Acceptability Agreeable, attitudes Qualitative Adoption Willingness to implement Appropriateness Perception of fit Feasibility Can they do it Fidelity Did they do it Screener, EMR testing data Penetration % eligible that got it Screeners done, EMR testing data Sustainability Does the intervention stick EMR testing data post-intervention Client Symptomatology % tested who tested positive EMR testing data

45 Outcome Definition Data Source Service Efficiency
Did the right people get screened/tested EMR testing data & risk data, screener risk & testing data Patient centeredness Patient responses Qualitative Timeliness Getting people to care EMR referrals, linkage Equity Care does not vary by personal characteristics EMR demographics linked to screener EMR testing data

46 Results: Screening & Testing Over Time
Steep downward line – restatement of falling adherece 2 – adherence dropped dramatically, but testing did not Bottom - positive

47 Study Example Using Proctor
Factor # Identified # Tested Positive % of Total Positives Cumulative % Ever injected drugs 56 17 41.5% Ever snorted drugs 200 6 14.6% 56.1% Elevated ALT (documented in EMR) 185 4 9.8% 65.9% Transfusion before 1992 59 3 8.0% 73.1% 20+ lifetime sex partners 115 2 4.9% 78.0% Maternal hepatitis C 10 1 2.4% 80.5% Liver diseases 23 82.9% Ever homeless 66 0.0% Every incarcerated 67 Chronic hemodialysis Transplant before 1992 Total 34

48 Implementation Feedback
Good reminder to focus on Hepatitis C Screener increased knowledge about patients Screen time-consuming General preference for screener Birth cohort phase difficult to buy into Remaining ambivalence Process too difficult and not realistic

49 Practical Issues Considerations when selecting a model:
What are the study goals, study type? How flexible does the model need to be? Am I more interested in dissemination or implementation activities? At what level am I intervening? What are the implementation strategies – what am I going to do? What do I want to learn from the project?

50 Level of Construct Flexibility
Broad Operational Loosely outlined & assigned constructs Flexible application across broad array of contexts More responsibility on researcher to think through how to operationalize constructs & apply model Concretely & clearly defined constructs Rigid application to a specific context Provides researcher with a detailed, step-by-step process for use Source: Tabak RG, Khoong EC, Chambers DA, Brownson RC. Bridging Research and Practice. American Journal of Preventive Medicine. 2012;43(3): doi: /j.amepre

51 Focus on Dissemination v. Implementation
Dissemination: Studying how to spread evidence-based practices to the target audience Implementation: Identifying optimal strategies to promote uptake of evidence-based practices in real-world settings Dissemination Only Implementation Only D > I D = I I > D Source: Tabak RG, Khoong EC, Chambers DA, Brownson RC. Bridging Research and Practice. American Journal of Preventive Medicine. 2012;43(3): doi: /j.amepre

52 Level of Intervention Which settings/stakeholders need to be addressed? Policy Community System/multiple organizations Organization Individual provider Ex. If your project goal affects clinician behavior & an individual hospital, then your model should address those levels

53 Implementation Strategies
Do your strategies & model constructs make sense together? Ex. If you plan to use an educational strategy (e.g. training, audit-feedback), does your model suggest knowledge/awareness is an important part of achieving implementation?

54 What Do I Want to Learn? What are my most important outcomes?
How can I measure them? Are there outcomes I am forgetting? (the model can help)

55 Poor Use of Model Quality Use of Model
Model inappropriate for research question Model mentioned once – but does not drive the work - no clear roadmap or relationship to question, design, outcomes Inconsistent labeling Model type & constructs respond to research question Model is linked to research question, study design, outcome measures Consistent labeling

56 Key Takeaways Models are not the be all, end all but they can help guide all parts of project Chosen model should link to the specific question you’re trying to answer, study design, outcome measures – no perfect model Models promote generalizable knowledge about replication & sustainability

57 Session Feedback Polling Question
How effective was this session at increasing your knowledge of conceptual models for implementation science research? 5 = very effective 4 = somewhat effective 3 = neither effective nor ineffective 2 = somewhat ineffective 1 = very ineffective

58 Session Feedback Polling Question
How can CIIS make these sessions more effective? [text short response]

59 Proposal Areas Addressed
Upcoming Sessions Tentative Date Session Title Proposal Areas Addressed 10/25/2017 Identifying Your Implementation & Improvement Sciences Research Question Quality/Care Gap, Evidence-Based Practice 12/6/2017 Using & Discussing Implementation Science Models Conceptual Model 1/25/2018 Implementation Strategies Versus Study Interventions Implementation Strategy 2/28/2018 Designing an Implementation & Improvement Sciences Study Study Design 3/22/2018 Quantitative Methods for Implementation & Improvement Sciences Measurement, Analytic Methods 4/18/2018 Qualitative Methods for Implementation & Improvement Sciences 5/10/2018 Engaging with Stakeholders to Conduct Feasible & Meaningful Research Stakeholder Engagement, Feasibility, Team, Policy Environment Our next session is January 25, 2018 and will focus on designing implementation strategies. Resources available on CIIS website, including presentation slides.

60 Thank You! Contact CIIS Website:


Download ppt "Transforming Implementation & Improvement Into Science: A skills building series December 6, 2017."

Similar presentations


Ads by Google