Presentation is loading. Please wait.

Presentation is loading. Please wait.

Evidence-based Application of Evidence- based Treatments Peter S. Jensen, M.D. President & CEO The REACH Institute REsource for Advancing Children’s Health.

Similar presentations


Presentation on theme: "Evidence-based Application of Evidence- based Treatments Peter S. Jensen, M.D. President & CEO The REACH Institute REsource for Advancing Children’s Health."— Presentation transcript:

1 Evidence-based Application of Evidence- based Treatments Peter S. Jensen, M.D. President & CEO The REACH Institute REsource for Advancing Children’s Health New York, NY

2 Effect Sizes of Psychotherapies Mean Effect Sizes Weisz et al., 1995 Children & AdolescentsAdults University “Real World”

3 Three Levels: Child & Family Factors: e.g., Access & Acceptance Provider/Organization Factors: e.g., Skills, Use of EB Systemic and Societal Factors: e.g., Organiz., Funding Policies Barriers vs. “Promoters” to Delivery of Effective Services (Jensen, 2000) “Effective” Services Efficacious Treatments

4 Key Differences, MedMgt vs. CC: Initial Titration Dose Dose Frequency #Visits/year Length of Visits Contact w/schools Teacher-Rated Inattention (CC Children Separated By Med Use)

5 Would You Recommend Treatment? (parent) Would You Recommend Treatment? (parent) MedmgtCombBeh Not recommend 9%3%5% Neutral9%1%2% Slightly Recommend4%2%2% Recommend35%15%24% Strongly recommend43%79%67%

6 Key Challenges l Policy makers and practitioners hesitant to implement change l Vested interests in the status quo l Researchers often not interested in promoting findings beyond academic settings l Manualized interventions perceived as difficult to implement or too costly l Obstacles and disincentives actively interfere with implementation

7 Key Challenges l Interventions implemented but “titrate the dose”, reducing effectiveness l “Clients too difficult”, “resources inadequate” used to justify bad outcomes l Research population “not the same” as youth being cared for at their clinical site l Having data and “being right” neither necessary nor sufficient to influence policy makers

8 The Good and the Bad: Effectiveness of Interventions by Intervention Type Davis, 2000 No. of Interventions demonstrating positive or negative/inconclusive change

9 Little or No Effect (Provider & Organization-focused) : l Educational materials (e.g., distribution of recommendations for clinical care, including practice guidelines, AV materials, and electronic publications) l Didactic educational meetings Bero et al, 1998

10 Effective Provider & Organizational Interventions: l Educational outreach visits l Reminders (manual or computerized) l Multifaceted interventions l Sustained, interactive educational meetings (participation of providers in workshops that include discussion and practice) Bero et al, 1998

11 Implications re: Changing Provider Behaviors Changing professional performance is complex - internal, external, and enabling factors No “magic bullets” to change practice in all circumstances and settings (Oxman, 1995) Multifaceted interventions targeting different barriers more effective than single interventions (Davis, 1999) Little to no theory-based studies Consensus guidelines approach necessary, but not sufficient. Lack of fit w/HCP’s mental models

12 Additional Perspectives  Messenger of equal importance as the message  Trusted  Available  Perceived as expert/competent  Adult Learning Models  Tailored to learner’s needs  Learner-defined objectives  Hands-on, with ample opportunities for practice  Sustained over time  Skill-oriented  Feedback  Attention to Maintenance and sustaining change

13 Dissemination and Adoption of New Interventions Source: Backer, Liberman, & Kuehnel (1986) Dissemination and Adoption of Innovative Psychosocial Interventions. Journal of Consulting and Clinical Psychology, 54:111-118; Jensen, Hoagwood, & Trickett (1997) From Ivory Towers to Earthen Trenches. J Appliied Developmental Psychology l Sustained Interpersonal contact l Organizational support l Persistent championship of the intervention l Adaptability of the intervention to local situations l Availability of credible evidence of success l Ongoing technical assistance, consultation

14 Science-based Plus Necessary “-abilities” Palatable Affordable Transportable Trainable Adaptable, Flexible Evaluable Feasible Sustainable

15 Models for Behavior Change: (Jaccard et al, 2002) The Theory of Reasoned Action (Fishbein & Ajzen, 1975) Self-efficacy Theory (Bandura, 1977) The Theory of Planned Behavior (Ajzen, 1981) Diffusion of Innovations (Rogers, 1995)

16 Influences on Provider Behavior Patient & Family Factors: Stigma Adherence Negative attitudes Rapport, engagement Provider Factors: Knowledge, training Self-efficacy Time pressures Fear of litigation Attitudes & beliefs Social conformity Lack of information Economic Influences: Compensation Reimbursement Incentives Systemic & Societal Factors: Organizational standards Staff support/resistance Staff Training Funding policy Prescribing Practices

17 First, Use an Atypical vs. Typical Descriptives (n=19) Min/Max Mean(SD) Favor/Unfavor0/53.73(1.61) Easy/Hard-1/54.16(1.64) Improve/No0/52.84(1.57) Agree/Disagree0/54.05(1.27)

18 First Use Atypical--Advantages AdvantagesCountPercent of Responses Avoids typicals' side effects 13 59.1% Better patient approval/compliance 5 22.7% Atypicals effective in treating aggression 2 9.1% Other (i.e. looks better politically) 2 9.1% Total responses 22 100.0%

19 First Use Atypical – Disadvantages DisadvantageCountPercent of Responses Typicals may work better for some patients 6 23.1% Avoids atypicals' side effects 6 23.1% If need to sedate patient, typicals may be better 6 23.1% More is known about typicals in kids 4 15.4% Can not be administered as IM’s 3 11.5% Other 1 3.8% Total responses 26 100.0%

20 First Use Atypical—Obstacles ObstacleCountPercent of Responses Cost 5 23.8% More data supporting typicals 5 23.8% Patient history of non-response to atypicals 4 19.1% Patient resistance 3 14.3% Less available 2 9.5% Other 2 9.5% Total responses 21 100.0%

21 Limit the Use of Stat’s & P.R.N.’s Descriptive Statistics (n=19) Min/MaxMean (SD) Favor/Unfavor-5/52.63(2.89) Easy/Hard-5/5-0.38(3.22) Improve/No-2/52.44(1.92) Agree/Disagree-2/53.86(1.81)

22 Limit Stat‘ & P.R.N.’s -- Advantages AdvantageCountPercent of Responses Other (i.e avoids traumatizing patient, 6 27.3% Avoids unnecessary medication 5 22.7% Avoids unnecessary side effects 4 18.2% Allows doctor to better understand patient’s condition 4 18.2% Patient learns techniques they can apply in ‘real life’ 3 13.6% Total responses 22 100.0%

23 Limiting Stat’s & P.R.N.'s — Disadvantages DisadvantageCountPercent of Responses Possible safety risk to patient and others 9 2.9% Other (i.e. does not address biological factors 6 28.6% Difficult for staff, who may feel less in control 4 19.0% May need to rapidly sedate patient 2 9.5% Total responses 21 100.0%

24 Limiting Stat’s & P.R.N.'s-- Obstacles ObstacleCountPercent of Responses Safety 8 33.3% Other (i.e.patient belief that p.r.n.’s condone behavior; 5 20.8% Staff resistance 4 16.7% Patient too aggressive 4 16.7% Staff availability and training 3 12.5% Total responses 24 100.0%

25 Monitor Side Effects Descriptives (n=19) Min/MaxMean(SD) Favor/Unfavor3/54.57(.69) Easy/Hard-2/52.94(2.4) Improve/No1/54.0(1.15) Agree/Disagree3/54.68(.58)

26 Use Standardized Scales for Side Effects -- Advantages AdvantageCountPercent of Responses Helps captures side effects you might otherwise miss 8 27.6% Other (i.e. increases patient compliance; improves 6 20.7% communication between doctors; helps assess severity of side effects) Provides objective measure 4 13.8% Keeps doctors’ focus on side effects 4 13.8% Determines drug effectiveness for specific symptoms 4 13.8% Enables doctor to track side effects over time 3 10.3% Total responses 29 100.0%

27 Use Standardized Scales for Side Effects--Disadvantages DisadvantageCountPercent of Responses Doctor may ignore side effects not on scale 3 27.3% May minimize importance of clinical evaluations 3 27.3% Other (i.e. may make patient more aware of side effects) 3 27.3% Methodological problems (i.e. inter-rater reliability) 2 18.2% Total responses 1 100.0%

28 Scales for Side Effects--Obstacles ObstacleCountPercent of Responses Time 8 25.0% Scales are complicated/require training 6 18.7% Instrument availability 5 15.6% Other (i.e. staff resistance; instrument availability; 5 15.6% cost) Administrative barriers 3 9.4% Laziness 3 9.4% Clinician resistance 2 6.3% Total responses 32 100.0%

29 New Models for Behavior Change: TMC, TII (Gollwitzer, Oettingen, Jaccard, Jensen et al, 2002; Perkins et al., 2007)

30 Mental Contrasting/Implementation Intentions 1. Use mental contrasting to strengthen behavioral intentions: “What are the advantages or positive consequences associated with the use of Guideline X” 2. Identify Obstacles: “What gets in the way of implementing guideline X” 3. Form Implementation Intentions to overcome obstacles: “If I encounter obstacle Y, then I will X.”

31 Track Target Symptoms Pre-InterventionPost-Intervention Descriptive Statistics (n=4) 1/53.5(1.9) 0/41.8(1.7) 1/42.5(1.3) 3/54.3(1.0) Favor/Unfavor Easy/Hard Improve/No Improve Agree/Disagree Min/MaxMean(SD) Descriptive Statistics (n = 4) 1/5 3.0(1.6) -3/1-0.5(1.9) 2/3 2.8(0.5) 3/4 3.3(0.5) Favor/Unfavor Easy/Hard Improve/No Improve Agree/Disagree Min/Max Mean (SD)

32 Descriptive Statistics (n = 4) 4/5 4.8(0.5) -5/5 3.3(2.9) 4/5 4.8(0.5) 5/5 5.0(0.0) Favor/Unfavor Easy/Hard Improve/No Improve Agree/Disagree Min/Max Mean (SD) Descriptive Statistics (n=4) 5/55.0(0.0) 1/53.5(1.9) 5/55.0(0.0) 5/55.0(0.0) Favor/Unfavor Easy/Hard Improve/No Improve Agree/Disagree Min/MaxMean(SD) Use A Conservative Dosing Strategy Pre-InterventionPost-Intervention

33 Limit the Use of P.R.N.s Descriptive Statistics (n=4) 3/54.5(1.0) 0/42.0(1.8) 1/53.3(1.7) 4/54.8(0.5) Favor/Unfavor Easy/Hard Improve/No Improve Agree/Disagree Min/MaxMean(SD) Descriptive Statistics (n = 4) -3/5 2.5(3.8) -5/5-0.8(4.2) 2/5 3.8(1.5) 3/5 4.5(1.0) Favor/Unfavor Easy/Hard Improve/No Improve Agree/Disagree Min/Max Mean (SD) Pre-InterventionPost-Intervention

34 Intention to Use Guidelines in the Next Month (n=4) GuidelinePre-InterventionPost-Intervention Track Target Symptoms 4.6(2.89)8.25(2.1) Conservative Dosing Strategy 8.8(1.30)10.00(.0) Limit P.R.N. 5.6(3.64)8.75(.96) Track Side Effects 9.6(.89)8.75(1.5)

35 Three Levels: Child & Family Factors: e.g., Access & Acceptance Provider/Organization Factors: e.g., Skills, Use of EB Systemic and Societal Factors: e.g., Organiz., Funding Policies Barriers vs. “Promoters” to Delivery of Effective Services (Jensen, 2000) “Effective” Services Efficacious Treatments

36 CLINIC/COMMUNITY INTERVENTION DEVELOPMENT AND DEPLOYMENT MODEL (CID) (Hoagwood, Burns & Weisz, 2000) Step 1:Theoretically and clinically-informed construction, refinement, and manualizing of the protocol within the context of the practice setting where it is ultimately to be delivered Step 2:Initial efficacy trial under controlled conditions to establish potential for benefit Step 3:Single-case applications in practice setting with progressive adaptations to the protocol Step 4:Initial effectiveness test, modest in scope and cost Step 5:Full test of the effectiveness under everyday practice conditions, including cost effectiveness Step 6:Effectiveness of treatment variations, effective ingredients, core potencies, moderators, mediators, and costs Step 7:Assessment of goodness-of-fit within the host organization, practice setting, or community Step 8:Dissemination, quality, and long-term sustainability within new organizations, practice settings, or communities

37 Partnerships & Collaborations in Community-Based Research Partnerships & Collaborations in Community-Based Research l Why Partnerships? l partnerships -- not with other scientists per se, but with experts of a different type -- experts from families, neighborhoods, schools, in communities. l Only from these experts can we learn what is palatable, feasible, durable, affordable, and sustainable for children and adolescents at risk or in need of mental health services l “Partnership” - changes in typical university investigator - research subject relationship l Practice – based Research Networks l Bi-directional learning

38 l Traditional approach l research question posed, building on theory and body of previous research l logical next step in elegant chain of hypotheses, tests, proofs, and/or refutations l isolation of variables from larger context; limit potential confounds and alternative explanations of findings l study designed, investigator then looks for “subjects” who will “recipients of the bounty” l cannot answer questions about sustainability l unidirectional l blind to issues of ecological validity Partnerships & Collaborations in Community-Based Research Partnerships & Collaborations in Community-Based Research

39 l Alternative (collaborative) approach l expert-lay distinction dissolved l both partners bring critical expertise to research agenda l research methods and technical expertise from the university investigator l systems access and local-ecological expertise from the community collaborator l so-called “confounds” can provide useful “tests” of the feasibility, durability, and generalizability of the intervention l hence, importance of replication l improved validity of knowledge obtained? Partnerships & Collaborations in Community-Based Research Partnerships & Collaborations in Community-Based Research

40 The REACH Institute….Putting Science to Work - Problem area identification - Bring key “change agents” and gatekeepers to the table (federal or state partners, consumer and professional organizations) - Identify “actionable” knowledge among experts and “consumers” - Identify E-B QI procedures that are feasible, sustainable, palatable, affordable, transportable - Consumer and stakeholder “buy-in” & commitment to E-B practices - Dissemination via partners across all 3 system levels - “with an edge” (policy/legislative strategy with relevant federal/state partners) - Training and TA/QI intervention; all sites eventually get intervention. - Monitoring/fidelity - Report preparation - Results fed back into Step II. Step I Step II - Site recruitment and preparation within “natural replicate” settings - Tool preparation, fidelity/monitoring - ”Skimming the cream,” first taking those sites most ready Step III Step IV

41 Design Considerations l “Begin with the end in mind” – CID model l Enemy of the good is the perfect: raise the floor, not the ceiling l “Randomized encouragement trials” vs. randomized controlled trials l Quality Improvement group vs. TAU l How does one know the necessary ingredients of change? l Attention – Expectations – Hawthorne effects? Measure them l Attention dose, time in treatment? Measure them l Measure change processes l Assuring fidelity to model? Measure it l Ensure therapeutic relationship…and measure it l Ensure family buy-in and therapist buy-in. Measure it l Need for two controls? TAU, attention control group

42 Overcoming Challenges: A Motivational Approach Change implementation strategies based on motivational approaches - William Miller l Practice what you preach l Express empathy l to challenges of policy makers and practitioners in implementing change with population l Develop discrepancy between ideal and current l Success of evidence-based treatment must be explainable, straightforward, simply stated, meaningful

43 Overcoming Challenges: A Motivational Approach l Avoid argumentation l Clinician scientists credible to policy makers and community-based practitioners l Avoid overstating the case and “poisoning the well” l Roll with resistance l Develop strategies for engagement, prepare for possible resistance

44 Foundation of Collaborative Efforts Researcher driven Research retains Research skills designated as primary One-way Unbalanced Continual suspicion Shared; equal investment Recognition of contribution by community member & & researchers Open; opportunities to discuss & resolve conflict Belief in the good faith of partners; room for mistakes Fairly distributed Goals Power Skills Communication Trust

45 Degrees of collaboration Focus groups Community Advisors or Advisory Board Community partners as paid staff Collaboration (+) identification of pressing community/family needs (+) definition of acceptable research projects or service innovations (+) provides ongoing input regarding various stages of research process (+) collaboration regarding implementation of project (+) access to researchers to provide guidance as obstacles encountered (+) co-creation co-implementation co-evaluation co-dissemination

46 Points of Collaboration in the Research Process Study Aims Research design & sampling Measurement & Outcomes Procedures (recruit, retain, data collection ImplementationEvaluationDissemination Defined collaboratively OR Advice sought OR Researcher defined Decision made jointly OR Researcher educates on methods & advice sought OR Methods pre- determined Defined within partnership OR Advice sought OR Researcher defined Shared responsibility (e.g. community to recruit, research staff to collect data) OR Designed with input OR Designed by researchers Projects are co-directed OR Researchers train community members as co-facilitators OR Research staff hired for project Plans for analysis co-created to ensure questions of both community & researchers answered OR Community members assist in interpretation of results OR Researchers analyze data Members of partnership define dissemination outlets OR Members of community fulfill co-author & co-presenter roles OR Researchers present at conferences & publish

47 The REACH Institute….Putting Science to Work - Problem area identification - Bring key “change agents” and gatekeepers to the table (federal or state partners, consumer and professional organizations) - Identify “actionable” knowledge among experts and “consumers” - Identify E-B QI procedures that are feasible, sustainable, palatable, affordable, transportable - Consumer and stakeholder “buy-in” & commitment to E-B practices - Dissemination via partners across all 3 system levels - “with an edge” (policy/legislative strategy with relevant federal/state partners) - Training and TA/QI intervention; all sites eventually get intervention. - Monitoring/fidelity - Report preparation - Results fed back into Step II. Step I Step II - Site recruitment and preparation within “natural replicate” settings - Tool preparation, fidelity/monitoring - ”Skimming the cream,” first taking those sites most ready Step III Step IV


Download ppt "Evidence-based Application of Evidence- based Treatments Peter S. Jensen, M.D. President & CEO The REACH Institute REsource for Advancing Children’s Health."

Similar presentations


Ads by Google