Evidence-based Application of Evidence- based Treatments Peter S. Jensen, M.D. President & CEO The REACH Institute REsource for Advancing Children’s Health.

Slides:



Advertisements
Similar presentations
Integrating the NASP Practice Model Into Presentations: Resource Slides Referencing the NASP Practice Model in professional development presentations helps.
Advertisements

Diversity Issues in Research Charlotte Brown, Ph.D. Associate Professor of Psychiatry Western Psychiatric Institute and Clinic PMBC Summer Institute, Pittsburgh,
Introduction to the unit and mixed methods approaches to research Kerry Hood.
US Office of Education K
CULTURAL COMPETENCY Technical Assistance Pre-Application Workshop.
The Community Engagement Studio: Strengthening Research Capacity through Community Engagement Consuelo H. Wilkins, MD, MSCI Executive Director, Meharry.
Implementation Research: Using Science to Guide Implementation of Evidence-Based Practices Brian S. Mittman, PhD Director, VA Center for Implementation.
Practicing Community-engaged Research Mary Anne McDonald, MA, Dr PH Duke Center for Community Research Duke Translational Medicine Institute Division of.
Healthy Child Development Suggestions for Submitting a Strong Proposal.
Community-Based Participatory Research
School-Community Relations. Learning Outcomes (School-Community Relations) Students are able to: Students are able to: Explain the meaning of meaningful.
PHAB's Approach to Internal and External Evaluation Jessica Kronstadt | Director of Research and Evaluation | November 18, 2014 APHA 2014 Annual Meeting.
1. 2 Implementing and Evaluating of an Evidence Based Nursing into Practice Prepared By Dr. Nahed Said El nagger Assistant Professor of Nursing H.
Evaluation. Practical Evaluation Michael Quinn Patton.
Applying Multiple Frameworks and Theories in Implementation Research Jeffrey Smith Implementation Research Coordinator Mental Health QUERI.
Studying treatment of suicidal ideation & attempts: Designs, Statistical Analysis, and Methodological Considerations Jill M. Harkavy-Friedman, Ph.D.
Community Level Models; Participatory Research and Challenges
How to Develop the Right Research Questions for Program Evaluation
Evaluation and Policy in Transforming Nursing
Identification, Analysis and Management
Using Outreach & Enabling Services to Support the Goals of a Patient-Centered Medical Home Oscar C. Gomez, CEO Health Outreach Partners Health Resources.
Creating a service Idea. Creating a service Networking / consultation Identify the need Find funding Create a project plan Business Plan.
Meeting SB 290 District Evaluation Requirements
STUDY PLANNING & DESIGN TO ENHANCE TRANSLATION OF HEALTH BEHAVIOR RESEARCH Lisa Klesges, Russell Glasgow, Paul Estabrooks, David Dzewaltowski, Sheana Bull.
Investing in Change: Funding Collective Impact
From Evidence to Action: Addressing Challenges to Knowledge Translation in RHAs The Need to Know Team Meeting May 30, 2005.
Nursing Care Makes A Difference The Application of Omaha Documentation System on Clients with Mental Illness.
Applying Multiple Frameworks and Theories in Implementation Research (Part 2) Jeffrey Smith Implementation Research Coordinator Mental Health QUERI.
Designing Survey Instrument to Evaluate Implementation of Complex Health Interventions: Lessons Learned Eunice Chong Adrienne Alayli-Goebbels Lori Webel-Edgar.
1 OPHS FOUNDATIONAL STANDARD BOH Section Meeting February 11, 2011.
Implementation Strategy for Evidence- Based Practices CIMH Community Development Team Model Pam Hawkins, Senior Associate Association for Criminal Justice.
Sue Huckson Program Manager National Institute of Clinical Studies Improving care for Mental Health patients in Emergency Departments.
The Seventh Annual Hawaii International Conference on Education Sustainability: Implementing Programs That Survive and Thrive Randy Keyworth Jack States.
Implementation Science Vision 21: Linking Systems of Care June 2015 Lyman Legters.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
A Peer Education Approach to Sexuality Education in Schools Melissa Blake Melissa Reagan Princeton Center for Leadership Training AAHE-AAHPERD National.
Measuring and Improving Practice and Results Practice and Results 2006 CSR Baseline Results Measuring and Improving Practice and Results Practice and Results.
Use of Community Based Participatory Research (CBPR) to Develop Nutrition Programs for Chronic Disease Prevention Elena Carbone, Dr.P.H., R.D., L.D.N.
Research Utilization in Nursing Chapter 21
Dissemination and Implementation Ellen Goldstein, MA, Kevin Grumbach, MD Translating Practice into Evidence: Community Engaged Research.
Leukemia & Lymphoma Society (LLS) Information Resource Center (IRC) Planning for a Service Program Evaluation - Case Study Public Administration 522 Presented.
Opportunities, Challenges, and Solutions within a Family-School Partnership Approach The Future of School Psychology Task Force on Family-School Partnerships.
Implementation and process evaluation: developing our approach Ann Lendrum University of Manchester Neil Humphrey University of Manchester Gemma Moss Institute.
Group Technical Assistance Webinar August 5, CFPHE RESEARCH METHODS FOR COMPARATIVE EFFECTIVENESS RESEARCH.
ROLE OF INFORMATION IN MANAGING EDUCATION Ensuring appropriate and relevant information is available when needed.
Secondary Translation: Completing the process to Improving Health Daniel E. Ford, MD, MPH Vice Dean Johns Hopkins School of Medicine Introduction to Clinical.
Integrating Knowledge Translation and Exchange into a grant Maureen Dobbins, RN, PhD SON, January 14, 2013.
PPA 502 – Program Evaluation Lecture 2c – Process Evaluation.
Getting There from Here: Creating an Evidence- Based Culture Within Special Education Ronnie Detrich Randy Keyworth Jack States.
Copyright restrictions may apply Randomized Trial of Teaching Brief Motivational Interviewing to Pediatric Trainees to Promote Healthy Behaviors in Families.
Using Logic Models in Program Planning and Grant Proposals The Covenant Foundation.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Family Medicine and Community HealthDepartment of Public Heath and Family Medicine Tufts University School of Medicine Changing Practice Behavior: The.
Capacity Development Results Framework A strategic and results-oriented approach to learning for capacity development.
Cal-ABA 26th Annual Western Regional Conference What We Know About Sustaining Programs? Randy Keyworth Ronnie Detrich Jack States.
Scottish Improvement Science Collaborating Centre Strengthening the evidence base for improvement science: lessons learned Dr Nicola Gray, Senior Lecturer,
RE-AIM Framework. RE-AIM: A Framework for Health Promotion Planning, Implementation and Evaluation Are we reaching the intended audience? Is the program.
Implementation Science: Finding Common Ground and Perspectives Laura Reichenbach, Evidence Project, Population Council International Conference on Family.
Increased # of AI/AN receiving in- home environmental assessment and trigger reduction education and asthma self-management education Increased # of tribal.
CHB Conference 2007 Planning for and Promoting Healthy Communities Roles and Responsibilities of Community Health Boards Presented by Carla Anglehart Director,
Implementation—Group E Timothy Strauman, presenter Bruce Chorpita, group chair.
Authentic service-learning experiences, while almost endlessly diverse, have some common characteristics: Positive, meaningful and real to the participants.
Logic Models How to Integrate Data Collection into your Everyday Work.
Proctor’s Implementation Outcomes
Clinical Practice evaluations and Performance Review
MUHC Innovation Model.
Social Work & Social Welfare: An Invitation (3rd ed.)
Changing the Health System to Improve the Health of Older Patients
Building Capacity for Quality Improvement A National Approach
Presentation transcript:

Evidence-based Application of Evidence- based Treatments Peter S. Jensen, M.D. President & CEO The REACH Institute REsource for Advancing Children’s Health New York, NY

Effect Sizes of Psychotherapies Mean Effect Sizes Weisz et al., 1995 Children & AdolescentsAdults University “Real World”

Three Levels: Child & Family Factors: e.g., Access & Acceptance Provider/Organization Factors: e.g., Skills, Use of EB Systemic and Societal Factors: e.g., Organiz., Funding Policies Barriers vs. “Promoters” to Delivery of Effective Services (Jensen, 2000) “Effective” Services Efficacious Treatments

Key Differences, MedMgt vs. CC: Initial Titration Dose Dose Frequency #Visits/year Length of Visits Contact w/schools Teacher-Rated Inattention (CC Children Separated By Med Use)

Would You Recommend Treatment? (parent) Would You Recommend Treatment? (parent) MedmgtCombBeh Not recommend 9%3%5% Neutral9%1%2% Slightly Recommend4%2%2% Recommend35%15%24% Strongly recommend43%79%67%

Key Challenges l Policy makers and practitioners hesitant to implement change l Vested interests in the status quo l Researchers often not interested in promoting findings beyond academic settings l Manualized interventions perceived as difficult to implement or too costly l Obstacles and disincentives actively interfere with implementation

Key Challenges l Interventions implemented but “titrate the dose”, reducing effectiveness l “Clients too difficult”, “resources inadequate” used to justify bad outcomes l Research population “not the same” as youth being cared for at their clinical site l Having data and “being right” neither necessary nor sufficient to influence policy makers

The Good and the Bad: Effectiveness of Interventions by Intervention Type Davis, 2000 No. of Interventions demonstrating positive or negative/inconclusive change

Little or No Effect (Provider & Organization-focused) : l Educational materials (e.g., distribution of recommendations for clinical care, including practice guidelines, AV materials, and electronic publications) l Didactic educational meetings Bero et al, 1998

Effective Provider & Organizational Interventions: l Educational outreach visits l Reminders (manual or computerized) l Multifaceted interventions l Sustained, interactive educational meetings (participation of providers in workshops that include discussion and practice) Bero et al, 1998

Implications re: Changing Provider Behaviors Changing professional performance is complex - internal, external, and enabling factors No “magic bullets” to change practice in all circumstances and settings (Oxman, 1995) Multifaceted interventions targeting different barriers more effective than single interventions (Davis, 1999) Little to no theory-based studies Consensus guidelines approach necessary, but not sufficient. Lack of fit w/HCP’s mental models

Additional Perspectives  Messenger of equal importance as the message  Trusted  Available  Perceived as expert/competent  Adult Learning Models  Tailored to learner’s needs  Learner-defined objectives  Hands-on, with ample opportunities for practice  Sustained over time  Skill-oriented  Feedback  Attention to Maintenance and sustaining change

Dissemination and Adoption of New Interventions Source: Backer, Liberman, & Kuehnel (1986) Dissemination and Adoption of Innovative Psychosocial Interventions. Journal of Consulting and Clinical Psychology, 54: ; Jensen, Hoagwood, & Trickett (1997) From Ivory Towers to Earthen Trenches. J Appliied Developmental Psychology l Sustained Interpersonal contact l Organizational support l Persistent championship of the intervention l Adaptability of the intervention to local situations l Availability of credible evidence of success l Ongoing technical assistance, consultation

Science-based Plus Necessary “-abilities” Palatable Affordable Transportable Trainable Adaptable, Flexible Evaluable Feasible Sustainable

Models for Behavior Change: (Jaccard et al, 2002) The Theory of Reasoned Action (Fishbein & Ajzen, 1975) Self-efficacy Theory (Bandura, 1977) The Theory of Planned Behavior (Ajzen, 1981) Diffusion of Innovations (Rogers, 1995)

Influences on Provider Behavior Patient & Family Factors: Stigma Adherence Negative attitudes Rapport, engagement Provider Factors: Knowledge, training Self-efficacy Time pressures Fear of litigation Attitudes & beliefs Social conformity Lack of information Economic Influences: Compensation Reimbursement Incentives Systemic & Societal Factors: Organizational standards Staff support/resistance Staff Training Funding policy Prescribing Practices

First, Use an Atypical vs. Typical Descriptives (n=19) Min/Max Mean(SD) Favor/Unfavor0/53.73(1.61) Easy/Hard-1/54.16(1.64) Improve/No0/52.84(1.57) Agree/Disagree0/54.05(1.27)

First Use Atypical--Advantages AdvantagesCountPercent of Responses Avoids typicals' side effects % Better patient approval/compliance % Atypicals effective in treating aggression 2 9.1% Other (i.e. looks better politically) 2 9.1% Total responses %

First Use Atypical – Disadvantages DisadvantageCountPercent of Responses Typicals may work better for some patients % Avoids atypicals' side effects % If need to sedate patient, typicals may be better % More is known about typicals in kids % Can not be administered as IM’s % Other 1 3.8% Total responses %

First Use Atypical—Obstacles ObstacleCountPercent of Responses Cost % More data supporting typicals % Patient history of non-response to atypicals % Patient resistance % Less available 2 9.5% Other 2 9.5% Total responses %

Limit the Use of Stat’s & P.R.N.’s Descriptive Statistics (n=19) Min/MaxMean (SD) Favor/Unfavor-5/52.63(2.89) Easy/Hard-5/5-0.38(3.22) Improve/No-2/52.44(1.92) Agree/Disagree-2/53.86(1.81)

Limit Stat‘ & P.R.N.’s -- Advantages AdvantageCountPercent of Responses Other (i.e avoids traumatizing patient, % Avoids unnecessary medication % Avoids unnecessary side effects % Allows doctor to better understand patient’s condition % Patient learns techniques they can apply in ‘real life’ % Total responses %

Limiting Stat’s & P.R.N.'s — Disadvantages DisadvantageCountPercent of Responses Possible safety risk to patient and others 9 2.9% Other (i.e. does not address biological factors % Difficult for staff, who may feel less in control % May need to rapidly sedate patient 2 9.5% Total responses %

Limiting Stat’s & P.R.N.'s-- Obstacles ObstacleCountPercent of Responses Safety % Other (i.e.patient belief that p.r.n.’s condone behavior; % Staff resistance % Patient too aggressive % Staff availability and training % Total responses %

Monitor Side Effects Descriptives (n=19) Min/MaxMean(SD) Favor/Unfavor3/54.57(.69) Easy/Hard-2/52.94(2.4) Improve/No1/54.0(1.15) Agree/Disagree3/54.68(.58)

Use Standardized Scales for Side Effects -- Advantages AdvantageCountPercent of Responses Helps captures side effects you might otherwise miss % Other (i.e. increases patient compliance; improves % communication between doctors; helps assess severity of side effects) Provides objective measure % Keeps doctors’ focus on side effects % Determines drug effectiveness for specific symptoms % Enables doctor to track side effects over time % Total responses %

Use Standardized Scales for Side Effects--Disadvantages DisadvantageCountPercent of Responses Doctor may ignore side effects not on scale % May minimize importance of clinical evaluations % Other (i.e. may make patient more aware of side effects) % Methodological problems (i.e. inter-rater reliability) % Total responses %

Scales for Side Effects--Obstacles ObstacleCountPercent of Responses Time % Scales are complicated/require training % Instrument availability % Other (i.e. staff resistance; instrument availability; % cost) Administrative barriers 3 9.4% Laziness 3 9.4% Clinician resistance 2 6.3% Total responses %

New Models for Behavior Change: TMC, TII (Gollwitzer, Oettingen, Jaccard, Jensen et al, 2002; Perkins et al., 2007)

Mental Contrasting/Implementation Intentions 1. Use mental contrasting to strengthen behavioral intentions: “What are the advantages or positive consequences associated with the use of Guideline X” 2. Identify Obstacles: “What gets in the way of implementing guideline X” 3. Form Implementation Intentions to overcome obstacles: “If I encounter obstacle Y, then I will X.”

Track Target Symptoms Pre-InterventionPost-Intervention Descriptive Statistics (n=4) 1/53.5(1.9) 0/41.8(1.7) 1/42.5(1.3) 3/54.3(1.0) Favor/Unfavor Easy/Hard Improve/No Improve Agree/Disagree Min/MaxMean(SD) Descriptive Statistics (n = 4) 1/5 3.0(1.6) -3/1-0.5(1.9) 2/3 2.8(0.5) 3/4 3.3(0.5) Favor/Unfavor Easy/Hard Improve/No Improve Agree/Disagree Min/Max Mean (SD)

Descriptive Statistics (n = 4) 4/5 4.8(0.5) -5/5 3.3(2.9) 4/5 4.8(0.5) 5/5 5.0(0.0) Favor/Unfavor Easy/Hard Improve/No Improve Agree/Disagree Min/Max Mean (SD) Descriptive Statistics (n=4) 5/55.0(0.0) 1/53.5(1.9) 5/55.0(0.0) 5/55.0(0.0) Favor/Unfavor Easy/Hard Improve/No Improve Agree/Disagree Min/MaxMean(SD) Use A Conservative Dosing Strategy Pre-InterventionPost-Intervention

Limit the Use of P.R.N.s Descriptive Statistics (n=4) 3/54.5(1.0) 0/42.0(1.8) 1/53.3(1.7) 4/54.8(0.5) Favor/Unfavor Easy/Hard Improve/No Improve Agree/Disagree Min/MaxMean(SD) Descriptive Statistics (n = 4) -3/5 2.5(3.8) -5/5-0.8(4.2) 2/5 3.8(1.5) 3/5 4.5(1.0) Favor/Unfavor Easy/Hard Improve/No Improve Agree/Disagree Min/Max Mean (SD) Pre-InterventionPost-Intervention

Intention to Use Guidelines in the Next Month (n=4) GuidelinePre-InterventionPost-Intervention Track Target Symptoms 4.6(2.89)8.25(2.1) Conservative Dosing Strategy 8.8(1.30)10.00(.0) Limit P.R.N. 5.6(3.64)8.75(.96) Track Side Effects 9.6(.89)8.75(1.5)

Three Levels: Child & Family Factors: e.g., Access & Acceptance Provider/Organization Factors: e.g., Skills, Use of EB Systemic and Societal Factors: e.g., Organiz., Funding Policies Barriers vs. “Promoters” to Delivery of Effective Services (Jensen, 2000) “Effective” Services Efficacious Treatments

CLINIC/COMMUNITY INTERVENTION DEVELOPMENT AND DEPLOYMENT MODEL (CID) (Hoagwood, Burns & Weisz, 2000) Step 1:Theoretically and clinically-informed construction, refinement, and manualizing of the protocol within the context of the practice setting where it is ultimately to be delivered Step 2:Initial efficacy trial under controlled conditions to establish potential for benefit Step 3:Single-case applications in practice setting with progressive adaptations to the protocol Step 4:Initial effectiveness test, modest in scope and cost Step 5:Full test of the effectiveness under everyday practice conditions, including cost effectiveness Step 6:Effectiveness of treatment variations, effective ingredients, core potencies, moderators, mediators, and costs Step 7:Assessment of goodness-of-fit within the host organization, practice setting, or community Step 8:Dissemination, quality, and long-term sustainability within new organizations, practice settings, or communities

Partnerships & Collaborations in Community-Based Research Partnerships & Collaborations in Community-Based Research l Why Partnerships? l partnerships -- not with other scientists per se, but with experts of a different type -- experts from families, neighborhoods, schools, in communities. l Only from these experts can we learn what is palatable, feasible, durable, affordable, and sustainable for children and adolescents at risk or in need of mental health services l “Partnership” - changes in typical university investigator - research subject relationship l Practice – based Research Networks l Bi-directional learning

l Traditional approach l research question posed, building on theory and body of previous research l logical next step in elegant chain of hypotheses, tests, proofs, and/or refutations l isolation of variables from larger context; limit potential confounds and alternative explanations of findings l study designed, investigator then looks for “subjects” who will “recipients of the bounty” l cannot answer questions about sustainability l unidirectional l blind to issues of ecological validity Partnerships & Collaborations in Community-Based Research Partnerships & Collaborations in Community-Based Research

l Alternative (collaborative) approach l expert-lay distinction dissolved l both partners bring critical expertise to research agenda l research methods and technical expertise from the university investigator l systems access and local-ecological expertise from the community collaborator l so-called “confounds” can provide useful “tests” of the feasibility, durability, and generalizability of the intervention l hence, importance of replication l improved validity of knowledge obtained? Partnerships & Collaborations in Community-Based Research Partnerships & Collaborations in Community-Based Research

The REACH Institute….Putting Science to Work - Problem area identification - Bring key “change agents” and gatekeepers to the table (federal or state partners, consumer and professional organizations) - Identify “actionable” knowledge among experts and “consumers” - Identify E-B QI procedures that are feasible, sustainable, palatable, affordable, transportable - Consumer and stakeholder “buy-in” & commitment to E-B practices - Dissemination via partners across all 3 system levels - “with an edge” (policy/legislative strategy with relevant federal/state partners) - Training and TA/QI intervention; all sites eventually get intervention. - Monitoring/fidelity - Report preparation - Results fed back into Step II. Step I Step II - Site recruitment and preparation within “natural replicate” settings - Tool preparation, fidelity/monitoring - ”Skimming the cream,” first taking those sites most ready Step III Step IV

Design Considerations l “Begin with the end in mind” – CID model l Enemy of the good is the perfect: raise the floor, not the ceiling l “Randomized encouragement trials” vs. randomized controlled trials l Quality Improvement group vs. TAU l How does one know the necessary ingredients of change? l Attention – Expectations – Hawthorne effects? Measure them l Attention dose, time in treatment? Measure them l Measure change processes l Assuring fidelity to model? Measure it l Ensure therapeutic relationship…and measure it l Ensure family buy-in and therapist buy-in. Measure it l Need for two controls? TAU, attention control group

Overcoming Challenges: A Motivational Approach Change implementation strategies based on motivational approaches - William Miller l Practice what you preach l Express empathy l to challenges of policy makers and practitioners in implementing change with population l Develop discrepancy between ideal and current l Success of evidence-based treatment must be explainable, straightforward, simply stated, meaningful

Overcoming Challenges: A Motivational Approach l Avoid argumentation l Clinician scientists credible to policy makers and community-based practitioners l Avoid overstating the case and “poisoning the well” l Roll with resistance l Develop strategies for engagement, prepare for possible resistance

Foundation of Collaborative Efforts Researcher driven Research retains Research skills designated as primary One-way Unbalanced Continual suspicion Shared; equal investment Recognition of contribution by community member & & researchers Open; opportunities to discuss & resolve conflict Belief in the good faith of partners; room for mistakes Fairly distributed Goals Power Skills Communication Trust

Degrees of collaboration Focus groups Community Advisors or Advisory Board Community partners as paid staff Collaboration (+) identification of pressing community/family needs (+) definition of acceptable research projects or service innovations (+) provides ongoing input regarding various stages of research process (+) collaboration regarding implementation of project (+) access to researchers to provide guidance as obstacles encountered (+) co-creation co-implementation co-evaluation co-dissemination

Points of Collaboration in the Research Process Study Aims Research design & sampling Measurement & Outcomes Procedures (recruit, retain, data collection ImplementationEvaluationDissemination Defined collaboratively OR Advice sought OR Researcher defined Decision made jointly OR Researcher educates on methods & advice sought OR Methods pre- determined Defined within partnership OR Advice sought OR Researcher defined Shared responsibility (e.g. community to recruit, research staff to collect data) OR Designed with input OR Designed by researchers Projects are co-directed OR Researchers train community members as co-facilitators OR Research staff hired for project Plans for analysis co-created to ensure questions of both community & researchers answered OR Community members assist in interpretation of results OR Researchers analyze data Members of partnership define dissemination outlets OR Members of community fulfill co-author & co-presenter roles OR Researchers present at conferences & publish

The REACH Institute….Putting Science to Work - Problem area identification - Bring key “change agents” and gatekeepers to the table (federal or state partners, consumer and professional organizations) - Identify “actionable” knowledge among experts and “consumers” - Identify E-B QI procedures that are feasible, sustainable, palatable, affordable, transportable - Consumer and stakeholder “buy-in” & commitment to E-B practices - Dissemination via partners across all 3 system levels - “with an edge” (policy/legislative strategy with relevant federal/state partners) - Training and TA/QI intervention; all sites eventually get intervention. - Monitoring/fidelity - Report preparation - Results fed back into Step II. Step I Step II - Site recruitment and preparation within “natural replicate” settings - Tool preparation, fidelity/monitoring - ”Skimming the cream,” first taking those sites most ready Step III Step IV