Presentation is loading. Please wait.

Presentation is loading. Please wait.

Evidence-based Practice in Psychology: Epistemological Diversity* Steven D. Hollon Member, Presidential Task Force on EBPP Vanderbilt University Email:

Similar presentations


Presentation on theme: "Evidence-based Practice in Psychology: Epistemological Diversity* Steven D. Hollon Member, Presidential Task Force on EBPP Vanderbilt University Email:"— Presentation transcript:

1 Evidence-based Practice in Psychology: Epistemological Diversity* Steven D. Hollon Member, Presidential Task Force on EBPP Vanderbilt University Email: steven.d.hollon@vanderbilt.edusteven.d.hollon@vanderbilt.edu *Based upon: Report of the 2005 Presidential Task Force on Evidence-Based Practice: Policy statement:http://www.apa.org/practice/ebpstatement.pdfhttp://www.apa.org/practice/ebpstatement.pdf Complete report:http://www.apa.org/practice/ebpreport.pdfhttp://www.apa.org/practice/ebpreport.pdf

2 2005 Presidential Task Force on Evidence-Based Practice (APA) Ronald F Levant EdD (Chair)Carol D Goodheart EdD (Chair) David H Barlow PhDFrederick L Newman PhD Jean Carter PhDJohn C Norcross PhD Karina Davidson PhDDoris K Silverman PhD Kristofer J Hagglund PhDBrian D Smedley PhD Steven D Hollon PhDBruce E Wampold PhD Josephine D Johnson PhD Drew I Westen PhD Laura C Leviton PhDBrian T Yates PhD Alvin R Mahrer PhDNolan W Zane PhD APA Staff: Geoffrey M Reed PhD Lynn F Bufka PhD Ernestine Penniman

3 Basic Definition and Process  Based on IOM definition that emphasized integration of research evidence with clinical expertise and patient values  Drew on diverse group with range of expertise and interests from research scientists through clinical practitioners  Produced draft policy statement and position paper that was then posted for comments and subsequently revised  Approved by vote of council at the 2005 APA convention

4 EBPP Defined  Evidence-based practice in psychology (EBPP) is the integration of the best available research with clinical expertise in the context of patient characteristics, culture, and preferences.  Closely parallels the definition adopted by the Institute of Medicine (2001) as adapted from Sackett and colleagues (2000, p. 14): “Evidence-based practice is the integration of best research evidence with clinical expertise and patient values.”

5 Best Research Evidence  Evidence drawn from basic and applied research  Hierarchy from clinical observation through randomized controlled trials with respect to efficacy  Address efficacy and effectiveness (utility)  Absence of evidence not evidence of absence Untested does not mean ineffective Claims for efficacy should be tested *American Psychological Association (2002). Criteria for evaluating treatment guidelines. American Psychologist, 57, 1052-1059.

6 Clinical Expertise  Encompasses number of competencies positive therapeutic relationships integration of diverse information recognizes own bias and limitation  Derived from clinical and scientific training  Used to integrate research evidence with clinical data in context of patient preference

7 Patient Characteristics, Values, and Context  Services most effective when responsive to patient problems, strengths, and preferences  Important variations in age, gender, race and ethnicity, and culture (among others)  EBPP seeks to maximize patient choice among effective alternative interventions

8 Integration  Psychologist determines applicability of research evidence to particular patient Application of research to given patient always involves probabilistic inferences Continuous monitoring of patient progress and adjustment of treatment as needed  Clinical decisions made in collaboration with informed patient and in consideration of costs, benefits, and options available (never by untrained persons unfamiliar with specifics of the case)

9 In Defense of RCTs  RCTs best way to detect causal influence far from perfect but still the best we have uncontrolled trials confound patients and procedures hormone replacement therapy just latest example need not do therapy like a scientist to evaluate effects Carl Rogers one of the first to do controlled trials good data best way to keep the critics at bay no controlled trials before Eysenck’s critique hundreds of subsequent trials show that psychotherapy works some leading therapies still not adequately tested

10 In Defense of ESTs  ESTs one reasonable way to see what works look for well done studies that show effects need not sacrifice external validity for internal validity need not exclude representative patients (and no longer do) can be used to test long-term treatments (and starting to do so) do not mandate specificity but can detect it treatment needs to work but not for reasons specified special case for medications not for psychotherapy treatment manuals neither necessary nor sufficient need not constrain clinicians unduly if integrity maintained merely useful aid for training and dissemination

11

12

13 In Defense of ESTs  ESTs one reasonable way to see what works look for well done studies that show effects need not sacrifice external validity for internal validity need not exclude representative patients (and no longer do) can be used to test long-term treatments (and starting to do so) do not mandate specificity but can detect it treatment needs to work but not for reasons specified special case for medications not for psychotherapy treatment manuals neither necessary nor sufficient need not constrain clinicians unduly if integrity maintained merely useful aid for training and dissemination

14 Must RCTs Exclude Representative Patients?  805 patients evaluated  240 (30%) randomized  565 (70%) excluded 235 (29%) low severity 240 (30%) diagnostic 96 (12%) psychosis 63 (08%) sub abuse 17 (02%) axis I 19 (02%) axis II 45 (06%) medical 95 (10%) med refusal 08 (01%) suicide risk From DeRubeis et al., 2005

15 Must RCTs Exclude Complicated Patients?  240 patients randomized  40 (16%) depressed only  200 (84%) comorbid 146 (73%) axis I 127 (53%) anxiety disorder 86 (36%) sub abuse 40 (16%) eating disorder 125 (52%) axis II 10 (04%) cluster a 10 (04%) cluster b 84 (35%) cluster c 37 (16%) pd nos From DeRubeis et al., 2005

16 In Defense of ESTs  ESTs one reasonable way to see what works look for well done studies that show effects need not sacrifice external validity for internal validity need not exclude representative patients (and no longer do) can be used to test long-term treatments (and starting to do so) do not mandate specificity but can detect it treatment needs to work but not for reasons specified special case for medications not for psychotherapy treatment manuals neither necessary nor sufficient need not constrain clinicians unduly if integrity maintained merely useful aid for training and dissemination

17 ADM and CT (N=225) ADM (N=225) ADM (N=90+) No ADM (N=90+) 1st Randomization1st Randomization Acute Treatment (1-18 months) Continuation (6-18 months) Maintenance/Follow-up (36 months) CPT III No ADM (N=90+) ADM (N=90+) 2 n d R a ndo m i z a t i on RemissionRecovery ResponseRelapseRecurrence (twice weekly/weekly) (monthly) (weekly/biweekly) (monthly) (monthly/ quarterly) (monthly/ quarterly)

18 79% 64%

19 79% 69%

20 71% 41% 19% 09%

21 Sustained Recovery = pRemit x cpRecover x 1-cpRecurrence

22 In Defense of ESTs  ESTs one reasonable way to see what works look for well done studies that show effects need not sacrifice external validity for internal validity need not exclude representative patients (and no longer do) can be used to test long-term treatments (and starting to do so) do not mandate specificity but can detect it treatment needs to work but not for reasons specified special case for medications not for psychotherapy treatment manuals neither necessary nor sufficient need not constrain clinicians unduly if integrity maintained merely useful aid for training and dissemination

23 Response to Treatment as a Function of Condition

24 Continuation Followup

25 Sustained Improvement for All Assigned to Treatment

26 Cumulative Direct Costs of ADM and CT

27 In Defense of ESTs  ESTs one reasonable way to see what works look for well done studies that show effects need not sacrifice external validity for internal validity need not exclude representative patients (and no longer do) can be used to test long-term treatments (and starting to do so) do not mandate specificity but can detect it treatment needs to work but not for reasons specified special case for medications not for psychotherapy treatment manuals neither necessary nor sufficient need not constrain clinicians unduly if integrity maintained merely useful aid for training and dissemination

28

29 Summary and Conclusions  Multiple components contribute to outcome Patient, therapist, relationship also matter  Validate treatments to improve patient care New methods emerge over time Pursue other ways to improve care  Emphasize that which we can reliably teach

30 Putting Things in Perspective  No one pretends that democracy is perfect or all- wise...indeed, it has been said that democracy is the worst form of Government except all those other forms that have been tried from time to time – Winston Churchill  The first principle for being a good psychologist is to not kid yourself, the second principle is to not kid anybody else – Paul Meehl


Download ppt "Evidence-based Practice in Psychology: Epistemological Diversity* Steven D. Hollon Member, Presidential Task Force on EBPP Vanderbilt University Email:"

Similar presentations


Ads by Google