Presentation is loading. Please wait.

Presentation is loading. Please wait.

Bonnie Spring, Ph.D., ABPP Northwestern University

Similar presentations


Presentation on theme: "Bonnie Spring, Ph.D., ABPP Northwestern University"— Presentation transcript:

1 Bonnie Spring, Ph.D., ABPP Northwestern University
Evidence-Based Practice in Clinical Psychology: What It Is, Why It Matters, What You Need to Know Bonnie Spring, Ph.D., ABPP Northwestern University

2 Why it matters: EBBP Rationale
improve quality and accountability for health care practice (IOM, 2001, Crossing the Quality Chasm) shared vocabulary and concepts for transdisciplinary, biopsychosocial research, practice, health care policy stimulate development of evidence base for behavioral treatments 2005 APA Policy

3 Why it matters: Potentionally Useful Infrastructure
Clinical Practice Guidelines: Increasingly based on ongoing systematic review of research (esp. RCTs) (e.g., USPTF, Cochrane, CDC/AHRQ) Research reporting guidelines (CONSORT, TREND, QUOROM) Evidence grading & knowledge synthesis systems (e.g., GRADE, AHRQ) Policy, often coverage/reimbursement implications (VA/DOD, CMS, NICE) (P4P?) Evidence-Based Practice: (life-long learning) Question formulation, search strategies, critical appraisal SUMSEARCH Clinical Evidence, First Consult, BMJ updates, Best Evidence Topics, CATCRAWLER, CATBANK – clinical scenario & bottom line

4 Overview History of evidence-based practice (EBP) Core elements of EBP
EBP pedagogy in psychology EBP pedagogy in other health disciplines Useful infrastructure and potential opportunities for synergy

5 Origins of Evidence-Based Practice
Missionary zeal.

6 Emergence of Evidence-Based Medicine
Flexner report :155(31!) (1915) 76(1930) Archie Cochrane – epidemiology, health services research - Effectiveness and Efficiency: Random Reflections on Health Services 1973 – John Wennberg – widespread practice variation clinical epidemiology determinants and consequences of health care decisions (McMaster U – David Sackett, Gordon Guyatt) 1985 – IOM: 15% medical practices evidence-based [2001 Crossing the Quality Chasm] Evidence-based medicine, Brian Haynes & Ann McKibbon – search strategies Cochrane Collaboration Sackett - How to Practice and Teach EBM Green – US; Yellow – Europe; White – Canada. Flexner – quality control; teaching methods & what txs; science. Cochrane – epidemiology, but how do you know what treats – yeast vs vit C in POW. RCT. Wennberg – nomothetic – diff diseases get diff treatments in diff places. Medicare patients in high-cost areas get more expensive care (more ICU, test, hospitalizations) but no better outcome. Because more academic med centers there. McMaster - idiographic Abraham Flexner – proprietary med schools, German educator and not MD. With the approval of the American Medical Association, Abraham Flexner, a college graduate of Johns Hopkins and an educator, undertook a muckraking investigation for the Carnegie Foundation for the Advancement of Teaching of 155 medical schools in the United States and Canada. Medical reformers were convinced that there were too many schools of low quality producing an overabundance of doctors. Flexner's radical conclusion was that only 31 medical schools were fit to survive. His model of the medical school of the future was Johns Hopkins School of Medicine. By 1915 the nation's medical schools had been reduced to 96 and by 1930 there were only 76 schools training the nation's physicians Cochrane – Scottish epidemiologist, screened entire populations - reflections on health services research; RCT as way to determine which treatments most effective & cost-effective to benefit public health. Methods used to determine best evidence introduced by them

7 What do we mean by “evidence-based practice?”

8 Alternative Definitions of Evidence-Based Practice
Nomothetic Guidelines: (public health, medicine) – focus on problem/disorder & level of evidence for practices (based on systematic review) (e.g., NICE, VA, apa) ESTs: (psychology) focus on intervention (& disorder) EBP: (psychology, medicine, nursing, social work) focus on decision-making about individual patients Guidelines – Not standards, aspirational, but do tend to influence coverage policies – rationing care, often w/ good reason. Nomothetic; eliminate practice variation; RCTs of treatments for disorders; way to disseminate education; implement best practices. Somewhat different evidence criteria for EST Idiographic……….. Lifelong Learning

9 APA Policy Statement adopted August 2005
“Evidence-based practice in psychology is the integration of the best available research with clinical expertise in the context of patient characteristics, culture, and preferences.” -adapted from IOM, 2001 & Sackett, 2000

10 Clinical Decision-Making
Best available research evidence Clinical Decision-Making Patient’s values, characteristics, and circumstances Clinical Expertise

11 Syllabus Project Prompt: Does anyone on the list teach a course on evidence-based practice (EBP)?  Specifically, I am searching for syllabi that cover one or more "legs" of the three-legged EBP stool:  a) research evidence, b) clinical expertise, c) patient values, preferences, characteristics.  November, 2006

12 Listservs Sampled ABCT APA Division 12 SSCPNET (Section III, Div 12)
CUDCP APA Division 38 ABMR SBM EBBM, MRBC, Obesity, CA SIGs

13 Outcome 39 syllabi 17 additional recommended articles and books
273 page document Discipline: 30 psychology 3 public health 3 medicine 1 nursing 1 PE/health/sport studies 140 requests November, 2006

14 Evidence-Based Practice
Modal Course Title: CBT, EST, EVT, Psychological Interventions, Psychotherapy Research Texts: Barlow, Handbook Psychologic Disorders, Bergen & Garfield Handbook of Psychotherapy and Behavior Change Content: ESTs Additional Additional Texts: -Persons, Case Conceptualization -Dawes, House of Cards Additional Content: -Assessment -Case formulation, functional analysis -Clinical judgment -Diversity -Iatrogenic effects -Research methods Iatrogenic – critical incident debriefing

15

16

17 courtesy of Barbara Walker, Indiana University, 2006

18 Best available research evidence
Synthesizer Locate Critically appraise Meta-analysis Consumer Appraise quality & relevance Integrate Researcher Design Conduct Analysis Reporting Best available research evidence Clinical Decision-Making Patient’s values, characteristics, and circumstances Clinical Expertise Clinician Communicate Assess patient Deliver EBP Patient Understanding Preferences Access

19 Researcher Training in Psychology versus Medicine
Design Correlational (convenience classes) Experimental (from animal studies) Conduct Brief, tight control Little missing data; replace cases Analysis - completer Reporting Clinical Medicine Design Observational (population) Clinical Trial –test of policy applied to population Conduct Long, intercurrent events Missing data; Analysis – ITT Reporting – CONSORT Based on animal study – noise exposure; if didn’t experience full course of noise, can’t tell, so replace. In clinical medicine, test if vitamin C or eating citrus. If people drop out of citrus group or shift to A, keep them citrus. Meaninful if they drop out or shift. Generalizing to what would be policy effect.

20 Researcher, Synthesizer, Consumer Training in Analysis
Psychology ANOVA/regression Clinical Medicine Odds Ratios Epidemiology Terminology Absolute risk (p[disease] in a particular population) Relative risk (p[disease/exposed]/p[disease/unexposed) Attributable risk (p[disease/exposed] -p[disease/unexposed) Number needed to harm (1/attributable risk) Odds ratio (odds[disease/exposed]/odds[disease/unexposed]) Why categorical – diagnosis; yes/no decision

21 Clinical Significance
NNH = 5. If 5 patients treated with TX1, 1 would be more likely to have AE than if all had received TX0 NNT = patients would need to be treated with TX1 to see one success not seen with TX0

22 Reporting: Consort Flow Diagram
Consolidated Standards of Reporting Trials (CONSORT)

23 Excerpt from CONSORT checklist
METHODS Participants 3 Eligibility criteria for participants and the settings and locations where the data were collected. Interventions 4 Precise details of the interventions intended for each group and how and when they were actually administered. Objectives 5 Specific objectives and hypotheses. Outcomes 6 Clearly defined primary and secondary outcome measures and, when applicable, any methods used to enhance the quality of measurements (e.g., multiple observations, training of assessors). Sample size 7 How sample size was determined and, when applicable, explanation of any interim analyses and stopping rules. Randomization -- Sequence generation 8 Method used to generate the random allocation sequence, including details of any restrictions (e.g., blocking, stratification) Randomization -- Allocation concealment 9 Method used to implement the random allocation sequence (e.g., numbered containers or central telephone), clarifying whether the sequence was concealed until interventions were assigned. Randomization -- Implementation 10 Who generated the allocation sequence, who enrolled participants, and who assigned participants to their groups. Blinding (masking) 11 Whether or not participants, those administering the interventions, and those assessing the outcomes were blinded to group assignment. When relevant, how the success of blinding was evaluated.

24 Evidence Synthesizer and Consumer Skills

25 Best available research evidence
Synthesizer Locate Critically appraise Meta-analysis Evidence User Locate Appraise quality & relevance Integrate Researcher Design Conduct Analysis Reporting Best available research evidence Clinical Decision-Making Patient’s values, characteristics, and circumstances Clinical Expertise Clinician Communicate Assess patient Deliver EBP Patient Understanding Preferences Access

26 Synthesizer: Systematic Reviewer- explicit, systematic, transparent to avoid bias
Specific research question (PICO) Search protocol to select papers – key words systematic search of the literature (EMBASE, CINAHL, Cochrane Controlled Trial register, DARE) explicit inclusion and exclusion criteria Explicit, transparent rating of methodological quality Data extraction Analysis: qualitative or quantitative Conclusion Discussion of strengths and limitations

27 The 5 Step EBM Model for Evidence Users (Consumers)
Ask: formulate the question Acquire: evidence - search for answers Appraise: the evidence for quality and relevance Apply the results Assess the outcome Think of these as competencies Appraise – both quality of the evidence and relevance to the pt at hand The question must have 4 parts – pt or problem being addressed, the intervention being considered, the comparison intervention, the clinical outcomes of interest Remember that clinical experience, pt preference, social context also play a role! Individual clinical expertise: the proficiency and judgment of the clinician gained from clinical experience and practice. Best available external clinical evidence - clinically relevant research. Accuracy/precision of diagnostic tests Power of prognostic markers Efficacy/safety of therapeutic, rehabilitative, and preventative regimens Basic medical sciences – genetics, immunology

28 Asking: Well-Built Clinical Questions
Background: What are effective treatments for bulimia nervosa? Foreground: In patients with Patient: binge eating disorder Intervention: does interpersonal therapy Comparison: compared to CBT reduce Outcome: frequency of binge episodes

29 Critically appraising the evidence
Use of standardized a priori appraisal methods to answer: Is the evidence valid? Internal validity Is the evidence applicable/relevant? External validity Is the evidence clinically significant? Is the evidence valid? Internal validity: how true are the study results for the people in the study? Varies by the validity of the chosen study design to answer the study question, the potential biases of the chosen study design and how well they were controlled in the study execution. Sources of bias: selection, performance, detection, attrition. External validity: How well do the study’s results apply to the person or clinical population you are interested in? Less well defined criteria for all studies. Historic approach has been hierarchical study designs (Handout—study designs). Some of the “bias” towards RCT has come from this. More recent approaches are sophisticated by considering trial design in combination with study design (how well does this type of study answer the question at hand?), how well study was executed (adequate randomization and followup), and questions related to external validity (spectrum of patients considered)—along with looking at various summary reports (SR is highest level of evidence). (Handout levels of evidence and grades of recommendation). How strong is the evidence? The chart also gets at this attribute—especially consistency and bias. Coherence is addressed by looking across evidence at contradicting as well as consistent results, and conisdering the evidence in light of what is known in other sciences (basic sciences/biology, systems research, sociology/psychology, etc.). What does the summarized evidence suggest about harms and benefits? Putting it all together. Common sense dictate: When there is evidence of benefit and value—do it. When there is evidence of no benefit, harm or poor value, don’t do it. When there is insufficient evidence to know for sure—be conservative. Use of the balance sheet. Balance sheets include: The characteristics of the underlying population that affect the question The most important health (and economic outcomes)—positive and negative The most important options (choices) The probability or magnitude of each outcomes with each option The absolute differences of outcomes with each option Limitations-- Can have strong but contradictory evidence—HRT Weighing benefits and harms—whose perspective? None of these methods is perfect—and coming up with hard numbers can be a lot of work—but focusing on exact estimates and complete options and outcomes forces precision in our thinking, accurate communication to patients, and consistency of care with patient preferences weighed in. A patient can’t really weigh in for a decision involving real tradeoffs with qualitative words: This treatment has benefits—but is also has some risks…What do you want to do?

30 Clinical Decision-Making
Clinical epidemiology discipline study of determinants and consequences of clinical decisions apply EBP/5A’s/critical appraisal at clinical encounter to overcome automatic, unconscious decision-making biases (aka bad clinical intuition)

31 barriers between research and practice
30 kg of guidelines per family doctor per year 25000 biomedical journals in print 8000 articles published per day 95% of studies cannot reliably guide clinical decisions 2001 Bazian Ltd

32 Clinical Decision-Making
Health Informatics discipline infrastructure, resources, devices, structures (e.g., algorithms, guidelines) needed to store, retrieve, manage and use health information and the time and place that a decision needs to be made. -Decision support.

33 Secondary Synthesized Evidence (AKA “evidence-based capitulation”)
Research proliferates rapidly. Clinical performance demands increase. Practicing clinicians too busy to use all EBM steps will all patients. Increased focus on pithy clinical practice guidelines, synopses, and structured abstracts MD Consult ACP Journal Club Cochrane Database of Systematic Reviews “Up-to-date” InfoPOEMS (Patient Oriented Evidence that Matters)

34 Best available research evidence
Synthesizer Locate Critically appraise Meta-analysis Consumer Appraise quality & relevance Integrate Researcher Design Conduct Analysis Reporting Best available research evidence Clinical Decision-Making Patient’s values, characteristics, and circumstances Clinical Expertise Clinician Communicate Assess patient Deliver EBP Patient Understanding Preferences Access

35 Clinically Supervised Training in Evidence-Based Treatment
Needs work: papers by Woody and by Weissman

36 Best available research evidence
Synthesizer Locate Critically appraise Meta-analysis Consumer Appraise quality & relevance Integrate Researcher Design Conduct Analysis Reporting Best available research evidence Clinical Decision-Making Patient’s values, characteristics, and circumstances Clinical Expertise Clinician Communicate Assess patient Deliver EBP Patient Understanding Preferences Access

37 Patient Preferences Shared decision-making requires information only available to patient (e.g., valuation of harms/hassles, alternative outcomes & treatments) Utility assessment: All possible outcomes assigned a value between 0 (death) and 1 (perfect health). Time trade-off approach The proportion of life in a particular health state (e.g., severe depression) that you would give up to attain perfect health (e.g., 30%). Utility of that health state is 1-(30%) = .70 Standard gamble approach The point where you are indifferent to the choice between spending the rest of your life in the health state in question and a gamble between perfect health and instant death where the probability of perfect health represents the utility of the health state.

38 Teaching evidence-based practice = teaching a process
Didactics Small groups, problem-based learning Preceptorships/clinical supervision Standardized patients and evidence stations Embedded throughout curriculum Teaching EBP is about teaching a process – not just what tx have empirical Role models

39 Medical Decision Making in the NU-FSM curriculum
MDM-I (first week of medical school) Sensitivity, specificity, pre- and post-test probabilities, innumeracy, uncertainty in medicine MDM-II (last two weeks of M1 year) Epidemiology Statistics MDM-III (beginning of M2 Spring Quarter) Decision analysis Meta-analysis Cost-effectiveness analysis Clinical guidelines M3 MDM (once a month in M3 year) Review papers pertaining to clinical cases Use of CAT

40 Evidence-Based Behavioral Practice (EBBP)
NIH Office of Behavioral and Social Sciences Research contract N01-LM : Resources for Training in Evidence- Based Behavioral Practice, Invite SMDM to the Scientific Advisory Board

41 OBSSR 5-Year Plan Year 1: develop training website, Council, Scientific Advisory Board, white paper on training, skills, competencies reflecting education in evidence-based behavioral practice (EBBP) Year 2: develop, implement a web-based, research-focused training module(s) on EBBP; field test in graduate curricula Year 3: launch interactive web-based training courses; establish practice network, develop first EBBP clinical practice training module

42 OBSSR 5-Year Plan Year 4: With practice network, develop modules on application of evidence-based clinical decision-making to intervention with specific cases. Field test in internship/residency/post-doctoral training programs and practice network. Year 5: Link website to systematic reviews of behavioral interventions, treatment manuals, outcome assessments. Develop and field test clinical decision-making modules that integrate patient preference and clinical competency assessments.

43 Suggestions To enhance the evidence base for psychological treatments and support lifelong learning, clinical psychology training might benefit from enhanced coverage of: Researcher skills in methods: clinical trial design, analysis, reporting, synthesis Clinician training in 5-step (5A’s) EBP model – cover 2 A’s

44 Suggestions Psychology informatics could use infrastructure development (PSYCinfo & Cochrane; library access; coverage in secondary synthesized sources like Up-to-Date; practice-based research networks) Psychology could use appropriate patient preference measures that support shared decision-making A discipline of clinical psychology decision-making needs to develop to systematize integration of research evidence, clinical expertise, and patient clinical data and preferences

45 Concluding Questions What training modules and materials would be helpful? Will you partner with us to help develop and try these out?

46

47

48 The Evidence Pyramid for Treatment Effectiveness Questions
***USE THE BEST EVIDENCE AVAILABLE***

49 Alternatives to evidence-based medicine
Eminence based medicine Eloquence based medicine Vehemence based medicine Nervousness based medicine To understand why guidelines, consider where coming from. Eminence- more senior expert, making the same mistakes with increasing confidence over impressive number of years, an art or craft done w/ clinical expertise. Vehemence-based: brow-beating your colleagues into incorporating your treatment into practice guidelines. Nervousness-based – in US fear of litigation – only bad test or treatment is one you didn’t think of ordering Confidence-based – restricted to surgeons (Isaacs and Fitzgerald, 1999, BMJ)

50 Levels of Clinical Evidence in the Primary Literature (psycINFO, MEDLINE)
RCT, Practice Guideline Consensus Development Conference Randomized Controlled Trial Quality Improvement Cohort Studies, Risk Cohort Studies Etiology Cohort Studies, Prognosis, Survival Analysis Cohort Studies, Case Control, Case Series Prognosis Randomized Controlled Trial, Double Blind, Clinical Trials Double-Blind Randomized Controlled Trial Therapy Search Filters Methodology Type of Question

51 EBM Resources Pocket guides with web-linked updates (Sackett; Guyatt & Rennie) Cochrane Library BMJ: Centre for EBM: Centre for Evidence-based mental health:


Download ppt "Bonnie Spring, Ph.D., ABPP Northwestern University"

Similar presentations


Ads by Google