Download presentation
Presentation is loading. Please wait.
1
Connective Solutions 21/08/2012 Evidence-Based Practice in Augmentative & Alternative Communication: How do you do it? What does it mean for Individuals who use AAC? Pammi Raghavendra. Ph.D. Senior Lecturer, Disability & Community Inclusion, School of Health Sciences Flinders University, Australia ISAAC-Israel Annual National AAC Conference, Tel Aviv 8 June 2014 All day workshop Emma Grace
2
EBP Workshop, Northcott, Sydney
2/11/2009 Workshop Outline What is EBP? What does EBP mean for individuals who use AAC and other stakeholders? Steps involved in EBP – 7 steps Asking a clinically relevant & answerable question Searching for the evidence Critically appraising the evidence Collating & synthesising the evidence 5. Implementing the evidence into practice 6. Evaluating the use of the evidence 7. Disseminating the EBP process & findings Facilitators and barriers to EBP Practical Suggestions to implement EBP Nature and extent of evidence in AAC What can you do to add evidence to the AAC field?
3
EBP Workshop, Northcott, Sydney
2/11/2009 Evidence Based Practice (EBP) What is it?
4
Background to Evidence-based Medicine (EBM)
Archie Cochrane ( ) The Cochrane Collaboration David Sackett (1914-) Definition of EBM, Father of EBM
5
Evidence-based medicine
“the conscientious, explicit and judicious use of current best evidence in making decisions about the care of individual patients.” Sackett et al., (1997, p.2)
6
What is Evidence Based Practice?
Definition: “the integration of best research evidence with clinical expertise and patient values (Sackett et al., 2000)
7
What is Evidence Based Practice in AAC?
Proposed Definition: (Schlosser & Raghavendra, 2004) EBP is defined as the integration of best and current research evidence with clinical/educational expertise and relevant stakeholder perspectives to facilitate decisions for assessment and intervention that are deemed effective and efficient for a given direct stakeholder.
8
EBP in AAC Schlosser & Raghavendra, 2004
9
What is Evidence Based Practice? – Key concepts
Integration = joining / synthesis of best research evidence clinical expertise stakeholder perspectives (patient values) (Schlosser & Raghavendra, 2004)
10
Key concepts continued…
Best research evidence = Data : current, verified and replicated High internal validity Highest level on hierarchy of evidence Adequate external validity and social validity (Schlosser & Raghavendra, 2004)
11
Key concepts continued…
Clinical expertise = reasoning, intuition, knowledge and skills related to clinical skills Educational expertise = reasoning, intuition, knowledge and skills related to educational skills (Schlosser & Raghavendra, 2004)
12
Key concepts continued…
Relevant stakeholder perspectives/values = viewpoints, preferences, concerns and expectations relative to the assessment or intervention Patient/client = direct stakeholder ie direct recipient of any decision arising from the EBP process (Schlosser & Raghavendra, 2004)
13
What EBP is not? Myths about EBP
EBP is impossible to implement because we do not have enough evidence. EBP already exists EBP declares research evidence the authority EBP is a cost-cutting mechanism. EBP is cook-book practice. EBP is impossible to put in place. (Sackett et al., 1997; Schlosser, 2004) Best and most current research evidence is relative Some practitioners do implement EBP EBP in medicine/AAC definition-all 3 Not always true EBP requires not only extensive clinical expertise but also skillful integration of all 3 aspects of EBP some degree of EBP by all.
14
EBP Workshop, Northcott, Sydney
2/11/2009 What does EBP mean for an individual with complex communication needs ? Individuals with CCN (their families) Central to EBP outcomes, promoting adoption of effective interventions & preventing adoption of ineffective outcomes Active participants in decision making (ASHA, 2004)
15
What does EBP mean for a practitioner/educator?
EBP Workshop, Northcott, Sydney 2/11/2009 What does EBP mean for a practitioner/educator? Best Practice Use of EBP as a framework for best practice, Promotes development of skills in finding, appraising and implementing evidence Emphasises need for high level clinical & communication skills Provides a framework for self-directed, life long learning Ethical principle: To provide the best available services (assessments & interventions) to consumers & families Practice-research gap
16
What does EBP mean for a service provider/ an organisation?
Best Practice at Organisational level What do families think of using iPads/tablet technology for communication and participation? What are the most effective post-school options for students with CCN using AAC? Use of EBP as a framework for effective and efficient services Facilitate/support to implement EBP
17
What does EBP mean for researchers?
EBP Workshop, Northcott, Sydney 2/11/2009 What does EBP mean for researchers? Excellent opportunity to bridge research-practice gap High priority clinical questions in diagnostics, screening, prognosis, intervention Conduct high quality research Disseminating research in way that can be used in practice
18
How do you do EBP? Adapted from Schlosser & Raghavendra (2003)
Asking the clinically relevant & answerable question Searching for the evidence Critically appraising the evidence Collating & synthesising the evidence Implementing the evidence into practice Evaluating the use of the evidence Disseminating the EBP process & findings
19
Step 1: Ask an clinically relevant & answerable question
EBP Workshop, Northcott, Sydney 2/11/2009 Step 1: Ask an clinically relevant & answerable question Broad or general questions provide background information Systematic review e.g., What are the potential barriers and facilitators to high-technology AAC provision and its ongoing use? What are the attitudes toward Individuals who Use Augmentative and Alternative Communication? What is the effectiveness of using iPads/iPods with individuals with disabilities?
20
PICO (Richardson, Wilson, Nishikawa & Hayward,1995)
Patient or Population/Problem Intervention Comparison or Control Outcome
21
PICO To clarify the questions related to specific clients
To develop questions for Systematic Reviews To identify the information needed to answer the question To translate the question into searchable terms To develop and refine your search strategy
22
Patient / Population/Problem
Characteristics of the population e.g age, gender, diagnosis, ethnic group etc How would you describe your Patient/Population group? Balance precision with brevity
23
Intervention Defining the intervention
What intervention are you interested in (be specific)? Therapy, prevention, diagnostic test, exposure/aetiology
24
Comparison or Control What alternative or other option are you comparing your intervention/assessment to? Be specific You may be comparing your intervention to another intervention, or to no intervention
25
Outcome What measurable outcome(s) are you interested in? What are you hoping to achieve for the client? Be specific
26
ELEMENT SUBJECT KEY WORDS P I C O
School aged children with cerebral palsy, complex communication needs Child Children School aged, 12-13, primary school, transition, high school Paediatric/paediatric Cerebral palsy, physical disability, spasticity, hemiplegic, CCN, AAC, severe communication/speech impairment, I Group-Peer training of high school students (workshops) Peers, classmates, training, communication, support C Visit by student with CCN plus training O Increased acceptance, welcoming classroom, social networks Social networks, friends, friendships,
27
Ask an clinically/service relevant answerable question using PICO
Group Work Activity 1 Ask an clinically/service relevant answerable question using PICO
28
PESICO template (Schlosser, Koul & Costello, 2007)
EBP Workshop, Northcott, Sydney 2/11/2009 PESICO template (Schlosser, Koul & Costello, 2007) Person/Problem (P) Person & problem Environment (E) Current/future envt & partner’s knowledge/skills, etc., Stakeholders (S) S’s perspectives, attitudes towards problem Intervention (I) Steps to change persons, envts. interaction, events, procedures Comparison/Control (C) Compare bet. Interventions or intervention & no intervention Outcome(s) (O) Desired outcomes
29
An example (Schlosser, Koul & Costello,2007)
Person/problem Envt. Stakeholders Intervention Comparison Outcomes In a 7 year-old boy with profound ID who exhibits self-injurious behavr. Who is currently in a self-contained classroom And whose teacher and aides suspects that his behaviour is communication based Is it sufficient to rely on informant-based assessment methods Or is it necessary to also conduct descriptive and experimental assessments In order to identify the communicative functions that maintain his problem behavr. In a valid and reliable manner/
30
Ask an clinically/service relevant answerable question using PESICO
Group Work Activity 2 Ask an clinically/service relevant answerable question using PESICO
31
Step 2: Search for the evidence
To start, use the PICO/PESICO question components to identify the search terms
32
Search for the best evidence
Where to start searching depends on a number of factors: Available time Available databases Subject matter and domain of the question Currency and level of evidence required
33
Where do I search? Finding the evidence
Courses, proceedings, books, Journals (Two million journal articles are published annually in some 20,000 ‘biomedical’ journals) Secondary research – meta-analysis, systematic reviews Primary research – individual research studies Grey literature e.g. unpublished research, theses Indexes and databases General eg CINAHL, Medline, PubMed, ERIC Specialist e.g. Cochrane Library, DARE, PEDro
34
Where do I search? Finding the evidence
Hand searches Table of contents Ancestry search- use reference list and identify other studies Forward citation search-who has referred to a particular article? Finding evidence on the Internet General search engines, e.g., Google Specialised Search engines, Google Scholar
35
Types of Databases Bibliographic/General eg Medline PubMed
Specialist Databases ERIC CINAHL PsyInfo
36
Databases – Full Text/Specialised
DARE- Database of Abstracts of Reviews of Effects (UK) Cochrane- Cochrane Database of SR (Worldwide) Expanded Academic ASAP Science Direct SCOPUS
37
Strategies for searching databases
Plan the research, efficiency Create a search plan Select and access the right databases Develop a search statement Limit, refine and evaluate Locate the source publication
38
Strategies for Searching
1) look for pre-filtered evidence (e.g., EBM reviews, Systematic Reviews, Practice guidelines) 2) look for reviews before individual studies 3) look for peer-reviewed before non-peer reviewed (Schlosser, Wendt, Angermeier & Shetty, 2005)
39
Create a Search Statement
Search commands – Boolean operator/connectors (or, and, not) example: labour not pregnancy Truncation – symbols used to substitute for characters at the end of a word e.g Ovid uses “$” example: child$ will give children, childlike Wildcards – symbols used to substitute for a letter within a word e.g Ovid uses # example: wom#n will give woman and women Check Help Screen for each database
40
Keyword Searching Keywords – important words that represent a topic. Have no control over how a word is used in the document The use of quotation marks is useful for example “intellectual disability” Fields – eg author, title, abstract, journal Limits – by date, by language, to full text, to abstracts, Systematic reviews
41
Hierarchy of Evidence http://gollum. lib. uic. edu/applied_health/
Hierarchy of Evidence University of Illinois at Chicago
42
b) Quasi-experimental group designs 2a. One well-designed non RCT,
Connective Solutions 10/10/2013 Proposed hierarchy of evidence to inform intervention development & selection: Participants with disabilities (Schlosser & Raghavendra, 2004) Meta-analysis of a) RCTs*, b)SSED, b) Quasi-experimental group designs 2a. One well-designed non RCT, 2b. 1 SSED-1 intervention, 2c. 1 SSED-multiple interventions…….. 3. Quantitative reviews that are non-meta-analytic 4. Narrative reviews 5. Pre-experimental group designs & Single case studies 6. Respectable Opinion EACD, Raghavendra et al., 2013
43
Levels of Evidence Based on the idea that different grades of evidence (study designs) vary in their ability to predict the effectiveness of the health practices Reducing biases-Sample, Measurement/detection, Intervention/Performance Higher grades of evidence are more likely to reliably predict outcomes than lower grades Is a system for making sure that you are aware of the strengths and weaknesses of different study types. Several Evidence Grading scales eg: Sackett’s Hierachy of Evidence, NHMRC, Cochrane
44
What are Systematic Reviews?
EBP Workshop, Northcott, Sydney 2/11/2009 What are Systematic Reviews? A synthesis of original research Pre-filtered evidence A SR aims to synthesize the results of multiple original studies by using strategies to reduce bias (Cook et al., 1997: Schlosser, 2003, cited in Schlosser et al., 2005)
45
Systematic Review (Adapted from Cochrane Database of Systematic reviews – & Centre for Reviews & Dissemination- Transparent process to facilitate replication Pre-defined, explicit methodology is used, strict protocol to inlcude as much relevant research original studies apparised and synthesised in a valid way Minimise risk of bias
46
Systematic Review Meta analysis is a mathematical synthesis of two or more primary studies that addressed the same hypothesis in the same way. (Greenhalgh, 1997, BMJ, 315: )
47
Where to find systematic reviews
N-CEP's Compendium of Guidelines and Systematic Reviews (ASHA) Cochrane Collaboration Campbell Collaboration What Works Clearinghouse (US Department of Education) Psychological Database for Brain Impairment Treatment Efficacy National Electronic Library for Health (National Health Service of the UK) Evidence-based Communication Assessment and Intervention (EBCAI) Journal Speech Byte
48
Step 3:Critically appraising the evidence
What is the evidence telling me? Validity (truth) and usefulness (clinical relevance)
49
What is Critical Appraisal?
Appraisal is a technique which offers a discipline for increasing the effectiveness of your reading, by enabling you to quickly exclude papers that are too poor a quality to inform practice, and to systematically evaluate those that pass muster to extract their salient points” Adapted from Miser WF (1999). Critical appraisal of literature, J. of American Board of Family practice, 12, Taken from
50
Are the findings applicable in my Setting? Is the quality of the study good enough to use the results? What do the results mean for my clients?
51
Difference between reading for content vs
Difference between reading for content vs. reading for critical appraisal Abstract Introduction (background, literature review), aims of the study Methodology Results Discussion Abstract Introduction (background, literature review), aims of the study Methodology Results Discussion
52
Type of Study The next step is to work out what study design will best answer your question Levels of Evidence – reflect the methodological rigour of the study
53
Type of Question Different types of questions are best answered by different types of studies Your question may be: - Intervention or therapy - Diagnosis/screening - Prognosis - Aetiology or risk factors
54
Questions -> Research Designs
Therapy/intervention effectiveness……. Test an association between…..… Descriptive info. about relationship in one participant Explore what factors influenced outcomes at one point in time Describe experiences…… Experimental Observational-Cohort/case-control Case Study Cross-sectional Qualitative
55
What do you need to look for in studies?
Why was the study done? What type of study design was used? What are the study characteristics (PICO/PESICO)? RELIABILITY- Test-retest, intra-rater & Inter-rater VALIDITY (Internal validity - What biases exist?) Participant selection, comparable groups at baseline, blinding, follow-up, drop-outs,outcomes, procedural reliability/treatment integrity)? What are the results (size and precision of the effect)? External validity -Are the results relevant in my clinical situation? Social Validity
56
Critical Appraisal Tools Systematic Reviews
EVIDAAC Systematic Review Scale (Schlosser, R. W., Raghavendra, P., & Sigafoos, J., Eysenbach, G., Blackstone, S., & Dowden, P. (2008) Protocol Source selection bias Trial selection, criteria for pooling Study quality Data extraction Statistical analysis Clinical Impact
57
Appraisal of Systematic Review Paper
Group Work Activity 3 Barriers and facilitators to the use of high-technology augmentative and alternative communication devices: a systematic review and qualitative synthesis Susan Baxter, Pam Enderby, Philippa Evans and Simon Judge, INT J LANG COMMUN DISORD, MARCH–APRIL 2012, VOL. 47, NO. 2, 115–129
58
Critical Appraisal Tools Randomised Control Trials (RCTs &Non-RCTs)
PEDro – P Scale- (Moseley, 1999; Maher et al., 2003) Physiotherapy Evidence Database To rate RCTs and Non-RCTs, not for SR, case-series, SSED 11 item scale, score 1 or 0 (based on info. found in the paper) 1st criterion - external validity internal validity Max.10, Min 0 (RCT=10, Non-RCT=8) Rated for methodological quality- sources of bias A high rating does not reflect relevance to practice
59
Critical Appraisal Tools
McMaster Forms- Developed by Law et al., 1998, Letts et al., Quantitative Review Form: Was the sample described in detail? Were the outcome measures reliable? Was intervention described in detail? Were the analysis methods appropriate? Qualitative Review Form: (ver 2.0, Letts et al., 2007) Was sampling done until redundancy in data was reached? Was the process of transforming data into themes described adequately? Was member checking used to verify findings?
60
Appraisal of a Qualitative Study
Group Work Activity 4 A Qualitative Analysis of Interactions of Children who use Augmentative and Alternative Communication ANETT SUNDQVIST* and JERKER ROONNBERG Augmentative and Alternative Communication, December 2010 VOL. 26 (4), pp. 255–266
61
References www.york.ac.uk - DARE
Auperin, A., Pignon, J.-P., & Poynard, T. (1997). Review article: critical review of meta-analyses of randomized clinical trials in hepatogastroenterology. Aliment Pharmacological Therapy, 11, 215 Maher, CG., Sherrington, C., Herbert, RD., Moseley, A., & Elkins, M. (2003). Reliability of the PEDro scale for rating methodological quality of randomised controlled trials. Physical Therapy, 83, Moseley, AM., Maher, C., Herbert, RD., Sherrington, C. (1999). Reliability of a scale for measuring the methodological quality of clinical trials. Proceedings of the VIIth Cochrane Colloquium, Rome, Italy: Cochrane Centre, p.39. Richardson, W., Wilson, M., Nishikawa, J., & Hayward, R. (1995). The well-built question: A key to evidence-based decisions, ACP Journal Club, 123, A12-A13. Schlosser, R. W., & Raghavendra, P. (2004). Evidence-based practice in augmentative and alternative communication. Augmentative and Alternative Communication, 20(1), Schlosser, R., Wendt, O., Angermeir,K., & Shetty, M. (2005). Searching for evidence in augmentative and alternative communication: Navigating a scattered literature, Augmentative & Alterntaive Communication, 21, Schlosser, R. & O’Neil-Pirozzi, T. (2006). Problem formulation in EBP and systematic reviews. Contemporary issues in communication science and disorders, 33,5-10. Schlosser, R., Koul, R., & Costello, J. (2007). Asking well-built questions for evidence-based practice in augmentative and alternative communication. Journal of Communication Disorders, 40 (3), - Cochrane Collaboration - DARE
Similar presentations
© 2024 SlidePlayer.com Inc.
All rights reserved.