Presentation is loading. Please wait.

Presentation is loading. Please wait.

Evidence-based Education: Can We Get There From Here?

Similar presentations


Presentation on theme: "Evidence-based Education: Can We Get There From Here?"— Presentation transcript:

1 Evidence-based Education: Can We Get There From Here?
Ronnie Detrich Wing Institute Association for Behavior Analysis International Evidence-based Education Conference September 6, 2008

2 Why Do We Need Evidence-based Education?
From a university in the U.S.

3 Acknowledgements Randy Keyworth Jack States Tom Critchfield Tim Slocum
Mark Shriver Teri Lewis-Palmer Karen Hager Janet Twyman Hill Walker Susan Wilczynski

4 Why Evidence-based Education?
Federal policy emphasizes scientifically based instruction. No Child Left Behind Over 100 references to scientifically based instruction. Individuals with Disabilities Education Improvement Act Pre-service and professional development should prepare educators to implement scientifically based instructional practices.

5 Why Evidence-based Education?
Professional organizations began validating interventions as evidence-based: Mid 1990’s Society for the Study of School Psychology American Psychological Association More recently What Works Clearinghouse (Institute for Education Science) Campbell Collaboration Coalition for Evidence-based Policy National Autism Center

6 Why Evidence-based Education?
Most professional organizations have ethical guidelines emphasizing services are based on scientific knowledge. American Psychological Association Psychologists’ work is based on the established scientific and professional knowledge of the discipline. National Association of School Psychologists … direct and indirect service methods that the profession considers to be responsible, research-based practice. The Behavior Analyst Certification Board The behavior analyst always has the responsibility to recommend scientifically supported, most effective treatment procedures.

7 What is Evidence-based Practice?
At its core the EBP movement is a consumer protection movement. It is not about science per se. It is a policy to use science for the benefit of consumers. “The ultimate goal of the ‘evidence-based movement’ is to make better use of research findings in typical service settings, to benefit consumers and society….” (Fixsen, 2008)

8 What is Evidence-based Practice?
Evidence-based practice has its roots in medicine. Movement has spread across major disciplines in human services: Psychology School Psychology Social Work Speech Pathology Occupational Therapy

9 What Is Evidence-based Practice?
Professional Judgment Best available evidence Client Values Sackett et al (2000) Best Available Evidence Professional Judgment Client Values EBP is a decision-making approach that places emphasis on evidence to: guide decisions about which interventions to use; evaluate the effects of an intervention.

10 What is Evidence-based Education?
The term “evidence-based” has become ubiquitous in last decade. There is no consensus about what it means. At issue is what counts as evidence. Federal definition emphasizes experimental methods. Preference for randomized trials. Definition has been criticized as being positivistic.

11 What Counts as Evidence?
Ultimately, this depends on the question being asked. Even behavior analysis allows for qualitative evidence (social validity measures). In EBP the goal is to identify causal relations between interventions and outcomes. Experimental methods do this best.

12 What Counts as Evidence?
Even if we accept causal demonstrations to be evidence, we have no consensus. Randomized Clinical Trials (RCT) have become the “gold standard.” There is controversy about the status of single subject designs. Most frequently criticized on the basis of external validity.

13 How Are Evidence-based Interventions Identified?
Identification is more than finding a study to support an intervention. Identification involves distilling a body of knowledge to determine the strength of evidence.

14 How Are Evidence-based Interventions Identified?
Distillation requires standards of evidence for reviewing the literature. Standards specify: the quantity of evidence the quality of evidence

15 Continua of Evidence Threshold of Evidence Janet Twyman, 2007
Quantity of the Evidence Quality of the Evidence Meta-analysis (systematic review) Single Case Replication (Direct and Parametric) Single Study Various Investigations Repeated Systematic Measures Convergent Evidence Personal Observation Expert Opinion Current “Gold Standard” High Quality Randomized Controlled Trial Uncontrolled Studies General Consensus Single Case Designs Semi-Randomized Trials Well-conducted Clinical Studies Threshold of Evidence The literature bearing on a certain clinical procedure is inventoried for: the relevance of findings, the quality of findings, the number of findings, and the consistency of findings for establishing a clear and singular linkage between a certain clinical outcome and a certain clinical procedure applied to members of a certain clinical population. The terms "levels of evidence" or "strength of evidence" refer to systems for classifying the evidence in a body of literature through a hierarchy of scientific rigor and quality. Several dozen of these hierarchies exist (Agency for Healthcare Research and Quality [AHRQ], 2002b). Some systems comprise three levels and others eight or more. The gradations in some hierarchies are based on randomization and experimental controls. The organizing focus for others may center on magnitude of effect sizes, confidence intervals, number of results, consistency of results, sample size, or Type I and Type II error rates (AHRQ, 2002a). Janet Twyman, 2007

16 How Are Evidence-based Interventions Identified?
Two approaches to validating interventions Threshold approach: Evidence must be of a specific quantity and quality before an intervention is considered evidence-based. Hierarchy of evidence approach: Strength of evidence falls along a continuum with each level having differential standards.

17 How Are Evidence-based Interventions Identified?
There are no agreed upon standards. It is possible for an intervention to be evidence-based using one set of standards and to fail to meet evidence standards using an alternative set. Difficult for consumers and decision makers to sort out the competing claims about what is evidence-based.

18

19 Evidence-based Intervention

20 Effective Ineffective Effective Ineffective Actual Effectiveness
Assessed Effectiveness Effective Ineffective Effective Ineffective Effective Effective True Most likely with hierarchy approach False Positive Positive Most likely with threshold approach False True Ineffective Negative Negative Ineffective

21 Choosing Between False Positives and False Negatives
At this stage, it is better to have more false positives than false negatives. False Negatives: Effective interventions will not be selected for implementation. As a consequence, less likely to determine that they are actually effective. False Positives: Progress monitoring will identify interventions that are not effective.

22 Why Do We Need Evidence-based Education?
Kazdin (2000) identified 550 named interventions for children and adolescents. A very small number of these interventions have been empirically evaluated. Of those that have been evaluated, the large majority are behavioral or cognitive-behavioral. Evidence-based interventions are less likely to be used than interventions for which there is no evidence or there is evidence about lack of impact.

23 Evidence-based Education Roadmap
Research to Practice Evidence-based Education Roadmap Research Replicability Practice

24 Efficacy Research (What Works?)
Practice Research Replicability Sustainability What works? When does it work? How do we make it work? Is it working? Efficacy Research (What Works?) What works? Primary concern is demonstrations of causal relations. Rigorous experimental control so threats to internal validity are minimized. Not always easy to immediately translate to practice.

25 Behavior Analysis and Efficacy
Behavior Analysis: emphasis on rigorous experimental control has resulted in many important contributions to education. Systematic, explicit teaching methods. Wide spread use of reinforcement systems.

26 Evidence-based Education Roadmap
Research to Practice Evidence-based Education Roadmap Research Replicability Practice

27 Effectiveness Research (When Does it Work?)
Practice Research Replicability Sustainability What works? When does it work? How do we make it work? Is it working? Effectiveness Research (When Does it Work?) Evaluates the robustness of an intervention when “taken to scale” and implemented in more typical practice settings. Answers questions related to external validity or generalizability of effects. Typically, smaller effect size. Efficacy and effectiveness fall on a continuum. When does it work?

28 Behavior Analysis and Effectiveness Research
Behavior Analysis has not generally concerned itself with external validity questions. Emphasizes generality of behavioral principles. Has not resulted in the type of research that answers the “actuarial” questions asked by effectiveness research. What percent of population of students will benefit from a specific program? Which students will benefit?

29 Research to Practice Issues
The lag time from efficacy research to effectiveness research to dissemination is years. (Hoagwood, Burns & Weisz, 2002) Only 4 of 10 Blueprint Violence Prevention programs had the capacity to disseminate to 10+ sites in a year. (Elliott & Mihalic, 2004)

30 Good Behavior Game: Efficacy
First efficacy study: fourth grade classroom (Barrish, Saunders, Wolf, 1969) Subsequent replications across: Settings (The Sudan, library, sheltered workshop) Students (general education, special education, 2nd grade, 5th grade, 6th grade, adults with developmental disabilities) Behaviors (on-task, off, task, disruptive, work productivity) All efficacy studies were single subject designs.

31 Good Behavior Game: Effectiveness
Series of effectiveness studies by Kellam et al. examining it as a prevention program: Special issue of Drug and Alcohol Dependence (2008) If exposed to GBG in 1st and 2nd grade then reduced risk for young adults of: drug/alcohol abuse smoking anti-social personality disorder subsequent use of school-based services suicidal ideation and attempts All studies were RCTs.

32 Good Behavior Game: Validation
Coalition for Evidence-based Policy reviewed the literature for Good Behavior Game: Determined it was evidence-based. Review included only those studies that were RCT. All single subject research was ignored.

33 A Consumer Perspective: One Year Follow-up
“…you should give them more good behavior game. Keep on doing what’s good.”

34 Evidence-based Education Roadmap
Research to Practice Evidence-based Education Roadmap Research Replicability Sustainability Practice

35 Implementation (How do we make it work?)
Practice Research Replicability Sustainability What works? When does it work? How do we make it work? Is it working? Implementation (How do we make it work?) “Identifying evidence-based interventions is one thing, implementing them is an entirely different thing.” (Dean Fixsen, 2008) The primary challenge is how to place an intervention within a specific context. Until implementation questions are answered, the ultimate promise of evidence-based education will go unfulfilled. How do we make it work?

36 Implementation is Fundamental
80% of initiatives ended within 2 years 90% of initiatives ended within 4 years Data from Center for Comprehensive School Reform

37 Behavior Analysis and Implementation
Service delivery in behavior analysis is a mediated model. Requires behavior analysts to address many of the issues of implementation for each project. We have not systematically attended to many of these issues, especially at large scale. What organizational features are necessary to support evidence-based interventions? How do we modify an intervention so it fits local contingencies without diminishing effectiveness.

38 Evidence-based Education Roadmap
Research to Practice Evidence-based Education Roadmap Research Replicability Sustainability Practice

39 Progress Monitoring (Is it Working?)
Practice Research Replicability Sustainability What works? When does it work? How do we make it work? Is it working? Progress Monitoring (Is it Working?) Research guides us to interventions that are most likely to work. Generalizing from a research base to a specific instance requires a leap of faith and confidence < 1.0. Assures that an intervention is actually effective in a setting (practice-based evidence). Is it working?

40 Behavior Analysis and Progress Monitoring
Progress monitoring is the sine qua non of applied behavior analysis. It is not applied behavior analysis if data are not collected and reviewed. Behavior analysis has made enormous contributions to the direct measurement of behavior. Represents the best example of practice-based evidence about evidence-based practices.

41 Evidence-based Education Roadmap
Research to Practice Evidence-based Education Roadmap Research Replicability Sustainability Practice

42 Similarities and Differences Between Behavior Analysis and Evidence-based Practice
Unit of analysis is populations Unit of analysis is individual Data-based decision making Evidence is derived from systematic reviews Evidence is derived from experiments Assumption that science produces best outcomes for consumers Practitioner must know how to implement effectively Practitioner must know laws of behavior and how to apply

43 A Prevention Model for Evidence-based Education
Academic Systems Behavioral Systems Intensive, Individual Interventions Individual Students Assessment-based High Intensity Intensive, Individual Interventions Individual Students Assessment-based Intense, durable procedures 1-5% 1-5% Targeted Group Interventions Some students (at-risk) High efficiency Rapid response 5-10% 5-10% Targeted Group Interventions Some students (at-risk) High efficiency Rapid response Universal Interventions All students Preventive, proactive 80-90% Universal Interventions All settings, all students Preventive, proactive 80-90%

44 Can We Get There From Here?
Behavior analysis has a great deal to contribute to the discussion about the most effective educational interventions. The current emphasis on RCT puts behavior analysis in a difficult position. If we are to have maximum impact on the field of education then we must change our behavior. “If you are not at the table, then you are on the menu.” (Cathy Watkins, 2008)

45 Can We Get There From Here?
We should begin to conduct RCTs. If we have robust interventions, they will fare well with RCT. RCTs are well suited to answer actuarial questions. Decision makers are concerned with these actuarial questions. “How big a bang will I get for my buck?”

46 Can We Get There From Here?
Sidman, The Behavior Analyst, 2006: “To make the general contributions of which our science is capable, behavior analysts will have to use methods of wider generality, in the sense they affect many people at the same time- or within a short time, without our being concerned about any particular members of the relevant population.”

47 Can We Get There? We should not abandon rigorous single subject research. Expand our repertoire to include other methods to answer different types of questions. Engage in a social influence process to assure that SSDs are included in evidence standards. Especially critical in special education context.


Download ppt "Evidence-based Education: Can We Get There From Here?"

Similar presentations


Ads by Google