Presentation is loading. Please wait.

Presentation is loading. Please wait.

Evidence-based Education: It Isn’t as Simple as You Might Think Ronnie Detrich Randy Keyworth Jack States Wing Institute.

Similar presentations


Presentation on theme: "Evidence-based Education: It Isn’t as Simple as You Might Think Ronnie Detrich Randy Keyworth Jack States Wing Institute."— Presentation transcript:

1 Evidence-based Education: It Isn’t as Simple as You Might Think Ronnie Detrich Randy Keyworth Jack States Wing Institute

2 Everybody’s Talking In the last few years there has been a great deal of discussion about evidence-based education.  Term has become nearly ubiquitous. Sources of Influence  No Child Left Behind (NCLB).  Individuals with Disabilities Education Improvement Act (IDEIA).  Several initiatives within APA Divisions 12, 16, 53.

3 What Did They Say? NCLB requires that interventions used to improve educational performance are based on scientific research.  In NCLB there are over 100 references to scientific research. IDEIA (2004) requires that interventions are scientifically based instructional practices.

4 Special Education and Evidence-based Education Pre-service and professional development for all who work with students with disabilities to ensure such personnel have the skills and knowledge necessary to improve the academic achievement and functional performance of children with disabilities, including the use of scientifically based instructional practices, to the maximum extent possible.

5 Special Education and Evidence-based Education Scientifically based early reading programs, positive behavioral interventions and supports, and early intervention services to reduce the need to label children as disabled in order to address the learning and behavioral needs of such children.

6 Special Education and Evidence-based Education The Individualized Education Program (IEP) shall include a statement of the special education and related services and supplementary aids and services, based on peer-reviewed research to the extent practicable, to be provided to the child, or on behalf of the child, and a statement of the program modifications or supports for school personnel that will be provided for the child.

7 Ethical Conduct and Evidence-based Interventions Most national psychological and educational organizations have ethical standards requiring science-based practices to address problems.  American Psychological Association Ethical Standard 2.04: Psychologists’ work is based on the established scientific and professional knowledge of the discipline.

8 Ethical Conduct and Evidence-based Education National Association of School Psychologists  Standard III F 4. School psychology faculty members and clinical or field supervisors uphold recognized standards of the profession by providing training related to high quality, responsible, and research-based school psychology services.

9 Ethical Conduct and Evidence-based Education National Association of School Psychologists  Standard III F 4. School psychology faculty members and clinical or field supervisors uphold recognized standards of the profession by providing training related to high quality, responsible, and research-based school psychology services.  Standard IV 4. School psychologists use assessment techniques, counseling and therapy procedures, consultation techniques, and other direct and indirect service methods that the profession considers to be responsible, research-based practice.

10 Ethical Conduct and Evidence-based Education Behavior Analysis Certification Board  Standard 2.09a The behavior analyst always has the responsibility to recommend scientifically supported, most effective treatment procedures. Effective treatment procedures have been validated as having both long-term and short-term benefits to clients and society.  Standard 2.09b Clients have a right to effective treatment (i.e., based on the research literature and adapted to the individual client).

11 Scope of the Problem Kazdin (2000) identified 550 named interventions for children and adolescents.  A very small number of these interventions have been empirically evaluated. Of those that have been evaluated, the large majority of them are behavioral or cognitive-behavioral.  Evidence-based interventions are less likely to be used than interventions for which there is no evidence or there is evidence about lack of impact (Kazdin, 2000).  In many instances practitioners are not aware of evidence-based interventions (Kratochwill, Albers & Shernoff, 2004).

12 Scope of the Problem Situation may be worse for autism interventions.  Long (2006) reported a Google® search on autism cure: Feb., ,000 hits Feb., ,290,000 hits.  Science does not move fast enough to evaluate all of these new interventions.  Some of these interventions may be helpful, some may have no impact, some may be harmful. We do not know. We cannot afford to implement interventions with unknown effects.

13 What is Evidence-based Practice? Sackett et al (2000) defined evidence-based practice in medicine as: “the integration of best research evidence with clinical expertise, and patient values.” EBP is a decision-making approach that places emphasis on evidence to:  guide decisions about which interventions to use.  evaluate the effects of an intervention.

14 What is Evidence-based Practice? Ultimately, EBP is a consumer protection issue.  Assumes that evidence-based interventions are more likely to be effective than interventions that are not evidence-based.  By validating interventions as evidence-based there is the implication that there are standards for reviewing interventions. Standards should be transparent. EBP is more than identifying evidence-based interventions.

15 Evidence-based Practice

16 Identifying Evidence-based Interventions Controversies to identifying evidence-based interventions.  There is no consensus about what constitutes evidence. NCLB permits both quantitative and qualitative evidence without specifying the types of questions that each approach best answers. In this context, we are most often concerned with evidence that establishes a causal relation between an intervention and a class of social or academic behaviors.

17 Identifying Evidence-based Interventions? Even if we accept causal demonstrations to be evidence, we have no consensus.  Randomized Clinical Trials have become the “gold standard.”  There is controversy about the status of single participant designs. Most frequently criticized on the basis of external validity.

18 Identifying Evidence-based Interventions Identification is more than finding a study to support an intervention.  Identification involves an evaluation of the body of knowledge about an intervention.  Standards of evidence specify: the quantity the quality of evidence necessary to validate an intervention as evidence-based.  There are no agreed upon standards. No single resource for decision makers. May result in other criteria than evidence to influence decisions.

19 Identifying Evidence-based Interventions Two approaches to establishing standards  Threshold approach: Evidence must be of a specific quantity and quality before intervention is considered evidence-based.  Hierarchy of evidence approach: Best available evidence approach. Strength of evidence falls along a continuum with each level having differential standards.

20 Identifying Evidence-based Interventions Several organizations have established standards but there is limited agreement among them.  What Works Clearinghouse  National Autism Center (proposed)  Society for Prevention Research  Task Force on Evidence-based Interventions in School Psychology  Council for Exceptional Children It is possible for an intervention to meet one standard but not a second.

21 Identification of Evidence-based Interventions Types of Evidence:  Efficacy trials: Intervention demonstrated to be effective when implemented under optimum conditions. Primarily designed to demonstrate impact of an independent variable.  Effectiveness trials: Interventions implemented under more typical practice conditions. Often interventions effects compromised when taken to scale.

22 Implementing Evidence-based Interventions Where Good Interventions Go to Die Identification of evidence-based interventions is necessary but not sufficient to assure that interventions will be effective. Non-science based issues influence selection of intervention:  Personal experience  Expert opinion  Cost impacts adoption Less effective but cheaper interventions may be adopted.  Effort Contextual fit. Complexity of intervention. Training.

23 Implementing Evidence-based Interventions Contextual Fit Logically it would seem to make sense to always implement intervention that produces greatest impact.  There may be exceptions: If high impact intervention requires great resources, specialized training, and is very different from current practices it may not be implemented with integrity. May be better to implement effective but lower impact intervention that is better contextual fit and will be implemented with greater integrity.

24 Implementing Evidence-based Interventions Contextual fit Adoption or Accommodation  Adoption: Implementing intervention as evaluated to be effective. Assures evidence-based intervention is being implemented.  Accommodation: adjusting intervention to meet local circumstances. May result in intervention no longer being evidence-based. May increase implementation with integrity.

25 Implementing Evidence-based Interventions Complexity of Intervention Level of precision may increase complexity  Be as precise as necessary but no more. Catch’em being good Good behavior game Individualized intervention plan

26 Implementing Evidence-based Interventions Training Implementing a new intervention requires training  Presents significant financial and logistical challenges. How is everyone to be trained? Who will provide the training? How will acquisition be assured? How will maintenance be assured? How will new implementers be trained after initial training?

27 Implementing Evidence-based Interventions Dissemination Issue for dissemination is getting information into hands of practice-based decision makers.  Journals are poor means for dissemination. Not typically read by decision makers Decision makers not necessarily qualified to interpret results Often lack sufficient detail for replication in a practice setting.  Clearinghouses are in their infancy. Often do not have the information a decision maker is seeking.

28 Implementing Evidence-based Interventions Often the evidence for a particular set of problems is inadequate for identifying an evidence-based intervention.  What becomes the basis for decision making? An alternative to evidence-based:  Evidence-based: meets well defined standards.  Empirically supported: based on scientific principles of behavior.  When no evidence-based intervention then build intervention with exclusively with principles from science of human behavior. »Not as strong as evidence-based.

29 Implementing Evidence-based Interventions Sustainability If intervention is effective then it is desirable to assure sustainability of the program.  Defining features of sustainable programs: Maintains over time. Maintains across generations of practitioners. Supported with existing resources of system.  There is an emerging science of sustainability. The larger the scale of implementation the greater the degree of complexity.  The science of large scale implementation is not well established.

30 Evaluating Evidence-based Interventions Progress Monitoring Implementation of evidence-based intervention does not assure success.  Necessary to evaluate impact in local context. No intervention will be effective for all students. Cannot predict who will benefit.  Progress monitoring is practice-based evidence about evidence-based practices.  Consistent with legal requirements and ethical standards.

31 Ethical Standards and Progress Monitoring National Association of School Psychologists  Standard IV C 1b. Decision-making related to assessment and subsequent interventions is primarily data-based.  Standard IV 6. School psychologists develop interventions that are appropriate to the presenting problems and are consistent with the data collected. They modify or terminate the treatment plan when the data indicate the plan is not achieving the desired goals.

32 Ethical Standards and Progress Monitoring Behavior Analysis Certification Board  Standard 4.04 The behavior analyst collects data or asks the client, client- surrogate, or designated other to collect data needed to assess progress within the program.  Standard 4.05 The behavior analyst modifies the program on the basis of data.

33 Legal Requirements for Progress Monitoring Fundamental to IEP process.  Must report on same schedule that grades are reported in general education. Response to Intervention is accepted as alternative means for determining eligibility for Learning Disability classification.  Progress monitoring is the heart of RTI. All students routinely and systematically monitored to assure adequate progress is occurring.

34 Evaluating Evidence-based Interventions Progress monitoring is a systems level intervention.  Contingencies must be in place to assure: Data are collected Data are reviewed Decisions are based on the data.  If contingencies are not in place, response effort associated with data collection will compromise data- based decision making.

35 Evaluating Evidence-based Interventions Curriculum based measurement is a powerful means for evaluating impact of academic interventions.  Scores on CBM correlated with scores on high stakes test. Can be used to predict how students will perform on state-wide tests.

36 Evaluating Evidence-based Interventions Monitoring effects of intervention of students is necessary but it is also necessary to systematically assess accuracy of implementation (treatment integrity).  Implementation without integrity results in an unspecified intervention being implemented for which there is no evidence-base to justify the intervention.  If student is not benefiting from intervention and there is poor integrity then no conclusions can be drawn about the effect of the intervention. If integrity is low, increase integrity before evaluating impact.

37 Where Are We? The legal requirements to implement evidence- based interventions requires system level changes.  Systems requirements: Process for selecting interventions. Process for training and maintenance. Process for progress monitoring for both treatment integrity and student performance.  These changes to the system will not occur rapidly.  Behavioral systems perspective can facilitate change process. Gilbert (1996) Human Competence: Engineering Worthy Performance

38 The Effects of Being Evidence-based Personal preference for interventions is de- emphasized.  Evidence guides decisions at all levels. Training programs are not necessarily teaching evidence-based interventions (Kratochwill, Albers & Shernoff, 2004).  To realize the long term benefits of an evidence-based practice approach it will be necessary to impact the content of university level training programs.


Download ppt "Evidence-based Education: It Isn’t as Simple as You Might Think Ronnie Detrich Randy Keyworth Jack States Wing Institute."

Similar presentations


Ads by Google