Presentation is loading. Please wait.

Presentation is loading. Please wait.

Is PBIS Evidence-based? George Sugai OSEP Center on PBIS University of Oregon Center for Behavioral Education & Research University of Connecticut August.

Similar presentations


Presentation on theme: "Is PBIS Evidence-based? George Sugai OSEP Center on PBIS University of Oregon Center for Behavioral Education & Research University of Connecticut August."— Presentation transcript:

1 Is PBIS Evidence-based? George Sugai OSEP Center on PBIS University of Oregon Center for Behavioral Education & Research University of Connecticut August 5, 2008 www.cber.org www.pbis.org George.sugai@uconn.edu

2 Purpose Is PBIS Evidence-based Practice? What is PBIS? How is evidence-based determined? What is PBIS evidence?

3 www.pbis.org Horner, R., & Sugai, G. (2008). Is school-wide positive behavior support an evidence-based practice? OSEP Technical Assistance Center on Positive Behavioral Interventions and Support. http://www.pbis.org/files/101007eviden cebase4pbs.pdf.

4 Evidence Basics

5 Why evidence-based? Maximize outcomes Minimize harm Increased accountability Increase efficiency Improve decision making Improve resource use

6 Basic Approach Start w/ what has greatest likelihood of addressing (evidence-based) confirmed problem/question –Explained/supported conceptually/empirically Adapt to local context/culture/need Monitor regularly & adjust based on data Adapt for efficient & durable implementation

7 4 Evaluation Criteria Effectiveness –Has/will practice produced desired outcome? Efficiency –What are costs (time, resources, $) to implement practice? Relevance –Is practice & outcomes appropriate for situation? Conceptually soundness –Is practice based on theory?

8 Basic Practices Evaluation

9

10

11

12 Guidelines for Selecting Practice 1.Define desired outcome 2.Delineate implementation setting 3.Identify evidence-based practice 4.Evaluate relevance of practice against outcome & setting/context 5.Adopt/adapt practice to setting/context 6.Arrange supports for accurate implementation 7.Continuously monitor effectiveness

13 Design Questions Has functional or cause-effect relationship been demonstrated & replicated? Have alternative explanations been accounted & controlled for? Have threats or weaknesses of methodology been controlled for? Was study implemented w/ fidelity/accuracy?

14 Research Designs Experimental - RCT & SSR Evaluation - Descriptive w/ baseline Case Study - Descriptive w/o baseline Testimonial - No/Limited data

15 Results Questions Who were subjects? –How much like my participants? Where was study conducted? –How much like where I work? What measures were used? –Do I have similar data? What outcomes were achieved? –Are expected outcomes similar

16 Effectiveness Logic Significance (“believe”) –Likelihood of same effect by chance Effect Size (“strength”) –Size of effect relative to business as usual Consequential Validity (“meaning”) –Contextually meaningful

17 SWPBS/PBIS

18 Primary Prevention: School-/Classroom- Wide Systems for All Students, Staff, & Settings Secondary Prevention: Specialized Group Systems for Students with At-Risk Behavior Tertiary Prevention: Specialized Individualized Systems for Students with High-Risk Behavior ~80% of Students ~15% ~5% CONTINUUM OF SCHOOL-WIDE INSTRUCTIONAL & POSITIVE BEHAVIOR SUPPORT

19 SYSTEMS PRACTICES DATA Supporting Staff Behavior Supporting Student Behavior OUTCOMES Supporting Social Competence & Academic Achievement Supporting Decision Making Basics: 4 PBS Elements

20 Agreements Team Data-based Action Plan ImplementationEvaluation GENERAL IMPLEMENTATION PROCESS

21 Classroom SWPBS Subsystems Non-classroom Family Student School-wide

22 1.Common purpose & approach to discipline 2.Clear set of positive expectations & behaviors 3. Procedures for teaching expected behavior 4.Continuum of procedures for encouraging expected behavior 5. Continuum of procedures for discouraging inappropriate behavior 6. Procedures for on-going monitoring & evaluation School-wide

23 Positive expectations & routines taught & encouraged Active supervision by all staff –Scan, move, interact Precorrections & reminders Positive reinforcement Non-classroom

24 Classroom-wide positive expectations taught & encouraged Teaching classroom routines & cues taught & encouraged Ratio of 6-8 positive to 1 negative adult- student interaction Active supervision Redirections for minor, infrequent behavior errors Frequent precorrections for chronic errors Effective academic instruction & curriculum Classroom

25 Behavioral competence at school & district levels Function-based behavior support planning Team- & data-based decision making Comprehensive person-centered planning & wraparound processes Targeted social skills & self-management instruction Individualized instructional & curricular accommodations Individual Student

26 Continuum of positive behavior support for all families Frequent, regular positive contacts, communications, & acknowledgements Formal & active participation & involvement as equal partner Access to system of integrated school & community resources Family

27 Funding Visibility Political Support Training Coaching Evaluation Local School Teams/Demonstrations PBS Systems Implementation Logic Leadership Team Active & Integrated Coordination

28 PBIS Evidence Base

29 VIOLENCE PREVENTION? Surgeon General’s Report on Youth Violence (2001) Coordinated Social Emotional & Learning (Greenberg et al., 2003) Center for Study & Prevention of Violence (2006) White House Conference on School Violence (2006) Positive, predictable school-wide climate High rates of academic & social success Formal social skills instruction Positive active supervision & reinforcement Positive adult role models Multi-component, multi-year school-family-community effort

30 90-School RCT Study Horner et al., in press Schools that receive technical assistance from typical support personnel implement SWPBS with fidelity Fidelity SWPBS is associated with ▫ Low levels of ODR ▫.29/100/day v. national mean.34 ▫ Improved perception of safety of the school ▫ reduced risk factor ▫ Increased proportion of 3 rd graders who meet state reading standard.

31 RCT Project Target Bradshaw & Leaf, in press PBIS (21 v. 16) schools reached & sustained high fidelity PBIS increased all aspects of organizational health Positive effects/trends for student outcomes –Fewer ODRs (majors + minors) –Fewer ODRs for truancy –Fewer suspensions –Increasing trend in % of students scoring in advanced & proficient range of state achievement test

32 4J School District Eugene, Oregon Change in the percentage of students meeting the state standard in reading at grade 3 from 97-98 to 01-02 for schools using PBIS all four years and those that did not.

33 84% 58% 11% 22% 05% 20% SWPBS schools are more preventive

34 88%69% 08% 17% 04% 14% SWPBS schools are more preventive

35 National ODR/ISS/OSS July 2008 K-66-99-12 # Sch1756476177 # Std781,546311,725161,182 # ODR423,647414,716235,279 ISS# Evnt638 avg/100# Day124961 OSS# Evnt63024 avg/100# Day107461 # Expl0.030.290.39 2409 1,254,453 1,073,642

36 July 2, 2008 ODR rates vary by level

37 July 2, 2008 A few kids get many ODRs

38

39 SWIS summary 07-08 July 2, 2008 2,717 sch, 1,377,989 stds; 1,232,826 Maj ODRs Grade Range# SchoolsMean Enroll. Mean ODRs/100/ sch day (std dev.) K-61,756445..35 (.45) 1/300 day 6-9476654.91 (1.40) 1/100 /day 9-121779101.05 (1.56) 1/105/day K-(8-12)3084011.01 (1.88) 1/100 /day


Download ppt "Is PBIS Evidence-based? George Sugai OSEP Center on PBIS University of Oregon Center for Behavioral Education & Research University of Connecticut August."

Similar presentations


Ads by Google