Presentation is loading. Please wait.

Presentation is loading. Please wait.

Guiding and Evaluating Positive Behavioral Support Implementation Shawn Fleming.

Similar presentations


Presentation on theme: "Guiding and Evaluating Positive Behavioral Support Implementation Shawn Fleming."— Presentation transcript:

1

2 Guiding and Evaluating Positive Behavioral Support Implementation Shawn Fleming

3 AGENDA  PBS Overview  Evaluation Instruments  Benchmarks of Quality  School-wide Evaluation Tool (SET)

4 Positive Behavior Support…  A collaborative, assessment-based approach to developing effective interventions for problem behavior  Emphasizes the use of proactive, educative, and reinforcement-based strategies to achieve meaningful and durable behavior and lifestyle outcomes  Aim is to build effective environments in which positive behavior is more effective than problem behavior

5 SYSTEMS PRACTICES Information Supporting Staff Behavior Supporting Decision- Making Supporting Student Behavior Positive Behavior Support OUTCOMES Social Competence, Academic Achievement, and Safety

6 SWPBS is a process that:  Establishes an effective and efficient system to address behavioral issues.  Utilizes proactive educational positive practices that support success.  define, teach, and support student and staff appropriate behaviors  Relies on data-based decisions to target interventions and evaluate progress.

7 What does PBS look like? SW-PBS (Universal)  SWPBS Team meets regularly  Administrators are active participants  Data-driven school-wide decisions regarding behavior  Behavior is indicated as an objective on the School Improvement Plan (SIP)  All members and staff are:  Able to identify team leader  Involved with the development of school-wide plan

8 What does PBS look like? SW-PBS (Universal)  3-5 Positively Stated School-wide Expectations  Taught to all students: >80% of students can state the School- wide expectations  Reinforcement System encourages students following expectations  Teachers/staff are reinforced for implementing plan  Positive adult-to-student interactions exceed negative  Effective consequences for rule violations  Problem behavior is addressed through function-based interventions

9 Adapted from the Center for Positive Behavior Interventions and Supports (2002) Designing Comprehensive Systems CONTINUUM OF POSITIVE BEHAVIOR SUPPORT (PBS)

10 Levels of PBS Adapted from Levels and Descriptions of Behavior Support (George, Harrower, & Knoster, 2003)  School-wide –intended for all students, staff, in specific settings and across campus  Classroom –reflect school-wide expectations for student behavior coupled with pre-planned strategies applied within classrooms  Targeted Group – addressing students who are at-risk for school failure, or display a chronic pattern of inappropriate behavior that do not respond to school-wide interventions  Individual Student –reflect school-wide expectations for student behavior coupled with team-based strategies applied with individual students based upon child-centered behavior

11 Primary Prevention: School-wide and Classroom-wide Systems for All Students, Staff, & Settings 100% of Students Adapted from the Center for Positive Behavior Interventions and Supports (2002) Designing a Universal System

12 What does PBS look like? SW-PBS (Universal)  SWPBS Team meets regularly  Administrators are active participants  Data-driven, school-wide decisions regarding behavior  Behavior is indicated as an objective on the School Improvement Plan (SIP)  All members and staff are:  Able to identify team leader  Involved with the development of school-wide plan

13 What does PBS look like? SW-PBS (Universal)  3-5 Positively Stated School-wide Expectations  Taught to all students: >80% of students can state the School- wide expectations  Reinforcement System encourages students following expectations  Teachers/staff are reinforced for implementing plan  Positive adult-to-student interactions exceed negative  Effective consequences for rule violations  Problem behavior is addressed through function-based interventions

14 PBS is Data-Driven  Implementation Evaluation  Does the team assess implementation of PBS elements?  Are team activities guided by assessment and other data sources?  Assessment of Goodness-of-Fit and/or Social Validity of Interventions  Problem Identification and Outcome Evaluation  Office Discipline Referrals  Suspensions/Expulsions  Student/Teacher absenteeism and drop-out rates  Academic performance

15 PBS Implementation Evaluation Instruments  Benchmarks of Quality  School-wide Evaluation Tool (SET)

16 SWPBS Benchmarks of Quality Team Member Rating Form  53 Benchmarks cover 11 elements of SWPBS  SWPBS Team members independently rate each item  In Place  Needs Improvement  Not in Place  Facilitator collects and tallies frequency of responses for each item

17 SWPBS Critical Elements  PBS Team  Faculty Commitment  Effective Procedures for Dealing with Discipline  Data Entry and Analysis Plan  Expectations & Rules  Reward/Recognition Program  Lesson Plans Developed (Expectations/Rules Taught)  Implementation Plan  Crisis Plan  Evaluation

18 Timeline for Completing the Benchmarks of Quality of SWPBS  When to complete?  Within 3-6 months after completing SWPBS training  Every Spring (April/May)

19 SWPBS Benchmarks of Quality Facilitator Scoring Form  Record the team’s most frequent response for each item ++ In Place +Needs Improvement -Not in Place  Use the descriptions and exemplars in the Benchmark of Quality Scoring Guide to complete the Facilitator’s Scoring Form

20 Using the Benchmarks of Quality of SWPBS  What to do with the results  Identify areas for improvement  School  District  State  Identify successes to celebrate  Share with LA Department to guide state collaboration and planning

21 Ville Platte Elementary January 2005 (6 months implementation)

22 Ferriday Jr. H.S. January 2006 (5 months implementation)

23 ZES Feb. 2006 Benchmarks

24 Horseshoe Drive Elementary January 2005 (1 year)

25 St. John Parish SWPBS Benchmark Scores (January 2005)

26 Number of Schools (129) X SWPBS Benchmark Element Scores January 2005

27 Total SWPBS Benchmark Score by year trained Each dot represents a school

28 The School-wide Evaluation Tool (SET)  Designed as a research instrument  Conducted by outside evaluator  Requires training to establish reliability  Approximately 2 hours to administer  Interview Administration, Staff and Students  Observations  Review of school documents

29 SET administration in LA  Goal to identify demonstration sites in Louisiana  Reliability check for select schools reporting 80% on the Benchmarks of Quality  Many LEAs administer the SET  Selection criteria considerations:  Grade-level representation  Geographic representation

30 The School-wide Evaluation Tool (# of evaluation questions)  Evaluates seven features of school-wide pbs.  Behavioral expectations defined (2)  Behavioral expectations taught (5)  Behavioral expectations rewarded (3)  Systematic responses to rule violations (4)  Information gathered to monitor student behavior (4)  Local management support for school-wide procedures (8)  District level support for school-wide procedures (2)

31 Process for conducting the SET  Interviews  administrator  15 randomly selected students  15 randomly selected staff  EBS team members  Observations  school rules posted in 10 locations  Permanent Product Review  school improvement goal, annual plan, implementation plan, referral form, & other written products

32 SET Elements  Representative team with administrator  Team meets at least once per month during school year  Team provides regular reports to staff  School-wide behavior support curriculum  team-based workshops  at least four workshop/classes each year  support for team facilitators  TEACH, Acknowledge, Correct, Monitor, & Communicate!

33 SET Score(s)  Percent of features implemented as measured by SET  The more features implemented, the higher the SET score  Percentage of implementation for each of seven feature areas  Mean SET score across all feature areas

34 Using SET Scores  For a school  Always link SET scores to Self-assessment data  Team Checklist  Benchmarks of Quality  Build action plan  What is the smallest change that will produce the biggest effect?  The goal is always focused on Student Outcomes.

35 What does a SET Score mean?  Overall SET score  General index of school-wide PBIS implementation  When has a school documented implementation of school-wide PBIS?  80% total score PLUS 80% “Expectations Taught”

36 Horseshoe Drive Elementary SET Scores 2003 & 2005

37 District Planning  What data are used to evaluate PBS implementation?  What changes need to be made?  Who will conduct evaluations  Who will analyze data across schools?  How will schools be identified as demonstration sites?  Check of self-report accuracy

38


Download ppt "Guiding and Evaluating Positive Behavioral Support Implementation Shawn Fleming."

Similar presentations


Ads by Google