Data-based Decision-making: Evaluating the Impact of School-wide Positive Behavior Support George Sugai, Rob Horner, Anne Todd, and Teri Lewis-Palmer University.

Slides:



Advertisements
Similar presentations
Goals for Session Define the role of data-based decision-making with the School-wide PBS approach. Propose features of Office Discipline Referral data.
Advertisements

School-wide Positive Behavior Support Rob Horner and George Sugai University of Oregon and University of Connecticut OSEP TA Center on Positive Behavior.
PBS Overview Goal for Today To introduce you to key principles and basic concepts for a continuum of support for students known as Positive Behavior.
Implementing School-wide Positive Behavior Support Rob Horner, George Sugai and Anne Todd University of Oregon Center on Positive Behavior Interventions.
Overview of SW-PBIS Cohort 10 ( ) Metro RIP (Regional Implementation Project) November 6, 2013 Shoreview Community Center T. J. Larson, MAT Barack.
School-Wide Information System An Introduction Presented by: Patrick Johnson Secondary Behavior Specialist
School-Wide Positive Behavioral Interventions & Supports: New Team Training Acknowledgement Program Established Day 2.
The Role and Expectations for School-wide PBS Coaches Rob Horner and George Sugai OSEP TA-Center on PBS Pbis.org.
Guiding and Evaluating Positive Behavioral Support Implementation Shawn Fleming.
MARY BETH GEORGE, USD 305 PBIS DISTRICT COORDINATOR USD #305 PBIS Evaluation.
From the work of: Rob Horner, Steve Newton, & Anne Todd, University of Oregon Bob Algozzine & Kate Algozzine, University of North Carolina at Charlotte.
Using Data to Make Precision Statements Effective Schoolwide Discipline Implementers’ Forum Cathy Shwaery July 29, 2008.
Rob Horner and George Sugai
School-wide Positive Behavior Support: Outcomes, Data, Practices, & Systems George Sugai Center on Positive Behavioral Interventions & Supports University.
Using ODR Data for Decision Making Rob Horner, George Sugai, Anne Todd, Teri Lewis-Palmer Marilyn Nersesian, Jim Watson.
Data Driven Decision Making Missouri PBS Summer Institute June 28 & 29, 2006.
Progress Monitoring and Action Planning Using the Team Implementation Checklist The Wisconsin RtI Center/Wisconsin PBIS Network (CFDA #84.027) acknowledges.
Evaluation Tools, On-Line Systems, and Data-Based Decision Making Version 3.0, Rev  This is a presentation of the Illinois PBIS Network. All.
Rob Horner University of Oregon Implementation of Evidence-based practices School-wide behavior support Scaling evidence-based practices.
4.0 Behavior Data Review and Action Planning WINTER 2012.
School-Wide Positive Behavioral Support School-wide Evaluation Tool January 2007.
The District Role in Implementing and Sustaining PBIS
Creative ways to use data: A toolkit for schools Susan Barrett
Supporting and Evaluating Broad Scale Implementation of Positive Behavior Support Teri Lewis-Palmer University of Oregon.
Sustaining School-wide Positive Behavior Support Rob Horner University of Oregon OSEP TA Center on Positive Behavior Support
Did You Know?. Data and PBIS: A Basic Overview With slides borrowed from Bolger & Miller Illinois PBIS Network Presentation on August 8 th, 2013 Rob Purple.
Building A Tier Two System In An Elementary School: Lessons Learned Tina Windett & Julie Arment Columbia Public Schools, Missouri Tim Lewis & Linda Bradley.
Using Data for Decision-making Rob Horner, Anne Todd, Steve Newton, Bob Algozzine, Kate Algozzine.
Discipline Referrals, Data Analysis, SWPBIS Rollout SWPBS Day 4: Universal Curriculum.
Positive Behavior Support Rob Horner, Ph.D. University of Oregon Families Educators Medical Professionals.
Positive Behavioral Interventions & Supports (PBIS) Core Behavioral Component The Response to Intervention Best Practices Institute Wrightsville Beach,
Progress Monitoring Intensive Behavior Supports, 2008 December, 2008.
Returning Team Training July 17, AGENDA  Introductions and Celebrations  Team Check-up  Creative ways to use data: A toolkit for schools  Check-in.
C. Effective Procedures for Dealing with Discipline.
School School Teams Must Have Immediate Access to Data to Make Objective Decisions About School Climate & Safety -,Where to Start?
1. Learn how data tools can be used to: ◦ help staff get started with School-wide PBIS ◦ check implementation fidelity ◦ monitor progress and establish.
School-Wide Positive Behavioral Interventions and Supports: Administrator’s Role Donna Morelli Cynthia Zingler Education Specialists Positive Behavioral.
1 Effective Procedures for Dealing with Problem Behaviors.
Positive Behavioral Interventions and Supports: Data Systems Northwest AEA September 7, 2010.
D. Data Entry & Analysis Plan Established. Critical Element PBIS Implementation Goal D. Data Entry & Analysis Plan Established 13. Data system is used.
Defining Behaviors and Establishing a Data System for Behavior SWPBIS Day 3: Universal Curriculum.
School-Wide PBIS: Action Planning George Sugai OSEP Center on PBIS Center for Behavioral Education & Research University of Connecticut August 11, 2008.
Mining Date: Using Data for Decision-making Rob Horner, Anne Todd, Steve Newton, Bob Algozzine, Kate Algozzine.
Revisit TIPS team minute form titled Problem Solving Action Plan? Teams need a review on: getting to a precision problem statement deciding what their.
DEVELOPING AN EVALUATION SYSTEM FOR SWPBS Rob Horner and Bob Algozzine.
“Sustaining & Expanding Effective Practices: Lessons Learned from Implementation of School-wide Positive Behavior Supports” Susan Barrett Cyndi Boezio,
Data-based Decision Making: Basics OSEP Center on Positive Behavioral Interventions & Supports February 2006
Data-Based Decision Making: Using Data to Improve Implementation Fidelity & Outcomes.
Data Systems Review School-Wide Positive Behavioral Interventions and Supports Training Northwest AEA September 20, 2010.
Introduction to School-wide Positive Behavior Support.
Sustaining Change: RtI & SWPBS George Sugai OSEP Center on PBIS Center for Behavioral Education and Research University of Connecticut May 9,
Annie McLaughlin, M.T. Carol Davis, Ed.D. University of Washington
Module 3: Introduction to Outcome Data-Based Decision-Making Using Office Discipline Referrals Phase I Session II Team Training Presented by the MBI Consultants.
C. Effective Procedures for Dealing with Discipline
Systems Review: Schoolwide Behavior Support Cohort 5: Elementary Schools Winter, 2009.
Detroit Public Schools Data Review and Action Planning: Schoolwide Behavior Spring
Sustaining and Improving Implementation of SWPBS Rob Horner and George Sugai OSEP TA-Center on Positive Behavior Support
Evaluation Tools and On-Line Systems Adapted from the Illinois PBIS Network.
Data Driven Decisions: Using the Tools Susan Barrett, Jerry Bloom PBIS Maryland Coaches Meeting October 2007
Implementing School-wide Positive Behavior Support Rob Horner and George Sugai University of Oregon and University of Connecticut OSEP TA Center on Positive.
Module 2 : Using Data to Problem-Solve Implementation Issues Session 3 Phase I Team Training Presented by the MBI Consultants.
School-wide Evaluation Tool (SET) Assessing the Implementation of School-wide Discipline Training Overview George Sugai, University of Connecticut Teri.
Schoolwide Systems Review: Module 3.0 Gather Cohort 7 Middle Schools.
PBIS DATA. Critical Features of PBIS SYSTEMS PRACTICES DATA Supporting Culturally Knowledgeable Staff Behavior Supporting Culturally Relevant Evidence-based.
Creative ways to use data: A toolkit for schools Melissa Leahy, Ph.D. Kimberly Muniz, M.A./CAS, NCSP Carroll County Public Schools Maryland.
Tier 1 Positive Behavior Support Response to Intervention for Behavior Faculty Overview.
POSITIVE BEHAVIORAL INTERVENTIONS AND SUPPORTS (PBIS)
Data Based Decision Making: Evaluating the Impact of SWPBIS
SWPB Action Planning for District Leadership
Implementing School-wide Positive Behavior Support
Presentation transcript:

Data-based Decision-making: Evaluating the Impact of School-wide Positive Behavior Support George Sugai, Rob Horner, Anne Todd, and Teri Lewis-Palmer University of Oregon OSEP Funded Technical Assistance Center

Purpose Examine the extent to which the logic of School-wide Positive Behavior Support (PBS) fits your real experience in schools Define the outcomes for School-wide PBS –Is School-wide PBS related to reduction in problem behavior? –Is School-wide PBS related to improved school safety? –Is School-wide PBS related to improved academic performance? Define tools for measuring School-wide PBS outcomes Examine a problem-solving approach for using office discipline referral (ODR) data for decision-making Provide strategies for using data for decision-making and action planning

To Improve Schools for Children Use evidence-based practices –Always look for data of effectiveness Never stop doing what is working Implement the smallest change that will result in the largest improvement Measure Compare Improvement

Academic SystemsBehavioral Systems 1-5% 5-10% 80-90% Intensive, Individual Interventions Individual Students Assessment-based High Intensity Intensive, Individual Interventions Individual Students Assessment-based Intense, durable procedures Secondary Group Interventions Some students (at-risk) High efficiency Rapid response Secondary Group Interventions Some students (at-risk) High efficiency Rapid response primary Interventions All students Preventive, proactive primary Interventions All settings, all students Preventive, proactive Designing School-Wide Systems for Student Success

SYSTEMS PRACTICES DATA Supporting Staff Behavior Supporting Decision- Making Supporting Student Behavior SW Positive Behavior Support OUTCOMES Social Competence, Academic Achievement, and Safety

Improving Decision-Making Problem Solution From To Problem Problem- solving Information Solution

Problem-solving Steps 1.Define the problem(s) – Analyze the data 2.Define the outcomes and data sources for measuring the outcomes 3.Consider 2-3 options that might work 4.Evaluate each option: – Is it safe? – Is it doable? – Will it work? 5.Choose an option to try 6.Determine the timeframe to evaluate effectiveness 7.Evaluate effectiveness by using the data – Is it worth continuing? – Try a different option? – Re-define the problem?

Key Features of Effective Data Systems Data are accurate Data are very easy to collect Data are used for decision-making Data are available when decisions need to be made Data collectors must see the information used for decision-making

Guiding Considerations Use accessible data Handle data as few times as possible Build data collection into daily routines Establish and use data collection as a conditioned positive reinforcer Share data summaries with those who collect it

Types of Questions Initial Assessment Questions –What type or which program do we need? –Where should we focus our efforts? Ongoing Evaluation Questions –Is the program working? –If no, Can it be changed? Should we end the program? –If yes, Do we need this program anymore? What do we need to do to sustain success?

What Data Should be Collected? Always start with the questions you want to answer Make data that will answer your question: –Easy, available, reliable Balance between reliability and accessibility –Systems approach Consider logistics –Who? When? Where? How? Two levels –What is readily accessible? –What requires extra resources?

When Should Data Decisions Be Made? Natural cycles, meeting times –Weekly, monthly, quarterly, annually Level of system addressed –Individual: daily, weekly –School-wide: monthly, quarterly –District/ Region –State-level

Basic Evaluation Questions by School or Program 1.What does “it” look like now? 2.How would we know if are successful? 3.Are we satisfied with how “it” looks? –YES: Celebrate –NO: What do we want “it” to look like? What do we need to do to make “it” look like that? 4.What can we do to keep “it” like that?

Basic School-wide PBS Evaluation Questions by School/ District/ Region Are our efforts making a difference? 1.How many schools have adopted School-wide PBS? 2.Are schools adopting School-wide PBS to criterion? 3.Are schools who are implementing School-wide PBS perceived as safe? 4.Are teachers delivering instructional lessons with fidelity as planned? 5.Is School-wide PBS improving student outcomes?

Is School-wide PBS Having a Positive Influence on School Culture? Using Office Discipline Referral Data

Office Discipline Referrals and The BIG 5! Examine office discipline referral rates and patterns –Major Problem events –Minor Problem events Ask the BIG 5 questions: –How often are problem behavior events occurring? –Where are they happening? –What types of problem behaviors? –When are the problems occurring? –Who is contributing?

The BIG 5 How often where what when who

Office Discipline Referral Caution Data reflects 3 factors: –Students –Staff members –Office personnel Data reflects overt rule violators Data is useful when implementation is consistent –Do staff and administration agree on office- managed problem behavior verses classroom- managed behavior?

Staff Managed (minors)Office Managed (majors) Tardy Unprepared; no homework/materials Violation of classroom expectations Inappropriate language Classroom disruption Minor safety violation Lying/Cheating Consequences are determined by staff Repeated minor behaviors Insubordination Blatant disrespect Abusive/Inappropriate language Harassment/Intimidation Fighting/Physical aggression Safety violations that are potentially harmful to self, others and/or property Vandalism/Property destruction Plagiarism Theft Skipping classes Illegal behaviors: Arson Weapons Tobacco Alcohol/Drugs

General Procedure for Dealing with Problem Behaviors Observe problem behavior Problem solve Determine consequence Follow procedure documented File necessary documentation Send referral to office File necessary documentation Determine consequence Follow through with consequences Problem solve Follow documented procedure Write referral and Escort student to office Follow up with student within a week Is behavior major? Does student have 3? NOYES NO YES Find a place to talk with student(s) Ensure safety

Office Discipline Referral Form Name: _________________________ Grade: _____ Date: _____ Referring Person: ________________________Time: ________ Others involved: None Peers Staff Teacher Substitute Unknown Other Problem BehaviorLocationPossible Motivation Major Abusive language Fighting/ physical aggression Harassment Overt defiance Other_____ Minor Inappropriate language Disruption Property misuse Non- compliance Other_____ Hallway Cafeteria Library Restroom Office Parking lot Classroom On bus Special event Common area Other_____ Attention from peers Attention from adults Avoid peers Avoid adults Avoid work Obtain items Don’t know Other_____ Consequence Lose privilege individual instruction Conference In-school suspension Parent contact Out-of-school suspension Time in office Other________________

SWIS™ Compatibility Checklist Procedure for Documenting Office Discipline Referrals School ___________________________ Date ____________________ Compatibility Question Date 1. Does a clear distinction exist between problem behaviors that are staff managed versus office managed exist and is it available for staff reference? Yes No 2. Does a form exist that is SWIS™ compatible for SWIS™ data entry that includes the following categories? Yes No a. Student name? Yes No b. Date? Yes No c. Time of incident? Yes No d. Student’s grade level? Yes No e. Referring staff member? Yes No f. Location of incident? Yes No g. Problem behavior? Yes No h. Possible motivation? Yes No i. Others involved? Yes No j. Administrative decision? Yes No k. Other comments? Yes No l. No more than 3 extra info. Yes No 3. Does a set of definitions exist that clearly defines all categories on the office discipline referral form? Yes No Next review date: _______________ Redesign your form until answers to all questions are “Yes.” Readiness requirements 4 and 5 are complete when you have all “Yes” responses.

Tables versus Graphs

YearMonth Number of Days Number of Referrals Average Referrals Per Day 2001Aug Sep Oct Nov Dec Jan Feb Mar Apr May Jun Jul Totals:

Number of ODR per Day and Month

Total verses Rate

Total Number of ODRs per Month

Number of ODRs per Day and Month

Priorities and Rationale Graphs Rate

SWIS summary (Majors Only) 3,410 schools; 1,737,432 students; 1,500,770 ODRs Grade Range Number of Schools Avg. Enrollment per school National Avg. for Major ODRs per 100 students, per school day K-62, = about 1 Major ODR every 3 school days, or about 34 every 100 days = a little less than 1 Major ODR per school day, or about 85 every 100 days = more than 1 Major ODR per school day, or about 127 every 100 days K- (8-12) = about 1 Major ODR per school day, or about 106 every 100 days Newton, J.S., Todd, A.W., Algozzine, K, Horner, R.H. & Algozzine, B. (2009). The Team Initiated Problem Solving (TIPS) Training Manual. Educational and Community Supports, University of Oregon unpublished training manual.

SWIS summary (Majors Only) 4,019 schools; 2,063,408 students; 1,622,229 ODRs Grade RangeNumber of Schools Mean Enrollment per school Median ODRs per 100 per school day K K-(8-12)

Interpreting Office Referral Data: Is there a problem? Absolute level (depending on size of school) –Middle, High Schools (> 1 per day per 100) –Elementary Schools (>1 per day per 300) Trends –Peaks before breaks? –Gradual increasing trend across year? Compare levels to last year –Improvement?

Application Activity: Absolute Value Is there a Problem? Middle School of 625 students? Compare with national average: 625/100 = X.92 = 5.75 Office Discipline Referrals per School Day

Elementary School with 150 Students Compare with National Average 150 / 100 = X.35 =.53

High School of 1800 students Compare with National Average 1800 / 100 = X 1.06 = 19.08

Middle School of 700 students

Middle School N= 495

Is There a Problem? #2 Absolute - Trend - Compare Middle School N= 495

Middle School N= 495

Middle School N= 495

Trevor Test Middle School 565 students Grades 6,7, and 8

Cafeteria Class Commons Hall 12:00 Lang. Defiance Disrespect Harrass Skip

Langley Elementary School 478 Students Kindergarten - Grade 5

What Does a Reduction of 850 Office Discipline Referrals and 25 Suspensions Mean? Kennedy Middle School Savings in Administrative Time ODR = 15 minutes/ event Suspension = 45 minutes/event 13,875 minutes 231 hours 29, 8-hour days Savings in Student Instructional Time ODR = 45 minutes/ event Suspension = 216 minutes/ event 43,650 minutes 728 hours 121, 6-hour school days

Is Implementation Related to Reduction in Problem Behavior? 767 students

Are Schools Adopting School-wide PBS to Criterion? Use the: –Team Implementation Checklist (TIC) –School-wide Evaluation Tool (SET) –EBS Self-Assessment Survey (School-wide section) –Other Measure and analyze annually  We’ll focus on TIC today!

Team Implementation Checklist (TIC) Characterizes the evolution of School-wide PBS implementation: –“Achieved,” “In progress,” or “Not started” Assists in: –Initial assessment –Getting started on action plan –Measuring progress of School-wide PBS Implementation Assesses team-based response –Quarterly or monthly

TIC Feature Areas 1.Establish Commitment 2.Establish and Maintain Team 3.Conduct Self-Assessment 4.Define Expectations 5.Teach Expectations 6.Establish Reward System 7.Establish Violations System 8.Establish Information System 9.Build Capacity for Function-based Support 10.Ongoing Activities

Checklist #1: Start-Up Activity Complete and submit Monthly.Status: Achieved, In Progress, Not Started Date: (MM/DD/YY) Establish Commitment 1. Administrator’s support and active involvement. Status : 2. Faculty/Staff support (One of top 3 goals, 80% of faculty document support, 3 year timeline). Status : Establish and Maintain Team 3. Team established (representative). Status : 4. Team has regular meeting schedule, effective operating procedures. Status : 5. Audit is completed for efficient integration of team with other teams/initiatives addressing behavior support. Status : Self-Assessment 6. Team/faculty completes EBS self- assessment survey. Status : 7. Team summarizes existing school discipline data. Status : Team Implementation Checklist

8. Strengths, areas of immediate focus and action plan are identified. Status : Establish School-wide Expectations school-wide behavior expectations are defined. Status : 10. School-wide teaching matrix developed. Status : 11. Teaching plans for school-wide expectations are developed. Status : 12. School-wide behavioral expectations taught directly and formally. Status : 13. System in place to acknowledge/reward school-wide expectations. Status : 14. Clearly defined and consistent consequences and procedures for undesirable behaviors are developed. Status : Establish Information System 15. Discipline data are gathered, summarized, and reported. Status : Build Capacity for Function-based Support 16. Personnel with behavioral expertise are identified and involved. Status : 17. Plan developed to identify and establish systems for teacher support, functional assessment and support plan development and implementation. Status : Team Implementation Checklist continued

Checklist #2: On-going Activity Monitoring Complete and submit Monthly. Status: Achieved, In Progress, Not Started 1. EBS team has met at least monthly. Status : 2. EBS team has given status report to faculty at least monthly. Status : 3. Activities for EBS action plan implemented. Status : 4. Accuracy of implementation of EBS action plan assessed. Status : 5. Effectiveness of EBS action plan implementation assessed. Status : 6. EBS data analyzed. Status : Team Implementation Checklist continued

Scoring the TIC Implementation Points –Achieved = 2 –In progress = 1 –Not Started = 0 Percentage of Items Implemented –Total Number of items scored as “Achieved” divided by 17 (items) –Subscale scores Number of items in each subscale area scored as “Achieved” divided by the number of items in that subscale area Percentage of Points Implemented –Total Total number of points divided by 34 –Subscale scores Total number of points in each subscale divided by total number of items multiplied by 2

Team Implementation Checklist (TIC)

Total Average TIC Scores Schools

1. What is working well? 2. What are next steps?

1. What is going well?2. What are next steps?

Main Messages Invest in prevention Create an effective environment –Leadership, teams; hosts for effective practices Use different systems for different problems –Individual student level alone will be insufficient –Collaboration with Mental Health Professionals Build a culture of competence –Define, teach, monitor, and reward appropriate behavior Build sustainable systems –Resist person-dependent interventions Invest in gathering and using information for decision- making and problem-solving

Action Planning Use your self-assessment information –Rally School-wide commitment –Establish a PBS Team –Focus on prevention (define, teach, monitor, and reward appropriate behavior) Ask kids tomorrow if they know the expectations Ask kids if they are being acknowledged for appropriate behavior –Use information system to guide implementation efforts Build Action Plan –When will the team meet? –What will be reported to faculty? –What will be reported to families?

Action Planning Which system are you going to work on? What are the specific outcomes? –When will they be completed? –What short-term activities are needed? –Who will be responsible? Reporting Schedule –What information will be gathered and by whom? –When will information be reported?