School-wide PBIS: Using Data for Effective Coaching Rob Horner University of Oregon www.pbis.orgwww.pbis.org www.uoecs.org.

Slides:



Advertisements
Similar presentations
Using the PBIS Tiered Fidelity Inventory (TFI) E-12
Advertisements

Establishing an Effective Network of PB4L: School wide Coaches
Overview of SW-PBIS Cohort 10 ( ) Metro RIP (Regional Implementation Project) November 6, 2013 Shoreview Community Center T. J. Larson, MAT Barack.
Moving School-wide PBIS Forward with Quality, Equity and Efficiency 2011 Tennessee School-wide PBIS State Conf Rob Horner, University of Oregon
Coaching PBIS Implementation Coaching PBIS Implementation Rob Horner org.
School-wide PBIS Universal Systems Year 3 Chris Borgmeier, PhD Portland State University
Coaching: Tier 2 and 3 Rainbow Crane Dr. Eleanore Castillo-Sumi.
From the work of: Rob Horner, Steve Newton, & Anne Todd, University of Oregon Bob Algozzine & Kate Algozzine, University of North Carolina at Charlotte.
John Carter Project Coordinator PBIS Idaho: Menu button: Idaho PBIS Presentations and Webinars.
VTPBiS REGIONAL COORDINATOR MEETING: MARCH
Welcome and Questions? Day 4. Component 6: Procedures for Record Keeping & Decision Making.
PBIS Coaches Institute Placer County Office of Education
PBIS Applications NWPBIS Washington Conference November 5, 2012.
Coming June 30,  Purpose of PBIS Assessment  Implications of the move from PBIS Surveys  Overview of available Tools and Surveys  Criteria for.
Progress Monitoring and Action Planning Using the Team Implementation Checklist The Wisconsin RtI Center/Wisconsin PBIS Network (CFDA #84.027) acknowledges.
Washington PBIS Conference Northwest PBIS Network Spokane, WA November 2013 Nadia K. Sampson & Dr. Kelsey R. Morris University of Oregon.
Rob Horner University of Oregon Implementation of Evidence-based practices School-wide behavior support Scaling evidence-based practices.
SW-PBS District Administration Team Orientation
Keys to Sustaining School-wide PBIS Rob Horner and George Sugai University of Oregon and University of Connecticut OSEP TA Center on Positive Behavior.
The District Role in Implementing and Sustaining PBIS
Supporting and Evaluating Broad Scale Implementation of Positive Behavior Support Teri Lewis-Palmer University of Oregon.
V 2.1 Tier II Critical Features Building the Infrastructure to Support Tier 2.
School-wide PBIS Tiered Fidelity Inventory
Cynthia M. Anderson, University of Oregon Lisa Bateman, Bethel School District Bruce Stiller, School District 4j Chris Borgmeier, Portland State University.
PBIS Data Review: Presented by Susan Mack & Steven Vitto.
V 2.1 Evaluation Tools, On-Line Systems and Action Planning.
Rob Horner University of Oregonwww.pbis.org. Celebrate: PBS now being used in many parts of society. Focus: On school-wide positive behavior support.
A Framework for Making a Difference Rob Horner, University of Oregon Deputy Director of the Research to Practice Division for the U.S. Department of Education’s.
Positive Behavioral Interventions & Supports (PBIS) Core Behavioral Component The Response to Intervention Best Practices Institute Wrightsville Beach,
New Coaches Training. Michael Lombardo Director Interagency Facilitation Rainbow Crane Behavior RtI Coordinator
Moving PBS Forward with Quality, Equity and Efficiency 2011 APBS Conference Rob Horner, University of Oregon
Brockton PBIS: Tier 2 Coaches Meeting March 2014 Adam Feinberg
Targeted and Intensive Interventions: Assessing Process (Fidelity) Cynthia M. Anderson, PhD University of Oregon.
Student and Family Engagement within SWPBIS Rob Horner and Celeste Rossetto Dickey University of Oregon Slides available at as well as at.
1. Learn how data tools can be used to: ◦ help staff get started with School-wide PBIS ◦ check implementation fidelity ◦ monitor progress and establish.
SW-PBIS Cohort 8 Spring Training March Congratulations – your work has made a difference Cohort 8.
IN NORTH THURSTON PUBLIC SCHOOLS KATY LEHMAN PBIS SPECIALIST MAY 22, 2013 PBIS Implementation.
 This is a presentation of the IL PBIS Network. All rights reserved. Recognition Process Materials available on
V 2.1 TFI Wordle. V 2.1 Objectives of Session: 1.Not bore you to sleep 2.Types of PBIS Data 3.pbisapps.org 4.PBIS Evaluation Tools 5.Action Planning.
DEVELOPING AN EVALUATION SYSTEM BOB ALGOZZINE AND STEVE GOODMAN National PBIS Leadership Forum Hyatt Regency O’Hare Rosemont, Illinois October 14, 2010.
Establishing Multi-tiered Behavior Support Frameworks to Achieve Positive School-wide Climate George Sugai Tim Lewis Rob Horner University of Connecticut,
School-Wide PBIS: Action Planning George Sugai OSEP Center on PBIS Center for Behavioral Education & Research University of Connecticut August 11, 2008.
TIPS Meeting Foundations Structure of meetings lays foundation for efficiency & effectiveness 11/22/20151.
DEVELOPING AN EVALUATION SYSTEM FOR SWPBS Rob Horner and Bob Algozzine.
“Sustaining & Expanding Effective Practices: Lessons Learned from Implementation of School-wide Positive Behavior Supports” Susan Barrett Cyndi Boezio,
Preparing for Advanced Tiers using CICO Calvert County Returning Team Summer Institute Cathy Shwaery, PBIS Maryland Overview.
Review & Re-establish SW PBIS Tier 1 SRIP – Cohort 9 August 2014.
Data-Based Decision Making: Using Data to Improve Implementation Fidelity & Outcomes.
Sustaining Change: RtI & SWPBS George Sugai OSEP Center on PBIS Center for Behavioral Education and Research University of Connecticut May 9,
Notes for Trainers (Day Training)
Problem-Solving Meeting Foundations
Module 3: Introduction to Outcome Data-Based Decision-Making Using Office Discipline Referrals Phase I Session II Team Training Presented by the MBI Consultants.
Systems, Data, & Practices to Move PBIS Forward in Ravenswood City School District Sheldon Loman, Ph.D.
Today’s Objectives Plan for facilitating school leadership team meetings to address critical concepts Analyze current resources and supports for all students.
Leadership Teams Implementing PBIS Module 14. Objectives Define role and function of PBIS Leadership Teams Define Leadership Team’s impact on PBIS implementation.
Aligning PBIS to Achieve Educational Excellence Rob Horner University of Oregon Acknowledge: George Sugai, Lucille Eber, Susan Barrett, Justyn Poulos,
PAPBS Network Coaches Day January 28, Team-Initiated Problem Solving II (TIPS II) Model.
Data Driven Decisions: Using the Tools Susan Barrett, Jerry Bloom PBIS Maryland Coaches Meeting October 2007
SW-PBIS Cohort 10 Spring Training & Celebration February and March 2016.
Module 2 : Using Data to Problem-Solve Implementation Issues Session 3 Phase I Team Training Presented by the MBI Consultants.
District Implementation of PBIS C-1 Rob Horner Brian Megert University of Oregon Springfield School District.
PBIS Coaches Training Day 3. OUR NORMS Confidentiality * Active participation * Professional use of technology * Assume best intentions Agenda Welcome.
V 2.1 Version 2.1 School-wide PBIS Tiered Fidelity Inventory.
Effective Meeting Practices November, 2012 Facilitated/Presented by: The Illinois RtI Network is a State Personnel Development Grant (SPDG) project of.
PBIS DATA. Critical Features of PBIS SYSTEMS PRACTICES DATA Supporting Culturally Knowledgeable Staff Behavior Supporting Culturally Relevant Evidence-based.
School-Wide Positive Behavior Coaches Meeting Year 1 Day Donna.
Iowa Behavior Alliance: School-wide PBS Third Annual State Conference October 2-3, 2007.
“Are We at Fidelity?”: Annual Evaluation and Action Planning.
POSITIVE BEHAVIORAL INTERVENTIONS AND SUPPORTS (PBIS)
SWPB Action Planning for District Leadership
Presentation transcript:

School-wide PBIS: Using Data for Effective Coaching Rob Horner University of Oregon

Goals Address questions you face as a PBIS Coach Building Decision-Systems  Basic Foundations  Access to core data sources  Rubric for using data to make decisions  On-going review and improvement using data Data needed for effective SWPBIS coaching  Using Fidelity data to assist teams in SWPBIS implementation Team problem solving processes  Establishing the Foundations  Using data for action planning  Using data for implementation assessment and adaptation

Using Data for Decision-making What decisions are needed? Who needs what information at what time in what form to make effective decisions?

Action Planning Improving Decision-Making Problem Solution From To Problem Solving Information Solution

School-Team Decisions Are we implementing SWPBIS?  Are we implementing at all three tiers?  What would be the next smallest change that would make the biggest effect? If we are implementing, are the procedures benefiting students?  Are there problems?  Define problem with precision:  What, Where, When, Who, Why  What proven actions (interventions, practices, packages) would address the problem yet fit with current strengths?  Are selected actions being implemented and effective?  How do they need to be adapted?  Are selection actions producing desired outcomes?

Fidelity of Implementation Measuring fidelity is more than accountability.  Measuring fidelity and regular action planning are key procedures for “getting it right”  Help teams use their strengths to find the path that best helps them achieve the SWPBIS core features. Be careful about adoption too many strategies/ practices.  Do a small number of things well.

~80% of Students ~15% ~5% ESTABLISHING CONTINUUM of SWPBS SECONDARY PREVENTION Check in/out Targeted social skills instruction Peer-based supports Social skills club TERTIARY PREVENTION Function-based support Wraparound Person-centered planning PRIMARY PREVENTION Teach SW expectations Proactive SW discipline Positive reinforcement Effective instruction Parent engagement School-wide Bully Prevention SECONDARY PREVENTION TERTIARY PREVENTION PRIMARY PREVENTION

Level of SupportResearch Measures (2-4 hours) Annual Self- Assessment Measures (45-60 min) Progress Monitoring Measures (15 min) Universal (Tier I)School-wide Evaluation Tool (SET) Self-Assessment Survey (SAS) Benchmarks of Quality (BoQ) Team Implementation Checklist (TIC) Secondary and Tertiary (Tier II, Tier III) Individual Student School-wide Evaluation Tool (ISSET) Benchmarks of Advanced Tiers (BAT) Measure of Advanced Tiers Tool (MATT) Overall Implementation Implementation Phases Inventory (IPI) Phases of Implementation (POI) Phases of Implementation Checklist (PIC) SWPBIS Measures of Fidelity ( See PBIS Evaluation Blueprint; Measures available at How Knowledgeable/Comfortable are you with ( 1= low; 5 = high): School-wide Evaluation ToolSET Self-Assessment Survey SAS Team Implementation Checklist TIC Benchmark of Quality BoQ Individual Student SETISSET Benchmark of Advanced TiersBAT Measure of Advanced Tiers ToolMATT

Schedule of SWPBIS Measures (For shaded measures, select one) LevelMeasurePreYear 1Year 2Year 3 SFWSSFWSSFWS Universal SWPBS Progress Monitoring: TIC XXX XXX Annual Self- Assessment: BoQ, X X X Research Measure: SET X X X Self- Assessment Survey: SAS X X X X Secondary/Tertiary Progress Monitoring: TBA XXX XXX Annual Self- Assessment: BAT X X Research Measure: I- SSET X X

Match Coaching to Stage of Implementation

School Level Implementation Takes Time: 2 – 4 Years EXPLORATION INSTALLATION INITIAL IMPLEMENTATION FULL IMPLEMENTATION Sages of Implementation How long at a district level? How long at a state level? How long at a national level? Fixsen & Blase 2012

Coaching Across Stages of Implementation Phase of Implementation Coaching ChallengeMeasures Exploration  Knowledge of SWPBIS  Self-Assessment  Readiness for adoption  Commitment Self-Assessment Survey Team Implementation Checklist Benchmarks of Quality SET Installation  Readiness for adoption  Team Implementation Checklist Initial Implementation  Fidelity Assessment  Action Planning  Adaptation to Local Context/Strengths/ Culture Team Implementation Checklist Full Implementation  Sustain SWPBIS Implementation  Expand to Tier II, and Tier III Benchmark of Quality Measure of Advanced Tiers Benchmark of Advanced Tiers

Coaching for Implementation Are we implementing SWPBIS?  Team Implementation Checklist (15 min)  Benchmark of Quality (45-60 min)  School-wide Evaluation Tool (2-4 hours) How Often Should Data be Collected?  Initial implementation (every 3 rd or 4 th meeting)  On-going (annually) Total Score  Are we improving? Sub-scale score  What is working where do we focus next? Item score  Action planning

Elementary Middle

Team Checklist: Subscale Scores Percentage of Total Points

Your Turn: What should team focus on ?

Time 1 versus Time 2

Demonstration School Exemplar NCES ID: Zenith, Winnemac Demonstration District NCES ID : School YearNumber of Responses Date CollectedActionWho/When /30/2009 FeatureScore (0, 1, 2) Establish Commitment 1. Administrator's Support & Active Involvement.1 2. Faculty/Staff Support.1 Establish & Maintain Team 3. Team Established (Representative).1 4. Team has regular meeting schedule, effective operating procedures Audit is completed for efficient integration of team with other teams/initiatives addressing behavior support. 0 Conduct Self-Assessment 6. Team completes the Team Implementation Checklist (TIC) Team summarizes existing school discipline data.1 8. Team uses self-assessment information to build implementation Action Plan (areas of immediate focus). 0

Demonstration School Exemplar NCES ID: Zenith, Winnemac Demonstration District NCES ID : School YearNumber of Responses Date CollectedActionWho/ When /15/2011 FeatureScore (0, 1, 2) Establish Commitment 1. Administrator's Support & Active Involvement.2 2. Faculty/Staff Support.2 Establish & Maintain Team 3. Team Established (Representative).1 4. Team has regular meeting schedule, effective operating procedures Audit is completed for efficient integration of team with other teams/initiatives addressing behavior support. 0 Conduct Self-Assessment 6. Team completes the Team Implementation Checklist (TIC) Team summarizes existing school discipline data.2 8. Team uses self-assessment information to build implementation Action Plan (areas of immediate focus). 2

Using Student Impact Data Universal Screening Progress Monitoring Team Meetings and Use of Data for Problem Solving

Universal Screening and Progress Monitoring. Data SourcesDecisions Universal Screening Behavior : Oct, Feb Reading: Sept, Nov, Mar SWPBIS Fidelity Measures Teacher report SSBD/ SSRS/ SSIS Office Discipline Referrals Oral Reading Fluency (etc) Are Tier I supports in place? What students need more intensive support? Initiate early intervention Progress Monitoring Fidelity: Match to level of intensity of support Impact: Match to level of intensity of support CICO/ ISIS fidelity measures Office Discipline Referrals Check-in Check-out points ISIS intensive support Oral Reading Fluency Phonemic Segmentation Comprehension Are Tier II and Tier III supports implemented as planned? Are supports effective? Are more intensive/ individualized supports needed?

~80% of Students ~15% ~5% 0-1 office discipline referral 6+ office discipline referrals 2-5 office discipline referrals Using office discipline referrals as a metric for universal screening of student social behavior Newton, J. S., Todd, A. W., Algozzine, K., Horner, R. H., & Algozzine, B. (2009). The Team Initiated Problem Solving (TIPS) Training Manual. Educational and Community Supports, University of Oregon, unpublished training manual. 32

Using the Referrals by Student report as a Universal Screening Tool 33

Cumulative Mean ODRs Cumulative Mean ODRs Per Month for 325+ Elementary Schools Jennifer Frank, Kent McIntosh, Seth May

Helping Teams Use Data at All Three Tiers Tier IOffice discipline referrals

~80% of Students ~15% ~5% ESTABLISHING a CONTINUUM of Data Options within SWPBS SECONDARY PREVENTION Fidelity: BAT, MATT * Student Outcomes: CICO-SWIS ODRs TERTIARY PREVENTION Fidelity: ISSET, BAT, MATT ISIS-SWIS Student Outcomes: ISIS-SWIS PRIMARY PREVENTION Fidelity: TIC, BoQ, SET Student Outcomes: Office Discipline Referrals SECONDARY PREVENTION TERTIARY PREVENTION PRIMARY PREVENTION

Coaching Tier III Data Use Needs ◦ System to assist coordination of individualized Team ◦ System for managing (a) assessments, (b) plan development/modification, and (c) team meeting minutes ◦ System for building data collection tailored to individualized plan  Collect Fidelity Data: Are we doing plan?  Collect Impact Data: Is plan benefiting student? ◦ System for on-going collection, summarizing and reporting of data for team decision-making.

Helping Teams use Data for Decision-Making Build team foundation  Roles, Agenda, Schedule Use data to define problems with precision Use data to build “Action Plans” that fit the local context and will work. Use data to assess if action plans are being implemented with effect

Problem-Solving Meeting Foundations Structure of meetings lays foundation for efficiency & effectiveness

Meeting Foundations Elements Four features of effective meetings  Predictability  Participation  Accountability  Communication Define roles & responsibilities  Facilitator, Minute Taker, Data Analyst Use electronic meeting minutes format 43 Newton, J. S., Todd, A. W., Algozzine, K., Horner, R. H., & Algozzine, B. (2009). The Team Initiated Problem Solving (TIPS) Training Manual. Educational and Community Supports, University of Oregon, unpublished training manual.

A. Predictability 1.Defined roles, responsibilities and expectations for the meeting 2.Start & end on time, if meeting needs to be extended, get agreement from all members 3.Agenda is used to guide meeting topics 4.Data are reviewed in first 5 minutes of the meeting 5.Next meeting is scheduled B. Participation 5.75% of team members present & engaged in topic(s) 6.Decision makers are present when needed What makes a successful meeting?

C. Accountability 7. Facilitator, Minute Taker & Data Analyst come prepared for meeting & complete during the meeting responsibilities 8.System is used for monitoring progress of implemented solutions (review previous meeting minutes, goal setting) 9.System is used for documenting decisions 10. Efforts are making a difference in the lives of children/students. D. Communication 11.All regular team members (absent or present) get access to the meeting minutes within 24 hours of the meeting 12.Team member support to practice team meeting norms/agreements

Define roles for effective meetings Core roles ◦ Facilitator ◦ Minute taker ◦ Data analyst ◦ Active team member ◦ Administrator Backup for each role Can one person serve multiple roles? Are there other roles needed? Typically NOT the administrator 46 Newton, J. S., Todd, A. W., Algozzine, K., Horner, R. H., & Algozzine, B. (2009). The Team Initiated Problem Solving (TIPS) Training Manual. Educational and Community Supports, University of Oregon, unpublished training manual.

47

Newton, J. S., Todd, A. W., Algozzine, K., Horner, R. H., & Algozzine, B. (2009). The Team Initiated Problem Solving (TIPS) Training Manual. Educational and Community Supports, University of Oregon, unpublished training manual. 48

Who is Responsible? ActionPerson Responsible Reserve RoomFacilitator Recruit items for AgendaFacilitator Review data prior to the meetingData Analyst Reserve projector and computer for meeting Minute Taker Keep discussion focusedFacilitator Record Topics and Decisions on agenda/minutes Minute taker Ensure that problems are defined with precision Facilitator Ensure that solutions have action plansFacilitator Provide “drill down” data during discussion Data Analyst End on timeFacilitator Prepare minutes and send to all members Minute taker

Activity # 1 (7 min) Select ◦ Facilitator ◦ Data Analyst ◦ Minute Taker Back up for each 50 RolePrimaryBackup Facilitator Data Analyst Minute Taker Next role review date: Newton, J. S., Todd, A. W., Algozzine, K., Horner, R. H., & Algozzine, B. (2009). The Team Initiated Problem Solving (TIPS) Training Manual. Educational and Community Supports, University of Oregon, unpublished training manual.

Organizing for an effective problem solving conversation Problem Solution Out of Time Use Data A key to collective problem solving is to provide a visual context that allows everyone to follow and contribute 51 Newton, J. S., Todd, A. W., Algozzine, K., Horner, R. H., & Algozzine, B. (2009). The Team Initiated Problem Solving (TIPS) Training Manual. Educational and Community Supports, University of Oregon, unpublished training manual.

Newton, J. S., Todd, A. W., Horner, R. H., Algozzine, B., & Algozzine, K. (2012) (Version 1.2)

Where in the Form would you place: 1.Planning for next PTA meeting? 2.There have been five fights on the playground in the past 3 weeks. 3.Update on CICO implementation 4.Increasing gang recruitment as an agenda topic for today. 5.Next meeting report on lunch-room status.

Activity Examine the Langley minutes: ◦ 1. What is one agenda item for next meeting? ◦ 2. Who will do what by when to get the video system working? ◦ 3. For the problem of “disrespectful behavior,” how will they know if they achieved their goal?

What needs to be documented? Meeting demographics – Date, time, who is present, who is absent – Agenda – Next meeting date/time/location/roles Administrative/ general Information/Planning items – Topic of discussion, decisions made, who will do what, by when Problem-Solving items – Problem statement, data used for problem solving, determined solutions, who will do what by when, goal, how/how often will progress toward goal be measured, how/how often will fidelity of implementation be measured 57 Newton, J. S., Todd, A. W., Algozzine, K., Horner, R. H., & Algozzine, B. (2009). The Team Initiated Problem Solving (TIPS) Training Manual. Educational and Community Supports, University of Oregon, unpublished training manual.

Summary: Team-based Problem Solving Team Foundations  Roles  Minute form Defining Problems Building Solutions Action Planning/ Follow up / Adaptation