DEVELOPING AN EVALUATION SYSTEM FOR SWPBS Rob Horner and Bob Algozzine.

Slides:



Advertisements
Similar presentations
Using the PBIS Tiered Fidelity Inventory (TFI) E-12
Advertisements

Measuring Performance within School Climate Transformation Grants
From Blueprint to Finished Product: Selecting the Right Tools and Employing the Right People Tim Lewis, Ph.D. University of Missouri OSEP Center on Positive.
Establishing an Effective Network of PB4L: School wide Coaches
Extending RTI to School-wide Behavior Support Rob Horner University of Oregon
1 Implementing a Three-Tiered State Evaluation Structure Bob Putnam The May Institute Karen Childs University of South Florida 2009 National PBIS Leadership.
Building Evaluation Capacity for States and Districts
School-wide PBIS: Using Data for Effective Coaching Rob Horner University of Oregon
The Role and Expectations for School-wide PBS Coaches Rob Horner and George Sugai OSEP TA-Center on PBS Pbis.org.
MARY BETH GEORGE, USD 305 PBIS DISTRICT COORDINATOR USD #305 PBIS Evaluation.
Coaching: Tier 2 and 3 Rainbow Crane Dr. Eleanore Castillo-Sumi.
John Carter Project Coordinator PBIS Idaho: Menu button: Idaho PBIS Presentations and Webinars.
VTPBiS Universal School Coordinator Orientation. Agenda Introductions Review Morning and Answer Questions Define Coordinator responsibilities and competencies.
Rob Horner and Steve Goodman. Goals Logic for investing in Trainer development For state leadership teams developing action plan For individuals identified.
SWPB Action Planning for District Leadership George Sugai & Susan Barrettt OSEP Center on PBIS University of Connecticut February 14,
San Jose Unified School District School-wide PBS Initiative Leadership Team Rob Horner Celeste Rossetto Dickey University of Oregon Pbis.org.
Leadership within SW-PBS: Following the Blueprints for Success Tim Lewis, Ph.D. University of Missouri OSEP Center on Positive Behavioral Intervention.
Evaluation in Michigan’s Model Steve Goodman National PBIS Leadership Forum October, 2011
PBIS Applications NWPBIS Washington Conference November 5, 2012.
PBIS Coaches Training Day 3. Coaches Training Day 4 Follow-up from Coaches Training Day 3 The Why? Preparing your teams for Tier 1 implementation Coaching.
PBIS Assessments NWPBIS Conference March 9, 2010 Katie Conley Celeste Rossetto Dickey University of Oregon.
Coming June 30,  Purpose of PBIS Assessment  Implications of the move from PBIS Surveys  Overview of available Tools and Surveys  Criteria for.
Northern California PBIS Symposium November 18, 2013.
Washington PBIS Conference Northwest PBIS Network Spokane, WA November 2013 Nadia K. Sampson & Dr. Kelsey R. Morris University of Oregon.
SW-PBS District Administration Team Orientation
Keys to Sustaining School-wide PBIS Rob Horner and George Sugai University of Oregon and University of Connecticut OSEP TA Center on Positive Behavior.
Chris Borgmeier, Dave McKay, Anne Todd, Celeste Dickey, Rob Horner October 2008
The District Role in Implementing and Sustaining PBIS
Rob Horner University of Oregon Current assumptions/research about coaching Define the experience with coaching in SWPBS implementation.
Designing and Implementing Evaluation of School-wide Positive Behavior Support Rob HornerHolly Lewandowski University of Oregon Illinois State Board of.
Sustaining School-wide Positive Behavior Support Rob Horner University of Oregon OSEP TA Center on Positive Behavior Support
Scaling up and sustaining an integrated behavior and reading schoolwide model of supports November 18, 2008.
Rob Horner University of Oregonwww.pbis.org. Celebrate: PBS now being used in many parts of society. Focus: On school-wide positive behavior support.
Measuring Implementation: School-Wide Instructional Staff Perspective Amy Gaumer Erickson, Ph.D. University of Kansas Evaluator: Kansas & Missouri SPDGs.
New Coaches Training. Michael Lombardo Director Interagency Facilitation Rainbow Crane Behavior RtI Coordinator
Targeted and Intensive Interventions: Assessing Process (Fidelity) Cynthia M. Anderson, PhD University of Oregon.
Developing a Comprehensive State-wide Evaluation for PBS Heather Peshak George, Ph.D. Donald K. Kincaid, Ed.D.
Sustaining School-wide Positive Behavior Support Rob Horner University of Oregon OSEP TA Center on Positive Behavior Support
Monitoring Advanced Tiers Tool (MATT) University of Oregon October, 2012.
Effective Behavioral & Instructional Support Systems Overview and Guiding Principles Adapted from, Carol Sadler, Ph.D. – EBISS Coordinator Extraordinaire.
Positive Behavioral Interventions and Supports Going to Scale in Maryland’s Local School Systems
1. Learn how data tools can be used to: ◦ help staff get started with School-wide PBIS ◦ check implementation fidelity ◦ monitor progress and establish.
IN NORTH THURSTON PUBLIC SCHOOLS KATY LEHMAN PBIS SPECIALIST MAY 22, 2013 PBIS Implementation.
Positive Behavioral Interventions and Supports: Data Systems Northwest AEA September 7, 2010.
E VALUATION FOR S CHOOL - WIDE PBIS Bob Algozzine University of North Carolina: Charlotte Rob Horner University of Oregon December 9, 2011.
DEVELOPING AN EVALUATION SYSTEM BOB ALGOZZINE AND STEVE GOODMAN National PBIS Leadership Forum Hyatt Regency O’Hare Rosemont, Illinois October 14, 2010.
Bob Algozzine Rob Horner National PBIS Leadership Forum Chicago Hyatt Regency O’Hare October 8, /
Establishing Multi-tiered Behavior Support Frameworks to Achieve Positive School-wide Climate George Sugai Tim Lewis Rob Horner University of Connecticut,
“Sustaining & Expanding Effective Practices: Lessons Learned from Implementation of School-wide Positive Behavior Supports” Susan Barrett Cyndi Boezio,
Data-Based Decision Making: Using Data to Improve Implementation Fidelity & Outcomes.
Data Systems Review School-Wide Positive Behavioral Interventions and Supports Training Northwest AEA September 20, 2010.
Evaluation Planning & Reporting for School Climate Transformation Grant (SCTG) Sites Bob Algozzine University of North Carolina at Charlotte Steve GoodmanMichigan's.
DEVELOPING AND IMPLEMENTING STATE-LEVEL EVALUATION SYSTEMS BOB ALGOZZINE, HEATHER REYNOLDS, AND STEVE GOODMAN National PBIS Leadership Forum Hyatt Regency.
Evidence-based Practices: A Technical Assistance Perspective Lou Danielson, Chair Brian Cobb, Susan Sanchez, Kathleen Lane, Rob Horner
Notes for Trainers (Day Training)
Detroit Public Schools Data Review and Action Planning: Schoolwide Behavior Spring
Click to edit Master title style Click to edit Master subtitle style 1/31/20161 If you modify this powerpoint, update the version information below. This.
Sustaining and Improving Implementation of SWPBS Rob Horner and George Sugai OSEP TA-Center on Positive Behavior Support
Leadership Teams Implementing PBIS Module 14. Objectives Define role and function of PBIS Leadership Teams Define Leadership Team’s impact on PBIS implementation.
Data Driven Decisions: Using the Tools Susan Barrett, Jerry Bloom PBIS Maryland Coaches Meeting October 2007
Module 2 : Using Data to Problem-Solve Implementation Issues Session 3 Phase I Team Training Presented by the MBI Consultants.
District Implementation of PBIS C-1 Rob Horner Brian Megert University of Oregon Springfield School District.
PBIS Coaches Training Day 3. OUR NORMS Confidentiality * Active participation * Professional use of technology * Assume best intentions Agenda Welcome.
V 2.1 Version 2.1 School-wide PBIS Tiered Fidelity Inventory.
PBIS DATA. Critical Features of PBIS SYSTEMS PRACTICES DATA Supporting Culturally Knowledgeable Staff Behavior Supporting Culturally Relevant Evidence-based.
Iowa Behavior Alliance: School-wide PBS Third Annual State Conference October 2-3, 2007.
“Are We at Fidelity?”: Annual Evaluation and Action Planning.
School-wide Positive Behavioral Interventions and Supports District-wide Implementation: Ensuring Success Kentucky Center for Instructional Discipline.
Current Issues Related to MTSS for Academic and Behavior Difficulties: Building Capacity to Implement PBIS District Wide at All Three Tiers OSEP conference.
Northern California PBIS Symposium November 18, 2013
Presentation transcript:

DEVELOPING AN EVALUATION SYSTEM FOR SWPBS Rob Horner and Bob Algozzine

Goals  Define core features of an evaluation plan for state or district SWPBS evaluation.  Define core features of an evaluation report for state or district SWPBS evaluators  Define current and future Evaluation Blueprint developments  Provide examples of effective state-level evaluation

Evaluating Implementation of SWPBS  Purposes  Evaluation to improve the fidelity of implementation Give teams regular information on how to improve  Evaluation to assess policy decisions Give state/ district decision-makers documentation of (a) fidelity (b) impact on student outcomes (c) sustainability

A Basic Model  Organize evaluation around decision-making questions.  What support was provided to whom for implementation of SWPBS?  Was SWPBS implemented at a level of fidelity that we can expect change in student outcomes?  Was SWPBS implementation related to improvement in student outcomes (reduction in problem behavior, increase in academic success)  Did SWPBS implementation improve the capacity of the district/state to sustain implementation and impact?

The Logic Model for Evaluation

Evaluation Questions  What support was provided to whom for implementation of SWPBS? CONTEXT and INPUT  Was SWPBS implemented at a level of fidelity that we can expect change in student outcomes? FIDELITY  Was SWPBS implementation related to improvement in student outcomes (reduction in problem behavior, increase in academic success)? IMPACT  Did SWPBS implementation improve the capacity of the district/state to sustain implementation and impact? SUSTAINABILITY

Context and Input  Demographic information about schools/districts  Enrollment, SES, Diversity, 504, IEP  Type and amount of Technical Assistance provided  Extent to which TA was perceived as helpful.

Fidelity  Fidelity = The extent to which the core features of SWPBS are being used.  Issues:  1. Implement with flexibility around form and format, but with consistency and precision around CORE Features.  2. Use Fidelity assessments as PART of implementation process as well as summative evaluation. Make the data useful for teams to develop their actions.

Research MeasuresAnnual Self- Assessment Measures Progress Monitoring Measures Universal Tier of SWPBS School-wide Evaluation Tool (SET) Self-Assessment Survey (SAS) Benchmarks of Quality (BoQ) Team Implementation Checklist (TIC) Secondary and Tertiary Tiers of SWPBS Individual Student School-wide Evaluation Tool (I- SSET) Benchmarks of Advanced Tiers (BAT) (To be developed) Overall Summary of Implementation Implementation Phases Inventory (IPI) Phases of Implementation (POI) The Tools: To Measure Fidelity

Year 1Year 2Year 3 PreSumFallWinSprSumFallWinSprSumFallWinSpr Universal SWPBS Progress Monitoring: TIC XXXXXX Annual Self- Assessment: BoQ, XXX Research Measure: SET XXX Self-Assessment Survey: SAS XXXX Secondary/ Tertiary Progress Monitoring: TBA XXXXXX Annual Self- Assessment: BAT XX Research Measure: I-SSET XX The Fidelity Tools: A Schedule

Using Fidelity Data  Forms on  New site Spring 2010:

Fidelity Tools: Uses

Commit Team Self-Assess Core Features Classroom Data Sec/Ter Mean Score Per Item Across 9 Middle Schools

What coaching advice do you have for this school? Commit Team Self-Assess Core Features Classroom Data Sec/Ter

What coaching advice do you have for this school? Commit Team Self-Assess Core Features Classroom Data Sec/Ter

What coaching advice do you have for this school? Commit Team Self-Assess Core Features Classroom Data Sec/Ter

What coaching advice do you have for this school? Commit Team Self-Assess Core Features Classroom Data Sec/Ter

What coaching advice do you have for this school?

What coaching advice do you have for a district leadership team with these BoQ data? School A B C D E F G

States Implementing SWPBS schools in all 50 states States Number of Schools Illinois

Illinois Elementary Schools

Illinois Middle Schools

Illinois High Schools

Illinois K- (8/12) Schools

Impact: Is SWPBS benefiting Students?  Social Behavior  Attendance  Referrals to Special Education  Proportion of time in typical educational contexts  Office Discipline Referrals (ODR)… (e.g. SWIS)  Individual student points/ behavior records  Academic Behavior  Progress monitoring (oral reading fluency… DIBELS)  Universal Screening  Standardized Test Scores

Total Office Discipline Referrals Total Office Discipline Referrals as of January 10

Average Office Discipline Referrals per day per month as of January 10

SWIS summary (Majors Only) 2,732 schools; 1,385,191 students; 1,244,026 ODRs Grade RangeNumber of Schools Mean Enrollment per school Mean ODRs per 100 per school day K-61, (sd=.45) (1 /300 / day) (sd=1.41) (1/ 100 / day) (sd=.1.56) (1/105 / day) K-(8-12) (sd=1.85) (1/ 100 / day

Interpreting Office Referral Data: Is there a problem?  Absolute level (depending on size of school)  Middle, High Schools (> 1 per day per 100)  Elementary Schools (>1 per day per 300)  Trends  Peaks before breaks?  Gradual increasing trend across year?  Compare levels to last year  Improvement?

Application Activity: Absolute Value Is there a Problem? Middle School of 625 students? Compare with national average: 625/100 = X.92 = 5.75 Office Discipline Referrals per School Day

Elementary School with 150 Students Compare with National Average 150 / 100 = X.34 =.51

Cafeteria Class Commons Hall 12:00 Lang. Defiance Disruption Harrass Skip

SWPBS implementation in Oregon 714

ElementaryMiddle K (8-12)High Schools

National Means N = 343

Sustainability: Continuous Regeneration  Building capacity State District School  The critical role of coaching  The importance of regular improvement Stay focused, but make it easier to do each year.

Leadership Team Funding Visibility Political Support TrainingCoachingEvaluation Active Coordination Local School or District Teams/Demonstrations Behavioral Expertise Policy

Sustaining SWPBS in Districts  Build District Capacity for Sustained Effects  Policy statement focused on social behavior  Job Recruitment Content “knowledge and experience implementing school-wide positive behavior support systems”  Annual Evaluation Demonstrated effectiveness implementing school-wide pbs practices.  Fall Orientation Content For administrators, classified, certified staff  Board outcome measures  School Improvement Goal

Organization of an Evaluation Report  Purpose  Description of SWPBS Core features, Implementation Process  Context and Input What TA is being provided, to whom, at what intensity  Fidelity Is the TA resulting in implementation at criterion?  Impact Is the TA resulting in benefits for students Are the systems and outcomes sustainable  Cost  Recommendations

North Carolina: An Exemplar