Applying the Research to Maximize Efficiency and to Best Meet Your School and District Needs Kim Gulbrandson, Ph.D. Wisconsin RtI Center.

Slides:



Advertisements
Similar presentations
Consensus Building Infrastructure Developing Implementation Doing & Refining Guiding Principles of RtI Provide working knowledge & understanding of: -
Advertisements

Using the PBIS Tiered Fidelity Inventory (TFI) E-12
Measuring Performance within School Climate Transformation Grants
SWPBS Forum October 2008 Claudia Vincent and Scott Spaulding University of.
Current Status and Emerging Directions for PBIS
Fidelity Instruments and School Burden Patricia Mueller, Ed.D., Brent Garrett, Ph.D., & David Merves, C.A.S. Evergreen Evaluation & Consulting, LLC AEA.
Summary of Results from Spring 2014 Presented: 11/5/14.
TEMPLATE DESIGN © DE-PBS Key Features Evaluation: Matching Philosophy & Measurement Sarah K. Hearn, M.Ed., Delaware Positive.
MARY BETH GEORGE, USD 305 PBIS DISTRICT COORDINATOR USD #305 PBIS Evaluation.
Coaching: Tier 2 and 3 Rainbow Crane Dr. Eleanore Castillo-Sumi.
PBIS Applications NWPBIS Washington Conference November 5, 2012.
Coming June 30,  Purpose of PBIS Assessment  Implications of the move from PBIS Surveys  Overview of available Tools and Surveys  Criteria for.
Progress Monitoring and Action Planning Using the Team Implementation Checklist The Wisconsin RtI Center/Wisconsin PBIS Network (CFDA #84.027) acknowledges.
Preparing for PBIS Team Planning Administration Building, Superintendent’s Conference Room April 21, :00 – 5:30 PM.
Evaluation Tools, On-Line Systems, and Data-Based Decision Making Version 3.0, Rev  This is a presentation of the Illinois PBIS Network. All.
RTI at the High School Level JoAnne Malloy, MSW Project Director Institute on Disability/UCED University of New Hampshire Maria Agorastou, MSW Research.
Welcome Back Monthly Coaches’ Meeting Module AA DC Name and Date Here.
School-wide PBIS Tiered Fidelity Inventory
PBIS Data Review: Presented by Susan Mack & Steven Vitto.
Coaches Training Introduction Data Systems and Fidelity.
V 2.1 Evaluation Tools, On-Line Systems and Action Planning.
The Wisconsin PBIS Network (CFDA #84.027) acknowledges the support of the Wisconsin Department of Public Instruction in the development of this presentation.
Cohort 9 Coach Meeting JANUARY 2014 ANN OSTERHUS, FACILITATOR AND MEGAN GRUIS, PRESENTER WITH SPECIAL GUEST, MARY HUNT.
Measuring Implementation: School-Wide Instructional Staff Perspective Amy Gaumer Erickson, Ph.D. University of Kansas Evaluator: Kansas & Missouri SPDGs.
New Coaches Training. Michael Lombardo Director Interagency Facilitation Rainbow Crane Behavior RtI Coordinator
Introduction to Coaching School-Wide PBS:RtIB. 2 Agenda PBS:RtIB Brief Overview Coaching Tier 1 Coaching Skills and Activities Resources and Barriers.
Developing a Comprehensive State-wide Evaluation for PBS Heather Peshak George, Ph.D. Donald K. Kincaid, Ed.D.
PBIS Meeting for BCPS Team Leaders and Coaches March 14, 2008 Oregon Ridge.
Monitoring Advanced Tiers Tool (MATT) University of Oregon October, 2012.
VTPBiS Regional Coordinators Meeting March Agenda Using the SAS as a tool to assess and address staff buy-in BoQ Procedure Update New Information.
PBIS Team Training Baltimore County Public Schools Positive Behavioral Interventions and Supports SYSTEMS PRACTICES DA T A OUTCOMES July 16, 2008 Secondary.
IN NORTH THURSTON PUBLIC SCHOOLS KATY LEHMAN PBIS SPECIALIST MAY 22, 2013 PBIS Implementation.
V 2.1 TFI Wordle. V 2.1 Objectives of Session: 1.Not bore you to sleep 2.Types of PBIS Data 3.pbisapps.org 4.PBIS Evaluation Tools 5.Action Planning.
PBIS Assessment Benchmarks for Advanced Tiers 2.5 Information & Instructions.
DEVELOPING AN EVALUATION SYSTEM BOB ALGOZZINE AND STEVE GOODMAN National PBIS Leadership Forum Hyatt Regency O’Hare Rosemont, Illinois October 14, 2010.
Establishing Multi-tiered Behavior Support Frameworks to Achieve Positive School-wide Climate George Sugai Tim Lewis Rob Horner University of Connecticut,
DEVELOPING AN EVALUATION SYSTEM FOR SWPBS Rob Horner and Bob Algozzine.
Spartan Expectations Be Responsible  Return promptly from breaks  Be an active participant  Use the law of two feet Be Respectful  Maintain cell phone.
BENCHMARKS OF QUALITY (BOQ) January 30 th, 2007 Joey Ledvina Parr, Ph.D. Elsa Velez, Ph. D.
Data Report July Collect and analyze RtI data Determine effectiveness of RtI in South Dakota in Guide.
Data-Based Decision Making: Using Data to Improve Implementation Fidelity & Outcomes.
Notes for Trainers (Day Training)
Detroit Public Schools Data Review and Action Planning: Schoolwide Behavior Spring
Click to edit Master title style Click to edit Master subtitle style 1/31/20161 If you modify this powerpoint, update the version information below. This.
School-Wide Positive Behavioral Interventions & Supports: New Team Training Evaluation Day 2.
Evaluation Tools and On-Line Systems Adapted from the Illinois PBIS Network.
Aligning PBIS to Achieve Educational Excellence Rob Horner University of Oregon Acknowledge: George Sugai, Lucille Eber, Susan Barrett, Justyn Poulos,
Pennsylvania Training and Technical Assistance Network PAPBS Network Coaches Day January 28, Fidelity Measures Lisa Brunschwyler- School Age- School.
Addressing Learning Problems in Elementary School Ellen Hampshire.
BoQ Critical Element: Faculty Commitment. Critical Element: Faculty Commitment 4. Faculty are aware of behavior problems across campus (regular data sharing)
Module 2 : Using Data to Problem-Solve Implementation Issues Session 3 Phase I Team Training Presented by the MBI Consultants.
School-wide Evaluation Tool (SET) Assessing the Implementation of School-wide Discipline Training Overview George Sugai, University of Connecticut Teri.
V 2.1 Version 2.1 School-wide PBIS Tiered Fidelity Inventory.
Session 1 PBIS Coaching Basics Kentucky Center for Instructional Discipline 33 Fountain Place Frankfort, KY Telephone/Fax:
Schoolwide Systems Review: Module 3.0 Gather Cohort 7 Middle Schools.
VTPBiS Coordinators as Coaches Learning and Networking Meeting 1 May, 2016.
White Pages Team Grey Pages Facilitator Team & Facilitator Guide for School-wide Reading Leadership Team Meetings Elementary.
Using Data to Evaluate PBIS Implementation and Student Outcomes Anna Harms.
SCHOOL-WIDE POSITIVE BEHAVIORAL INTERVENTIONS AND SUPPORT: ADDRESSING THE BEHAVIOR OF ALL STUDENTS Benchmarks of Quality KENTUCKY CENTER FOR INSTRUCTIONAL.
Tier 1 Positive Behavior Support Response to Intervention for Behavior Faculty Overview.
“Are We at Fidelity?”: Annual Evaluation and Action Planning.
Sharing Your School Climate Data with STAFF Directions for PowerPoint users: The following is a sample template for sharing your DSCS results.
SAM (Self-Assessment of MTSS Implementation) ADMINISTRATION TRAINING
Annual Evaluation (TFI 1.15 )
Florida’s MTSS Project: Self-Assessment of MTSS (SAM)
School-wide PBIS Tiered Fidelity Inventory
Monitoring Your Progress
TFI Wordle This presentation is intended to introduce the PBISApps site and the types of data teams will be working with. The teams will take their first.
Wisconsin Evaluation
Team Self-Assessment to Action: Effective Use of the TFI
Presentation transcript:

Applying the Research to Maximize Efficiency and to Best Meet Your School and District Needs Kim Gulbrandson, Ph.D. Wisconsin RtI Center

Objectives  To provide a general overview of the research behind the tools  To share strengths and weaknesses of the current assessment tools  To provide resources to support schools/districts in using these tools in a coordinated way

BoQ  Sound development process (multiple stages)  Sound psychometrics  Good test-retest reliability (.94)  High inter-rater reliability (above 90%)  Good internal consistency reliability (.70 or above) PBS team – only scale with low reliability  CFA and EFA: Items with low factor loadings eliminated New Classroom Critical scale added Current 10-factor structure is solid

BoQ  Best tool for distinguishing amongst schools implementing with fidelity  Detailed scoring criteria (rubric)  Found to be a valid instrument even when administered using diverse methods When administration varied from validated method, it did not significantly change scores (if Scoring Guide used)

BoQ  Schools with higher BoQ scores tend to have greater decreases in ODR’s than schools with lower BoQ’s  No district support, CR or coaching items  Family engagement items  Highly correlated with the TIC and SET

BoQ and SET  Offers good cross comparisons (several subscales represent similar elements)  BoQ and SET scores are significantly correlated with one another  BoQ measures PBIS areas with more specificity than the SET  BoQ measures critical features of implementation not covered by the SET Faculty buy-in Lesson plans Crisis plans Evaluation

BoQ and SET  BoQ is better able to distinguish amongst schools that are implementing with fidelity than the SET is  SET can be used to validate BoQ reporting  BoQ can be used to identify additional areas in need of improvement that may not have been identified on the SET If done within same time frame

SET  Considered more sensitive for initial implementation than for sustained implementation  Fairly strong psychometrics  Drawback: Can score 80% on the SET without having some of the critical features of PBIS in place  Limited feedback on the implementation process  Items most appropriate for elementary (less interpretable for middle school)

SET  Use caution with Expectations Taught and Management subscales  Time intensive  Less interpretable and reliable for large schools  Includes a district support component yields high scores only 2 items  No family engagement, CR or coaching items

TIC  Primarily looks at startup activities (only 6 questions tracking ongoing development)  Less useful for fully implementing schools or for looking at sustainability  Limited empirical research examining its reliability and validity One study - internal consistency reliability  Mixed criticisms about being too lenient  3 family engagement items  No district level, coaching or CR items

SAS  The only tool that clearly breaks things down into 4 different systems  Limited reliability and validity data  Higher reliability for improvement priority than current status  Nonclassroom Settings and Individual Student had lowest reliability and greatest variability across staff  Suggested: Look at individual items

SAS  Item 8 – interpret with caution  Has been used to identify specific strategies associated with reductions in racially disproportionate suspensions  3 family engagement items  No district-level, CR or coaching components

BAT  Limited reliability (low test-retest for subscales)  Not yet validated (Tier 3 most problematic)  Tier 3 FBA/BIP scores consistently high/overinflated  Suggestion: People with specific knowledge of FBA/BIP’s complete the BAT  6 family engagement items  No coaching, CR or district items

MATT  No formal work has been done with regard to reliability and validity  3 family engagement items  Scoring concerns (inflated implementation scores)  Suggestion: Look at tier 2 and 3 organization and critical elements subscale scores separately, or individual items

RtI All Staff Survey  5 family engagement items  5 CR items  Aligns with the SIR (29 questions)  Aligns with the state graphic/model  Multiple levels

RtI All Staff Survey  Reliability and validity information, but less than the SIR  No coaching items  Few leadership items

SIR  Aligns with the RtI All Staff  5 family engagement items  Includes leadership items  Includes CR items  Multiple levels

SIR  Reliable and valid  Modified CR items has not been re-tested – be careful comparing across years  Missing district-focused items

Considerations  Which is most important for you to measure? Initial implementation Sustainability District and/or school level factors Different settings All staff or team perceptions Family engagement Culturally responsive practices Leadership

Assessment Tool Review  See handout

Using Assessments to Action Plan