Fall Data Review October 2, 2014 Moodle Site-Drop Box 1 WARMUP: Review as a team one celebration you would be willing to share out. If you have a challenge.

Slides:



Advertisements
Similar presentations
Accelerating Achievement in Boston Public Schools: Academic Achievement Framework.
Advertisements

Consensus Building Infrastructure Developing Implementation Doing & Refining Guiding Principles of RtI Provide working knowledge & understanding of: -
Continuous Improvement Data Review Workday High School Leadership Teams.
PAYS FOR: Literacy Coach, Power Hour Aides, LTM's, Literacy Trainings, Kindergarten Teacher Training, Materials.
Self Assessment and Implementation Tool for Multi- Tiered Systems of Support (RtI)
Coaching: Tier 2 and 3 Rainbow Crane Dr. Eleanore Castillo-Sumi.
Evaluation in Michigan’s Model Steve Goodman National PBIS Leadership Forum October, 2011
Progress Monitoring and Action Planning Using the Team Implementation Checklist The Wisconsin RtI Center/Wisconsin PBIS Network (CFDA #84.027) acknowledges.
Preparing for PBIS Team Planning Administration Building, Superintendent’s Conference Room April 21, :00 – 5:30 PM.
Evaluation in Michigan’s Model Steve Goodman National PBIS Leadership Forum October, 2010
Washington PBIS Conference Northwest PBIS Network Spokane, WA November 2013 Nadia K. Sampson & Dr. Kelsey R. Morris University of Oregon.
Cohort 5 Elementary School Data Review and Action Planning: Schoolwide Reading Spring
SPRING DATA REVIEW MAY 20, TODAYS PRESENTATION HAS BEEN ADAPTED FROM THE FOLLOWING RESOURCES. THANK YOU! MIBLSI HURON ISD INGHAM ISD FLORIDA RTI.
Grade-level Benchmark Data Meetings
Winter Data Review Workday High School Leadership Teams February 6, 2014.
Fall Data Review Continuous Improvement Work Day School Leadership Teams Fall 2014.
School Leadership Teams Collaborating for Effectiveness Begin to answer Questions #1-2 on the Handout: School Leadership Teams for Continuous Improvement.
Winter Data Review Workday District Leadership Teams February 27, 2014.
Developing Professional Learning Communities To Promote Response to Intervention Linda Campbell Melissa Nantais.
Fall » Facilitator » Action Plan Recorder » Grows and Glows Recorder » Time Keeper » Data Manager.
RtII: Data Analysis Teaming. Goals of Today’s Session  Compare and contrast types of collaborative teams Building-wide teams Grade-level teams Intervention.
PBIS Data Review: Presented by Susan Mack & Steven Vitto.
PBIS Tier 1 & 2 Coaches’ Meeting WCTC – ISU Room March 19 or 20, 2014.
Scaling up and sustaining an integrated behavior and reading schoolwide model of supports November 18, 2008.
PBIS Tier 1 Coaches Training
Winter Data Review Workday School Leadership Teams February 11-13, 2014.
Measuring Implementation: School-Wide Instructional Staff Perspective Amy Gaumer Erickson, Ph.D. University of Kansas Evaluator: Kansas & Missouri SPDGs.
New Coaches Training. Michael Lombardo Director Interagency Facilitation Rainbow Crane Behavior RtI Coordinator
MAISA Annual Conference June 21, 2012  Leadership and vision  Focused and intentional action  Knowledge and capacity building  Accountable for student.
Continuous Improvement Data Review Workday Spring 2015.
Cohort 5 Middle/Jr. High School Data Review and Action Planning: Schoolwide Reading Spring,
Monitoring Advanced Tiers Tool (MATT) University of Oregon October, 2012.
1 Division of Public Schools (PreK -12) Florida Department of Education Florida Education: The Next Generation DRAFT March 13, 2008 Version 1.0 INSERT.
Hastings Public Schools PLC Staff Development Planning & Reporting Guide.
January Data Day: 1.0 Introduction WINTER Opening Activity:  Talk to your neighbor about successes and challenges you experienced when using PBIS.
Why Do State and Federal Programs Require a Needs Assessment?
IN NORTH THURSTON PUBLIC SCHOOLS KATY LEHMAN PBIS SPECIALIST MAY 22, 2013 PBIS Implementation.
May 29, The material for this training day was developed by Ingham ISD: Theron Blakeslee, John Endahl, Melanie Kahler, Matt Phillips, Jeanne Tomlinson,
DEVELOPING AN EVALUATION SYSTEM BOB ALGOZZINE AND STEVE GOODMAN National PBIS Leadership Forum Hyatt Regency O’Hare Rosemont, Illinois October 14, 2010.
D. Data Entry & Analysis Plan Established. Critical Element PBIS Implementation Goal D. Data Entry & Analysis Plan Established 13. Data system is used.
Cohort 4 Elementary School Data Review and Action Planning: Schoolwide Reading Spring
Data Driven Decision Making Across All Content Areas WI PBIS Network Summer Leadership Conference Rachel Saladis Lynn Johnson The Wisconsin RtI Center/Wisconsin.
Spring Data Review Workday High School Leadership Teams May 22, 2014.
Benchmark Data Meetings Presented to Coaches September 6, 2013 Adapted from MiBLSi materials.
DEVELOPING AN EVALUATION SYSTEM FOR SWPBS Rob Horner and Bob Algozzine.
1 Module L R ole of Coaches Coaches’ Monthly Meeting Add DC Name Here.
Friday September 6, 2013 WELCOME TO THE 2013/2014 COACH KICKOFF!
Data Report July Collect and analyze RtI data Determine effectiveness of RtI in South Dakota in Guide.
Data-Based Decision Making: Using Data to Improve Implementation Fidelity & Outcomes.
Data Coach Session INGHAM INTERMEDIATE SCHOOL DISTRICT JANUARY 21, 2015.
Fidelity of Implementation A tool designed to provide descriptions of facets of a coherent whole school literacy initiative. A tool designed to provide.
Notes for Trainers (Day Training)
Click to edit Master title style Click to edit Master subtitle style 1/31/20161 If you modify this powerpoint, update the version information below. This.
Early Years MTSS Data Review Workday Fall Agenda  Welcome and Purpose  Program Data and Goals  Let’s Review  BIRS Reports  DIBELSnet Overview.
INITIAL TEAM TRAINING Presented by the MBI Consultants MODULE 7: Establishing Procedures for Data Collection.
Data-based Decision Making and Problem Solving in PBIS Schools VTPBiS Leadership Forum October 9, 2015.
Evaluation Tools and On-Line Systems Adapted from the Illinois PBIS Network.
Leadership Teams Implementing PBIS Module 14. Objectives Define role and function of PBIS Leadership Teams Define Leadership Team’s impact on PBIS implementation.
Spring Data Review Workday School Leadership Teams May 20 & 21, 2014.
Teaming/Data/Interventions RtI Infrastructure: Teaming RtI Partnership Coaches meeting January 6, 2011 Terry Schuster, RtI Partnership Lead Coach.
Module 2 : Using Data to Problem-Solve Implementation Issues Session 3 Phase I Team Training Presented by the MBI Consultants.
Schoolwide Systems Review: Module 3.0 Gather Cohort 7 Middle Schools.
VTPBiS Coordinators as Coaches Learning and Networking Meeting 1 May, 2016.
White Pages Team Grey Pages Facilitator Team & Facilitator Guide for School-wide Reading Leadership Team Meetings Elementary.
RTI: Big Ideas (Secondary Level) RESOURCES. Data-based instructional decision making model for MTSS Is this an individual student problem or a larger.
SCHOOL-WIDE POSITIVE BEHAVIORAL INTERVENTIONS AND SUPPORT: ADDRESSING THE BEHAVIOR OF ALL STUDENTS Benchmarks of Quality KENTUCKY CENTER FOR INSTRUCTIONAL.
(MTSS) Multi-Tiered System of Supports Charles R. Eccleston, District MTSS Trainer.
Middle School Training: Ensuring a Strong Foundation of Supports
WARMUP: Review as a team one celebration you would be willing to share out. If you have a challenge to problem-solve, also be ready to share out if willing.
Winter Data Review WARMUP: January 27, 2016
Presentation transcript:

Fall Data Review October 2, 2014 Moodle Site-Drop Box 1 WARMUP: Review as a team one celebration you would be willing to share out. If you have a challenge to problem-solve, also be ready to share out if willing.

Leadership team Purpose: DATA REVIEW  will have an opportunity to review the status of literacy, mathematics and behavior/social emotional systems  will note celebrations, create a communication plan and make Action Plans to continue to improve student achievement Todays presentation has been adapted from the following resources. Thank You!  MiBLSi Huron ISD Ingham ISD Florida RtI Project Il Spire  IES Practice Guides (Math/Reading)  RtI Innovation Conferences & Kansas MTSS 2

Materials you will need today  Data (Fall/Winter/Spring) Review Problem Solving Guide (electronic or handout)  Worked Example of Data Fall Review Problem Solving Guide (electronic or handout)  SIP Information, SWIS, SRSS, DIBELS, Aimsweb Reports & Tier 2-3 Tracking Forms, Process surveys from last year (available on pbisapps or electronic PET-Rs)  Last/This years Assessment Binder if team uses these  SAS  PET-R/SWEPT  Action Plans & Communication Plan 3

Learning Targets We can articulate…  the purpose and value in a Building/District Data Review.  how student outcomes data can inform the building/district about student performance.  how process data can inform the building/district about student performance.  how student outcome data and process data can be used together to inform building/district goals.  The purpose and steps in a problem solving process (gather data, identify and analyze problem(s), develop an action plan, evaluate the plan). 4

Agenda  Welcome  Data Overview  School Improvement Connections  Team Time  Lunch (Working Lunch: Sharing)  Team Time  Rejoin at 2:45 p.m.  Data days May 20 changed to May 21: Register ahead.  MTSS News  Exit Slip & Evaluation 5

Team Roles Facilitator Guide the discussion Keep the team focused Ensure the tasks on the “Complete the Exit Slip” are Completed Time Keeper Make sure the team is moving through the process efficiently Have team gather at 2:45 p.m. ASSIST Manager Make any modifications to SIP/DIP Action Plan Recorder Fill out Problem Solving Guide, At a Glance Overviews Record any “To Do’s” generated by the team Send completed Action Plan to team members 6

Review Outcome Data Reading/Math/Writing SWIS SRSS Did the school meet the benchmark? Review Process Data Reading/Math: SWEPT/PET-R M-TIC/PET-M Behavior: BOQ, BAT, SAS Social Emotional: SRSS-TBD Did the school meet the criterion for each measure? Connect Outcome Data to Process Data Reading/Math: Could the areas on the process measure that did not meet criterion explain scores below benchmark? Behavior: Could the areas on the process measure that did not meet criterion explain high behavioral incidents? Social Emotional: Process Measure TBD Generate Action Plan Identify the items on the process measures for reading/math and behavior to improve Set an action plan for each item Dr. Sara Lewandowski 7

Data Sets & Purpose: SRESD TrendOutcome Process  Grade Level (DN, AW) (IE) Or District  Sub-group (DN, AW) (IE)  Explore (IE) or (ACT)  MEAP (IE or MISchoolData.org)  Aimsweb/DIBELSNext Benchmark  Aimsweb ROI by Measure for Reading & Math (P, IE) at Building Level (P, IE)  Summary of Effectiveness (DN, P)  SWIS/Behavior, SRSS, Suspensions/Expulsions/Attendance (SWIS or PowerSchool: P)  Local Assessments (P, IE)  PET-R (P)/ SWEPT-R (P)  SAS  BAT  BoQ  PET-M * (P)  PET-M * Middle School (P)  MTSS-SA * (P)  PBIS/Reading/Math TICs  SET  ASSIST AW = AIMsweb; DN = DIBELSNext, IE = Illuminate Ed; P = Team Provided 8

Schoolwide Overview- Academics Where to find the academic data! 9

Appendix A-1 Elementary At-a-Glance School-wide Status Overview Using your data, highlight areas that need support PET-R/SWEPT: Planning and Evaluation Tool for reading and math; BoQ-PBIS Benchmarks of Quality; SAS-PBIS Self-Assessment Survey; BAT-PBIS Benchmarks of Advanced Tiers; PA- Phonemic Awareness; PSF-Phoneme Segmentation Fluency; AP-Alphabetic Principle; LSF-Letter Sound Fluency; NWF-Nonsense Word Fluency; FL-Fluency; Comprehension; Vocab-Vocabulary; R-CBM-Reading Curriculum Based Measure; SWIS-School-wide Information Systems; EWS-Early Warning Systems; MTSS-SA-Building Self-Assessment, OCM-Oral Counting Measures; NIM- Number Identification Measure, Quantity Discrimination Measure, Missing Number Measure. Adapted from MiBLSi/HISD materials,

Schoolwide Overview- Behavior/Social Emotional

Schoolwide Overview- Behavior

Process Data - Behavior Benchmarks of Quality (BoQ) Completed annually by school leadership teams Tier 1 SWPBIS implementation fidelity check 53 benchmarks across 10 critical elements of implementation. Identifies areas of strength and need; informs problem analysis and action planning. 70% Implementation Goal Self-Assessment Survey (SAS) Completed annually by building staff Fidelity check of PBIS implementation across (a) school wide, (b) non- classroom, (c) classroom, and (d) individual students Seven key elements of the Implementation Subsystems Informs of areas of strength and need, including communication between leadership team and staff 70% Implementation Goal pbisapps.org

Process Data - Behavior 14

Schoolwide Overview- Behavior/Social Emotional

Outcome Data - Behavior 16

Outcome Data - Behavior 17

OUT COME DATA: SRSS 18

Early Warning Signs 19

Early Warning Signs (EWS)  Routinely available data; available early in the school year  Better predictor than background characteristics  Cut points selected to balance yield and accuracy.  Helps target interventions  Informs of patterns and trends 20

Early Warning Signs (EWS) ATTENDANCE: Missing more than 10% of instructional time BEHAVIOR: Suspensions (ISS or OSS); Minor or Major ODRs  ISS or OSS: 6 hours of academic instruction lost per day  ODR: 20 minutes of academic instruction lost for student per referral COURSE PERFORMANCE: Course failures, grade point average; credit accrual  Combinations of academic indicators can reduce graduation likelihood to 55% 21

EWS Outcome Data - Building Level ATTENDANCE: > 90% missing less than 10% of instructional time  State of Ohio retrospective analysis of top/bottom 10% academic outcomes  1 st Semester 9 th grade better predictor than grades or failures BEHAVIOR: > 80% with 0 Suspensions (ISS or OSS)  “High Quality Instruction” research  MTSS Targeted Intervention COURSE PERFORMANCE: ACT-Explore Data  Course Failures (MTSS Model of 80% corrected for accuracy to 85-90%)  Credit Accrual is building-specific  Combinations of academic indicators can reduce graduation likelihood to 55% 22

Schoolwide Overview – Behavior Worked Example 23

Process Data Snapshots ACADEMICS & BEHAVIOR What are they good for? 24

We can use PET/SWEPT to answer the following questions— If we have less than 80% of our students at benchmark: What might be happening in our core instruction that might be contributing to that? What can we do to improve? Research indicates specific areas that will improve outcomes. These include Goals/Objectives/Priorities, Assessment, Instructional Practices and Materials, Instructional Time, Differentiated Instruction/Grouping, Administration/Organization/Communication, Professional Development. Does our MTSS system have the most effective steps in place? 25

PET-M SNAPSHOTS 26

MTSS: Building Self- Assessment Scale: N ot Started ( N ) — I n Progress ( I ) — A chieved ( A ) — M aintaining ( M ) — What Does BSA Data Tell you? 27

Process Data Snapshots BEHAVIOR Benchmarks of Quality (BoQ)  Tier 1 SWPBIS implementation fidelity check  53 benchmarks across 10 critical elements:  Identifies areas of strength and need to inform action plans  Completed annually by school leadership teams Self-Assessment Survey (SAS) Completed annually by building staff Fidelity check of PBIS implementation across (a) schoolwide, (b) non- classroom, (c) classroom, and (d) individual students Seven key elements of the Implementation Subsystems Informs of areas of strength and need, including communication 28

Process Data Snapshots: PBIS Benchmarks of Quality (BoQ) 29

Process Data Snapshots: PBIS Self-Assessment Survey (SAS) While summary data from the SAS provides a general sense of a building’s PBIS systems, more focused analysis can inform a team of the most vital and influential next steps. Low Implementation Status High Staff Priority PBIS Subsystem Targeted Implementation Supports 30

Process Data Snapshots: PBIS Self-Assessment Survey (SAS) 31

Barriers? The PET/SWEPT/BoQ/SAS are good guides for designing and improving a building MTSS system. However, they also can also present many perceived barriers to work around. Remember, think outside of the box! What can we do differently? 32

Data Connections Activity  Pick a piece of candy from the bowl on your table.  Take one Candy Wrapper activity sheet (or a sticky note) from your table and find another individual (not from your district)that has the same piece of candy.  Ideas to share with one another…and make notes  How data sets reviewed so far are used in each district?  What other types of trend or outcome data are used in your district?  What sub-groups do your district look at?  How often does your leadership team review data? 33

Problem Solving Guide 34

Problem Solving Guide: Step 1 Determine your (first) problem to be addressed today based one what you’ve derived from: Previous SIP Overviews of Academics and Behavior (At a Glances) Outcome Data Process Data and Process Data Snapshots 35

Problem Solving Guide: Step 2 Complete a Problem Analysis: Hypothesize what may be contributing to the problem Again, your data and the Snapshots can inform this discussion. 36

Problem Solving Guide: Worked Example 37

Problem Solving Guide: Worked Example 38

Problem Solving Guide: Worked Example 39

Problem Solving Guide: Worked Example 40

Problem Solving Guide 41

Problem Solving Guide- Step 3 42

Problem Solving Guide Steps 3 & 4 43

Summary information from Data Reviews (Math, Reading, Behavior, SSRS, EWS) should be used to:  Inform (create/revise) plan’s measureable objectives, strategies & activities  Monitor and evaluate SI strategies & activities District/School Improvement Plans 44

Goals ObjectivesStrategiesActivities Student Goal Statement: “ All students will increase proficiency in mathematics/re ading/writing, etc.” Resources Measurable Objective Statement(s): What will happen, with which students, by when At least 1 must address state assessment At least 1 should address critical objectives across one or more tiers of support Strategies: What staff will do to attain goal & objectives What teachers will implement Research- based Practices Clear connection to Consolidated Application/ Title I budget Activities: Activity Description Activity Type Planned/Actual Staff Planned/Actual Timeline Include actions to Monitor & Evaluate strategy implementation Clear connection to Consolidated Application/Title I budget Resources Funding Source Planned/ Actual Amount Improvement Plans + Problem Solving Guide = Student Achievement 45

The Building Leadership Team does not have to solve every problem but does need to study building data to determine school-wide needs they will address along with identifying grade-level needs and ensuring the appropriate individual(s) who will address these needs are identified (e.g., which grade-level teams need to address the identified needs) Remember…

Team Time: Complete School-wide Overview Sheets (Behavior/Academic) and/or Problem Solving Guide  Complete overview sheets  Review/update previous action plan  Identify building Celebrations and Opportunities: Share out  Prioritize “Problems” for today’s process You do! 47

Team Time: Complete Problem Solving Process  Choose a problem; complete the problem solving process and create an action plan.  Move on to second (and third) problem, if able.  Communication Plan  LUNCH at 12:00  Reconvene at 2:45  Updates: Next Year  Exit Slip You do! 48

Check when complete Problem Solving Guide: Literacy Problem Solving Guide: Mathematics Data Review Problem Solving Guide (Large Packet) Special Education Tracking Form (Coaches) Action Plan (Appendix C) Communication Plan (Appendix D) Building Summary Report for District Data Review (Appendix E) This item is critical because it will inform future District-level Data Reviews. Completed Data Review Guide ed to Rebecca Buxton: Exit Checklist Building Data Review Please have your time-keeper check-off each task when completed using the table below. Turn this form in to MTSS Facilitator at the end of the day, verifying that your completed Data Review Guide was sent to: Date:________________ Building:_______________________District:____________________________

Session Evaluation Rate your knowledge/skills/ competence for the following items upon the completion of todays’ Data review. 50