Presentation is loading. Please wait.

Presentation is loading. Please wait.

Fall Data Review October 2, 2014 Moodle Site-Drop Box 1 WARMUP: Review as a team one celebration you would be willing to share out. If you have a challenge.

Similar presentations


Presentation on theme: "Fall Data Review October 2, 2014 Moodle Site-Drop Box 1 WARMUP: Review as a team one celebration you would be willing to share out. If you have a challenge."— Presentation transcript:

1 Fall Data Review October 2, 2014 Moodle Site-Drop Box 1 WARMUP: Review as a team one celebration you would be willing to share out. If you have a challenge to problem-solve, also be ready to share out if willing.

2 Leadership team Purpose: DATA REVIEW  will have an opportunity to review the status of literacy, mathematics and behavior/social emotional systems  will note celebrations, create a communication plan and make Action Plans to continue to improve student achievement Todays presentation has been adapted from the following resources. Thank You!  MiBLSi Huron ISD Ingham ISD Florida RtI Project Il Spire  IES Practice Guides (Math/Reading)  RtI Innovation Conferences & Kansas MTSS 2

3 Materials you will need today  Data (Fall/Winter/Spring) Review Problem Solving Guide (electronic or handout)  Worked Example of Data Fall Review Problem Solving Guide (electronic or handout)  SIP Information, SWIS, SRSS, DIBELS, Aimsweb Reports & Tier 2-3 Tracking Forms, Process surveys from last year (available on pbisapps or electronic PET-Rs)  Last/This years Assessment Binder if team uses these  SAS  PET-R/SWEPT  Action Plans & Communication Plan 3

4 Learning Targets We can articulate…  the purpose and value in a Building/District Data Review.  how student outcomes data can inform the building/district about student performance.  how process data can inform the building/district about student performance.  how student outcome data and process data can be used together to inform building/district goals.  The purpose and steps in a problem solving process (gather data, identify and analyze problem(s), develop an action plan, evaluate the plan). 4

5 Agenda  Welcome  Data Overview  School Improvement Connections  Team Time  Lunch (Working Lunch: Sharing)  Team Time  Rejoin at 2:45 p.m.  Data days 2014-15 May 20 changed to May 21: Register ahead.  MTSS News  Exit Slip & Evaluation 5

6 Team Roles Facilitator Guide the discussion Keep the team focused Ensure the tasks on the “Complete the Exit Slip” are Completed Time Keeper Make sure the team is moving through the process efficiently Have team gather at 2:45 p.m. ASSIST Manager Make any modifications to SIP/DIP Action Plan Recorder Fill out Problem Solving Guide, At a Glance Overviews Record any “To Do’s” generated by the team Send completed Action Plan to team members 6

7 Review Outcome Data Reading/Math/Writing SWIS SRSS Did the school meet the benchmark? Review Process Data Reading/Math: SWEPT/PET-R M-TIC/PET-M Behavior: BOQ, BAT, SAS Social Emotional: SRSS-TBD Did the school meet the criterion for each measure? Connect Outcome Data to Process Data Reading/Math: Could the areas on the process measure that did not meet criterion explain scores below benchmark? Behavior: Could the areas on the process measure that did not meet criterion explain high behavioral incidents? Social Emotional: Process Measure TBD Generate Action Plan Identify the items on the process measures for reading/math and behavior to improve Set an action plan for each item Dr. Sara Lewandowski 7

8 Data Sets & Purpose: SRESD TrendOutcome Process  Grade Level (DN, AW) (IE) Or District  Sub-group (DN, AW) (IE)  Explore (IE) or (ACT)  MEAP (IE or MISchoolData.org)  Aimsweb/DIBELSNext Benchmark  Aimsweb ROI by Measure for Reading & Math (P, IE) at Building Level (P, IE)  Summary of Effectiveness (DN, P)  SWIS/Behavior, SRSS, Suspensions/Expulsions/Attendance (SWIS or PowerSchool: P)  Local Assessments (P, IE)  PET-R (P)/ SWEPT-R (P)  SAS  BAT  BoQ  PET-M * (P)  PET-M * Middle School (P)  MTSS-SA * (P)  PBIS/Reading/Math TICs  SET  ASSIST AW = AIMsweb; DN = DIBELSNext, IE = Illuminate Ed; P = Team Provided 8

9 Schoolwide Overview- Academics Where to find the academic data! 9

10 Appendix A-1 Elementary At-a-Glance School-wide Status Overview Using your data, highlight areas that need support PET-R/SWEPT: Planning and Evaluation Tool for reading and math; BoQ-PBIS Benchmarks of Quality; SAS-PBIS Self-Assessment Survey; BAT-PBIS Benchmarks of Advanced Tiers; PA- Phonemic Awareness; PSF-Phoneme Segmentation Fluency; AP-Alphabetic Principle; LSF-Letter Sound Fluency; NWF-Nonsense Word Fluency; FL-Fluency; Comprehension; Vocab-Vocabulary; R-CBM-Reading Curriculum Based Measure; SWIS-School-wide Information Systems; EWS-Early Warning Systems; MTSS-SA-Building Self-Assessment, OCM-Oral Counting Measures; NIM- Number Identification Measure, Quantity Discrimination Measure, Missing Number Measure. Adapted from MiBLSi/HISD materials, 2013-14. 10

11 Schoolwide Overview- Behavior/Social Emotional

12 Schoolwide Overview- Behavior

13 Process Data - Behavior Benchmarks of Quality (BoQ) Completed annually by school leadership teams Tier 1 SWPBIS implementation fidelity check 53 benchmarks across 10 critical elements of implementation. Identifies areas of strength and need; informs problem analysis and action planning. 70% Implementation Goal Self-Assessment Survey (SAS) Completed annually by building staff Fidelity check of PBIS implementation across (a) school wide, (b) non- classroom, (c) classroom, and (d) individual students Seven key elements of the Implementation Subsystems Informs of areas of strength and need, including communication between leadership team and staff 70% Implementation Goal pbisapps.org

14 Process Data - Behavior 14

15 Schoolwide Overview- Behavior/Social Emotional

16 Outcome Data - Behavior 16

17 Outcome Data - Behavior 17

18 OUT COME DATA: SRSS 18

19 Early Warning Signs 19

20 Early Warning Signs (EWS)  Routinely available data; available early in the school year  Better predictor than background characteristics  Cut points selected to balance yield and accuracy.  Helps target interventions  Informs of patterns and trends 20

21 Early Warning Signs (EWS) ATTENDANCE: Missing more than 10% of instructional time BEHAVIOR: Suspensions (ISS or OSS); Minor or Major ODRs  ISS or OSS: 6 hours of academic instruction lost per day  ODR: 20 minutes of academic instruction lost for student per referral COURSE PERFORMANCE: Course failures, grade point average; credit accrual  Combinations of academic indicators can reduce graduation likelihood to 55% 21

22 EWS Outcome Data - Building Level ATTENDANCE: > 90% missing less than 10% of instructional time  State of Ohio retrospective analysis of top/bottom 10% academic outcomes  1 st Semester 9 th grade better predictor than grades or failures BEHAVIOR: > 80% with 0 Suspensions (ISS or OSS)  “High Quality Instruction” research  MTSS Targeted Intervention COURSE PERFORMANCE: ACT-Explore Data  Course Failures (MTSS Model of 80% corrected for accuracy to 85-90%)  Credit Accrual is building-specific  Combinations of academic indicators can reduce graduation likelihood to 55% 22

23 Schoolwide Overview – Behavior Worked Example 23

24 Process Data Snapshots ACADEMICS & BEHAVIOR What are they good for? 24

25 We can use PET/SWEPT to answer the following questions— If we have less than 80% of our students at benchmark: What might be happening in our core instruction that might be contributing to that? What can we do to improve? Research indicates specific areas that will improve outcomes. These include Goals/Objectives/Priorities, Assessment, Instructional Practices and Materials, Instructional Time, Differentiated Instruction/Grouping, Administration/Organization/Communication, Professional Development. Does our MTSS system have the most effective steps in place? 25

26 PET-M SNAPSHOTS 26

27 MTSS: Building Self- Assessment Scale: N ot Started ( N ) — I n Progress ( I ) — A chieved ( A ) — M aintaining ( M ) — What Does BSA Data Tell you? 27

28 Process Data Snapshots BEHAVIOR Benchmarks of Quality (BoQ)  Tier 1 SWPBIS implementation fidelity check  53 benchmarks across 10 critical elements:  Identifies areas of strength and need to inform action plans  Completed annually by school leadership teams Self-Assessment Survey (SAS) Completed annually by building staff Fidelity check of PBIS implementation across (a) schoolwide, (b) non- classroom, (c) classroom, and (d) individual students Seven key elements of the Implementation Subsystems Informs of areas of strength and need, including communication 28

29 Process Data Snapshots: PBIS Benchmarks of Quality (BoQ) 29

30 Process Data Snapshots: PBIS Self-Assessment Survey (SAS) While summary data from the SAS provides a general sense of a building’s PBIS systems, more focused analysis can inform a team of the most vital and influential next steps. Low Implementation Status High Staff Priority PBIS Subsystem Targeted Implementation Supports 30

31 Process Data Snapshots: PBIS Self-Assessment Survey (SAS) 31

32 Barriers? The PET/SWEPT/BoQ/SAS are good guides for designing and improving a building MTSS system. However, they also can also present many perceived barriers to work around. Remember, think outside of the box! What can we do differently? 32

33 Data Connections Activity  Pick a piece of candy from the bowl on your table.  Take one Candy Wrapper activity sheet (or a sticky note) from your table and find another individual (not from your district)that has the same piece of candy.  Ideas to share with one another…and make notes  How data sets reviewed so far are used in each district?  What other types of trend or outcome data are used in your district?  What sub-groups do your district look at?  How often does your leadership team review data? 33

34 Problem Solving Guide 34

35 Problem Solving Guide: Step 1 Determine your (first) problem to be addressed today based one what you’ve derived from: Previous SIP Overviews of Academics and Behavior (At a Glances) Outcome Data Process Data and Process Data Snapshots 35

36 Problem Solving Guide: Step 2 Complete a Problem Analysis: Hypothesize what may be contributing to the problem Again, your data and the Snapshots can inform this discussion. 36

37 Problem Solving Guide: Worked Example 37

38 Problem Solving Guide: Worked Example 38

39 Problem Solving Guide: Worked Example 39

40 Problem Solving Guide: Worked Example 40

41 Problem Solving Guide 41

42 Problem Solving Guide- Step 3 42

43 Problem Solving Guide Steps 3 & 4 43

44 Summary information from Data Reviews (Math, Reading, Behavior, SSRS, EWS) should be used to:  Inform (create/revise) plan’s measureable objectives, strategies & activities  Monitor and evaluate SI strategies & activities District/School Improvement Plans 44

45 Goals ObjectivesStrategiesActivities Student Goal Statement: “ All students will increase proficiency in mathematics/re ading/writing, etc.” Resources Measurable Objective Statement(s): What will happen, with which students, by when At least 1 must address state assessment At least 1 should address critical objectives across one or more tiers of support Strategies: What staff will do to attain goal & objectives What teachers will implement Research- based Practices Clear connection to Consolidated Application/ Title I budget Activities: Activity Description Activity Type Planned/Actual Staff Planned/Actual Timeline Include actions to Monitor & Evaluate strategy implementation Clear connection to Consolidated Application/Title I budget Resources Funding Source Planned/ Actual Amount Improvement Plans + Problem Solving Guide = Student Achievement 45

46 The Building Leadership Team does not have to solve every problem but does need to study building data to determine school-wide needs they will address along with identifying grade-level needs and ensuring the appropriate individual(s) who will address these needs are identified (e.g., which grade-level teams need to address the identified needs) Remember…

47 Team Time: Complete School-wide Overview Sheets (Behavior/Academic) and/or Problem Solving Guide  Complete overview sheets  Review/update previous action plan  Identify building Celebrations and Opportunities: Share out  Prioritize “Problems” for today’s process You do! 47

48 Team Time: Complete Problem Solving Process  Choose a problem; complete the problem solving process and create an action plan.  Move on to second (and third) problem, if able.  Communication Plan  LUNCH at 12:00  Reconvene at 2:45  Updates: Next Year  Exit Slip You do! 48

49 Check when complete Problem Solving Guide: Literacy Problem Solving Guide: Mathematics Data Review Problem Solving Guide (Large Packet) Special Education Tracking Form (Coaches) Action Plan (Appendix C) Communication Plan (Appendix D) Building Summary Report for District Data Review (Appendix E) This item is critical because it will inform future District-level Data Reviews. Completed Data Review Guide e-mailed to Rebecca Buxton: buxton@sresd.org Exit Checklist Building Data Review Please have your time-keeper check-off each task when completed using the table below. Turn this form in to MTSS Facilitator at the end of the day, verifying that your completed Data Review Guide was sent to: buxton@sresd.org.buxton@sresd.org Date:________________ Building:_______________________District:____________________________

50 Session Evaluation Rate your knowledge/skills/ competence for the following items upon the completion of todays’ Data review. 50


Download ppt "Fall Data Review October 2, 2014 Moodle Site-Drop Box 1 WARMUP: Review as a team one celebration you would be willing to share out. If you have a challenge."

Similar presentations


Ads by Google