Presentation is loading. Please wait.

Presentation is loading. Please wait.

Spring Data Review Cohort 7 Spring 2013 Elementary Schools

Similar presentations


Presentation on theme: "Spring Data Review Cohort 7 Spring 2013 Elementary Schools"— Presentation transcript:

1 Spring Data Review Cohort 7 Spring 2013 Elementary Schools
Trainer Notes: This will be C7 last core training with MiBLSi. Spring 2013 Elementary Schools

2 Acknowledgements The materials for this training day were developed with the efforts of… Melissa Nantais Kim St. Martin Anna Harms Jennifer Rollenhagen Tennille Whitmore Content was based on the work of… Roland Good, University of Oregon Stephanie Stollar, Dynamic Measurement Group (DMG) Rob Horner, University of Oregon George Sugai, University of Connecticut Joe Torgesen, Florida Center for Reading Research Dawn Miller, Shawnee Mission School District Michigan’s RtI State Framework and Guidance Committee Trainer Notes: This slide contains acknowledgements of those who contributed to the development of this content along with whose work this content is based upon.

3 Group Expectations To make this day the best possible, we need your assistance and participation Be Responsible Attend to the “Come back together” signal Active participation…Please ask questions Be Respectful Please allow others to listen Please turn off cell phones and pagers Please limit sidebar conversations Share “air time” Please refrain from and Internet browsing Be Safe Take care of your own needs Trainer Notes: Please take a few moments to review the group expectations at the start of the day.

4 Purpose of Spring Data Review Activities
Review the role of the school leadership team in sustaining the work of MTSS implementation Make the work of MTSS visible within the school improvement plan Provide teams with time and a structure to review school-wide data for the purposes of developing a plan that will improve student outcomes Provide teams with time and a structure to identify and summarize celebrations and areas of need to share with stakeholders Trainer Notes: This slide contains the purpose of Spring Data Review activities. Be sure to review the purpose with participants.

5 Today’s Agenda Introduction
Gather: Ensuring continued accurate and efficient collection and use of data Study: Understanding the data, strengths, areas of need, gap and cause for gap Plan: Integrate improvement priorities into the school improvement plan Do: Monitoring the plan and next steps Trainer Notes: This slide contains the agenda for today’s spring data review. Please consider posting the agenda on chart paper at the front of the room to refer to throughout the day.

6 1.0 Introduction Trainer Notes:
This module provides an introduction to today’s data review. It should begin by 9:05 a.m. and end by 9:30 a.m.

7 Getting Ready for Today
Take a moment to identify the following roles: Facilitator Recorder Timekeeper It will be helpful for the recorders to have access to someone’s computer! Trainer Notes: Take a few moments to allow the implementation teams to determine roles for the day

8 Do You Have What You Need?
Pink Assessment Binder Paper or electronic copies of your data Follow-up activities worksheets/action plans from Fall and Winter Data Reviews School Improvement Plan Trainer Notes: The intent of this slide is to provide a quick review of materials needed for today’s training.

9 Building-Level Data Review: Big Ideas
Completed three times per year Problem-solving focus tied to School Improvement Plan Focused on both Program Quality/Fidelity data & Student Outcome data Results in an action plan that specifies what needs to be done, by whom & by when, and the resources needed Trainer Notes: This slide provides a review of the big ideas behind building-level data review. Please take a few moments to review the list with the group.

10 Purpose of a Building-Level Data Review
To understand the status of MTSS implementation and impact on student outcomes To identify small and large successes, communicate those successes and capitalize on them To identify where support is needed and begin communicating and organizing resources so that support is provided where needed Trainer Notes: This slide provides a review of building-level data review. Have participants read through these bullets on their own.

11 Team Share At the Winter Data Review we asked each team to share one aspect of their implementation plan that the group could hold them accountable for in May Briefly review your progress with your team Identify at least one aspect of this year’s implementation efforts related to what your team was being held accountable for that has gone well – Be prepared to share Trainer Notes: This team share should begin by 9:10 a.m. and end by 9:30 a.m. It may end earlier depending on the size of your group. There are a variety of ways you can structure the share out of this team time. Please choose an option that will work best for your audience. Here are some suggestions (please note that you do not need to choose one of these options if there is another option you would like to use): For a small to medium group – Same role partner; find someone at another table who has the same job role as you (e.g., teacher, principal, ISD staff, coach, etc.); once you have a partner, have a standing conversation, each sharing your groups’ implementation success; at the signal, thank your partner and return to your team For a medium to large sized group – Go Visual/Museum Tour; create a poster (with a visual) representing what your team was being held accountable for and your success with implementation; designate one team member to stay with your poster and answer questions/ provide clarification; all other team members go on a “museum tour” looking at what other teams are doing and making notes to take back to their team

12 2.0 Gather Trainer Notes: This module should begin by 9:30 a.m. and end at 10:00 a.m.

13 AIMSweb Data Collection
MiBLSi does not have direct access to pull data from your AIMSweb accounts. Cohort 7 schools need to submit summaries of their AIMSweb screening data to MiBLSi by using a spreadsheet. AIMSweb Data Sheet for 2012_13 (send to or your TAP) 8 teams sent their spreadsheets after Winter screening. We use this data for problem solving to provide the best supports possible. We also use summaries of the data for grant reporting. [Skip this]

14 Cohort 1-7 District Data Sharing Agreements
The letter provides information about how MiBLSi uses data collected from schools. NEED TO ACT: Please have either the UO DIBELS Data System or DIBELSnet form signed by your district’s superintendent or assistant superintendent (someone who can provide permission for the entire district). We need the DIBELSnet form to access data from schools that have switched to this data system for DIBELS Next. We need the UO DIBELS Data System form as part of an update process to comply with recent changes to FERPA requirements. Even though the form is being signed by the district, we will not access data from schools in C1-7 that have not participated with us. [skip]

15 MIBLSI ASKS SCHOOLS AND DISTRICTS TO USE DATA FOR:
Data-based decision making as part of a continuous school improvement process to improve student outcomes through effective/efficient implementation of research- based practices. MIBLSI COLLECTS DATA FROM SCHOOLS AND USES IT FOR: Project-level data-based decision making to inform allocation of resources and effective programming support Accountability to our grant funding sources [Ignore bottom part]. This slide talks about why this data is important. Data is used as part of an on-going school improvement process, where we look at student outcome data and systems process data to determine if our instruction is working and if it’s done with fidelity.

16 Rates of Data Submission: Cohort 7
You can ignore the results of this graph, which looks at what percentage of MiBLSi schools submitted each evaluation piece. [Not SE Data Collection Form]

17 Gather We want to gather information that tells us:
How well we are implementing/doing something: Program Quality/Fidelity Data AND Whether what we’re doing is working: Student Outcome Data The data you’ve collected this year has been a combination of program quality/fidelity/systems process data and student outcome data. So, how well are we implementing something and is it resulting in improved performance? Remember that these two things are closely related, and without fidelity, an otherwise effective system is unlikely to result in student improvement.

18 Why Do We Want Both Types of Data?
Trainer Notes: Download and embed the video “Data in the Classroom” onto this slide. The intent of the video is to emphasize the importance of having both types of data – student outcome data and the program quality/fidelity data. When you watch the video your attention is on the data the announcer asks for – the number of red cards dealt. However, when you watch the video a second time, most people will now notice the additional data, the message on the cards that says “Be sure to use at least 2 types of data.” The two types of data we need to make sure we are looking at together to get the complete picture are student outcome data and program quality/fidelity data.

19 Program Quality/Fidelity Data
Behavior Data You’ve Collected… Student Outcome Data Program Quality/Fidelity Data Office Discipline Referral Data Tier 2/ Tier 3 Intervention Tracking form Benchmarks of Quality (BoQ) PBIS Self-Assessment Survey (SAS) Benchmarks for Advanced Tiers (BAT) School-wide Evaluation Tool (SET) Trainer Notes: Here is a list of the behavior data teams have been collecting since beginning with the project.

20 Program Quality/Fidelity Data
Literacy Data You’ve Collected… Student Outcome Data Program Quality/Fidelity Data Reading Curriculum Based Measurement (R-CBM) screening & progress monitoring data Tier 2/ Tier 3 Intervention Tracking form Planning and Evaluation Tool –Revised (PET-R) or School-wide Evaluation and Planning Tool (SWEPT) Trainer Notes: Here is a list of the literacy data teams have been collecting since beginning with the project.

21 Considerations in Building Sustainable Systems of Data-Based Decision Making
Data Collection Training (Initial and Re-Training) Accuracy Checks for Administration & Scoring Scheduling of Assessments Data Entry Time for Data Entry Accuracy of Data Entry Training for Data Entry (Initial and Re-Training) Data Sharing Training in Interpretation of Data (Initial and Re-Training) Ensuring Timely Access to Data Formal and Informal Data Sharing Trainer Notes: The goal is to ensure accurate collection of data that is accessible in a timely fashion for ongoing decision making at the school-wide, grade level, classroom, and individual student levels. The intent of this slide and the upcoming activity is to help teams to consider how to ensure data-based decision making remains a part of the sustainable system that will continue after their formal partnership and trainings with the project have ended.

22 Sustaining Data Collection & Review
Supports from MiBLSi: Measurement Schedule Measurement page on the MiBLSi website Reading Data Coordinator listserv PBIS Assessment listserv SWIS Facilitator listserv Training materials on the MiBLSi website for data review DIBELS Mentor training (August) SWIS Facilitator trainings (Fall 2013) Trainer Notes: This slide provides a review of the continued supports the project will provide to assist with sustaining data collection and review after formal trainings with the project has ended.

23 Sustaining Data Collection & Review
Trainer Notes: We know that teams often encounter “road blocks” or barriers to the continued collection and use of data for decision-making after their time with the project. Since our focus has always been on sustainable practices, teams need to begin to identify what the potential road blocks might be in order to brainstorm ways to prevent the road block or to get around the road blocks if/when they come up.

24 Example Planning Sheet
Trainer Notes: This is a screen shot of an example Sustaining Data Collection & Analysis Planning Sheet. Walk teams through the example, focusing on the ODR row with the anticipated barrier, brain storming action items and identification of the necessary supports to address the potential barrier. Teams will use a similar process during the Team Time on the next slide.

25 Team Time Review the Measurement Schedule and determine how to best ensure that data are collected during the school year Complete the Planning Sheet by identifying any potential barriers to continued data collection and data reviews for the school year Brain storm strategies to prevent or overcome the potential barriers Identify who can help address any remaining barriers Trainer Notes: Provide teams with 20 minutes to complete this activity. This team time should begin by 9:40 a.m. and end at 10:00 a.m. The intent of this activity is to help teams to consider how to ensure data-based decision making remains a part of the sustainable system that will continue after their formal partnership and trainings with the project have ended. Teams are also asked to anticipate potential barriers to continued data collection and analysis and brainstorm how to address these potential barriers. There is a worksheet in the participant workbook to help teams through this Team Time. The worksheet is titled “Sustaining Data Collection & Analysis Planning Sheet”. Provide a break from 10:00 a.m. to 10;15 a.m.

26 3.0 Study Trainer Notes: This module should begin at 10:15 a.m. and end at 12:00 p.m. when teams break for lunch.

27 Using Data for Decision Making
Trainer Notes: Download and embed the video entitled “School Leadership Videos – Using Data”. The intent of this video is to stress the importance of the data review process and link to school improvement planning.

28 Some times, reviewing data can be awkward…
Trainer Notes: Download and embed the video clip “Do you trust your data”. The intent of this video is to convey the message that sometimes reviewing data can be awkward, especially if someone does not trust the data. Teams need to be aware of the emotions that may come along with reviewing data and stay focused on the things that are within their control.

29 Program Quality / Fidelity Data
Understanding the Parts of a School Improvement Plan Program Quality / Fidelity Data Student Outcome Data Trainer Notes: This slide is animated. The intent of this slide is to outline the steps of the School Improvement Process and to point out where program quality/fidelity data and student outcome data fit into the School Improvement Planning Process.

30 Review the definition of goals and objectives and share with Partner 2
Partner Activity Partner 1: Review the definition of goals and objectives and share with Partner 2 Partner 2: Review the definition of strategies and activities and share with Partner 1 Trainer Notes: Provide 5 minutes for this activity.

31 Questions & Data Source for Building-Level Data Analysis
Trainer Notes: These screen shots of the two page document “Questions and Data Sources for Building-Level Analysis” are intended to help teams organize the study process coming up. Before teams get started studying their data, it is important to point out a few important parts to this document. Have teams open up their workbook to these two pages. On the first page have them highlight the box after question #6 and stress that the answers to items 1-6 should allow teams to identify the gap statement and write precise problem statements. On the second page have participants highlight the last box and stress that the use of the program quality/fidelity data along with the behavior data (remember the integration of reading and behavior) in questions 7-15 will help teams formulate a the cause for gap statement. There are three versions of this document in the participant workbook – one for DIBELS net, one for DDS, and one for AIMSweb. Be sure to have participants locate the appropriate version for the data system they use.

32 Activity Independently:
Review the document “Questions and Data Sources for Building-Level Analysis” As a Team: Identify any questions or data sources that your team needs additional clarification around Trainer Notes: Provide teams with 5-10 minutes for this activity. Be sure to circulate the room while teams are engage in this activity. Debrief with the group after the activity to determine which data sources teams may need additional supports with. The following 34 slides are hidden and trainers can stop hiding the slides that are needed for reviewing data reports with teams.

33 Effectiveness of Instructional
Support Reports

34 Effectiveness of Instructional Support Levels Report examines the effectiveness of a school’s instructional support by grouping students by their benchmark status category at one assessment period and then determining how well that group did at the next assessment period.

35 Contains data on the performance of the entire grade at the middle of the year based on the composite score. Contains data on the performance of the entire grade at the end of the year based on the composite score.

36 Of the students who were at or above benchmark in the winter, what percent remained at or above benchmark in the spring? The data indicate that of the 56 students who were at or above benchmark in the winter, 84% (47) remained at or above benchmark in the spring.

37 Of the students who were below benchmark in the winter, what percent moved from below benchmark to at or above benchmark in the spring? The data indicate that of the 8 students who were below benchmark in the winter, 67% (8) moved to at or above benchmark in the spring.

38 Of the students who were well below benchmark in the winter, what percent moved to below benchmark or at or above benchmark in the spring? The data indicate that of the 1 student who was below benchmark in the winter, 100% (1) moved to below benchmark and at or above benchmark in the spring.

39 Status Report by Subgroup

40 DIBELS Next Distribution Report
Trainer Notes: This is a screenshot of a DIBELS Next Distribution Report.

41 DIBELS Next Distribution Report by Subgroup
Trainer Notes: This is a screenshot of the DIBEL Next Distribution Report by subgroup. Remind teams that it is important that they are examining their data disaggregated by subgroups such as race/ethnicity and SES. To be able to run these reports, teams must be sure to enter relevant student information into the data system.

42 DIBELS Next Cross Year Box Plot
Trainer Notes: This is a screen shot of the DIBELS Next Cross Year Box Plot graph. This report is helpful to teams when examining data over time.

43 DIBELS Next Summary of Effectiveness Report
Trainer Notes: This is a screen DIBELS Next Summary of Effectiveness Report.

44 AIMSweb Tier Transition Report
Trainer Notes: This is a screenshot of the AIMSweb Tier Transition Report.

45 AIMSweb Tier Transition Report by Subgroup
Expand Report Options Select group Trainer Notes: The AIMSweb Tier Transition Report by subgroup looks identical to the typical Tier Transition Report. This provides a screen shot of how to select the options to run this report by subgroup. Click Display

46 Reading System of Support?
How Effective is our Reading System of Support? Meets the Needs of… And Supports… Core At least 80% of all students Support % of students to make adequate progress Strategic 15% of students who need more than just Core Supports 80% of these students to achieve benchmark Intensive 5% of students who need intensive intervention Supports 80% of these students to progress to strategic or benchmark support

47 Electronic Version of the Planning & Evaluation Tool-Revised (PET-R)
.”

48 Electronic Version of the Planning & Evaluation Tool-Revised (PET-R)

49 Self Assessment Survey (SAS)
Total Score Report School-wide

50 Self Assessment Survey (SAS)
Total Score Report Non-Classroom

51 Self Assessment Survey (SAS)
Total Score Report Classroom

52 Self Assessment Survey (SAS)
Total Score Report Individual System

53 Self Assessment Survey (SAS)
Subscale Report

54 Self-Assessment Survey (SAS) Subscale Report

55 Benchmarks of Quality (BoQ) Total Score Report
Target = 70% 51%

56 Benchmarks of Quality (BoQ) Subscale Report

57 Benchmarks of Quality (BoQ) Items Report

58 Benchmarks for Advanced Tiers
Total Score Report

59 Benchmarks for Advanced Tiers
Subscales Report

60 Benchmarks for Advanced Tiers
Item Report

61 School-wide Information System
(SWIS) Average Referrals Per Day Per Month Referrals by Problem Behavior Referrals by Location Referrals by Time of Day Referrals by Student

62 Average Referrals Per Day Per Month
SWIS Average Referrals Per Day Per Month

63 Referrals by Problem Behavior
SWIS Referrals by Problem Behavior

64 SWIS Referrals by Location

65 SWIS Referrals by Time

66 SWIS Referrals by Student

67 SWIS Ethnicity Report

68 Role of the School Leadership Team
Acts on school-wide data (Program Quality/Fidelity and Student Outcomes) on a regular basis Sends grade level specific information to the grade level staff to address during grade level meetings Provides all stakeholders with an overview of the data and areas for celebration and areas targeted for growth. This includes teachers, support staff, volunteers and parents. Utilizes work groups to address relevant needs Sends school-wide information to district level staff Trainer Notes: This slide provides an outline of the role of the School Leadership Team related to data review activities.

69 Cascading Model of Support
MiBLSi ISD Leadership LEA District Building Identifies school-wide concerns and grade level specific concerns; Develops action plan based on building level data and concerns and in alignment with the district goals for MTSS implementation; “Turfs” grade level specific concerns to grade level teams; Responsible for implementing plans and communicating successes/challenges on a regular basis using data anchor information Trainer Notes: This is a screenshot of the cascading model of support. This is an animated slide. The animations highlight the various levels of supports we are focused on along with their specific roles and responsibilities. It is important that teams recall the role of the leadership team in examining school-wide data, which is the purpose of today’s activities. Building Staff Learns the strategies and practices necessary to effectively teach critical skills; Analyzes data at the classroom and grade level to identify areas of success and need; Communicates needs to building team so the needs can be addressed Students Improved academics and behavior

70 Remember… The Building Leadership Team does not have to solve every problem but does need to study building data to determine school-wide needs they will address along with identifying grade-level needs and ensuring the appropriate individual(s) who will address these needs are identified (e.g., which grade-level teams need to address the identified needs) Trainer Notes: This is a reminder to the teams that they are not being asked to solve every problem related to the data they are reviewing but rather to study the data to determine which school-wide needs they will address and identify any specific grade-level needs that will be addressed by the grad- level teams.

71 Making Sense of Student Outcomes and Program Quality / Fidelity
Stay the course and work to do it better (elaboration and continuous regeneration) Explore accuracy of program quality/ fidelity data. Determine if enough time has passed to expect changes in student outcomes (keep initial implementation on track and move into full implementation) Explore accuracy of data. Examine what else is happening / present that could be contributing to strong student outcomes (keep working to do it right through initial and full implementation) Consider intensive implementation supports (revisit exploration / adoption and installation) Trainer Notes: Let’s unpack this graphic. If we start with the green box in the top left corner, this is a situation when both the program quality/fidelity data and student outcome data are both on-track. When this occurs, teams should keep doing what they have been doing and the focus can shift to working to things even better. Moving to the yellow box in the top right corner, this is a situation when program quality/fidelity data is high but student outcome data are in need of support. In this case, the first step a team would take is to explore the accuracy of the program quality/fidelity data. The team would also want to consider if enough time has passed as to be able to expect changes in student outcomes. When it has been verified that the program quality/fidelity data are accurate, then teams should focus on keeping initial implementation on track and expand to full implementation. Teams will also need to keep an eye on student outcome data. Moving to the bottom left corner, this is a situation where the program quality/fidelity scores are low and student outcome data are high. If this is the situation, teams should explore the accuracy of the both sets of data. Things to consider at this point include determining what else might be occurring that could be contributing to the strong student outcomes – those are the things we want to keep doing. Finally, the red box in the bottom right hand corner describes a situation where neither your program quality/fidelity data and student outcome data are on track. In this situation, teams should consider how to provide intensive implementation supports. Teams should revisit exploration/adoption and installation. Staff consensus is important in moving the work of MTSS forward.

72 Translating the Analysis into Celebrations and Gap Statements
Be specific by describing the: Big Idea Time of Year Tier 1, 2, 3 Performance Gaps Possible Program Quality / Fidelity Links Data Accuracy

73 Example School-wide Reading Student Outcome Data
Trainer Notes: This set of example school-wide data can be found in the participant workbook. Have participants review the data in their workbook before moving on to the next series of slides which will provide an example celebration and gap statement based on this data. These is intended to examples to model the link between the data analysis (questions 1-6 on the Questions & Data Sources for Building-Level Data Analysis) and the celebrations and gap statement.

74 Translating the Analysis into Celebrations
Trainer Notes: This is a screen shot of an example celebrations worksheet. Please point out to the participants that the celebrations are very specific and identify the time of year, the specific Big Idea, and which tier(s) in describing the celebrations.

75 Celebrate Successes!!! Trainer Notes:
This slide is a reminder to teams that they need to celebrate the specific successes at each grade level and at the building level.

76 Recall Our Example School-wide Reading Student Outcome Data
Trainer Notes: This is a quick transition slide to remind teams that we are using the same data set as we gear up to show an example gap statement on the next slide.

77 Translating the Analysis into Gap Statement
Example Gap Statement As of May 2013, K is the only grade level where at least 80% of all students are at or above benchmark on the Composite score and on Phonemic Awareness measure but not for Alphabetic Principle. Grades 1 through 3 have not established at least 80% or higher on the composite scores (1st – 71%, 2nd – 65%, 3rd – 73% in May 2013) or individual measures for each grade level indicating needs in the areas of Alphabetic Principle, Fluency, Comprehension, and Vocabulary across these grade levels. There is inconsistent performance across subgroups at all grade levels when the May 2013 composite scores are disaggregated by ethnicity. The data related to the effectiveness of instructional supports indicates that 1st (second semester only), 2nd, and 3rd demonstrate a relative strength in keeping 90% or more students at or above benchmark across the year but do not yet consistently have 80% of more of students at or above benchmark at each assessment period. More than 15% of students demonstrate a need for strategic reading supports, with the exception of 2nd grade second semester and 3rd grade all year. However, at each grade level, with the exception of first semester K, strategic reading supports are not moving enough students (at least 80%) to at or above benchmark at the next assessment period. At each benchmark period there are also more than 5% of students demonstrating a need for intensive reading supports in grades 1-3 and the intensive reading supports are not moving enough students out of the well below benchmark range (at least 80%). Data come from the Summary of Effectiveness Table Data come from the Reading Data Summary Sheet Data come from the Subgroup Performance Sheet Trainer Notes: This example gap statement can be found as a full sheet in the participant workbook. Prompt participants to read through the example gap statement in their workbook and highlight the areas where the data from the Reading Data Summary Sheet, Summary of Effectiveness Table and the Subgroup Performance Table are found in the example gap statement. Prompt participants to write where the data came from in their participant workbook. This slide is animated. After allowing individual time to review this example, advance the slide and indicate that the information in blue is from the Reading Data Summary Sheet, the information in orange is from the Summary of Effectiveness Table, and the information in purple is from the Subgroup Performance Table.

78 Example Program Quality/Fidelity Data for Reading & Behavior
Trainer Notes: This set of example reading and behavior program quality/fidelity data can be found in the participant workbook. Have participants review the data in their workbook before moving on to the next series of slides which will provide an example cause for gap. These is intended to examples to model the link between the data analysis (questions 7-14 on the Questions & Data Sources for Building-Level Data Analysis) and the cause for gap statement.

79 Example Behavior Student
Outcome Data Trainer Notes: This set of example behavior student outcome data can be found in the participant workbook. Have participants review the data in their workbook before moving on to the next series of slides which will provide an example cause for gap. These is intended to examples to model the link between the data analysis (questions 7-14 on the Questions & Data Sources for Building-Level Data Analysis) and the cause for gap statement.

80 Translating Analysis into Cause for Gap
Example Cause for Gap Our DIBELS Next data indicate we have weaknesses in our core reading curriculum as well as strategic and intensive reading supports. While our overall PET-R score is 98%, the subscale score for Professional Development has been at 75% for the past two years and we consistently score low on items (3) related to time being systematically allocated for educators to analyze, plan and refine instruction, and (4) professional development efforts are explicitly linked to practices and programs that have been shown to be effective through documented research. In addition, our BoQ score of 51% is below the criteria of 70% and we have an average referral per day per month that is consistently above the national median of .22 indicating that we do not have full implementation of a three tier model of behavior supports in place and most of our referrals are coming from the classroom which means that our behavior systems are likely impacting our academic outcomes. All of these factors are contributing to our gap. Data come from the Behavior Program Quality/Fidelity Measure Data come from the Reading Program Quality/Fidelity Measure Comes from the Behavior Student Outcome Data Trainer Notes: This example cause for gap can be found as a full sheet in the participant workbook. Prompt participants to read through the example cause for gap in their workbook and highlight the areas where the data from the Reading Program Quality/Fidelity Data, Behavior Student Outcome Data, and Behavior Program Quality/Fidelity Data are found in the example gap statement. Prompt participants to write where the data came from in their participant workbook. This slide is animated. After allowing individual time to review this example, advance the slide and indicate that the information in blue is from the Reading Program Quality/Fidelity Data, the information in purple is from the Behavior Program Quality/Fidelity Data, and the information in orange is from Behavior Student Outcome Data.

81 Team Time Use the Questions and Data Sources for Building-Level Data Analysis with the Data Review Workbook to study your data The intended outcome of this team time is to clearly and specifically identify celebrations, gap statement, and cause for gap on your analysis of both the student outcome data and program quality/fidelity data Trainer Notes: This team time goes until teams break for lunch from 12:00 p.m. to 12:45 p.m. There is not a facilitator guide for this day. Teams are asked to use the “Questions and Data Sources for Building-Level Data Analysis” along with the questions/prompts in the data review workbook to study their data.

82 4.0 Plan Trainer Notes: This module should begin at 12:45 p.m. and end by 2:55 p.m. The intent of this module is for teams to use the information from the study section to develop specific strategies and activities – ideally tied to their School Improvement Plan.

83 School Improvement Plan
What We Want to Avoid… School Improvement Plan MTSS Trainer Notes: The goal is for the work of MTSS and School Improvement to be connected and not in separate silos (as seen in this picture).

84 Making MTSS Visible in Your School Improvement Plan
References to the Program Quality/Fidelity Data and Student Outcome Data collected this year Through strategies and activities related to the core principles of Multi-Tiered System of Supports, a School-wide Reading Model and School-wide Positive Behavioral Interventions & Supports Should reflect the integrated work you are doing in reading and behavior Trainer Notes: This slide provides general ideas of how to make MTSS visible in School Improvement Plans.

85 “We will implement MTSS.” “We will get trained in MTSS.”
Group Discussion Discuss why the following statements are considered non-examples of making MTSS visible in a School Improvement Plan “We will implement MTSS.” “We will get trained in MTSS.” Trainer Notes: Provide 3-5 minutes for this group discussion.

86 The Reality… MTSS Framework Evidence Based Instructional Practices
PLCs Evidence Based Instructional Practices Paragraph Shrinking Explicit vocabulary instruction PLC’s, Grade level meetings, problem solving process Student engagement strategies Assessments Behavioral Supports Schoolwide & Classroom PBIS Check-in Check-out Evidence Based Interventions K-PALS REWARDS Read 180 Read to Achieve Trainer Notes: This slide is animated. The reality is that MTSS is a framework or umbrella under which many evidence-based programs and practices exist. Making MTSS visible in School Improvement will require being much more specific than “we will implement MTSS” or “we will get trained in MTSS.” Research Based Core Program Reading Street Prentice Hall

87 So… What Should it Look Like?
Trainer Notes: This slide is a transition slide. We’ve talked about what not to do related to making MTSS visible in the school improvement plan. Now let’s look at what it should look like.

88 Recall the Cause for Gap Example
Example Cause for Gap Our DIBELS Next data indicate we have weaknesses in our core reading curriculum as well as strategic and intensive reading supports. While our overall PET-R score is 98%, the subscale score for Professional Development has been at 75% for the past two years and we consistently score low on items (3) related to time being systematically allocated for educators to analyze, plan and refine instruction, and (4) professional development efforts are explicitly linked to practices and programs that have been shown to be effective through documented research. In addition, our BoQ score of 51% is below the criteria of 70% and we have an average referral per day per month that is consistently above the national median of .22 indicating that we do not have full implementation of a three tier model of behavior supports in place which means that our behavior systems are likely impacting our academic outcomes. All of these factors are contributing to our gap. Example Objective: At least 80% of students in all grade levels and all subgroups at Sample School will have basic literacy skills established by May 2014, as measured by DIBELS Next composite scores and subscale scores for each grade level. Trainer Notes: This slide is animated. We return to the cause for gap statement. Now we provide an example objective statement.

89 Translating the Cause for Gap into Action
Example Cause for Gap Our DIBELS Next data indicate we have weaknesses in our core reading curriculum as well as strategic and intensive reading supports. While our overall PET-R score is 98%, the subscale score for Professional Development has been at 75% for the past two years and we consistently score low on items (3) related to time being systematically allocated for educators to analyze, plan and refine instruction, and (4) professional development efforts are explicitly linked to practices and programs that have been shown to be effective through documented research. In addition, our BoQ score of 51% is below the criteria of 70% and we have an average referral per day per month that is consistently above the national median of .22 indicating that we do not have full implementation of a three tier model of behavior supports in place which means that our behavior systems are likely impacting our academic outcomes. All of these factors are contributing to our gap. Example Strategy #1: Sample Schools staff will strengthen the MTSS framework for reading by increasing the implementation percentage on the Professional Development section of the Planning & Evaluation Tool – Revised (PET-R). Trainer Notes: This slide is animated. We provide an example strategy and highlight the portions of the cause for gap that directly led to the example strategy.

90 Strategy #1 - Example Activities:
Translating the Cause for Gap into Action Example Cause for Gap Our DIBELS Next data indicate we have weaknesses in our core reading curriculum as well as strategic and intensive reading supports. While our overall PET-R score is 98%, the subscale score for Professional Development has been at 75% for the past two years and we consistently score low on items (3) related to time being systematically allocated for educators to analyze, plan and refine instruction, and (4) professional development efforts are explicitly linked to practices and programs that have been shown to be effective through documented research. In addition, our BoQ score of 51% is below the criteria of 70% and we have an average referral per day per month that is consistently above the national median of .22 indicating that we do not have full implementation of a three tier model of behavior supports in place which means that our behavior systems are likely impacting our academic outcomes. All of these factors are contributing to our gap. Strategy #1 - Example Activities: Monthly grade level team meetings that focus on analyzing DIBELS benchmark, progress monitoring data, and additional classroom data to inform instruction and result in clearly defined action plans will be scheduled for the school year. Staff will be provided with professional development prior to the start of the school year to ensure all staff are able to read, analyze and interpret DIBELS reports and SWIS reports to ensure integration of academic and behavior during grade level team meetings. Trainer Notes: This slide is animated. Given the example cause for gap and example strategy on the previous slide, here are some potential activities that would be directly related to the strategy.

91 Translating the Cause for Gap into Action
Example Cause for Gap Our DIBELS Next data indicate we have weaknesses in our core reading curriculum as well as strategic and intensive reading supports. While our overall PET-R score is 98%, the subscale score for Professional Development has been at 75% for the past two years and we consistently score low on items (3) related to time being systematically allocated for educators to analyze, plan and refine instruction, and (4) professional development efforts are explicitly linked to practices and programs that have been shown to be effective through documented research. In addition, our BoQ score of 51% is below the criteria of 70% and we have an average referral per day per month that is consistently above the national median of .22 indicating that we do not have full implementation of a three tier model of behavior supports in place which means that our behavior systems are likely impacting our academic outcomes. All of these factors are contributing to our gap. Example Strategy #2: Staff will increase the fidelity of implementation of School-wide PBIS with a score of 70% on the Benchmarks of Quality (BoQ) in order to decrease the average referrals per day per month to at or below the national median. Trainer Notes: This slide is animated. We’ve created a second example strategy linked to the example cause for gap statement.

92 Strategy #2 Example Activities:
Review Benchmarks of Quality data and SWIS data with staff and facilitate a data dialogue using the BoQ Total Score, Subscale, and Items report along with the SWIS Big 5 reports. Facilitate a conversation with staff in order to gain consensus around implementation of SW-PBIS and the integration of academic and behavioral supports. The school staff will use the completed lesson plans for teaching School-wide behavioral expectations following the schedule of dates for the kick-off and review. School staff will review SWIS data at monthly staff meetings and engage in data dialogues to problem solve and action plan base on the needs identified in the school-wide data The Leadership Team will use this spring’s BoQ data (items report) to monitor progress of implementation efforts. Translating the Cause for Gap into Action Example Cause for Gap Our DIBELS Next data indicate we have weaknesses in our core reading curriculum as well as strategic and intensive reading supports. While our overall PET-R score is 98%, the subscale score for Professional Development has been at 75% for the past two years and we consistently score the items (3) related to time being systematically allocated for educators to analyze, plan and refine instruction, and (4) professional development efforts are explicitly linked to practices and programs that have been shown to be effective through documented research. In addition, our BoQ score of 51% is below the criteria of 70% and we have an average referral per day per month that is consistently above the national median of .22 indicating that we do not have full implementation of a three tier model of behavior supports in place which means that our behavior systems are likely impacting our academic outcomes. All of these factors are contributing to our gap. Trainer Notes: This slide is animated. It lists example activities that are directly related to the example strategy from the previous slide.

93 School Improvement Support Tool
Building-Level School Improvement Support Tool Do not under estimate the importance of identifying the following information as a part of your planning: Who will do it? By when? How often? Resources needed Plan for Monitoring Trainer Notes: This slide is animated. It provides a screen shot of the School Improvement Support Tool (that is in the pink assessment binder). We want to be very explicit about the link to example objectives, strategies and activities on the previous slides to this form. We also do not want to underestimate the importance of completing the entire form including, who will do it?, by when?, how often?, resources needed, and the plan for monitoring. Remember, the goal is to have a completed action plan based on the data.

94 Importance of Communication
Trainer Notes: Communication is a key component to planning. Teams need to keep in mind what needs to be communicated out and to whom.

95 Team Time Use the Data Analysis you completed during the Study Phase along with the examples in your Data Review Workbook to develop your plan. The intended outcome of this team time is to develop specific strategies and activities to address the gap statement and the cause for gap as well as to identify what information needs to be communicated and to whom the information will be communicated Trainer Notes: This team time should begin by 1:30 p.m. at the latest and will end by 2:55 p.m. Teams should be prompted to take a 15 minute break during this time.

96 5.0 Do Trainer Notes: This module should begin at 2:55 p.m. and end by 3:00 p.m.

97 Trainer Notes: The intent of this module is to convey this message “just do it!” Teams should have completed a detailed action plan based on the data analysis completed earlier today. The task then becomes to actually implement the plan – just do it!

98 Wrapping Up Our Time with C7
Trainer Notes: Each TAP will insert content to wrap up the time with C7 schools. This should begin at 3:00 p.m. and end at 3:30 p.m.

99 Thank you for your time and your dedication to the hard work of MTSS implementation!


Download ppt "Spring Data Review Cohort 7 Spring 2013 Elementary Schools"

Similar presentations


Ads by Google