Presentation is loading. Please wait.

Presentation is loading. Please wait.

Schoolwide Systems Review: Module 3.0 Gather Cohort 7 Middle Schools.

Similar presentations


Presentation on theme: "Schoolwide Systems Review: Module 3.0 Gather Cohort 7 Middle Schools."— Presentation transcript:

1 Schoolwide Systems Review: Module 3.0 Gather Cohort 7 Middle Schools

2 We want to gather information that tells us: –How well we are implementing/doing something: Systems/Process/Fidelity Data AND –Whether what we’re doing is working: Student Outcome Data Gather

3 Data You’ve Collected So Far this Year: PBIS Self Assessment Survey PBIS Team Implementation Checklist Office Discipline Referrals (SWIS) Planning and Evaluation Tool for Effective Schoolwide Reading Programs Curriculum Based Measurement Screening Data (AIMSweb)

4 Big Ideas for Gathering Data in an Efficient, Effective Way Stay organized (schedule, forms readily available for use) Provide training to ensure consistent, accurate data collection. Get the data back in front of the people who are doing the work. Actively use the data for decision-making. Keep the results in a central location.

5 Measures MiBLSi Model  Evaluation  Measures Measurement Schedule MiBLSi Model  Evaluation  Measurement Schedule

6

7 Staff Knowledge of SWIS During the leadership launch and throughout the PBIS trainings, we trained leadership teams to: –distinguish between major and minor problem behaviors –Categorize problem behavior –Identify motivation How has that information been delivered to the whole staff?

8 Office Discipline Referral Caution Data reflects 3 factors: –Students –Staff members –Office personnel Data reflects overt rule violators. Data is useful when implementation is consistent. –Do staff and administration agree on office-managed problem behavior versus classroom-managed behavior?

9 Terminology Office Discipline Referral (ODR) = Major Discipline Referral Major Discipline Referrals involve behaviors that are seriously disruptive, potentially harmful and may take more than 2- minutes to address. Office Discipline Referrals are records of behavior. They are not the consequence system.

10 Sharing SWIS data with staff Does your staff have ownership of the data? Everyone who spends time with students needs to understand the discipline procedures in your school and know how to complete the discipline referral form.

11 Sharing SWIS data with staff Which SWIS reports are shared with your staff for review ? –Big Five Report Average referrals per day per month Referrals by problem behavior Referrals by location Referrals by time of day Referrals by individual student –Custom reports How is your staff acting on the data?

12 SWIS Data Entry & Reports Who is entering discipline referral data into the SWIS website? Have they been trained through a Swift at SWIS training? How often are they entering referrals and how is the workload? Who reviews SWIS reports and how often?

13 Indicators that ODR data might not be accurate: Few referral forms, despite many problem behaviors Many referrals from some staff, none from others. Motivation information: left blank, “Don’t Know” or “Unknown” options selected. Time of referral report shows that referrals are all occurring at beginning or end of school day.

14 Action Planning Improving SWIS Accuracy Review the SWIS Decision Flowchart on page 10 of your participant in your work book to determine your next steps for ensuring the accuracy of your SWIS data.

15 Reading CBM Screening Data AIMSweb

16 Staff Knowledge of Reading Curriculum Based Measures During the Schoolwide Reading Day 2 Training, we trained leadership teams to: –Understand the Foundations (research) behind the early literacy measures. –Understand the purpose of Reading CBM for Schoolwide Screening –Understand how Reading CBM data are designed to be used within a Schoolwide Reading Model How has that information been delivered to the whole staff?

17 Are you minimally disruptive? Data collection methods –Teacher –Assessment teams Schedule posted in advance One person coordinating flow of students and managing assessment team needs

18 Support Fidelity Before, During and After Screening BEFORE Train thoroughly to start with. Review administration and scoring rules before each screening. Practice with sample scripts & sample students (with shadow scoring). DURING Observe using the Integrity Checklists provided as part of the screening materials./ Conduct shadow scoring. AFTER Check scoring of passages. Check the transfer of scores from passages to front of the booklet. Check transfer of scores from front of booklets to the website (if applicable). Validate scores with classroom teacher.

19 Data Entry Who is entering Universal Screening data into the website? Does someone verify the accuracy of those entries?

20 Sharing Universal Screening Data with Staff Does your staff have ownership of the data? People doing the teaching need to be involved in collecting the Universal Screening measures, at some level –Part of assessment team –Collecting progress monitoring data

21 Sharing Universal Screening Data with Staff Which reports are given to your staff? –Building level (Tier Transition Report) –Grade level (Rainbow Reports) –Individual student (Class list report) How is your staff acting on the data?

22 Indicators that your student data might not be accurate Multiple teachers question the accuracy of students’ scores based on what they know about the students. Multiple students do not have any scores in the database Student scores are above the range possible for the measure.

23 Improving Accuracy of Reading Screening Data Review the Universal Screening Decision Flowchart on page 14 of your participant in your work book to determine your next steps for ensuring the accuracy of your Universal Screening data. Action Planning

24 Systems / Process / Implementation Fidelity Measures Completed this year: PBIS Self Assessment Survey (SAS) PBIS Team Implementation Checklist (TIC) Schoolwide Evaluation and Planning Tool(SWEPT) Later this Spring: Benchmarks of Quality (BOQ) and for some the Schoolwide Evaluation Tool (SET)

25 Systems/Process Measures Tell us about our implementation fidelity: Are we doing what we said we would do, when and how we said we would do it? When engaged in systems change, we will likely be able to see changes in adult behavior before we are able to see changes in student behavior. Systems/process data help us know if we are on the right track early on and can reinforce our initial work Having this information helps us to accurately interpret our student outcomes.

26 Strengths and Weaknesses of Self Assessment Strengths Weaknesses Self Evaluation can prompt improved implementation on its own. Self perceptions might not always be accurate: -Lack / changing background knowledge Typically faster to complete, and less complex. Intentional or unintentional over or under-inflation of scores

27 Do you know where you can locate each of the systems/process measures? Do you complete the items as a team, or is it a 1 person rush right before the measure is due? Do you know who can help you if you don’t understand an item?

28 Indicators that your systems/process data might not be accurate First time completing the measure, scores of fully implemented across the board. There is a mismatch between your process outcome measures and your student outcome measures. There is a mismatch between the leadership team’s perceptions of implementation and the full staff perception.

29 Staff Knowledge During and between trainings, we have asked you to complete the SAS, TIC and SWEPT. Does your leadership team fully understand the purpose of these tools and can you communicate the purpose and results to others?

30 Sharing Data with Staff Help bring staff along with the work the leadership team is spearheading. Share graphs, implementation improvement, especially information that is relevant (scores related to the classroom, results from SAS), etc. SAS, TIC, SWEPT data should presented in conjunction with student outcome data (AIMSweb, SWIS) Leadership team perceptions should be complemented by staff perceptions of implementation.

31 Data Entry and Reports Measures need to be completed during the assessment window (see measurement schedule online or in Assessment Booklet) Behavior systems/process scores should be entered online simultaneously, or within 1 week of paper scoring. Reports should be reviewed online or printed for review with the team and whole staff. More than one person should have the skills and necessary information to be able to enter data and access reports online. Paper copies of the Reading Measures (SWEPT) should be submitted to MiBLSi for inclusion in the statewide data set.

32 Improving Accuracy of Outcome Process Data Review the Outcome Process Decision Flowchart on page 17 of your participant in your work book to determine your next steps for ensuring the accuracy of your Outcome Process data. Action Planning


Download ppt "Schoolwide Systems Review: Module 3.0 Gather Cohort 7 Middle Schools."

Similar presentations


Ads by Google