Presentation is loading. Please wait.

Presentation is loading. Please wait.

Activity Self Perception

Similar presentations


Presentation on theme: "Activity Self Perception"— Presentation transcript:

1 Activity Self Perception
Think about and share your different data sources and how do you use the information currently? Similarities? Differences? Activity Self Perception

2 Coaching teams to use data for decision making

3 Coaching

4 What is a PBIS Coach? What do we mean by “Coach”
PBIS Coaches are not “trainers”, they support teams who have basic training in PBIS. PBIS Coaches support teams to make data-based decisions toward quality improvement of student, staff and family outcomes.

5 What do I need to be doing as a coach?
Prevent team members from launching into solutions before they are ready by engaging in active problem solving Get team members to ask questions, even if they don’t have all the information Don’t move forward until a measureable goal is identified and a solution is designed

6 What do I need to know to coach well?
Desired Outcomes – how will we know if what we’re doing is having a positive effect on students, staff, and families? Practices – what PBIS Interventions are in place? Data – what data do we have and what tools do we have to collect & summarize data? Systems – what do we have in place to support teams to look at our data and use it for quality improvement?

7 Data-Based Decision Making

8 Here’s what we know… Decisions are more likely to be effective and efficient when they are based on data. The quality of decision making depends most on the first step (defining the problem to be solved). Define problems with precision and clarity Why?

9 Data help us ask the right questions…they do not provide the answers
Data help us ask the right questions…they do not provide the answers. Use data to: Identify problems Refine problems Define the questions that lead to solutions Data help place the “problem” in the context rather than in the students. Why?

10 School-wide PBIS Individualized, Tier III Targeted, Tier II
Tertiary Prevention: System for students requiring more intensive & individualized supports for academic, social, or mental health services. Individualized, Tier III Secondary Prevention: Systems for targeted or group-based interventions for students needing additional support beyond the Universal or Tier I system. Targeted, Tier II Primary Prevention: School-wide & Classroom-wide systems for all students and all staff in all settings. Universal, Tier I Why? (Power in Continuum)

11 What is DBDM? The process of planning for student success (both academic and behavioral) through the use of ongoing progress monitoring and analysis of data Douglas County School District (Colorado) What?

12 Why Do it? The value of Data-Based Decision Making is:
Quality Improvement Cycle of continuous Improvement Improving what? fidelity of implementation, social climate, learning environment, student learning, attendance, grades)

13 How do we do it? Right Data/Format/Time/People Right Questions
Solution Development & Action Planning How? The fist step in any problem-solving process is identifying the problem. SWIS helps schools/facilities move from “I think I have a problem” to “My problem is….”

14 Hallway Noise Study Using SWIS Data for Active Decision Making
A brief vignette to demonstrate how SWIS data is used to support data-based decision making. Internal: charts will be updated with SWIS graphs at some point. Kelsey to change look & feel. Trainer Note: Here’s an example of a school improving decision making through problem-solving. Kartub, D., Taylor-Greene, S., March, R., Horner, R.H. (2000). Reducing Hallway Noise: A Systems Approach. Journal of Positive Behavior Interventions, 2(3)

15 Problem Staff at a middle school (Grades 6-8) in a rural school district with 520 students have identified an issue with student noise in the hallways. Teachers complain that hallway noise is significantly disruptive around lunch. Three lunch periods (by grade) Students required to walk past classrooms still in session to access cafeteria.

16 Problem Solving Process
Team Assesses the Extent of the Problem Vote during faculty meeting confirmed as a priority to address Review Existing Practices Students were taught school-wide expectations Teaching Assistant in hall gives out detentions & office referrals for loud noise. Review Existing Data Referrals by location Hallway ODR per student Trainer Note: Here’s the process that this team used.

17 Problem Solving Process (cont.)
Build a hypothesis Noise is occurring because Students have been in class all morning (low blood sugar) and want to socialize (peer attention) Hallway is loud at beginning and end of day Define problem-solving logic Small number of kids = address group/individually Large number of kids = address system Define, teach, monitor, and reward BEFORE increasing use of punishment.

18 Office Referrals by Location
Internal: Future – update to SWIS graph. Trainer Note: Here are the office referrals by location. (c in the process)

19 Office Discipline Referrals by Student
                                                                                           Students: 173           Referrals: 530 Trainer Note: Many students receiving referrals throughout the school across locations and behaviors. (c in process)

20 Trainer Note: (c in process)
Narrow down to student with 1 or more referral in the hallway this year. In SWIS we can use drill-down by location, time, and dates! 0-1 in green part of triangle 2-5 good candidate for targeted intervention 6 or more – tertiary intervention.

21 Drill Down into the Problem
Who? Large number of students across grade levels What? Disruptive (loud, rowdy) behavior When? After morning class Where? Hallway Why? (a) To gain peer attention, and (b) behavior is similar to what they do before and after school. *Teaching Assistant’s consequences are not proving effective Trainer Note: using data from c to start building d

22 Solution (keep it simple)
Make lunch hallways look different from hallways in morning and afternoon. Change lighting Review school-wide expectations for hallway Five-minute review of “quiet” Build reward for valued behavior Three days of quiet in hallway results in an extra five minutes of social time (at lunch or at end of school) Remind students to be quiet just before they are released for lunch Measure and Implement Use a decibel meter to measure noise level Public posting of results

23 Build Action Plan Actions Who When 1. Build “Quiet” Curriculum
Ben and Mary Nov 12 2. Buy Decibel Meter Rob Nov 10 3. Teach Hallway Expectations/ Reminders Team Dec 2-3 4. Collect and Post Data Reiko Ongoing 5. Schedule Lunch Times Ms. Green 6. Graph and Report Data 7. Report to Staff Staff Meeting

24 Sixth Grade Lunch Noise
Trainer Note: The team collected baseline data, put…..

25 Seventh Grade Lunch Noise

26 Eighth Grade Lunch Noise

27 Improving Decision Making
From Problem Solution To Problem Problem Solving Solution Action Planning Trainer Note: We want to interrupt the traditional cycle of admiring the problem, by talking and talking about the problem and quickly launching into determining solutions by building a system for improving decision-making.

28 How? Right Data/Format/Time/People What is the right data?
What would be the right format? What is the right time (schedule) to bring the data? Who are the right people to be discussing and using this data to address issues? How? The fist step in any problem-solving process is identifying the problem. SWIS helps schools/facilities move from “I think I have a problem” to “My problem is….”

29 How? Right Questions The statement of a problem is important for team-based problem solving. Everyone must be working on the same problem with the same assumptions. Problems often are framed in a “Primary” form. That form creates concern, but is not useful for problem-solving. Frame primary problems based on initial review of data Use more detailed review of data to build “Solvable Problem Statements.” How? The second step is to ask the right questions of your data to identify whether there’s a problem and develop a precise problem statement.

30 What are the data we need for a decision?
Precise problem statements include information about the following questions: What is the problem behavior? How often is the problem happening? Where is the problem happening? Who is engaged in the behavior? When is the problem most likely to occur? Why is the problem sustaining? Quote: Disaggregation is not problem solving but problem finding

31 Primary versus Precision Statements
Primary Statements Precision Statements Too many referrals September has more suspensions than last year Gang behavior is increasing The cafeteria is out of control Student disrespect is out of control There are more ODRs for aggression on the playground than last year. These are most likely to occur during first recess, with a large number of students, and the aggression is related to getting access to the new playground equipment. Trainer Notes: Here are examples of several primary statements and a precision statement.

32 Primary versus Precision Statements
There are more ODRs for disruption (loud, rowdy behavior) in the hallway. These are most likely to occur during after morning class, with a large number of students across grade levels, and the disruption is related to getting peer attention. What? More ODRs for disruption. Where? In the hallway. Who? A large number of students across grade levels. When? After morning class. Why? To get access to peer attention. Trainer Notes: Change to match Hallway study precise problem A precise problem statement answers the questions: Who? What? When? Where? Why?

33 How? Solution Development & Action Planning
Prevention— how can we avoid the problem context? Who? When? Where? Schedule change, curriculum change, etc. Teaching— how can we define, teach, and monitor what we want? Teach appropriate behavior Use problem behavior as negative example Recognition— how can we build in systematic rewards for positive behavior? Extinction— how can we prevent problem behavior from being rewarded? Consequences— what are efficient, consistent consequences for problem behavior? How will we collect and use data to evaluate: Implementation fidelity? Impact on student outcomes?

34 Solution Development How can we avoid the problem context?
Solution Component Action Step(s) Prevention How can we avoid the problem context? Example: Schedule Lunch Times, Change Lighting Teaching How can we define, teach, and monitor what we want? Example: Build “Quiet” Curriculum, Buy Decibel Meter, Teach Hallway Expectations/ Reminders Recognition How can we build in systematic rewards for positive behavior? Example: Three days of quiet in hallway results in an extra five minutes of social time (at lunch or at end of school) Extinction How can we prevent problem behavior from being rewarded? Example: Public posting of results Corrective Consequence Consequences—what are efficient, consistent consequences for problem behavior? Example: Continue current system (Minor/Major ODR) Data collection Implementation fidelity? Example: Walkthrough report, observation, self-assessment Impact on student outcomes? Example: SWIS ODR Data Trainer Notes: Here is a simple, basic example of how the necessary solution components can be organized alongside their action steps. Organizing the solution development process in such a way increases the likelihood that action steps will be taken and that all staff and team members are on the same page in regards to solving the problem. We will present another template later on that we encourage teams to use to guide the solution development discussion. It is presented in more of an action plan format.

35 Corrective Consequence
Precise Problem Statement: Many students across grade levels are engaging in disruptive (loud, rowdy) behavior in the hallway after morning class, and the behavior is maintained by peer attention. Goal: Reduce hallway ODRs by 50% per month (currently 24 per month average) Solution Components What are the action steps? Who is Responsible? By When? How will fidelity be measured? Notes/Updates Prevention Schedule Lunch Times, Change Lighting  Custodial staff to adjust lighting Principal to adjust schedule Ongoing Nov 10 New lunch schedule Walkthrough report Teaching Build “Quiet” Curriculum, Buy Decibel Meter, Teach Hallway Expectations/ Reminders Ben & Mary Nov 12  Permanent product Staff Self Assessment Recognition Continue current acknowledgment system and add an extra five minutes of social time (at lunch or at end of school) after three days of quiet in hallway Reiko & Principal Nov 9 (announcements & chart up)  Announcement made Chart made Extinction Public posting of results of decibel readings  Reiko  Ongoing  Posted chart Corrective Consequence Continue current system (Minor/Major ODR) Hallway and Cafeteria supervisors  SWIS ODR Reports What data will we look at? Who is responsible for gathering the data? When/How often will data be gathered? Where will data be shared? Who will see the data? Data Collection  ODR record Supervisor weekly report SWIS Data Entry person and Principal share report with supervisors Weekly In supervisor meeting and posted in the faculty lounge on PBIS bulletin board All staff Most frequently misunderstood and overlooked component! Example: public posting of results will reduce likelihood of payoff that previously reinforced this behavior Trainer Note: To enhance the likelihood that the solution plan will be used, we’ve transformed it into an action plan. We have the same information we had on the slide prior, but now it is organized into an action plan format to help us identify who is responsible for which action steps, when persons will complete their tasks, how we will measure fidelity, and any other anecdotal information. You also see in the bottom portion that we’ve outlined the data collection process. In effect, this is a combination action plan and evaluation plan. We’ve created one document to help us act on our solution and to also help us evaluate how we did. An action plan says who will do what by when. An evaluation plan is a plan for specifying how the team will know that their efforts have (1) been implemented as planned (fidelity) and (2) that student outcomes have been effected (decreased rates of problem behavior). Keep in mind, during team discussion each of the solution components should be addressed and discussed. However, the final solution development action plan MAY not have something in each cell. Schools/Facilities may already be doing things and may not need to always write it in on the plan. It’s a matter of fluency. The more fluent a team is with the problem-solving process the less they might include on this form. Still, recording it on paper helps ensure that it is completed and achieved.

36 Value of this work/process
Quality Improvement is Continuous Improvement The value of Data-Based Decision Making is: Quality Improvement Cycle of continuous Improvement Improving what? fidelity of implementation, social climate, learning environment, student learning, attendance, grades) Example: Hallway Study, recoup teaching time, improving social climate (for staff and students)

37 Guided discussion with TIPS activity?
Show Videos

38 Activity Reflection Activity - Shapes

39 Go eat! Lunch Break

40 Activity Data Treasure Hunt
Think about and share your different data sources and how do you use the information currently? Similarities? Differences? Activity Data Treasure Hunt

41 Different Data for Different Decisions

42 Decision-Making for Quality Improvement
Outcome Data Discipline Data for Short-Term Improvement Progress Monitoring (formative) Universal Screening Discipline Data for Long-Term Improvement Annual summarization and review of Strengths, Weaknesses, and Planning (summative) Continuum of frequency is the key… we need to evaluate some things frequently and some infrequently (don’t get tripped up on how often is ST vs. LT)

43 Decision-Making for Quality Improvement
Fidelity Data Fidelity Data for Short-Term Improvement Progress Monitoring Universal Screening Fidelity Data for Long-Term Improvement Annual Assessments

44 Decision-Making for Quality Improvement
Outcome Data Discipline (e.g., referrals) Academic Attendance Climate/Culture School Safety Fidelity Data Team/Self Assessments Walk-through reports PBIS Assessment (e.g., SET, Self Assessment, BoQ, TIC) Instead of segmenting data reviews (looking just at outcome or just at fidelity data) looking at the shaded section (where the data intersects) we can get a more comprehensive view of reality and can make higher quality decisions.

45 Decision-Making for Quality Improvement
Outcome Data Discipline Data for Continuous System Improvement and Progress Monitoring (formative) Discipline Data for Universal Screening of Student Needs Discipline Data for Evaluation (summative) of Strengths, Weaknesses, and Planning Fidelity Data Fidelity Data for Continuous Improvement Fidelity Data for Evaluation

46 Connecting Outcomes & Fidelity
Lucky Sustaining Positive outcomes, low understanding of how we achieved them Replication of success unlikely Positive outcomes, high understanding of how we achieved them Replication of success likely Losing Ground Learning Undesired outcomes, low understanding of how we achieved them Replication of failure likely Undesired outcomes, high understanding of how we achieved them Replication of mistakes unlikely Outcomes Fidelity

47 Sustaining Positive outcomes, high understanding of how we achieved them Replication of success likely

48 Discipline Data for a cycle of Continuous quality improvement

49 Short-Term vs. Long-Term Quality Improvement
Quality improvement requires two levels of analysis/use: Short-term (sometimes called progress monitoring) is using data regularly Long-term (sometimes called evaluation or annual assessment) is summarizing data for a big picture

50 Harbor Haven Middle School
Trainer Notes: ***The simulation with Harbor Haven Middle School is the first in a series of simulations. It is to be modeled by the trainers and serves as the foundational example of drilling down data. The goal is to demonstrate for participants how you would take the perception of a problem (a primary problem statement) and then begin to explore the issue, beginning with the school-wide information readily available at the SWIS Dashboard.*** Example Think Aloud: “So far in this module, we have: Demonstrated how to locate and identify the various report option in SWIS. Explained what data-based decision making is and how it is valuable. Demonstrated how SWIS is used help define problems with precision. Demonstrated how SWIS can be used to help develop solutions to problems.” “Now we are going to demonstrate through a simulation how all of this can be done in a fluid, systematic way by a team in any school/facility. As we move through our simulations today, be mindful of how we are using data for our decision making.” “For our first simulation, we are going to work with Harbor Haven Middle School. It is typical middle school housing grades 6, 7, and 8 and has combined student enrollment of 565 students.” 565 students Grades 6, 7, 8

51 Harbor Haven Middle School
Is there a problem? If so, what is it? Problem Trainer Notes: Example Think Aloud: “Lately the faculty buzz around Harbor Haven is that students are getting worse. When we started the school year, everything seemed fine. But, several months in, the teachers are complaining that they are constantly dealing with the fall-out from lunch.” “I need to explore this issue in more detail. I start at the SWIS Dashboard. The dashboard gives me a read on how things are going school-wide. I can check my vital signs, so to speak, and see if there are any red flags that need more investigation.” Dashboard

52 Harbor Haven Middle School
Median Trainer Notes: Example Think Aloud: “When I look at my graph for average referrals per day per month, it does show me that we have an increasing trend. There may be some merit to that faculty buzz. Referral incidents are definitely on the rise. Still, I need more information to truly solve the problem in the best way.”

53 Harbor Haven Middle School
School-wide Data Defiance Harassment Theft Trainer Notes: Example Think Aloud: “The SWIS Dashboard gives me more data indicators about how things are going school-wide. I can see that my top problem behaviors are defiance and harassment. I can see that problem behaviors are most occurring in the hallway and cafeteria. I can also see that behaviors are most occurring at 10:00 and also between 11:00 and 12:30.” Hallway Café Bus Gym 10:00 11:00-12:30

54 Harbor Haven Middle School
School-wide Data Majority of referrals come from 6th and 7th grades. Trainer Notes: Example Think Aloud: “The SWIS Dashboard also shows me that the majority of referrals are coming from our 6th grade and 7th grade students, and that 11 students across our school have received more than 2 referrals.” 11 students have more than 2 referrals.

55 What Do I Know? What? Defiance and harassment.
Where? Hallways and cafeteria. Who? A large number of students. Majority of referrals are from 6th grade. When? 10:00 and 11:00-12:30. Trainer Notes: Example Think Aloud: “So here’s everything I’ve gathered from just the SWIS Dashboard. Our problem behaviors are defiance and harassment. Our problem locations are hallways and cafeteria. Our 6th and 7th grade students account for the majority of referrals. Our problem times are 10:00 and the period from 11:00 to 12:30.” “From the SWIS Dashboard, I can get a lot of the pieces to the puzzle.”

56 What Do I Know? I know pieces of information.
But, I do not know if any of this information is connected. I need to drill down to look for connections. Trainer Notes: Example Think Aloud: “What I don’t get from the SWIS Dashboard alone is how all of the puzzle pieces fit together. I need to use the Drill Down feature and have a more laser-like focus with the data to really understand how the puzzle pieces are connected.” “I’m going to use the Drill Down feature in SWIS to further explore our issue and see if I can identify how the pieces connect.”

57 Change the graph type to change the analysis.
Data Drill Down Use the information from the SWIS Dashboard to drill down and analyze data. Trainer Notes: Example Think Aloud: “Here I am at the Drill Down feature. To get here, all I had to do was click on the Drill Down icon located in the navigation row.” “Now I can begin to include some filters into my dataset and really drill down. You can think of the filters being any of the ‘red flags’ that we noticed from the SWIS Dashboard. For example, the dashboard indicated that the cafeteria was one of our hot spots. Plus, the faculty buzz has been dealing with fall out after lunch. So, I’m going to drill down into our cafeteria data. That’s where I’m going to start.” “So, I’ve included the school year and the problem location of ‘cafeteria’ in to the dataset and clicked ‘Generate’. Now SWIS is showing me all of our data that matches the school year I identified and the location I identified.” “If I change the graph type, I can see different aspects of what is happening in the cafeteria. When I select the graph type of ‘Problem Behavior’ I can see that physical aggression and harassment are the primary problems.” “My SWIS Dashboard did tell me that harassment is a top problem behavior for our school. So, now I see a connection. Harassment is connected to the cafeteria. I will add that to my dataset and drill deeper.” Change the graph type to change the analysis.

58 Data Drill Drown Cafeteria and harassment are connected.
Trainer Notes: Example Think Aloud: “OK. I’ve added harassment into my dataset and now SWIS has generated for me all of the data that matches my identified filters (e.g., school year, location, problem behavior). Now I can change the graph type again to better analyze the situation. By changing the graph type to ‘Time of Day’ I can get an idea of when harassment is most likely to happen in the cafeteria. This will help me better identify which students we are dealing with.” “I can see that the time range of 11:30-12:00 is problematic and that fits with what I saw on the SWIS Dashboard. So, I’ve got another connection that I’ll include in my dataset.” Change the graph type to change the analysis.

59 Data Drill Down Cafeteria, harassment, and the time range 11:30-12:00 are connected. Trainer Notes: Example Think Aloud: “OK. I’ve added the time range 11:30-12:00 into the dataset. Now I’m seeing everything that has happened in our cafeteria this year, during the specified time, and which involves harassment.” “Now I can change the graph type to identify what the motivation is. I know what the problem is, where the problem is, and when the problem is. But, I also need to know why the problem keeps happening.” “When I change the graph type I can see that the primary motivation for this dataset is obtain peer attention.” Change the graph type to change the analysis.

60 Precise Problem Statement & Solution Development
Data Drill Down for Connections Cafeteria Harassment 11:30-12:00 Obtain Peer Attention Precise Problem Statement & Solution Development Trainer Notes: Example Think Aloud: “Alright. By drilling down my data, I have a better idea of how the puzzle pieces connect. I can see that in the cafeteria we have the problem behavior of harassment. Our problem behaviors are happening during the 11:30-12:00 time range, which happens to be 6th grade lunch. I also know that the problem behaviors are being maintained because students are getting peer attention.” Share with participants that the value of doing a data drill drown is to connect the dots. The SWIS Dashboard is great for keeping your fingers on the pulse of what is going on. But to really see if items are connected or isolated, a data drill down is best. Many students are engaging in harassment in the cafeteria during the 11:30-12:00 time range (6th grade lunch), and the behavior is maintained by peer attention.

61 Solution Development Target Area(s): Harassment in the cafeteria
Goal: Reduce referrals for harassment in the hall & cafeteria by 50% Solution Component Action Step(s) Prevention Maintain the current lunch schedule, but shift th grade classes to balance numbers. Teaching Teach behavioral expectations in cafeteria Recognition Establish “Friday Five”—an extra 5 minutes of lunch on Friday for five good days. Extinction Encourage all students to work for “Friday Five”, making a reward for problem behavior less likely. Corrective Consequence Active supervision and continued early consequence (ODR) Data collection Maintain ODR record and supervisor weekly report Trainer Notes: All of the information that is gained through the drill down can help begin the process of solution development. Here is the simple we showed earlier for how to organize the solution development process. The solution components are organized in one column and the action steps are organized in another. All of it is related back to the target area and the goal stated at the top.

62 Precise Problem Statement: Many students are engaging in harassment in the cafeteria during the 11:30-12:00 time range (6th grade lunch), and the behavior is maintained by peer attention. Goal: Reduce cafeteria ODRs by 50% per month (currently 24 per month average) Solution Components What are the action steps? Who is Responsible? By When? How will fidelity be measured? Notes/Updates Prevention  Maintain current lunch schedule, but shift 6th grade classes to balance numbers.  Principal to adjust schedule and send to staff January 10   to staff Teaching  Teach behavioral expectations in the cafeteria  Teachers will take class to cafeteria; cafeteria staff will teach expectations  Rotating schedule on January 15  Sign-up sheet for scheduled times Recognition  Establish “Friday Five”—extra 5 min. of lunch on Friday for 5 good days School counselor and Principal will create a chart and staff extra recess Principal to give announcement on intercom on Monday  Announcement made Chart made Extinction  Encourage all students to work for “Friday Five”—make reward for problem behavior less likely  All staff  Ongoing Corrective Consequence  Active supervision and continued early consequence (minor/major ODR) Hallway and Cafeteria supervisors Ongoing What data will we look at? Who is responsible for gathering the data? When/How often will data be gathered? Where will data be shared? Who will see the data? Data Collection  ODR record Supervisor weekly report SWIS Data Entry person and Principal share report with supervisors Weekly In supervisor meeting and posted in the faculty lounge on PBIS bulletin board All staff Trainer Note: To enhance the likelihood that the solution plan will be used, we’ve transformed it into an action plan. We have the same information we had on the slide prior, but now it is organized into an action plan format to help us identify who is responsible for which action steps, when persons will complete their tasks, how we will measure fidelity, and any other anecdotal information. You also see in the bottom portion that we’ve outlined the data collection process. In effect, this is a combination action plan and evaluation plan. We’ve created one document to help us act on our solution and to also help us evaluate how we did. An action plan says who will do what by when. An evaluation plan is a plan for specifying how the team will know that their efforts have (1) been implemented as planned (fidelity) and (2) that student outcomes have been effected (decreased rates of problem behavior). Keep in mind, during team discussion each of the solution components should be addressed and discussed. However, the final solution development action plan MAY not have something in each cell. Schools/Facilities may already be doing things and may not need to always write it in on the plan. It’s a matter of fluency. The more fluent a team is with the problem-solving process the less they might include on this form. Still, recording it on paper helps ensure that it is completed and achieved.

63 Discipline Data for Universal Screening

64 Using SWIS Data for Decision Making
Universal Screening Tool Proportion of students with 0-1 Office Discipline Referrals (ODRs) 2-5 ODRs 6+ ODRs Progress Monitoring Tool Compare data across time Prevent previous problem patterns Trainer Notes: SWIS also assists schools in the decision making process by serving as a universal screening tool. Using the data-based decision rules of 0-1 ODRs, 2-5 ODRs, and 6+ ODRs schools can quickly identify those students to watch; those students who may benefit from targeted, small group services and Tier II supports; as well as those students who may benefit form more individualized and intensive supports. By regularly reviewing SWIS graphs and reports, schools/facilities can progress monitor their efforts and be proactive in their actions and decisions.

65 Using the Referrals by Student as a Universal Screening Tool
Trainer Notes: Here is an example of how SWIS can be used as for universal screening. Student with 2-5 are possible candidates for more support in behavior, academic, or both areas.

66 Research Study on Early Intervention
Cumulative Mean ODRs Per Month for 325+ Elementary Schools 08-09 Cumulative Mean ODRs Trainer Notes: A research study by Frank, McIntosh, and May highlights the value of proactive decision making. What the research study found was that as a school year progresses, the cumulative mean of office discipline referrals increases for each of the three tiers. Students who may benefit from Tier III supports see the most significant increase in ODRs over time. However, when schools used SWIS as a tool for universal screening and intervened early they were able to break the pattern. When the pattern was proactively broken, a significant stabilization in ODRs occurred and future incidents and problems were preempted. Jennifer Frank, Kent McIntosh, Seth May

67 Research Study on Early Intervention
Cumulative Mean ODRs Per Month for 325+ Elementary Schools 08-09 The “October Catch” Cumulative Mean ODRs Trainer Notes: However, when schools used SWIS as a tool for universal screening and intervened early they were able to break the pattern. When the pattern was proactively broken, a significant stabilization in ODRs occurred and future incidents and problems were preempted. Jennifer Frank, Kent McIntosh, Seth May

68 Discipline Data for Long-Term Continuous Improvement

69 SWIS Year-End Report

70

71

72 Fidelity Data for Short-Term Continuous Quality Improvement

73 PBIS Assessment Reports

74 Fidelity Data for Universal Screening

75 How would we ensure that the Universal Screening occurred?
Who is responsible to gather the data for the team What is the schedule for reviewing this data (for purposes of universal screening)?

76 Fidelity Data for Long-Term Continuous Improvement

77 Two-fold TBD

78 Nad’s slide

79

80 Think about and share your different data sources and how do you use the information currently?
Similarities? Differences? Activity


Download ppt "Activity Self Perception"

Similar presentations


Ads by Google