Presentation is loading. Please wait.

Presentation is loading. Please wait.

Data-based Decision Making: Basics OSEP Center on Positive Behavioral Interventions & Supports February 2006

Similar presentations


Presentation on theme: "Data-based Decision Making: Basics OSEP Center on Positive Behavioral Interventions & Supports February 2006"— Presentation transcript:

1

2 Data-based Decision Making: Basics OSEP Center on Positive Behavioral Interventions & Supports February 2006 www.PBIS.org www.SWIS.org George.sugai@uconn.edu C/3

3 Purpose To review systems & practices of school-wide data-based decision making –Why? –Guidelines –Examples

4 Why? Communications Effectiveness, efficiency, & relevance of decision making Professional accountability Prevention …..Use minutes efficiently

5 SYSTEMS PRACTICES DATA Supporting Staff Behavior Supporting Student Behavior OUTCOMES Supporting Social Competence & Academic Achievement Supporting Decision Making 4 PBS Elements

6 3 Elements of Data-based Decision Making 1.High quality data from clear definitions, processes, & implementation (e.g., sw behavior support) 2.Efficient data storage & manipulation system (e.g., SWIS) 3.Process for data-based decision making & action planning process (e.g., team)

7 Assumptions Continuum of school-wide system of positive behavior support in place “Good” data available Team-based leadership In-building expertise School-level decision making needed

8 Start with Questions & Outcomes! Use data to verify/justify/prioritize Describe in measurable terms Specify realistic & achievable criterion for success

9 Data-based Action Planning Process 1.Use Team 2.Identify data sources 3.Collect 4.Summarize data 5.Analyze data 6.Build & implement action plan based on data

10 School-wide PBS Systems Implementation Logic

11 Kinds of Data Office discipline reports Behavioral incidents Attendance Suspension/Detention Observations Self-assessments Surveys, focus groups Etc.

12 Office Discipline Referral Caution Reflects 3 factors –Student –Staff member –Office Reflects overt rule violations Underestimations

13 General Approach: “Big 5” # referrals per day per month # referrals by student # referrals by location #/kinds of problem behaviors # problem behaviors by time of day

14 Tables versus Graphs

15 YearMonthNum of DaysNum of ReferralsAvg Referrals Per Day 2001Aug000.00 2001Sep1950.26 2001Oct21180.86 2001Nov18170.94 2001Dec14211.50 2002Jan22180.82 2002Feb17150.88 2002Mar19261.37 2002Apr21140.67 2002May18130.72 2002Jun1120.18 2002Jul000.00 Totals: 1801490.83

16 # ODR per Day per Month

17 Total v. Rate

18 Total # ODR per Month

19 # ODR per Day per Month

20 Give priority to Graphs Rate

21

22

23 Days: 175Referrals: 471Avg: 2.69 M/m

24 Days: 175Referrals: 86Avg: 0.49 M

25 M/M

26 Is action needed?

27

28

29

30

31

32

33 What?

34

35

36 Where?

37

38 Who? Students per Number of Referrals

39 Who?

40 When?

41

42 “Real” Data “A. E. Newman” Elementary School –~450 K-5 students –~40% free/reduced lunch –Suburban

43 YearMonthNum of DaysNum of ReferralsAvg Referrals Per Day 2001Aug000.00 2001Sep191347.05 2001Oct221627.36 2001Nov181357.50 2001Dec15140.93 2002Jan1700.00 2002Feb1900.00 2002Mar1600.00 2002Apr2200.00 2002May2200.00 2002Jun900.00 2002Jul000.00 Totals: 1794452.49

44 # Behavior Incidents/Day/Month

45 Increasing Precision Majors v. minors? Type of behavior infractions? Location of BI? Time of day? Staff member? Individual students?

46 # BI by Problem Behavior Type

47 # Major BI/Day/Month

48 # BI by Location

49 # BI by Time of Day

50 # BI by Staff Member

51 # Major BI by Staff Member

52 SW v. Individual Examine impact of individual student behavioral incidents on school-wide behavior incidents

53 # Major BI by Student w/ >1

54 # BI by Student w/ >3

55 SW v. Individual Majors + MinorsMajors Only #%#% 1-28920%4410% 3-5276%102% >5>5307%41%

56 What about CLEO? 12 BI Dec. 2000 – Jun. 2001 19 BI Sep. 2001 – Dec. 2001 Suspensions/Expulsions Per Year 2000-012001-02 EventsDaysEventsDays In School Suspensions0022 Out of School Suspensions1132.5 Expulsions0000

57 CLEO: # BI/Day/Month

58 CLEO: # BI by Type

59 CLEO: # BI by Location

60 1. School-wide systems if… >40% of students received 1+ ODR >2.5 ODR/student Modify universal interventions (proactive school- wide discipline) to improve overall discipline system –Teach, precorrect, & positively reinforce expected behavior

61 2. Classroom system if… >60% of referrals come from classroom >50% of ODR come from <10% of classrooms Enhance universal &/or targeted classroom management practices –Examine academic engagement & success –Teach, precorrect for, & positively reinforce expected classroom behavior & routines

62 3. Non-classroom systems if… >35% of referrals come from non- classroom settings >15% of students referred from non- classroom settings Enhance universal behavior management practices –teach, precorrect for, & positively reinforce expected behavior & routines –increase active supervision (move, scan, interact)

63 4. Targeted group interventions if…. >10-15 students receive >5 ODR Provide functional assessment- based, but group- based targeted interventions –Standardize & increase daily monitoring, opportunities & frequency of positive reinforcement

64 5. Individualized action team system if... 10 ODR <10 students continue rate of referrals after receiving targeted group support Provide highly individualized functional- assessment-based behavior support planning

65 “School-wide Information System” (SWIS) SWIS.org

66 Establishing an Evaluation Plan Develop evaluation questions –What do you want to know? Identify indicators for answering each question –What information can be collected?

67 Develop methods & schedules for collecting & analyzing indicators –How & when should this information be gathered? Make decisions from analysis information –What is the answer for the question?

68 7 Basic Evaluation Questions 1.What does “it” look like now? 2.Are we satisfied with how “it” looks? 3.What would we like “it” to look like? 4.What would we need to do to make “it” look like that? 5.How would we know if we’ve been successful with “it”? 6.What can we do to keep “it” like that? 7.What can we do to make “it” more efficient & durable?

69 Guidelines: To greatest extent possible…. Use available data Make data collection easy (<1% of staff time) Develop relevant questions Display data in efficient ways

70 Develop regular & frequent schedule/routine for data review & decision making Utilize multiple data types & sources Establish clarity about office v. staff managed behavior Invest in local expertise

71 Conclude Data are good…but only as good as systems in place for –PBS –Collecting & summarizing –Analyzing –Decision making, action planning, & sustained implementation


Download ppt "Data-based Decision Making: Basics OSEP Center on Positive Behavioral Interventions & Supports February 2006"

Similar presentations


Ads by Google