Presentation is loading. Please wait.

Presentation is loading. Please wait.

PROGRESS MONITORING FOR SOCIAL BEHAVIOR Cynthia M. Anderson, PhD & Nadia Katul Sampson, MA University of Oregon.

Similar presentations


Presentation on theme: "PROGRESS MONITORING FOR SOCIAL BEHAVIOR Cynthia M. Anderson, PhD & Nadia Katul Sampson, MA University of Oregon."— Presentation transcript:

1 PROGRESS MONITORING FOR SOCIAL BEHAVIOR Cynthia M. Anderson, PhD & Nadia Katul Sampson, MA University of Oregon

2 Intensive Interventions Specialized Individualized Systems for Students with High-Risk Behavior Targeted Interventions Specialized Group Systems for Students with At-Risk Behavior Universal Interventions School-/Classroom- Wide Systems for All Students, Staff, & Settings ~80% of Students ~15% ~5% School-Wide Positive Behavior Support

3 Systems Supporting Staff Behavior Practices Supporting Student Behavior OUTCOMES Measurable Outcomes Supporting Decision Making

4 Important Outcomes to Monitor  System outcomes  What key features of student support are in place?  Are key features implemented with fidelity?  Individual student outcomes  Decision rules for starting an intervention “Is this intervention a good fit?”  Progress monitoring during an intervention “Is the intervention resulting in the outcomes we want?”  Is the intervention being implemented as designed? “Are we doing what we said we would do?”

5 Systems Outcomes: Assessing Process  Self Assessment  Monitoring progress over time  Developing an action plan  External Evaluation  Monitoring progress over time  Useful when outside opinion is warranted

6 Existing Tools for Assessing Process  Universal Component of SWPBS  External School-wide Evaluation Tool (SET)  Self Assessment Team implementation Checklist (TIC) Benchmarks of Quality (BoQ) Phases of Implementation  Targeted & Intensive Components of SWPBS  External Individual Student Systems Evaluation Tool (ISSET)  Self Assessment Benchmarks for Advanced Tiers (BAT)

7 ISSET and BAT  Key Features 1. Foundations: What needs to be in place? 2. Targeted interventions 3. Intensive interventions  For each feature: SYSTEMS Practices Data 1. What practices are implemented? 2. What systems are used? 3. What outcomes are assessed?

8 Important Outcomes to Assess  System outcomes  Individual student outcomes  Decision rules for starting an intervention “Is this intervention a good fit?”  Progress monitoring during an intervention “Is the intervention resulting in the outcomes we want?”  Is the intervention being implemented as designed? “Are we doing what we said we would do?”

9 Important Outcomes to Assess  System outcomes  Individual student outcomes  Decision rules for starting an intervention “Is this intervention a good fit?”

10 Is an Intervention a Good Fit?  Questions about the student’s behavior:  What is the problem?  What is the hypothesis about why the problem is occurring  What is the goal of intervention?  Who will be implementing and what are their skills and availability?  Intervention selection: Is this intervention effective for:  Problems like this (severity, intensity, where it occurs, etc.)  Behaviors triggered and maintained by events like this one?  Achieving goals like this?  What resources are needed to implement?

11 Is this Intervention a Good Fit?  Evaluating outcomes requires planning before the intervention begins 1. What are the targeted outcomes? 2. What is the goal—date and outcome? 3. How will data be collected? 4. How will data be analyzed? 5. How often will progress monitoring occur? Group Template Individual Template

12 Important Outcomes to Assess  System outcomes  Individual student outcomes  Progress monitoring during an intervention “Is the intervention resulting in the outcomes we want?”

13 Students in IPBS—Is the Intervention Working?  Once the intervention has begun  Progress monitoring occurs regularly and frequently Feedback from a teacher(s) Team feedback  Data are used to guide decision-making Continue the intervention Modify the intervention Begin a new intervention Fade the existing intervention Graph System Behavior Rating Form Behavior Rating & Fidelity Team Feedback

14

15 Important Outcomes to Assess  System outcomes  Individual student outcomes  Is the intervention being implemented as designed? “Are we doing what we said we would do?”

16 Fidelity  Documentation that intervention is being implemented as designed  Measurement  Teacher-completed  Assessed by another person

17 Student Outcomes--Fidelity 1. What are key components of the intervention? 2. How can fidelity be measured? 3. Who will collect and analyze the data? 4. How will data be used? Sample BSP

18 Monitoring Student Progress Over Time  System requirements  Efficient  Comprehensive  Easily accessible  Modifiable to meet needs of individual students

19 Relevant Information for Individual Students 1. Referral information 2. Intervention description 3. Modifications to intervention 4. Easily interpretable summery of intervention results/progress

20

21

22

23

24 Progress-Monitoring in Illinois  Progress monitoring is critical at all levels  Student Per student, for individual progress-monitoring In aggregate, to monitor effectiveness of interventions themselves Ex. Is our ‘problem-solving’ group effective?  Building/District Per school, to monitor building-level systems Ex. Is our HS effective at keeping youth engaged? In aggregate, to make district-level decisions District as a whole (set goals, allocate resources) Cohort schools vs non-cohort schools (is an initiative working?)

25 Data-Based Decision-Making 1) Student outcome data is used:  To identify youth in need of support and to identify appropriate intervention  For on-going progress-monitoring of response to intervention  To exit or transition youth off of interventions 2) Intervention integrity or process data is used:  To monitor the effectiveness of the intervention itself  To make decisions regarding the continuum/ menu of interventions/supports

26

27 Mean CICO points per school 71 Illinois Elementary Schools 08-09

28

29 Secondary Systems Planning Team Meeting Agenda  Number of youth in CICO (record on TT)?  Number of youth responding (record on TT)? * Send Reverse Request for Assistance to teachers of all youth not responding  Number of new youth potentially entering intervention (share # of RFAs, Universal Screening info and/or # of youth who met the data-based decision-rule cut offs for Secondary support)?  Repeat for S/AIG, Mentoring & Brief FBA/BIP  If less than 70% of youth are responding to any of the interventions, the Secondary Systems team should review the integrity of the intervention and make adjustments as needed.

30 3-Tiered System of Support Necessary Conversations (Teams) CICO SAIG Group w. individual feature Complex FBA/BIP Problem Solving Team Tertiary Systems Team Brief FBA/ BIP Brief FBA/BIP WRAP Secondary Systems Team Plans SW & Class-wide supports Uses Process data; determines overall intervention effectiveness Standing team; uses FBA/BIP process for one youth at a time Uses Process data; determines overall intervention effectiveness Sept. 1, 2009 Universal Team Universal Support

31 Comparison: Elementary School A FY 2009 CISS Data and IS-SET Data

32 FY 2009 IS-SET Data Comparison: Elementary School A - District

33 164 Mean Percentage of Students by Major ODRs 06-07 Elementary School B (677 students)

34 Mean Percentage of Students by Major ODRs 07-08 Elementary School B (707 students) 71

35 Mean Percentage of Students by Major ODRs 08-09 Elementary School B (695 students) 61

36 FY 2009 IS-SET Data Comparison: Elementary School B - District

37 Comments/Questions


Download ppt "PROGRESS MONITORING FOR SOCIAL BEHAVIOR Cynthia M. Anderson, PhD & Nadia Katul Sampson, MA University of Oregon."

Similar presentations


Ads by Google