Presentation is loading. Please wait.

Presentation is loading. Please wait.

D ATA -B ASED P ROBLEM S OLVING AND D ATA S YSTEMS Shelby Robertson, Ph.D. Therese Sandomierski, MA Pamela Sudduth, MA.

Similar presentations


Presentation on theme: "D ATA -B ASED P ROBLEM S OLVING AND D ATA S YSTEMS Shelby Robertson, Ph.D. Therese Sandomierski, MA Pamela Sudduth, MA."— Presentation transcript:

1 D ATA -B ASED P ROBLEM S OLVING AND D ATA S YSTEMS Shelby Robertson, Ph.D. Therese Sandomierski, MA Pamela Sudduth, MA

2 This Session: Solidify a vision for problem solving at Tier 1 See some examples of what it looks like for different domains Become familiar with some resources that are available to support DBPS

3 DBPS Workgroup Develop a model/template for data-based problem solving across tiers… – Can be applied by schools and districts Primary outcomes will be the conceptual framework, training resources, and exemplars for professional development at the district level. – “Library” for consultants

4 What Is Data-Based Problem Solving? Decisions in a MTSSS Framework are based on student performance data. Data-Based Problem Solving is infused in all components of a MTSSS practice. At the screening level, data would be used to make decisions about which students are at risk of their needs not being met. In the progress monitoring stage, data is used to make decisions about effectiveness of interventions. Decisions to increase or decrease levels of intervention within a Multi-Tiered Systems of Support Framework are based on student performance data.

5 Why is Data-Based Problem Solving Important? Data-based decisions regarding student response to intervention is central to the MTSSS Framework. Important educational decisions about intensity and likely duration of interventions are based on individual student response to instruction across multiple tiers of interventions and are informed by data on learning rate and level.

6 Knowing why and for what purpose data is being collected is imperative. When the purpose and intent of data collection is known, the data can be used to make various decisions. Why is Data-Based Problem Solving Important?

7 What Should Schools Consider? Three types of data are gathered within a MTSSS process: Data as a result of universal screening is used to identify those students who are not making academic or behavioral progress at expected rates Data as a result of diagnostic assessment is used to determine what students can and cannot do in important academic and behavioral domains Data as a result of progress monitoring is used to determine if academic or behavioral interventions are producing desired effects.

8 Data collection leads to appropriate support and strategic instruction for ALL students. When looking at data, a team may decide: – if the delivery of the core curriculum should be altered, – if more information is needed, – or if supplemental instruction needs to be added. Data that is collected will also inform the school whether or not the problem exists as a result of the classroom environment, intervention, curriculum, instruction, or learner.

9 Problem Solving Process Define the Problem What Do We Want Students to KNOW and Be Able to DO? Problem Analysis Why Can’t They DO It? Implement Plan What Are WE Going To DO About It? Evaluate Did It WORK? (Response to Intervention –RtI)

10 Step 1/Tier 1 Integrated Guided Questions Guiding Questions: Step 1 – Problem ID What do we expect our students to know, understand, and be able to do as a result of instruction? Do our students meet or exceed these expected levels? (How sufficient is the core?) Are there groups for whom core is not sufficient?

11 Full Option Graduates! Both domains focus on a common goal:

12 AcademicsBehavior NGSSS for all grade levels, content areas School-Wide expectations Character Education Traits School-Wide social skills curricula School/District mission statements What do we expect our students to know, understand, and be able to do as a result of instruction? To effectively address student outcomes, schools must address both domains.

13 How sufficient is the core?

14 Are there groups for whom core is not sufficient?

15 How sufficient is the core?

16 Are there groups for whom core is not sufficient?

17 How to Answer the Questions: Behavior Attendance Tardies Suspensions Discipline referrals Surveys – Locally developed, safety, climate, substance abuse Percent participating in Tier 1 system

18 How sufficient is the core?

19 How sufficient is the core?

20 Are there groups for whom core is not sufficient?

21 Are there groups for whom core is not sufficient?

22 Guiding Questions: Step 2 – Problem Analysis If the core is NOT sufficient for either a “domain” or group of students, what barriers have or could preclude students from reaching expected levels? Step 2/Tier 1 Integrated Guided Questions

23 I nstruction C urriculum E nvironment L earner Alignment with Standards and Across Grade/School Levels, Relevancy to Students’ Personal Goals, Content, Pacing, Progression of Learning, Differentiation Cognitive Complexity of Questions and Tasks, Gradual Release of Responsibility, Appropriate Scaffolding, Connection to Students’ Personal Goals, Interests and Life Experiences Reward/Consequence System, Visual Cues, Climate/Culture, Quality of Student/Adult Relationships, Quality of Peer Relationships, High Expectations for ALL Students, Collaboration and Voice Reinforcement Preferences, Perceptions of Competence and Control, Perceived Relevancy of Instruction/Education, Integration and Affiliation with School, Academic/Social- Emotional Skill Development

24 Hypotheses HypothesisData Source Examples I Instruction did not include modeling and guided practice. Lesson plans, observations, report/survey data, permanent products C Skills targeted in the lessons did not align with the NGSSS Lesson plans, Observations of task, assignments and assessments E School-wide reinforcement program includes few developmentally appropriate reinforcement options. Review of school-wide behavior plan, Student survey and student focus group feedback L A substantial amount of instructional time is lost due to excessive absenteeism.Attendance, ODRs, Suspensions

25 Reaching Expected Levels If the core is NOT sufficient for either a “domain” or group of students, what barriers have or could preclude students from reaching expected levels?

26 What potential barriers have precluded us from improving student outcomes? Lack of… Common Assessments Common Planning Ongoing Progress Monitoring Curriculum Mapping Aligned with NGSSS and Common Assessments Resource Availability Administrative Support Professional Development

27 Possible Data Sources for Analysis I Lesson plan review, instructional observations, survey data, permanent products C Lesson plans, Observations of task, assignments and assessments E Review of school-wide behavior plan, student survey and student focus group feedback, walk-through assessments, climate surveys, behavior plan/fidelity measures L Attendance, ODRs, suspensions, Assessment of academic/social- emotional skill development Analyzing Identified Problems

28 The school-wide reinforcement program IS NOT being implemented with fidelity…

29 Guiding Questions: Step 3 – Plan Development & Implementation What strategies or interventions will be used? – What resources are needed to support implementation of the plan? Planning for Step 4 How will sufficiency and effectiveness of core be monitored overtime? – How will the data be displayed? How will fidelity of interventions be monitored over time? How will fidelity of the problem solving process be monitored over time? How will “good”, “questionable,” and “poor” responses to intervention be defined? Step 3/Tier 1 Integrated Guided Questions

30 What strategies or interventions will be used?

31 Math Resources What resources are needed to support implementation of the plan?

32 Literacy Resources What resources are needed to support implementation of the plan?

33

34 Tier 1 Interventions (Behavior) Based on the function of the problem behavior – Teach the skill – Reward the skill – Consequate effectively Referrals by expectation, context, motivation, admin decision will help inform the possible function for examples

35 How will sufficiency and effectiveness of core be monitored overtime? Common Assessment Example

36 Monitoring the Core (Behavior): Referrals per Day/Month

37 How will fidelity be monitored over time? Fidelity of implementation is the delivery of instruction in the way in which it was designed to be delivered. Fidelity must also address the integrity with which screening and progress-monitoring procedures are completed and an explicit decision-making model is followed. Fidelity also applies to the problem solving process…bad problem solving can lead to bad decisions to implement otherwise good interventions.

38 Monitoring the Core (Behavior): Fidelity Depends on the intervention! – Lesson plans with built-in fidelity checklists – Permanent products of lessons – Token sign-out logs – Counts of positive post cards – Parent call logs Implementation measures Surveys, focus groups observations

39 Implementation Measures: PBS Implementation Checklist

40 Implementation Measures: Benchmarks of Quality

41 How will “good”, “questionable,” and “poor” responses to intervention be defined? Decision Rules: Positive Response – Gap is closing – Can extrapolate point at which target student(s) will “come in range” of target--even if this is long range Questionable Response – Rate at which gap is widening slows considerably, but gap is still widening – Gap stops widening but closure does not occur Poor Response – Gap continues to widen with no change in rate.

42 Defining Adequate Response: Tier 1 for Behavior School-Wide screenings (< 20% identified) ODRs by October (< 2 majors) Teacher nominations, ESE (EBD) referrals Declining trend* in discipline data Attendance, tardies

43 Step 4 – Plan Evaluation of Effectiveness Have planned improvements to core been effective? Step 4/Tier 1 Integrated Guided Questions

44 Performance Fall Positive Response to Intervention Expected Performance Observed Performance WinterSpring Gap is closing, Can extrapolate point at which target student(s) will “come in range” of target--even if this is long range

45 Performance Time Positive Response to Intervention Expected Trajectory Observed Trajectory

46 Performance Fall Questionable Response to Intervention Expected Performance Observed Performance WinterSpring Rate at which gap is widening slows considerably, but gap is still widening Gap stops widening but closure does not occur

47 Performance Time Questionable Response to Intervention Expected Trajectory Observed Trajectory

48 Performance Fall Poor Response to Intervention Expected Performance Observed Performance WinterSpring Gap continues to widen with no change in rate.

49 Performance Time Poor Response to Intervention Expected Trajectory Observed Trajectory

50 Have our interventions been effective?

51

52 Decisions What to do if RtI is: Positive Continue intervention with current goal Continue intervention with goal increased Fade intervention to determine if student(s) have acquired functional independence.

53 Decisions What to do if RtI is: Questionable – Was our DBPS process sound? – Was intervention implemented as intended? If no - employ strategies to increase implementation integrity If yes - – Increase intensity of current intervention for a short period of time and assess impact. If rate improves, continue. If rate does not improve, return to problem solving.

54 Decisions What to do if RtI is: Poor – Was our DBPS process sound? – Was intervention implemented as intended? If no - employ strategies in increase implementation integrity If yes - – Is intervention aligned with the verified hypothesis? (Intervention Design) – Are there other hypotheses to consider? (Problem Analysis) – Was the problem identified correctly? (Problem Identification)

55 We CANNOT Continue to Ignore the Data… Will we meet our goal of 100% by 2014?

56 FLDOE Race to the Top L ocal I nstructional I mprovement S ystem Minimum Standards FLDOE identified nine component areas of a LIIS and specific requirements for each.

57 6. Analysis and Reporting-The system will leverage the availability of data about students, district staff, benchmarks, courses, assessments, and instructional resources to provide new ways of viewing and analyzing data. 8. Data Integration-The system will include or seamlessly share information about students, district staff, benchmarks, courses, assessments, and instructional resources to enable teachers, students, parents, and district administrators to use data to inform instruction and operational practices.

58 Academics & Behavior influence one another in a multitude of ways Systems & resources are being developed to support DBPS – RtI:B database – Workgroup models & materials The Reciprocal Nature of Academic & Behavior Outcomes

59 Thank You!


Download ppt "D ATA -B ASED P ROBLEM S OLVING AND D ATA S YSTEMS Shelby Robertson, Ph.D. Therese Sandomierski, MA Pamela Sudduth, MA."

Similar presentations


Ads by Google