Presentation on theme: "D ATA -B ASED P ROBLEM S OLVING AND D ATA S YSTEMS Shelby Robertson, Ph.D. Therese Sandomierski, MA Pamela Sudduth, MA."— Presentation transcript:
D ATA -B ASED P ROBLEM S OLVING AND D ATA S YSTEMS Shelby Robertson, Ph.D. Therese Sandomierski, MA Pamela Sudduth, MA
This Session: Solidify a vision for problem solving at Tier 1 See some examples of what it looks like for different domains Become familiar with some resources that are available to support DBPS
DBPS Workgroup Develop a model/template for data-based problem solving across tiers… – Can be applied by schools and districts Primary outcomes will be the conceptual framework, training resources, and exemplars for professional development at the district level. – “Library” for consultants
What Is Data-Based Problem Solving? Decisions in a MTSSS Framework are based on student performance data. Data-Based Problem Solving is infused in all components of a MTSSS practice. At the screening level, data would be used to make decisions about which students are at risk of their needs not being met. In the progress monitoring stage, data is used to make decisions about effectiveness of interventions. Decisions to increase or decrease levels of intervention within a Multi-Tiered Systems of Support Framework are based on student performance data.
Why is Data-Based Problem Solving Important? Data-based decisions regarding student response to intervention is central to the MTSSS Framework. Important educational decisions about intensity and likely duration of interventions are based on individual student response to instruction across multiple tiers of interventions and are informed by data on learning rate and level.
Knowing why and for what purpose data is being collected is imperative. When the purpose and intent of data collection is known, the data can be used to make various decisions. Why is Data-Based Problem Solving Important?
What Should Schools Consider? Three types of data are gathered within a MTSSS process: Data as a result of universal screening is used to identify those students who are not making academic or behavioral progress at expected rates Data as a result of diagnostic assessment is used to determine what students can and cannot do in important academic and behavioral domains Data as a result of progress monitoring is used to determine if academic or behavioral interventions are producing desired effects.
Data collection leads to appropriate support and strategic instruction for ALL students. When looking at data, a team may decide: – if the delivery of the core curriculum should be altered, – if more information is needed, – or if supplemental instruction needs to be added. Data that is collected will also inform the school whether or not the problem exists as a result of the classroom environment, intervention, curriculum, instruction, or learner.
Problem Solving Process Define the Problem What Do We Want Students to KNOW and Be Able to DO? Problem Analysis Why Can’t They DO It? Implement Plan What Are WE Going To DO About It? Evaluate Did It WORK? (Response to Intervention –RtI)
Step 1/Tier 1 Integrated Guided Questions Guiding Questions: Step 1 – Problem ID What do we expect our students to know, understand, and be able to do as a result of instruction? Do our students meet or exceed these expected levels? (How sufficient is the core?) Are there groups for whom core is not sufficient?
Full Option Graduates! Both domains focus on a common goal:
AcademicsBehavior NGSSS for all grade levels, content areas School-Wide expectations Character Education Traits School-Wide social skills curricula School/District mission statements What do we expect our students to know, understand, and be able to do as a result of instruction? To effectively address student outcomes, schools must address both domains.
Are there groups for whom core is not sufficient? www.flrtib.org
Are there groups for whom core is not sufficient? www.flrtib.org
Guiding Questions: Step 2 – Problem Analysis If the core is NOT sufficient for either a “domain” or group of students, what barriers have or could preclude students from reaching expected levels? Step 2/Tier 1 Integrated Guided Questions
I nstruction C urriculum E nvironment L earner Alignment with Standards and Across Grade/School Levels, Relevancy to Students’ Personal Goals, Content, Pacing, Progression of Learning, Differentiation Cognitive Complexity of Questions and Tasks, Gradual Release of Responsibility, Appropriate Scaffolding, Connection to Students’ Personal Goals, Interests and Life Experiences Reward/Consequence System, Visual Cues, Climate/Culture, Quality of Student/Adult Relationships, Quality of Peer Relationships, High Expectations for ALL Students, Collaboration and Voice Reinforcement Preferences, Perceptions of Competence and Control, Perceived Relevancy of Instruction/Education, Integration and Affiliation with School, Academic/Social- Emotional Skill Development
Hypotheses HypothesisData Source Examples I Instruction did not include modeling and guided practice. Lesson plans, observations, report/survey data, permanent products C Skills targeted in the lessons did not align with the NGSSS Lesson plans, Observations of task, assignments and assessments E School-wide reinforcement program includes few developmentally appropriate reinforcement options. Review of school-wide behavior plan, Student survey and student focus group feedback L A substantial amount of instructional time is lost due to excessive absenteeism.Attendance, ODRs, Suspensions
Reaching Expected Levels If the core is NOT sufficient for either a “domain” or group of students, what barriers have or could preclude students from reaching expected levels?
What potential barriers have precluded us from improving student outcomes? Lack of… Common Assessments Common Planning Ongoing Progress Monitoring Curriculum Mapping Aligned with NGSSS and Common Assessments Resource Availability Administrative Support Professional Development
Possible Data Sources for Analysis I Lesson plan review, instructional observations, survey data, permanent products C Lesson plans, Observations of task, assignments and assessments E Review of school-wide behavior plan, student survey and student focus group feedback, walk-through assessments, climate surveys, behavior plan/fidelity measures L Attendance, ODRs, suspensions, Assessment of academic/social- emotional skill development Analyzing Identified Problems
The school-wide reinforcement program IS NOT being implemented with fidelity…
Guiding Questions: Step 3 – Plan Development & Implementation What strategies or interventions will be used? – What resources are needed to support implementation of the plan? Planning for Step 4 How will sufficiency and effectiveness of core be monitored overtime? – How will the data be displayed? How will fidelity of interventions be monitored over time? How will fidelity of the problem solving process be monitored over time? How will “good”, “questionable,” and “poor” responses to intervention be defined? Step 3/Tier 1 Integrated Guided Questions
Tier 1 Interventions (Behavior) Based on the function of the problem behavior – Teach the skill – Reward the skill – Consequate effectively Referrals by expectation, context, motivation, admin decision will help inform the possible function www.flpbs.fmhi.usf.edu for examples www.flpbs.fmhi.usf.edu
How will sufficiency and effectiveness of core be monitored overtime? Common Assessment Example
Monitoring the Core (Behavior): Referrals per Day/Month www.flrtib.org
How will fidelity be monitored over time? Fidelity of implementation is the delivery of instruction in the way in which it was designed to be delivered. Fidelity must also address the integrity with which screening and progress-monitoring procedures are completed and an explicit decision-making model is followed. Fidelity also applies to the problem solving process…bad problem solving can lead to bad decisions to implement otherwise good interventions.
Monitoring the Core (Behavior): Fidelity Depends on the intervention! – Lesson plans with built-in fidelity checklists – Permanent products of lessons – Token sign-out logs – Counts of positive post cards – Parent call logs Implementation measures Surveys, focus groups observations
How will “good”, “questionable,” and “poor” responses to intervention be defined? Decision Rules: Positive Response – Gap is closing – Can extrapolate point at which target student(s) will “come in range” of target--even if this is long range Questionable Response – Rate at which gap is widening slows considerably, but gap is still widening – Gap stops widening but closure does not occur Poor Response – Gap continues to widen with no change in rate.
Defining Adequate Response: Tier 1 for Behavior School-Wide screenings (< 20% identified) ODRs by October (< 2 majors) Teacher nominations, ESE (EBD) referrals Declining trend* in discipline data Attendance, tardies
Step 4 – Plan Evaluation of Effectiveness Have planned improvements to core been effective? Step 4/Tier 1 Integrated Guided Questions
Performance Fall Positive Response to Intervention Expected Performance Observed Performance WinterSpring Gap is closing, Can extrapolate point at which target student(s) will “come in range” of target--even if this is long range
Performance Time Positive Response to Intervention Expected Trajectory Observed Trajectory
Performance Fall Questionable Response to Intervention Expected Performance Observed Performance WinterSpring Rate at which gap is widening slows considerably, but gap is still widening Gap stops widening but closure does not occur
Performance Time Questionable Response to Intervention Expected Trajectory Observed Trajectory
Performance Fall Poor Response to Intervention Expected Performance Observed Performance WinterSpring Gap continues to widen with no change in rate.
Performance Time Poor Response to Intervention Expected Trajectory Observed Trajectory
Decisions What to do if RtI is: Positive Continue intervention with current goal Continue intervention with goal increased Fade intervention to determine if student(s) have acquired functional independence.
Decisions What to do if RtI is: Questionable – Was our DBPS process sound? – Was intervention implemented as intended? If no - employ strategies to increase implementation integrity If yes - – Increase intensity of current intervention for a short period of time and assess impact. If rate improves, continue. If rate does not improve, return to problem solving.
Decisions What to do if RtI is: Poor – Was our DBPS process sound? – Was intervention implemented as intended? If no - employ strategies in increase implementation integrity If yes - – Is intervention aligned with the verified hypothesis? (Intervention Design) – Are there other hypotheses to consider? (Problem Analysis) – Was the problem identified correctly? (Problem Identification)
We CANNOT Continue to Ignore the Data… Will we meet our goal of 100% by 2014?
FLDOE Race to the Top L ocal I nstructional I mprovement S ystem Minimum Standards FLDOE identified nine component areas of a LIIS and specific requirements for each.
6. Analysis and Reporting-The system will leverage the availability of data about students, district staff, benchmarks, courses, assessments, and instructional resources to provide new ways of viewing and analyzing data. 8. Data Integration-The system will include or seamlessly share information about students, district staff, benchmarks, courses, assessments, and instructional resources to enable teachers, students, parents, and district administrators to use data to inform instruction and operational practices.
Academics & Behavior influence one another in a multitude of ways Systems & resources are being developed to support DBPS – RtI:B database – Workgroup models & materials The Reciprocal Nature of Academic & Behavior Outcomes