Data Based Decision Making

Slides:



Advertisements
Similar presentations
Response to Intervention: Linking Statewide Initiatives.
Advertisements

Instructional Decision Making
Response to Intervention (RtI) in Primary Grades
Chapter 9 - Fluency Assessment
Overview of Progress Monitoring Training Session Part of a training series developed to accompany the AIMSweb Improvement System. Purpose is to provide.
Chapter 9: Fluency Assessment
Digging Deeper with DIBELS Data
Big Ideas About Frequent Formative Evaluation Using General Outcome Measures and the Progress Monitoring Program One of the most powerful interventions.
Progress Monitoring project DATA Assessment Module.
Linking Data to Instruction Jefferson County School District January 19, 2010.
Mike W. Olson RTI. RTI is… 2 the practice of providing high-quality instruction/intervention matched to student needs and using learning rate over time.
What We’re Learning Building & Improving an RTI System Seven Key Foundations RISS 2009.
Novice Webinar 2 Overview of the Four Types and Purposes of Assessment.
North Penn School District Phase III Update Introduction to Response to Instruction and Intervention (RTII): A Schoolwide Framework for Student Success.
Response to Intervention (RtI) A Basic Overview. Illinois IDEA 2004 Part Rules Requires: use of a process that determines how the child responds.
Eugene, OR Brown Bag Presentation: November 19, 2007
Progress Monitoring and Goal Writing Using CBMs to write RtI goals Using CBMs to write RtI goals.
Multi-tiered Instruction at the Secondary Level “I think what makes a difference for our kids is that they graduate with a sense of place: high school,
Response to Intervention (RTI) Presented by Ashley Adamo and Brian Mitchell January 6, 2012.
Universal Screening and Progress Monitoring Nebraska Department of Education Response-to-Intervention Consortium.
 “Fluency assessment consists of listening to students read aloud and collecting information about their oral reading accuracy, rate, and prosody.” (Page.
Response to Intervention A quick review to guide the work of NH’s RtI Task Force Sandy Plocharczyk Raina Chick Co Chairs, NH RtI Task Force October 24,
Reading First Assessment Faculty Presentation. Fundamental Discoveries About How Children Learn to Read 1.Children who enter first grade weak in phonemic.
Chapter 9 Fluency Assessment Tina Jensen. What? Fluency Assessment Consists of listening to students read aloud for a given time to collect information.
Aligning Interventions with Core How to meet student needs without creating curricular chaos.
Webinar 3 Core Instruction (Tier 1). Assessments: – Screening – Evaluating effectiveness of core instruction Research-based/Evidence-based Instructional.
Assessment: Universal Screening Cadre 7 Initial Training September 29, 2011.
Research Foundations and EGRA Protocols or Why these measures? Sylvia Linan-Thompson.
Vision: Every child in every district receives the instruction that they need and deserve…every day. Oregon Response to Intervention Vision: Every child.
1 Preventing Reading Difficulties with DIBELS Assessment.
Progress Monitoring and Response to Intervention Solution.
An Introduction to - PBIS in Roseburg Public Schools: RTI, Professional Learning Communities and How to Respond When Kids Don’t Learn.
Progress Monitoring Cadre 8 Training February 6 th, 2012.
Response to Intervention (RTI) at Mary Lin Elementary Principal’s Coffee August 30, 2013.
Vision: Every child in every district receives the instruction that they need and deserve…every day. Oregon Response to Intervention Vision: Every child.
Progress Monitoring for students in Strategic or Intensive intervention levels Based on the work of Roland Good and Ruth Kaminski.
School-wide Data Analysis Oregon RtI Spring Conference May 9 th 2012.
Vision: Every child in every district receives the instruction that they need and deserve…every day. Oregon Response to Intervention Vision: Every child.
Aligning Interventions with Core How to meet student needs without creating curricular chaos.
The Instructional Decision-Making Process 1 hour presentation.
Vision: Every child in every district receives the instruction that they need and deserve…every day. Oregon Response to Intervention Vision: Every child.
Using Data in the EBIS System Universal Screening and Progress Monitoring.
LIGHTS, CAMERA …. ACADEMIC DATA at the Elementary Level Cammie Neal and Jennifer Schau Forsyth County Schools.
Vision: Every child in every district receives the instruction that they need and deserve…every day. Oregon Response to Intervention Vision: Every child.
1 The Oregon Reading First Model: A Blueprint for Success Scott K. Baker Eugene Research Institute/ University of Oregon Orientation Session Portland,
Response to Intervention in KPS Linda Campbell
Webinar 1: Overview. 1. Overview  Link to SLD Rule  Multi-tiered Systems of Support (MTSS)  Systems of Assessment 2. Introduction to Tiers  Tier 1:
RTI Response To Intervention. What is RTI ? Response to intervention is a multi – tier approach to the early identification and support of students with.
Lori Wolfe October 9, Definition of RTI according to NCRTI ( National Center on Response to Intervention) Response to intervention integrates assessment.
Data-Based Decision Making: Universal Screening and Progress Monitoring.
PLCS & THE CONNECTION TO RESPONSE TO INTERVENTION Essentials for Administrators Sept. 27, 2012.
Vision: Every child in every district receives the instruction that they need and deserve…every day. Foundational Tier 1 training Sept. 22, 2015 Universal.
Digging Deeper with Screening Data: Creating Intervention Groups Gresham-Barlow School District September 8, 2011.
DIBELS: Doing it Right –. Big Ideas of Today’s Presentation Reading success is built upon a foundation of skills DIBELS (Dynamic Indicators of Basic Early.
Effective Behavior & Instructional Support. Implementing RTI through Effective Behavior & Instructional Support.
Interventions Identifying and Implementing. What is the purpose of providing interventions? To verify that the students difficulties are not due to a.
Universal Screening Cadre 6 Training October 12, 2010.
UNIVERSAL SCREENING AND PROGRESS MONITORING IN READING Secondary Level.
Updated Section 31a Information LITERACY, CAREER/COLLEGE READINESS, MTSS.
Literacy Assessments Literacy Workgroup Marcia Atwood Michelle Boutwell Sue Locke-Scott Rae Lynn McCarthy.
Using Data to Implement RtI Colleen Anderson & Michelle Hosp Iowa Department of Education.
Progress Monitoring Goal Setting Overview of Measures Keith Drieberg, Director of Psychological Services John Oliveri, School Psychologist Cathleen Geraghty,
WestEd.org Washington Private Schools RtI Conference Follow- up Webinar October 16, 2012 Silvia DeRuvo Pam McCabe WestEd Center for Prevention and Early.
Response to Intervention for PST Dr. Kenneth P. Oliver Macon County Schools’ Fall Leadership Retreat November 15, 2013.
Data-Driven Decision Making
Data Usage Response to Intervention
RTI & SRBI What Are They and How Can We Use Them?
Overview: Understanding and Building a Schoolwide Assessment Plan
Lake Myra Elementary School * July 2009
Presentation transcript:

Data Based Decision Making

Reading Review Stanovich, 2010 Fuchs & Fuchs -- Progress Monitoring

"It ain't so much the things we don't know that get us into trouble "It ain't so much the things we don't know that get us into trouble. It's the things we know that just ain't so." -Josh Billings Perhaps the second most famous humor writer and lecturer in the United States in the second half of the 19th century after Mark Twain

We Never Know for sure… Even practices with the best research base… may not work for some students. So… if you are using a research based intervention – implement & COLLECT DATA! And… if you are struggling to identify a research-based intervention – implement & COLLECT DATA!

Critical Concept: Data Based Decision Making Continuous, purposeful process of collecting, interpreting, presenting and using data to inform actions that support positive educational outcomes. Data based decision making considers the learner’s progress within the contexts of instruction, curriculum and environment.

Necessary components of Assessment When a student is experiencing difficulty, several related & complementary types of assessment should be performed Assessment of the Learner (Student) Assessment of Instruction (or Intervention) Curriculum and Environment Learner Instruction/ Intervention Curriculum Environment

Measuring -ICE Instruction, Curriculum, Environment What questions might you have about the instruction/intervention or curriculum? Are the instructional/interventions methods research based? Implementation fidelity? Is the classroom environment suitable to learning Time on task Instructional time Academic engaged time Opportunities to Respond & % Correct Responses Positive to Negative Ratio Student problem behavior

Models for Data Based Decision Making Problem Solving Models & Outcomes Driven Models

Supporting Social Competence & Academic Achievement OUTCOMES Supporting Decision Making Supporting Staff Behavior SYSTEMS DATA PRACTICES Supporting Student Behavior 9

Outcomes Driven Model In an Outcomes Driven Model, the bottom line is achievement of essential educational or social outcomes What are the desired outcomes? Are students attaining the necessary skills to be successful? If not, what changes can we make? Are the changes increasing student progress?

Research Based Frameworks Needed How do we know what to measure & when? Reading RTI & Big 5 Ideas of Reading Math RTI Behavior PBIS, Function of Behavior & ABA

Big 5 Ideas of Reading Reading Comprehension Vocabulary Oral Reading Fluency & Accuracy Phonics (Alphabetic Principle) Acquisition Fluency Maintenance & Generalization Phonemic Awareness

We must identify struggling students, BEFORE they fall too far behind 3. Accurately identify those who are on track and those who will need more support We must identify struggling students, BEFORE they fall too far behind Good, Simmons, & Smith (1998)

Response to Intervention Academic Systems Behavioral Systems Intensive, Individual Interventions Individual Students Assessment-based High Intensity 1-5% 1-5% Intensive, Individual Interventions Individual Students Assessment-based Intense, durable procedures 5-10% 5-10% Targeted Group Interventions Some students (at-risk) High efficiency Rapid response Targeted Group Interventions Some students (at-risk) High efficiency Rapid response 80-90% 80-90% Universal Interventions All students Preventive, proactive Universal Interventions All settings, all students Preventive, proactive Circa 1996 14

Problem Solving Meeting Foundations Identify Problems Team Initiated Problem Solving (TIPS) Model Develop Hypothesis Evaluate and Revise Action Plan Collect and Use Data Discuss and Select Solutions The full TIPS model. Two parts. Implementation of Problem Solving Meeting Foundations Use of the problem solving process (strategy?) Develop and Implement Action Plan Problem Solving Meeting Foundations

Purposes of Assessment Screening “Which students need more support?” Progress Monitoring “Is the student making adequate progress?” Diagnostic “What and how do we need to teach this student?” Outcome “Has our instruction been successful?”

Outcomes Driven Model Screening Outcome Screening Diagnostic Progress Monitoring

Effective Data Collection

Use the right tools for the right job Screening Progress Monitoring Diagnostic Assessment Outcomes

Use Good Tools Technically Adequate Reliability = Consistency The extent that an assessment will be consistent in finding the same results across conditions (across different administrators, across time, etc.) If same measure is given several times to the same person, their scores would remain stable & not randomly fluctuate

Use Good Tools Technically Adequate Validity = extent that an assessment measures what it is supposed to measure First we need to know what we should be measuring! Research Based Frameworks for Measurement Students who do well on valid reading tests are proficient readers Valid = assessing reading by having the student read a passage aloud and monitoring errors and rate Not Valid = assessing reading by having a student match printed letters on a page (this is an assessment matching visual figures) Draw a line to Match the letters: A f U p w w E A f I U v B p

Use Good Tools A Concern for self-developed assessments Technical Adequacy can be a problem with self-developed measures Challenge with Professional Learning Team model Which often rely on teacher-developed assessments to measure important student outcomes & guide decision making

Low Inference Students are tested using materials that are directly related to important instructional outcomes Low inference Making judgments on a child’s reading skills based on listening to them read out loud. High inference Making judgments on a child’s emotional state based on pictures they’ve drawn

Use the tools correctly Standardized Administration Administered, scored, and interpreted in the same way Directions given to students are consistent Student responses are scored in the same way Every student has the exact same opportunity on the assessment

Efficiency Time is precious in classrooms, efficiency is an important consideration When evaluating efficiency of an assessment tool, we must consider: Time & personnel required to design, administer and score assessment tools Design Administration & Scoring PNRT’s Already designed Time intensive (1-2 hours/child) CBA Some already designed, Some teacher-created Quick and Easy (1-10 min/child) CBM

Screening

1. Compare ALL students to the same grade-level standard ALL students are assessed against the grade level-standard, regardless of instructional level "If you don't know where you are going, you will wind up somewhere else.“ ~ Yogi Berra

2. Be efficient, standardized, reliable, and valid Robust indicator of academic health Brief and easy to administer Can be administered frequently Must have multiple, equivalent forms (If the metric isn’t the same, the data are meaningless) Must be sensitive to growth

We must identify struggling students, BEFORE they fall too far behind 3. Accurately identify those who are on track and those who will need more support We must identify struggling students, BEFORE they fall too far behind Good, Simmons, & Smith (1998)

4. Evaluate the quality of your schoolwide instructional system Are 80% of your students proficient? Are 80% of students reaching benchmarks and “on track” for next goal? If not, then the core curriculum needs to be addressed

What are Screening Tools? Not Screening Tools DIBELS Oral Reading Fluency Maze EasyCBM CBM Math Computation CBM Writing – Story Starters CBM Algebra CBM Early Numeracy Quick Phonics Screener QRI-IV DRA2 Running Records Report cards Meeting OAKS standards Core curriculum weekly tests on skills that are learned

One Page of a 3-Page CBM in Math Concepts and Applications (24 Total Blanks)

Previous Years Discipline data Who needs to be on our radar from Day 1? Who had FBA/BSP’s last year? Which students moved on? Which are returning this year? Can we get data for our incoming class & new students? Decision Rule

Progress Monitoring

Progress Monitoring Tools Brief & Easy Sensitive to growth Equivalent forms Frequent

What course should we follow? How are we doing? Where are we? What is our goal? What course should we follow? How are we doing? Our Goal Desired Course Notes: For example, in the Northwest boating is an important recreation and livelihood. Whether you are on a whale watching tour or fishing, sometimes finding your way back to your port is easy. The sky is clear, the ocean blue, and you can clearly see your home port and the course you should follow to reach a safe harbor. [click] But sometimes the fog roles in and our journey to our goal becomes much more difficult and challenging. It is hard to tell where we are, where we want to be, what course to follow, and whether we are getting closer to safety or need to make a course adjustment. [click] So we turn on the GPS and ask where we are. [click] Of course, knowing where we are is only of limited help. The great philosopher Buckaroo Bonzai once commented, “No matter where you go, there you are!” [click] We also need to know where the port, our safe harbor, is. [click] [click] We also need to know what course to follow to get there. [click] The GPS can tell us to point the boat at 117 degrees and progress for 20 minutes at 10 knots to reach our goal. Now we have a good plan about how to get to our goal, our safe harbor, and avoid the rocks and cliffs on either side. But, sometimes our plans go awry…. [click] We also need to check up on our progress in time to make course corrections. [click] If we are off course, the time to modify our plan is early, in time to still reach our safe harbor and not end up on the rocks. [click] We are Here Actual Course

Progress Monitoring: The GPS for Educators!

Purpose of Progress Monitoring Answers the question(s): Are the children learning? How can we tell? Are they making enough progress? Can we remove some of our supports? Do we need to change or intensify our supports?

How often do you progress monitor students? Determined by district decision rules and level of need Best practice recommendations: Intensive: 1-2 x per week Strategic: 1x or 2x per month

How do we know if a student is making adequate progress? Correct words per Minute Decision Rules

Questions to Consider How many data points below the line before you make a change in instruction/intervention? What do you change? Group size? Time? Curriculum? Other factors?

Progress Monitoring Phonics for Reading 27 31 35 30 25 32 34 38

We do not use progress monitoring data to… …select specific short-term instructional goals …take a lot of time away from instruction …diagnose educational problems …assign grades to students …evaluate teachers

What are Progress Monitoring Tools? Not Progress Monitoring Tools DIBELS Oral Reading Fluency Maze EasyCBM CBM Math Computation CBM Writing – Story Starters CBM Algebra CBM Early Numeracy Quick Phonics Screener QRI-IV DRA2 Running Records Report cards Meeting OAKS standards Core curriculum weekly tests on skills that are learned

Progress Monitoring data tell us WHEN a change is needed Progress Monitoring data does not always tell us WHAT change is needed

Point Card

Look at Individual Student graph for Targeted Student(s)

Diagnostic Assessment Answer the question…. Why? WARNING! Critical Thinking Skills may be Required

Collecting Diagnostic Data The major purpose for administering diagnostic tests is to provide information that is useful in planning more effective instruction. Diagnostic tests should only be given when there is a clear expectation that they will provide new information about a child’s difficulties learning to read that can be used to provide more focused, or more powerful instruction.

Diagnostic Assessment Questions “Why is the student not performing at the expected level?” (Defining the Problem) “What is the student’s instructional need?” (Designing an Intervention)

Digging Deeper In order to be “diagnostic”: We need to know the sequence of skill development Content knowledge may need further development

Enabling Skills Enabling skills are skills that could be considered prerequisite skills for the demonstration of proficient performances on larger assessments measures They represent the sub-skills of higher order performance demonstration Deficiencies in enabling skills will often result in lower performance on assessments

Phonemic Awareness Developmental Continuum Vital for Diagnostic Process! Hard Phoneme deletion and manipulation Blending and segmenting individual phonemes Onset-rime blending and segmentation Syllable segmentation and blending Sentence segmentation Rhyming Word comparison THEN check here! IF DIFFICULTY DETECTED HERE.. Easy

Reading: Diagnostic assessments may include: In curriculum assessments: Quick Phonics Screener Weekly assessment data Unit and Benchmark assessment data Survey Level Assessments Error Analysis or Running Records Any formal or informal assessment that answers the question: Why is the student having a problem?

Survey Level Assessment Start at expected level and move backward until specific skill deficits are identified Match interventions to address specific skill deficits Example 2nd Grade Math Assignment – Double Digit Math FACTS sheet (+,-,x,/) -- student cannot do Progress backward in assessment to see where student can be successful Cannot do basic facts division  multiplication  or double digit subtraction or addition Can do single digit addition to +5 successfully

Error Analysis Select a 250 word passage on which you estimate that the student will be 80-85% accurate. Record the student’s errors on your copy of the reading probe. Use at least 25 errors for students in grade 1 to conduct an error analysis and at least 50 errors for students in second grade and above. Use an error analysis sheet to conduct error analysis.

Error Analysis

We do not use diagnostic data… …for all students …to monitor progress towards a long-term goal …to compare students to each other

Outcome Was the goal reached? Often times, the same assessment as your screener Can be CBM, State-testing (OAKS), other high stakes assessments. Should be linked to district standards and benchmarks