Linking Data to Instruction Jefferson County School District January 19, 2010.

Slides:



Advertisements
Similar presentations
Accelerating Achievement in Boston Public Schools: Academic Achievement Framework.
Advertisements

Progress Monitoring And RtI System
Chapter 9 - Fluency Assessment
ABCs of CBMs Summary of A Practical Guide to
Academic Data for Instructional Decisions: Elementary Level Dr. Amy Lingo, Dr. Nicole Fenty, and Regina Hirn Project ABRI University of Louisville Project.
Digging Deeper with DIBELS Data
Progress Monitoring project DATA Assessment Module.
Mike W. Olson RTI. RTI is… 2 the practice of providing high-quality instruction/intervention matched to student needs and using learning rate over time.
Novice Webinar 2 Overview of the Four Types and Purposes of Assessment.
North Penn School District Phase III Update Introduction to Response to Instruction and Intervention (RTII): A Schoolwide Framework for Student Success.
Response to Intervention (RTI) Lindenhurst Schools
Universal Screening: Answers to District Leaders Questions Are you uncertain about the practical matters of Response to Intervention?
RTI: Initial Steps, Preparations, Readiness Tammy Rasmussen Dean Richards COSA/OCE Fall Conference Oct. 6, 2011.
RtI Assessment CED 613. Universal Screening What is it and what does it Evaluate? What is the fundamental question it is asking? What is the ultimate.
Curriculum Based Evaluations Informed Decision Making Leads to Greater Student Achievement Margy Bailey 2006.
Universal Screening and Progress Monitoring Nebraska Department of Education Response-to-Intervention Consortium.
Reading First Assessment Faculty Presentation. Fundamental Discoveries About How Children Learn to Read 1.Children who enter first grade weak in phonemic.
Chapter 9 Fluency Assessment Tina Jensen. What? Fluency Assessment Consists of listening to students read aloud for a given time to collect information.
Aligning Interventions with Core How to meet student needs without creating curricular chaos.
Intervention Placement Process: Finding the Right Fit Cadre 8 Training Feb 5, 2012.
Webinar 3 Core Instruction (Tier 1). Assessments: – Screening – Evaluating effectiveness of core instruction Research-based/Evidence-based Instructional.
Assessment: Universal Screening Cadre 7 Initial Training September 29, 2011.
Vision: Every child in every district receives the instruction that they need and deserve…every day. Oregon Response to Intervention Vision: Every child.
Response to Intervention Reliable Methods to Measure Student Progress in Basic Literacy Skills Jim Wright
Setting Ambitious & Attainable Student Goals OrRTI Spring Training May 3 rd, 2011.
RTI: Initial Steps, Preparations, Readiness Jon Potter Tammy Rasmussen COSA/OCE Fall Conference Sept. 30, Oct. 1, 2010.
Utilizing AIMSweb to Inform Instruction June 25, 2012.
Work Sample Seminar1 Work Sample Seminar Screening Assessments, File Review, & Context Part 1 Portland State University.
Digging Deeper with Screening Data: Creating Intervention Groups Seaside School District March 17, 2010 Adapted from a presentation by.
Progress Monitoring and Response to Intervention Solution.
Intervention Management. Keeping RtI on Track Jigsaw chapter 1 (pps. 1-6) Each person reads one section Share a big idea from your section and answer.
Data Based Decision Making
Progress Monitoring Cadre 8 Training February 6 th, 2012.
Vision: Every child in every district receives the instruction that they need and deserve…every day. Oregon Response to Intervention Vision: Every child.
Supporting Teachers Within RTI: The Role of the School Psychologist Jon Potter Lisa Bates OrRTI Project 1 OSPA/WSASP Conference Fall 2010.
School-wide Data Analysis Oregon RtI Spring Conference May 9 th 2012.
Vision: Every child in every district receives the instruction that they need and deserve…every day. Oregon Response to Intervention Vision: Every child.
Aligning Interventions with Core How to meet student needs without creating curricular chaos.
Response to Intervention: Improving Achievement for ALL Students Understanding the Response to Intervention Process: A Parent’s Guide Presented by: Dori.
Vision: Every child in every district receives the instruction that they need and deserve…every day. Oregon Response to Intervention Vision: Every child.
LIGHTS, CAMERA …. ACADEMIC DATA at the Elementary Level Cammie Neal and Jennifer Schau Forsyth County Schools.
Instructional Decision Making in Iowa IOWA. Iowa’s Experience: How it all started Began in Discussions with stakeholders Parents Teachers Administrators.
SCREENING AND PROGRESS MONITORING: THE HOW TO’S….. BEAVERTON SCHOOL DISTRICT Dean Richards, Lisa Bates, Jon Potter.
Vision: Every child in every district receives the instruction that they need and deserve…every day. Oregon Response to Intervention Vision: Every child.
Response to Instruction: Using Data to Make Decisions PRESENTER: Lexie Domaradzki.
Progress Monitoring and the Academic Facilitator Cotrane Penn, Ph.D. Intervention Team Specialist East Zone.
RTI Essential Components Jenice Pizzuto Jenice Pizzuto Jenice Pizzuto National Consultant, Learning Forward, President, Learning Forward Oregon Leadership.
Using RTI for LD Eligibility: We Are All Members of the Assessment Team Oregon RTI Project Sustaining Districts Trainings
Special Education Referral and Evaluation Report Oregon RTI Project Sustaining Districts Trainings
RtI.  Learn: ◦ What is RtI ◦ Why schools need RtI ◦ What are the components that comprise an RtI system - must haves ◦ Underlying assumptions for the.
RtI Team 2009 Progress Monitoring with Curriculum-Based Measurement in Reading -DIBELS.
Data-Based Decision Making: Universal Screening and Progress Monitoring.
Vision: Every child in every district receives the instruction that they need and deserve…every day. Foundational Tier 1 training Sept. 22, 2015 Universal.
Digging Deeper with Screening Data: Creating Intervention Groups Gresham-Barlow School District September 8, 2011.
Vision: Every child in every district receives the instruction that they need and deserve…every day. Progress Monitoring Middle School Training: Ensuring.
EBIS Effective Behavior & Instructional Supports Tina Rodriguez June 23, 2013.
Building an RTI System Gresham-Barlow School District Dec 2, 2010.
Effective Behavior & Instructional Support. Implementing RTI through Effective Behavior & Instructional Support.
Universal Screening Cadre 6 Training October 12, 2010.
Winter  The RTI.2 framework integrates Common Core State Standards, assessment, early intervention, and accountability for at-risk students in.
UNIVERSAL SCREENING AND PROGRESS MONITORING IN READING Secondary Level.
Vision: Every child in every district receives the instruction that they need and deserve…every day. Oregon Response to Intervention Vision: Every child.
Using Data to Implement RtI Colleen Anderson & Michelle Hosp Iowa Department of Education.
Response to Intervention (RtI) Aldine ISD District Staff Development August 18, 2009.
WestEd.org Washington Private Schools RtI Conference Follow- up Webinar October 16, 2012 Silvia DeRuvo Pam McCabe WestEd Center for Prevention and Early.
K-5: Progress Monitoring JANUARY, 2010 WAKE COUNTY PUBLIC SCHOOL SYSTEM INTERVENTION ALIGNMENT.
DIBELS.
Progress monitoring Is the Help Helping?.
Are students benefitting from core instruction + interventions?
DIBELS: An Overview Kelli Anderson Early Intervention Specialist - ECC
Presentation transcript:

Linking Data to Instruction Jefferson County School District January 19, 2010

2 RTI Assessment Considerations Measurement strategies are chosen to… – Answer specific questions – Make specific decisions Give only with a “purpose” in mind – There is a problem if one doesn’t know why the assessment is being given.

3 Types of Assessments 1.Screening Assessments - Used for ALL students to identify those who may need additional support (DIBELS, CBM, Office Discipline Referrals for behavior, etc.) 2.Formative Assessment/Progress Monitoring - Frequent, on-going assessments that shows whether the instruction is effective and impacting student skill development (DIBELS, CBM, etc) 3.Diagnostic Assessments - Pinpoint instructional needs for students identified in screenings (Quick Phonics Screener, Survey Level Assessments, Curriculum Based Evaluation Procedures, etc.) ALL PART OF AN ASSESSMENT PROCESS WITHIN RTI!

Universal Screening Assessments Universal screening occurs for ALL students at least three times per year Procedures identify which students are proficient (80%) and which are deficient (20%). Good screening measures: Are reliable, valid, repeatable, brief, and easy to administer Are not intended to measure everything about a student, but provide an efficient an unbiased way to identify students who will need additional support (Tier 2 or Tier 3) Help you assess the overall health of your Core program (Are 80% of your students at benchmark/proficiency?)

Why Use Fluency Measures for Screening? Oral Reading Fluency and Accuracy in reading connected text is one of the best indicators of overall reading comprehension (Fuchs, Fuchs, Hosp, & Jenkins, 2001) We always examine fluency AND accuracy Without examining accuracy scores, we are missing a BIG piece of the picture Students MUST be accurate with any skill before they are fluent. Oral reading fluency (ORF) does not tell you everything about a student’s reading skill, but a child who cannot read fluently cannot fully comprehend written text and will need additional support.

Linking Screening Data to Instruction Questions to consider: – Are 80% of your students proficient based on set criteria (benchmarks, percentiles, standards, etc)? If not, what are the common instructional needs? – i.e. fluency, decoding, comprehension, multiplication, fractions, spelling, capitalization, punctuation, etc What is your plan to meet these common instructional needs schoolwide/grade-wide? – Improved fidelity to core? – More guided practice? – More explicit instruction? – Improved student engagement? – More professional development for staff?

Progress Monitoring Assessments Help us answer the question: Is what we’re doing working? Robust indicator of academic health Brief and easy to administer Can be administered frequently Must have multiple, equivalent forms – (If the metric isn’t the same, the data are meaningless) Must be sensitive to growth

Screening/Progress Monitoring Tools: Reading DIBELS PSF, NWF – Pros: Free, quick and easy, good research base, benchmarks, quick, linked to instruction – Cons: Only useful in Grade K-2 ORF (DIBELS, AIMSWEB, etc) – Pros: Free, good reliability and validity, easy to administer and score – Cons: May not fully account for comp in a few students MAZE – Pros: Quick to administer, may address comprehension more than ORF, can administer to large groups simultaneously, useful in secondary – Cons: Time consuming to score, not as sensitive to growth as ORF OAKS – Pros: Already available, compares to state standards – Cons: Just passing isn’t good enough, not linked directly to instruction, needs to be used in conjunction with other measures

Screening/Progress Monitoring Tools: Math CBM Early Numeracy Measures – Pros: Good reliability, validity, brief and easy to administer, – Cons: Sensitivity to growth, only useful in K-2 Math Fact Fluency – Pros: Highly predictive of struggling students – Cons: No benchmarks, only a small piece of math screening CBM Computation – Pros: Quick and easy to administer, sensitive to growth, surface validity – Cons: Predictive validity questionable, not linked to current standards CBM Concepts and Applications – Pros: Quick and easy to administer, good predictive validity. Linked to NCTM Focal Points (AIMSWEB) – Cons: Not highly sensitive to growth, newer measures easyCBM – Pros: Based on NCTM Focal Points, computer-based administration and scoring – Cons: Untimed (does not account for fluency), lengthy (administer no more than once every 3 weeks), predictive validity uncertain

Screening/Progress Monitoring Tools: Writing CBM Writing – Pros: Easy to administer to large groups, can obtain multiple scores from single probe – Cons: time consuming to score, does not directly measure content of writing – Correct Writing Sequences (CWS, %CWS) Pros: Good reliability, validity, sensitive to growth at some grade levels Cons: Time consuming to score, not as sensitive to growth in upper grades, %CWS not sensitive to growth – Correct Minus Incorrect Writing Sequences (CIWS) Pros: Good reliability, validity, sensitive to growth in upper grades Cons: Time consuming to score, not sensitive to growth in lower grades

Screening & Progress Monitoring Resources National Center Response to Intervention ( Intervention Central ( AIMSweb ( DIBELS ( easy CBM ( The ABC’s of CBM (Hosp, Hosp,& Howell, 2007)

The major purpose for administering diagnostic tests is to provide information that is useful in planning more effective instruction. Diagnostic tests should only be given when there is a clear expectation that they will provide new information about a child’s difficulties learning to read that can be used to provide more focused, or more powerful instruction. Diagnostic Assessments

Diagnostic Assessment Questions “Why is the student not performing at the expected level?” “What is the student’s instructional need?” Start by reviewing existing data

Diagnostic Assessments Quick Phonics Screener (Hasbrouck) DRA Error Analysis Survey Level Assessments In-Program Assessments (mastery tests, checkouts, etc) Curriculum-Based Evaluation Procedures – "any set of measurement procedures that use direct observation and recording of a student’s performance in a local curriculum as a basis for gathering information to make instructional decisions”(Deno, 1987) Any informal or formal assessments that answer the question: Why is the student having problems?

15 The Problem Solving Model 1.Define the Problem: What is the problem and why is it happening? 2.Design Intervention: What are we going to do about the problem? 3.Implement and Monitor: Are we doing what we intended to do? 4.Evaluate Effectiveness: Did our plan work?

Using the data to inform interventions What is the student missing? What does your data tell you? Start with what you already have, and ask “Do I need more info?” Phonemic Awareness PhonicsFluency & Accuracy VocabularyComprehension

Using your data to create interventions: An Example Adapted from

Organizing Fluency Screening Data: Making the Instructional Match Group 1: Accurate and Fluent Group 2: Accurate but Slow Rate Group 3: Inaccurate and Slow Rate Group 4: Inaccurate but High Rate Regardless of the skill focus, organizing student data by looking at accuracy and fluency will assist teachers in making an appropriate instructional match!

Digging Deeper with Screening Data Is the student accurate? – Must define accuracy expectation Consensus in reading research is 95% Is the student fluent? – Must define fluency expectation Fluency Measuring Tools: – Curriculum-Based Measures (CBM) – AIMSWeb (grades 1 - 8) – Fuch’s reading probes (grades 1 - 7) – DIBELS (grades K - 6)

Organizing Fluency Data: Making the Instructional Match Group 1: Accurate and Fluent Group 2: Accurate but Slow Rate Group 3: Inaccurate and Slow Rate Group 4: Inaccurate but High Rate Group 1: Dig Deeper in the areas of reading comprehension, including vocabulary and specific comprehension strategies. Group 2: Build reading fluency skills. (Repeated Reading, Paired Reading, etc.) Embed comprehension checks/strategies. Group 3: Conduct an error analysis to determine instructional need. Teach to the instructional need paired with fluency building strategies. Embed comprehension checks/strategies. Group 4: Conduct Table-Tap Method. If student can correct error easily, teach student to self- monitor reading accuracy. If reader cannot self- correct errors, complete an error analysis to Determine instructional need. Teach to the instructional need. Core Instruction *Check Comp* +Fluency building +Decoding then fluency Self- Monitoring

Data Summary 3rd Grade Class- Fall DIBELS: ORF => 77 StudentAccuracyWCPM Jim97%58 wcpm Nancy87%59 wcpm Ted89%90 wcpm Jerry98%85 wcpm Mary99%90 wcpm

Day 4’s Activity 5 Group 1: Accurate and Fluent Group 2: Accurate but Slow Rate Group 3: Inaccurate and Slow Rate Group 4: Inaccurate but High Rate ACTIVITY: Based on criteria for the grade level, place each student’s name into the appropriate box. Organizing data based on performance(s) assists in grouping students for instructional purposes. Students who do not perform well on comprehension tests, have a variety of instructional needs.

Match the Student to the Appropriate Box: Group 1: Accurate and Fluent Group 2: Accurate but Slow Rate Group 3: Inaccurate and Slow Rate Group 4: Inaccurate but High Rate >95% acc. And 77 wcpm. Jim Jerry Mary NancyTed StudentAccuracyWCPM Jim97%58 wcpm Nancy87%59 wcpm Ted89%90 wcpm Jerry98%85 wcpm Mary99%90 wcpm

Regardless of Skill… Phonemic Awareness Letter Naming Letter Sounds Beginning Decoding Skills Sight Words Addition Subtraction Fractions

Instructional “Focus” Continuum Accurate at Skill Fluent at Skill Able to Apply Skill IF no, teach skill. If yes, move to fluency If no, teach fluency/ automaticity If yes, move to application If no, teach application If yes, the move to higher level skill/concept

Digging Deeper In order to be “diagnostic” – Teachers need to know the sequence of skill development – Content knowledge may need further development – How deep depends on the intensity of the problem. OR

Phonemic Awareness Developmental Continuum Easy Hard IF DIFFICULTY DETECTED HERE.. THEN check here! Phoneme deletion and manipulation Blending and segmenting individual phonemes Onset-rime blending and segmentation Syllable segmentation and blending Sentence segmentation Rhyming Word comparison Vital for Diagnostic Process!

Screening Assessments: Not Always Enough Screening assessments do not always go far enough in answering the question: – We will need to “DIG DEEPER!” Quick phonics screener Error Analysis Curriculum Based Evaluation

When does this happen? How Frequent:2-3 times per year (after benchmarking/screening occurs) How Long:1-2 hours per grade level Who Attends:All grade level teachers, SPED teacher, principal, Title staff, specialists, instructional coach What is the Focus: Talk about schoolwide data, evaluate health of core and needed adjustments for ALL students Data Used:Screening Tier 1 Meetings

When does this happen? How Frequent:Every 4-6 weeks (by grade level) How Long:30-45 minutes Who Attends:All grade level teachers, SPED teacher, principal, Title teacher, specialists, instructional coach What is the Focus: Talk about intervention groups. Adjust, continue, discontinue interventions based on district decision rules Data Used:Screening, Progress Monitoring, sometimes Diagnostic Tier 2 Meetings

When does this happen? How Frequent:As needed based on individual student need and district decision rules How Long:30-60 minutes Who Attends:Gen ed teacher, SPED teacher, principal, specialists, school psych, instructional coach, parents What is the Focus: Problem-solve individual student needs. Design individualized interventions using data. Data Used:Screening, Progress Monitoring, and Diagnostic Tier 3 (Individual Problem Solving) Meetings

Useful Resources What Works Clearinghouse – Florida Center for Reading Research – National Center on Response to Intervention – Center on Instruction – Oregon RTI Project – Curriculum Based Evaluation: Teaching and Decision Making (Howell & Nolet, 2000) The ABCs of CBM (Hosp, Hosp & Howell, 2007)