Reading Goals and Reading Growth A Proposal for Cohort A

Slides:



Advertisements
Similar presentations
Chapter 9 Fluency Assessment Jhanyce A. Acosta. What? * Fluency Assessment -a method of listening to students read aloud in order to gathering their data,
Advertisements

DIBELS Part I SEDL 368. To access the materials, you will need to go to Download.
1 Module 2 Using DIBELS Next Data: Identifying and Validating Need for Support.
Extending RTI to School-wide Behavior Support Rob Horner University of Oregon
Survey Level Assessment
Cohort A Project-wide Data “Our goals can only be reached through a vehicle of a plan, in which we must fervently believe, and upon which we must vigorously.
1 Data-Based Leadership Cohort B March 2, 2006 (C) 2006 by the Oregon Reading First Center Center on Teaching and Learning.
1 Cohort B Q2: How are we doing?. 2 Reviewing Outcomes  What percent of students are reaching benchmark goals in each grade level?  What percent of.
Oregon Reading First (2009)1 Oregon Reading First Webinar Data-based Action Planning Winter 2009.
What Can We Do to Improve Outcomes? Identifying Targets of Opportunity Roland H. Good III University of Oregon WRRFTAC State.
One More Piece of the RTI Puzzle: Zones of Growth for Students Receiving Tier 2 Instructional Supports Hank Fien, Ph.D. Center for Teaching and Learning.
1 Cohort B Institute on Beginning Reading III February 1 and 2, 2006 Achieving Healthy Grade-Level Systems in Beginning Reading.
Oregon Reading First (2010)1 Oregon Reading First Regional Coaches’ Meeting May 13, 2010.
Oregon Reading First (2009)1 Oregon Reading First Regional Coaches’ Meeting May 2009.
Oregon Reading First (2008)1 Oregon Reading First Conference Call Data-based Action Planning Winter 2008.
1 Q3: How do we get there? Cohort B 2 GOALS AND ASSESSMENT INSTRUCTIONAL PROGRAMS INSTRUCTIONAL TIME DIFFERENTIATED INSTRUCTION ADMINISTRATION/ ORGANIZATION.
Oregon Reading First (2010)1 Winter 2010 Data Based Planning for Instructional Focus Groups.
1 Oregon Reading First: Cohort B Leadership Session Portland, Oregon May 27, 2009.
1 Project-wide Reading Results: Interpreting Student Performance Data and Designing Instructional Interventions Oregon Reading First February, 2004 Institute.
1 Q2: How are we doing? Cohort A (C) 2006 by the Oregon Reading First Center Center on Teaching and Learning.
Calling Ms. Cleo:What Can DIBELS Tell Us About the Future Ben Clarke, Scott Baker, and Ed Kame’enui Oregon Reading First Center February 3, 2004.
Cohort B Leadership Session March 3, 2008 Agenda.
 Meaningful  Measureable  Able to be Monitored  Useful in Making instructional decisions.
Oregon Reading First Cohort B Regional Coaches’ Meeting October, 2005
From Data to Dialogue: Facilitating meaningful change with reading data Ginny Axon misd.net) Terri Metcalf
Interpreting DIBELS reports LaVerne Snowden Terri Metcalf
DATA BASED DECISION MAKING IN THE RTI PROCESS: WEBINAR #2 SETTING GOALS & INSTRUCTION FOR THE GRADE Edward S. Shapiro, Ph.D. Director, Center for Promoting.
California Mini-CorpsOctober 10, Writing SMART Goals In order to become a team – a group of people working interdependently to achieve a common.
School-wide Data Analysis Oregon RtI Spring Conference May 9 th 2012.
Instruction, Teacher Evaluation and Value-Added Student Learning Minneapolis Public Schools November,
Systems Review: Schoolwide Reading Support Cohort 5: Elementary Schools Winter, 2009.
Instructional Leadership and Reading First Component 3-Part B Sara Ticer, Principal, Prairie Mountain School District Support for Instructional Leadership.
RTI Procedures Tigard Tualatin School District EBIS / RTI Project Jennifer Doolittle Oregon Department of Education, January 27, 2006.
B-ELL Leadership Session May 26, 2009 Jorge Preciado University of Oregon © 2009 by the Oregon Reading First Center Center on Teaching and Learning.
School-wide Data Team Meeting Winter NSIF Extended Cohort February 10, 2012.
1 October 24, 2006 Doris Baker Rachell Katz Jorge Preciado B-ELL Leadership Session © 2006 by the Oregon Reading First Center Center on Teaching and Learning.
DIBELS Data: From Dabbling to Digging Interpreting data for instructional decision-making.
1 The Oregon Reading First Model: A Blueprint for Success Scott K. Baker Eugene Research Institute/ University of Oregon Orientation Session Portland,
McGraw-Hill/Irwin © 2012 The McGraw-Hill Companies, Inc. All rights reserved. What is Student Progress Monitoring and How Will it Help Me? Laura Florkey.
Data Analysis MiBLSi Project September 2005 Based on material by Ed Kameenui Deb Simmons Roland Good Ruth Kaminski Rob Horner George Sugai.
HOW DO WE USE DIBELS WITH AN OUTCOMES-DRIVEN MODEL? Identify the Need for Support Validate the Need for Support Plan Support Evaluate Effectiveness of.
RtI Team 2009 Progress Monitoring with Curriculum-Based Measurement in Reading -DIBELS.
Data-Based Decision Making: Universal Screening and Progress Monitoring.
Tallassee Elementary Summary of Effectiveness DIBELS Report Data Meeting May 9, 2012 Presenter: Cynthia Martin, ARI Reading Coach.
Interpreting data for program evaluation and planning.
Data-based Decisions: A year in review Sharon Walpole University of Delaware.
Setting ambitious, yet realistic goals is the first step toward ensuring that all our students are successful throughout school and become proficient adult.
The State of Our School Fall, Goals What do we want all children to know and be able to do with text in our school? K – 90% of students will reach.
1 Linking DIBELS Data to Differentiated Instructional Support Plans 32 nd Annual COSA Seaside Conference June 23, 2006 Hank Fien, Ph.D. Center for Teaching.
Vision: Every child in every district receives the instruction that they need and deserve…every day. Oregon Response to Intervention Vision: Every child.
Progress Monitoring Goal Setting Overview of Measures Keith Drieberg, Director of Psychological Services John Oliveri, School Psychologist Cathleen Geraghty,
Intensive Reading Support 6.0 Evaluate Instructional Support 21.
RtI Team 2009 Progress Monitoring with Curriculum-Based Measurement in Reading - AIMS.
RTI Trends & Issues Keith Drieberg Brad McDuffee San Bernardino City Unified School District Keith Drieberg Brad McDuffee San Bernardino City Unified School.
Data Review Team Time Spring Purpose 0 This day is meant to provide school leadership teams with time to review the current status of their.
K-5: Progress Monitoring JANUARY, 2010 WAKE COUNTY PUBLIC SCHOOL SYSTEM INTERVENTION ALIGNMENT.
DIBELS.
Module 10 Assessment Logistics
Data Review Team Time Fall 2013.
Data-Based Leadership
Q3: How do we get there? Cohort A
DIBELS Next Overview.
Reading Goals and Reading Growth A Proposal for Cohort A
Extending RTI to School-wide Behavior Support
Oregon Reading First Summary Outcomes at the End of Year 1: Students at Benchmark (On Track) (C) 2005 by the Oregon Reading First Center Center on Teaching.
Oregon Reading First Summary Outcomes at the End of Year 1: Students at Benchmark (On Track) © 2005 by the Oregon Reading First Center Center on Teaching.
Program Effectiveness in DERF: State-Level Action Plan
Special Education teacher progress monitoring refresher training
Data-based Decisions: You try it
RTI Procedures Tigard Tualatin School District EBIS / RTI Project
Presentation transcript:

Reading Goals and Reading Growth A Proposal for Cohort A Oregon Reading First Center Oregon Reading First Cohort A Leadership Session March 3, 2006 © 2006 by the Oregon Reading First Center Center on Teaching and Learning

Purpose of the Presentation Consider two ways to think about student reading goals with our DIBELS data Benchmark Goals: e.g., 90 words per minute at end of grade 2 on ORF Individualized Performance Goals: e.g., 75 words per minute at end of grade 2 on ORF Present the rationale for these complimentary goal systems Outline how we might set student goals in the fall using this type of approach

How We Currently Think About Reading Goals and Adequate Reading Progress or Growth

Reaching Benchmark Goals in 1 Year Probability may be very low for some students, especially in the higher grades (e.g., 2 & 3) Important differences in REAL student growth may occur YET NOT result in a reduction of risk status (e.g., Intensive to Strategic or Strategic to Benchmark) More sensitive ways to set goals for some students may lead to greater achievement and a more accurate way to acknowledge school success

What do we know from research about setting student achievement goals Setting goals leads to better student achievement Ambitious goals are better than meager goals When student progress is monitored toward measurable goals, achievement is enhanced When data decision rules are used in conjunction with progress monitoring data, achievement is optimal!

Our Current System of Defining Adequate Progress Can Be Improved A real example of data from our Cohort A Reading First schools can help frame the issue

Mean Rates of Progress for All 2nd Grade Students based on Fall Level of Performance Oral Reading Fluency Goals

Mean Rates of Progress of Those 2nd Grade Students MOST At-Risk in the Fall Means for all student groups are in At High Risk category at the beginning of the year AND at the end of the year

Strong Rates of Growth in High Performing Schools Schools in the Upper Quartile in DIBELS Data System (literally thousands of schools are represented) These schools were successful with 2nd grade intensive students Data from these types of schools can help us set realistic and challenging Individualized Performance Goals

Growth of Students Most At-Risk in High Performing Schools Mean Growth of Top Third Mean Growth of Middle Third Mean Growth of Lowest Third

Goal Setting Proposal for Cohort A 2006-2007 Use fall screening data to set one of two types of goals for all students, K-3 For Benchmark Students: Goal will be Benchmark Goal or higher in the spring For Strategic and Intensive Students: Goal will be EITHER Benchmark Goal or Individualized Performance Goal in the spring Decision will be based on probability of attaining Benchmark Goal Probabilities will be determined using High Performing Schools in DIBELS Data System and Oregon Reading First Schools Individualized Performance Goals will be based on High Performing Schools in DIBELS Data System and Oregon Reading First Schools

Hypothetical example of setting Benchmark and Individualized Performance Goals with 5 Grade 2 Students S1 S2 S5 S4 S3 x Benchmark Goal = 90 wpm or above x x x Individualized Performance Goals: 88, 86, 60 x Scott, look at this with animation.

Hypothetical performance of 3 Intensive Students with similar fall ORF scores x Scott, look at this with animation.

Analyzing Patterns of Growth May Help Us Distinguish Between Student-Level and Systems-Level Issues Alex Ben Lori School A Xavier Jordan Cassi School B x Growth Goal Line Scott, look at this with animation.

DIBELS Benchmark Goals Grade DIBELS Winter Goals DIBELS Spring Goals Kindergarten PSF >= 18 NWF >= 25 Grade 1 NWF >= 50 ORF >= 40 Grade 2 ORF >= 90 Grade 3 ORF >= 110

Concept of Individualized Performance Goals for Some Students Grade Intermediate goals Year-end goals Kindergarten PSF = X (individualized) NWF = X (individualized) Grade 1 ORF = X (individualized) Grade 2 Grade 3

Next Steps By the summer, what the ORFC will do Determine Realistic and Challenging Individualized Performance Goals for students who are NOT likely to meet Spring DIBELS Benchmark Goals Based on High Performing Schools nationwide and Oregon Reading First Schools Determine child, school, and other factors that may influence attainment of Individualized Performance Goals e.g., Percent of Intensive Students in the school; English Language Learner status

What the ORFC will do By the beginning of school in the fall, establish specific criteria for schools to be considered high performing, average performing, or low performing in relation to DIBELS primary reading measure (i.e., NWF and ORF) Criteria by risk status -- i.e., Intensive, Strategic, and Benchmark Students By the beginning of school in the fall, establish school based goals for performance on the SAT-10 and OSAT Based on prior performance and national Reading First goals Establish performance criteria for high performing, average performing, or low performing

As soon as your fall data are collected, what the ORFC will do Provide Individualized Performance Goals for student who are not likely to meet DIBELS Benchmark Goals Link fall data to criteria for schools to be high performing, average performing, or low performing