Putting the “R” in RtI: Assessing Student Responsiveness through Norming, Screening and Progress Monitoring Summer RtI Institute July 30-31st, 2007 Amanda.

Slides:



Advertisements
Similar presentations
Ed-D 420 Inclusion of Exceptional Learners. CAT time Learner-Centered - Learner-centered techniques focus on strategies and approaches to improve learning.
Advertisements

Response to Intervention (RtI) in Primary Grades
Overview of Progress Monitoring Training Session Part of a training series developed to accompany the AIMSweb Improvement System. Purpose is to provide.
Response to Intervention (RtI) Secondary Model for Intervention This ppt is an adaptation of a specific PISD Training on RTI, The Educational Testing and.
Progress Monitoring project DATA Assessment Module.
0 From TN Department of Education Presentation RTII: Response to Instruction and Intervention.
RTI and Special Education: Making sense of it all!
Margaret D. Anderson SUNY Cortland, April, Federal legislation provides the guidelines that schools must follow when identifying children for special.
Plan Evaluation/Progress Monitoring Problem Identification What is the problem? Problem Analysis Why is it happening? Progress Monitoring Did it work?
RTI … What do the regs say?. What is “it?” Response To Intervention is a systematic process for providing preventive, supplementary, and interventional.
Response to Intervention Edward Daly & Todd Glover University of Nebraska- Lincoln.
1 Referrals, Evaluations and Eligibility Determinations Office of Vocational and Educational Services for Individuals with Disabilities Special Education.
A NEW APPROACH TO IDENTIFYING LEARNING DISABILITIES RTI: Academics.
Universal Screening: Answers to District Leaders Questions Are you uncertain about the practical matters of Response to Intervention?
May Dr. Schultz, Dr. Owen, Dr. Ryan, Dr. Stephens.
The Criteria for Determining SLD When Using an RTI-based Process Part I In the previous session you were presented the main components of RtI, given a.
“Sorting Out Response to Intervention” Nassau Association of District Curriculum Officials February 26, 2009 Presented by Arlene B. Crandall ABCD Consulting,
Curriculum Based Evaluations Informed Decision Making Leads to Greater Student Achievement Margy Bailey 2006.
Using Targeted Interventions to Support School Improvement Presenter: Kathleen Smith Director Office of School Improvement.
Universal Screening and Progress Monitoring Nebraska Department of Education Response-to-Intervention Consortium.
RTI Implementer Webinar Series: What is RTI?
RESPONSE TO INTERVENTION Georgia’s Pyramid. Pyramid Vocabulary  Formative Assessment  Universal Screening  Intervention  Progress Monitoring.
The Role of Assessment in Response to Intervention Connecting Research to Practice for Teacher Educators.
Problem Solving Model Problem Solving Model NC DPI Summer Preparation Preparation & Implementation Implementation North Carolina.
Curriculum Based Measures vs. Formal Assessment
Response to Intervention How to Monitor RTI Reading Interventions Jim Wright
Power Pack Click to begin. Click to advance Congratulations! The RtI process has just become much easier. This team member notebook contains all the information.
CEDS 2005 ANNUAL CONFERENCE Educational Assessment: Looking Forward, Reaching Further November 17, 2005 When Assessment Isn’t Enough: Understanding Student.
Response to Intervention RTI Data Challenge: Setting Individual RTI Academic Goals Using Research Norms for Students Receiving.
Progress Monitoring and Response to Intervention Solution.
Response to Intervention. Background Individuals with Disabilities Education Act of 2004 Changes to align with No Child Left Behind (NCLB) Allows districts.
Specific Learning Disability: Accurate, Defensible, & Compliant Identification Mississippi Department of Education.
MI draft of IDEIA 2004 (Nov 2009) WHAT HAS CHANGED? How LD is identified:  Discrepancy model strongly discouraged  Response To Instruction/Intervention.
D62 Response to Intervention
Mississippi’s Three Tier Model of Instruction An Overview of the Intervention Policy and Process.
Response to Intervention Using Problem-Solving Teams Within the Framework of RTI Jim Wright
RTI Procedures Tigard Tualatin School District EBIS / RTI Project Jennifer Doolittle Oregon Department of Education, January 27, 2006.
1 Curriculum Based Measures Improving Student Outcomes through Progress Monitoring.
Using Data for Decisions Points to Ponder. Different Types of Assessments Measure of Academic Progress (MAP) Guided Reading (Leveled Reading) Statewide.
Response to Intervention Methods of Classroom Data Collection Jim Wright
Issues in Selecting Assessments for Measuring Outcomes for Young Children Issues in Selecting Assessments for Measuring Outcomes for Young Children Dale.
From Screening to Verification: The RTI Process at Westside Jolene Johnson, Ed.S. Monica McKevitt, Ed.S.
Training for Problem Solving Teams Susan Clay Jefferson County Board of Education Fall 2014.
Parent Leadership Team Meeting Intro to RtI.  RtI Overview  Problem Solving Process  What papers do I fill out?  A3 documenting the story.
1 RESPONSE TO INSTRUCTION ________________________________ RESPONSE TO INTERVENTION New Opportunities for Students and Reading Professionals.
Educable Mental Retardation as a Disability: Accurate, Defensible, & Compliant Identification Mississippi Department of Education.
Special Education Referral and Evaluation Report Oregon RTI Project Sustaining Districts Trainings
RTI Response To Intervention. What is RTI ? Response to intervention is a multi – tier approach to the early identification and support of students with.
Response to Intervention Activity: Selecting the ‘Best of the Best’ Tier I Intervention Ideas.
Responsiveness to Instruction RtI Tier III. Before beginning Tier III Review Tier I & Tier II for … oClear beginning & ending dates oIntervention design.
RtI Team 2009 Progress Monitoring with Curriculum-Based Measurement in Reading -DIBELS.
Tier III Implementation. Define the Problem  In general - Identify initial concern General description of problem Prioritize and select target behavior.
Evaluation and Eligibility Using RTI Crook County School District February 26, 2010.
 Three Criteria: Inadequate classroom achievement (after intervention) Insufficient progress Consideration of exclusionary factors  Sources of Data.
By: Jill Mullins. RtI is… the practice of providing high-quality instruction/intervention matched to student needs and using learning rate over time and.
Granite School District Multi-Tiered System of Support Part I: Moving Between Tier 1 Differentiation and Tier 2 Interventions and Extensions Teaching and.
Response to Intervention SPED 461. Basic Principles of RTI Response to intervention integrates assessment and intervention within a multi-level prevention.
Progress Monitoring Goal Setting Overview of Measures Keith Drieberg, Director of Psychological Services John Oliveri, School Psychologist Cathleen Geraghty,
Response to Intervention (RtI) Aldine ISD District Staff Development August 18, 2009.
RESPONSE TO INTERVENTION (RTI) LEARNING DISABILITIES (LD) By: Julia Bjerke, Monica Fontana Crystal Schlosser, & Jessica Ringwelski.
Plan for Response to Intervention (RTI). What is Response to Intervention? Response to Intervention (RTI) is a practice of providing high-quality instruction.
Specific Learning Disability: Accurate, Defensible, & Compliant Identification Mississippi Department of Education.
RtI Team 2009 Progress Monitoring with Curriculum-Based Measurement in Reading - AIMS.
WestEd.org Washington Private Schools RtI Conference Follow- up Webinar October 16, 2012 Silvia DeRuvo Pam McCabe WestEd Center for Prevention and Early.
Response to Intervention for PST Dr. Kenneth P. Oliver Macon County Schools’ Fall Leadership Retreat November 15, 2013.
Data Collection Challenge:
Verification Guidelines for Children with Disabilities
RTI & SRBI What Are They and How Can We Use Them?
RTI Response to Intervention (RTI) is a multi-tier approach to the early identification and support of students with learning and behavior needs. Struggling.
Special Education teacher progress monitoring refresher training
Presentation transcript:

Putting the “R” in RtI: Assessing Student Responsiveness through Norming, Screening and Progress Monitoring Summer RtI Institute July 30-31st, 2007 Amanda Albertson, M. A. Courtney LeClair, M. A. Stephanie Schmitz, Ed.S.

Agenda Progress Monitoring Procedures Data examples Decisions Uses Assessment Curriculum Based Measurement Norming Uses Strengths & Limitations Procedures and Tips Screening Choosing a measure Decisions Progress Monitoring Procedures Data examples Decisions RtI and Special Education Placement

Direct Assessment of Academic Skills Curriculum-Based Measurement (CBM) Contents of the assessment are based on the instructional curriculum. Measures are presented in a standardized format. Material for assessment is controlled for difficulty by grade levels. Measures are generally brief. Shapiro, E. S. (2004). Academic skills problems: Direct assessment and intervention (3rd ed.). New York: The Guilford Press.

Curriculum Based Measurement (cont.) Advantages Can be used efficiently by teachers Produces accurate, meaningful information to index growth Answers questions about the effectiveness of programs in producing academic growth Provides information to help teachers plan better instructional programs Fuchs, L. & S. Fuchs, D. (1997) Use of curriculum-based measurement in identifying students with disabilities. Focus on Exceptional Children, 30, 3, 1-15.

Norming (a.k.a. Obtaining Normative Data)

Normative Data “Provide information on student levels and range of performance at different grades, by indexing achievement cross-sectionally” Provide “appropriate standards for weekly rates of academic growth” Fuchs, L. and Fuchs, D. (1993). Formative Evaluation of Academic Progress: How much growth can we expect?. School Psychology Review 22, 1, 1-30.

Uses of Local Normative Data Make decisions about referred students Report individual and/or group scores to teachers, parents, or other agencies Identify students proactively who aren’t keeping up with peers or benchmarks Detect academic and behavioral trends over time Bollman, K. & Johnson, C. Used with permission from FSDS.org. Based on Stewart, L. H., & Kaminski, R. (2002). Best practices in developing local norms for academic problem-solving. In A. Thomas & J. Grimes (Eds.), Best Practices in School Psychology IV (pp. 737-752). Bethesda, MD: NASP.

Strengths of Local Normative Data Decrease the likelihood of bias in decision making Provide meaningful comparison group Promote identification of educational needs in a systematic problem-solving orientation Follow changing patterns of local performance Clear expectations of what is expected and ranges in performance Bollman, K. & Johnson, C. Used with permission from FSDS.org. Based on Stewart, L. H., & Kaminski, R. (2002). Best practices in developing local norms for academic problem-solving. In A. Thomas & J. Grimes (Eds.), Best Practices in School Psychology IV (pp. 737-752). Bethesda, MD: NASP.

Limitations of Local Normative Data Threat of Misinterpretation Sample & measurement tasks must be defined Small sample can cause the norms to be unstable Local performance is not necessarily acceptable May use empirically derived benchmark rates to determine if students’ performance is acceptable Local norms may not necessarily advocate the use of certain curricula Norms show level of performance and rate of growth in curricula Bollman, K. & Johnson, C. Used with permission from FSDS.org. Based on Stewart, L. H., & Kaminski, R. (2002). Best practices in developing local norms for academic problem-solving. In A. Thomas & J. Grimes (Eds.), Best Practices in School Psychology IV (pp. 737-752). Bethesda, MD: NASP.

Steps in Developing Local Norms 1. Identify norm sample 2. Choose materials 3. Decide who and how many students will be assessed 4. Collect the data 5. Organize the data for use Bollman, K. & Johnson, C. Used with permission from FSDS.org. Based on Stewart, L. H., & Kaminski, R. (2002). Best practices in developing local norms for academic problem-solving. In A. Thomas & J. Grimes (Eds.), Best Practices in School Psychology IV (pp. 737-752). Bethesda, MD: NASP.

1. Identify norm sample 3 Basic Levels Consider… Classroom School-Building School-District Consider… Decisions for which data shall be used Amount of curriculum chaos in the district Political and economic structure of the area Characteristics of the population Economic and other resources available Bollman, K. & Johnson, C. Used with permission from FSDS.org. Based on Stewart, L. H., & Kaminski, R. (2002). Best practices in developing local norms for academic problem-solving. In A. Thomas & J. Grimes (Eds.), Best Practices in School Psychology IV (pp. 737-752). Bethesda, MD: NASP.

2. Choosing Norming Measurement Tools Tools should… Be reliable Be accurate Have relatively normal distributions Be sensitive to change Provide enough opportunities to respond (limit ceiling effects) Have standardized administration and scoring Reliably differentiate student level of skill Be time efficient Be affordable Provide data important to general education expectations Bollman, K. & Johnson, C. Used with permission from FSDS.org. Based on Stewart, L. H., & Kaminski, R. (2002). Best practices in developing local norms for academic problem-solving. In A. Thomas & J. Grimes (Eds.), Best Practices in School Psychology IV (pp. 737-752). Bethesda, MD: NASP.

Examples of Norming Measurement Tools Dynamic Indicators of Basic Early Literacy Skills (DIBELS; http://dibels.uoregon.edu/) Reading K-6 Spanish and English Aimsweb (www.aimsweb.com) Math Written Expression K-8

3. Implement a Sampling Plan Balance the resources available, representativeness of the sample, and the information desired Some questions can be answered without testing every child every year Your questions should drive the sampling plan! Bollman, K. & Johnson, C. Used with permission from FSDS.org. Based on Stewart, L. H., & Kaminski, R. (2002). Best practices in developing local norms for academic problem-solving. In A. Thomas & J. Grimes (Eds.), Best Practices in School Psychology IV (pp. 737-752). Bethesda, MD: NASP.

Implement a Sampling Plan Classroom Norms Minimum of 7-10 students Selected randomly (every nth student on list) Selected randomly from a pool of “typical students” Building Norms Minimum of 15-20% of students in each grade Minimum of 20 students per grade Selected randomly To compute percentile ranks, a minimum of 100 students per grade is needed District Norms Random sample of 100 students per grade Bollman, K. & Johnson, C. Used with permission from FSDS.org. Based on Stewart, L. H., & Kaminski, R. (2002). Best practices in developing local norms for academic problem-solving. In A. Thomas & J. Grimes (Eds.), Best Practices in School Psychology IV (pp. 737-752). Bethesda, MD: NASP.

National vs. Local Norms National norms require less time and effort Don’t have to collect normative data National norms are readily accessible Local norms are more representative of your population Local norms are more sensitive Local norms allow you to choose the materials that are most appropriate to your building/district

4. Collect the Data Trimester norming (Fall, Winter, Spring) Use equivalent but not identical materials each time Prepare student and examiner materials ahead of time Examiners should be trained to administer and score Determine suitable locations for testing Determine appropriate dates for testing Bollman, K. & Johnson, C. Used with permission from FSDS.org. Based on Stewart, L. H., & Kaminski, R. (2002). Best practices in developing local norms for academic problem-solving. In A. Thomas & J. Grimes (Eds.), Best Practices in School Psychology IV (pp. 737-752). Bethesda, MD: NASP.

5. Organize Data for Use Data Can be Summarized at Four Levels Individual student raw scores Classroom ranges of scores, medians, and rank orderings Building ranges of scores, medians, rank orderings, and percentile ranks District ranges of scores, descriptive statistics, within grade frequency distributions, percentile ranks, and across grade comparisons Bollman, K. & Johnson, C. Used with permission from FSDS.org. Based on Stewart, L. H., & Kaminski, R. (2002). Best practices in developing local norms for academic problem-solving. In A. Thomas & J. Grimes (Eds.), Best Practices in School Psychology IV (pp. 737-752). Bethesda, MD: NASP.

Computing Percentile Ranks 1. Construct a frequency distribution of the raw scores 2. For a given raw score, determine the cumulative frequency for all scores lower than the score of interest 3. Add half the frequency for the score of interest to the cumulative frequency value determined in Step 2 4. Divide the total by N, the number of examinees in the norm group and multiply by 100% Crocker, L., & Algina, A. (1986). Introduction to classical and modern test theory. New York: Holt, Rinehart and Winston.

Organizing Data for Use

Universal Screening

Universal Screening A classroom-wide, school-wide, or district-wide assessment which involves assessing all students to identify students who are at risk for academic failure or behavioral difficulties and could potentially benefit from specific instruction or intervention. National Association of State Directors of Special Education, Inc. (2005). Response to Intervention: Policy considerations and implementation. New York, NY: The Guilford Press.

Choosing a Screening Measure Compatibility with local service delivery needs Alignment with constructs of interest Theoretical and empirical support Population fit Practical to administer Glover, T. A., & Albers, C. A. (in press). Considerations for evaluating universal screening assessments. Journal of School Psychology.

Choosing a Screening Measure Appropriately standardized for use with the target population Consistent in measurement Accurate in its identification of individuals at risk

Examples of Screening Measures CBM Dynamic Indicators of Basic Early Literacy Skills (DIBELS; http://dibels.uoregon.edu/) Aimsweb (www.aimsweb.com) Teacher recommendations Classroom assessments National assessments (e.g., MAT) Report card rubrics

Pre-Screening Procedures with CBM 1. Decide who will conduct the screening. 2. Ensure that the individuals who are administering the screening have been trained in using the chosen CBM materials. 3. Organize CBM materials (e.g., make sure there are enough, write student names on them, etc.). 4. Decide whether to use local or national (published) norms to determine which students need additional academic assistance. 5. Ensure that you give the type of probe recommended for that specific grade level and time of year

Possible DIBELS probes Example of DIBELS chart

CBM Screening Tips Reading measures need to be administered individually. It is best to have several administrators and to bring entire classrooms into a central location at one time. Math and writing can be administered to students as a group, so administer these probes to entire classrooms. It is also helpful to prepare materials so that each student has their own materials with their names on them.

Post-Screening Procedures 1. Enter student scores into a computer program (e.g., Excel) that can easily sort the data. 2. Sort the data so that students are rank-ordered. 3. Determine which students fell below the previously specified cut-off

Example Spreadsheet Median ORF Student A 3 Student B 5 Student C 6   Median ORF Student A 3 Student B 5 Student C 6 Student D 8 Student E Student F 9 Student G 11 Student H Student I 13 Student J 14 Student K 15 Student L 16 Student M 17 Student N 18 Student O 19

Screening Results Example

Screening Decisions Students who fall below pre-specified cutoff Based on scores, supporting documentation, and prior knowledge of student abilities, determine the necessary educational intervention. Decide who is going to implement the intervention(s). Decide who is going to monitor student progress over time.

Progress Monitoring

Progress Monitoring The practice of assessing students to determine if academic or behavioral interventions are producing desired effects. Provides critical information about student progress that is used to ensure the use of effective educational practices and to verify that students are progressing at an adequate rate. National Association of State Directors of Special Education, Inc. (2005). Response to Intervention: Policy considerations and implementation. New York, NY: The Guilford Press.

Progress Monitoring Those students who did not make the screening cutoff will be monitored on a frequent (generally once per week) basis. It is recommended that the same form of CBM be used for screening and progress monitoring. Use the recommended form for the students grade and time of year.

Progress Monitoring Typically occurs at least once per week Provides ongoing information regarding student progress Can be used to determine whether interventions need to be strengthened or modified

Progress Monitoring Procedures 1. Based upon the norms you have decided to use and each student’s screening results, set a goal for each student. This goal should reflect an average gain per week as determined by the norms that you are using. 2. Once the student’s intervention has begun, monitor the student’s progress once per week.

Progress Monitoring Procedures (cont.) 3. Graph the student’s scores (e.g., correct read words/minute, correct writing sequences, digits correct) on a chart. 4. Periodically review the chart to determine whether progress is being made. 5. After the student has been in an intervention for a specified amount of time, hold a meeting with your decision making team. Look at the level, and the rate of progress Determine whether the goal was attained and/or exit criteria met

Progress Monitoring: Example 1 Baseline Intervention Intervention Baseline

Progress Monitoring Decisions (Example 1) What you can do in this situation Continue with the intervention and monitoring. Continue with the intervention and monitor less frequently. Discontinue intervention but monitor to ensure that progress doesn’t cease/reverse.

Progress Monitoring Example 2 Intervention Intervention Baseline Baseline

Progress Monitoring Decisions: Example 2 Decision that needs to be made in this situation: 1.Modify the current intervention, or 2. Implement a different intervention in place of the current intervention.

Progress Monitoring Examples In example 1, adequate rate and level were being achieved The team will decide whether or not to continue to monitor student progress. The student will still be involved in universal screenings.

Progress Monitoring Examples In example 2, neither adequate rate nor level were being achieved. It is necessary to modify the current intervention or introduce a new intervention. Progress monitoring is still necessary.

Progress Monitoring: Example 2 Establish a new goal based on the last three data points obtained by the student. After the intervention is modified or a new intervention is implemented, progress monitoring continues until the next evaluation period.

Progress Monitoring: Example 2a Intervention 2 Intervention 1 Baseline

Progress Monitoring: Example 2a What you can do in this situation: Continue with the intervention and monitoring Continue with the intervention and monitor less frequently Discontinue intervention but monitor to ensure that progress doesn’t decrease

Progress Monitoring: Example 2b

Progress Monitoring: Example 2b After two periods of intensive, empirically based intervention in which the student has not achieved the level and rate goal established from baseline data, the team should consider special education placement.

RtI and Special Education Placement

RtI Is Not a Special Education Initiative! Assessment is conducted within a RtI framework first and foremost to improve instruction and enhance student growth. RtI is NOT a stand alone special education initiative, a means for increasing or decreasing special education numbers, or focused primarily on disability determination and documented through a checklist. RtI is about determining the intensity of support needed to help students succeed! Nebraska Department of Education. (2006). Technical Assistance Document

Special Education Placement Before placing a student in special education using the RtI model, several factors need to be considered: 1. Was the measurement of progress accurate? 2. Was the intervention appropriate for the child? 3. Were high rates of treatment integrity observed? 4. Did the student attend sessions regularly? 5. Does the student’s ELL status or other cultural/language factors need to be considered? 6. Is there evidence that the student could benefit from special education?

What About IQ Tests? The Individuals with Disabilities Act, 2004 became effective October 13, 2006 It states that the severe discrepancy approach “shall not be required” to identify students with specific learning disabilities “When determining whether a child has a specific learning disability as defined under this Act, the local education agency shall not be required to take into consideration whether a child has a severe discrepancy between achievement and intellectual ability…”

IDEA 2004 Continues… “In determining whether a child has a specific learning disability, a local educational agency may use a process which determines if a child responds to scientific, research-based intervention.” Thus, IQ tests are an option, but not necessary, for LD verification IQ tests are still necessary for MH verification For more information, see: http://idea.ed.gov/explore/home

Conclusions Norming, universal screening and progress monitoring are important components of the RtI process. Each process is used to ensure that students receive the services that they need to increase performance.

Additional Resources/References Bollman, K. & Johnson, C. Used with permission from FSDS.org. Based on Stewart, L. H., & Kaminski, R. (2002). Best practices in developing local norms for academic problem-solving. In A. Thomas & J. Grimes (Eds.), Best Practices in School Psychology IV (pp. 737-752). Bethesda, MD: NASP. Crocker, L., & Algina, A. (1986). Introduction to classical and modern test theory. New York: Holt, Rinehart and Winston. Edformation. (2004). AIMSweb, retrieved from www.edformation.com/. Glover, T. A., & Albers, C. A. (2007). Considerations for evaluating universal screening assessments. Journal of School Psychology, 45, 117-135. Good, R. H. & Kaminski, R. A. (Eds.). (2002). Dynamic Indicators of Basic Early Literacy Skills (6th ed.). Eugene, OR: Institute for the Development of Educational Achievement. Retrieved from dibels.uoregon.edu/ Fuchs, L. and Fuchs, D. (1993). Formative Evaluation of Academic Progress: How much growth can we expect?. School Psychology Review 22, 1, 1-30. Fuchs, L. & S. Fuchs, D. (1997) Use of curriculum-based measurement in identifying students with disabilities. Focus on Exceptional Children, 30, 3, 1-15. National Association of State Directors of Special Education, Inc. (2005). Response to Intervention: Policy considerations and implementation. New York, NY: The Guilford Press. Nebraska Department of Education. (2006). Technical Assistance Document. Shapiro, E. S. (2004). Academic skills problems: Direct assessment and intervention (3rd ed.). New York: The Guilford Press.