Presentation is loading. Please wait.

Presentation is loading. Please wait.

RTI Data Challenge: Setting Individual RTI Academic Goals Using Research Norms for Students Receiving ‘Off-Level’ Interventions Source: Shapiro, E. S.

Similar presentations


Presentation on theme: "RTI Data Challenge: Setting Individual RTI Academic Goals Using Research Norms for Students Receiving ‘Off-Level’ Interventions Source: Shapiro, E. S."— Presentation transcript:

1 RTI Data Challenge: Setting Individual RTI Academic Goals Using Research Norms for Students Receiving ‘Off-Level’ Interventions Source: Shapiro, E. S. (2008). Best practices in setting progress-monitoring monitoring goals for academic skill improvement. In A. Thomas & J. Grimes (Eds.), Best practices in school psychology V (pp ). Bethesda, MD: National Association of School Psychologists.

2 Setting Individual Student RTI Academic Goals Using Research Norms
To set a goal for student academic performance, four elements are needed: The student’s baseline academic performance. Prior to starting the intervention, the teacher calculates baseline performance by assessing the target student several times with the academic measure that will be used to measure that student’s progress once the intervention begins. Estimate of ‘typical’ peer performance. The teacher has a reliable estimate of expected or typical peer performance on the academic measure that will be used to measure the target student’s progress.

3 Setting Individual Student RTI Academic Goals Using Research Norms
To set a goal for student academic performance, four elements are needed (cont.): Estimate of expected weekly progress. The teacher selects a rate of weekly academic progress that the target student is expected to attain if the intervention is successful. Number of weeks for the intervention trial. The teacher decides on how many weeks the RTI intervention will last, as the cumulative, final academic goal can be calculated only when the entire timespan of the intervention is known.

4 How to Set a Goal for an ‘Off-Level’ Intervention
Comparing Student Performance to Benchmarks and Flagging Extreme Discrepancies. The student is administered reading fluency probes equivalent to his or her current grade placement and the results are compared to peer norms. If the student falls significantly below the level of peers, he or she may need additional assessment to determine whether the student is to receive intervention and assessment ‘off grade level’.

5 Example of Progress-Monitoring Off-Level: Randy
In January, Mrs. Chandler, a 4th-grade teacher, receives her classwide reading fluency screening results. She notes that a student who has recently transferred to her classroom, Randy, performed at 35 Words Read Correct (WRC) on the 1-minute AIMSweb Grade 4 fluency probes. Mrs. Chandler consults AIMSweb reading-fluency research norms and finds that a reasonable minimum reading rate for students by winter of grade 4 (25th percentile) is 89 WRC.

6 Example of Progress-Monitoring Off-Level: Randy
AIMSweb Norms: ‘Typical’ reader (25th percentile) in Gr 4 at mid-year (winter norms): 89 WRC Example of Progress-Monitoring Off-Level: Randy Target Student Randy: 35 WRC Conclusion: Randy’s grade-level performance is in the ‘frustration’ range. He requires a Survey-Level Assessment to find his optimal ‘instructional’ level. Source: AIMSweb® Growth Table Reading-Curriculum Based Measurement: Multi-Year Aggregate: School Year

7 How to Set a Goal for an ‘Off-Level’ Intervention
Conducting a Survey Level Assessment (SLA). For students with large discrepancies when compared to benchmarks, the teacher conducts a SLA to determine the student’s optimal level for supplemental intervention and progress-monitoring. The teacher administers AIMSweb reading probes from successively earlier grade levels and compares the student’s performance to the benchmark norms for that grade level. The student’s ‘instructional’ level for intervention is the first grade level in which his reading-fluency rate falls at or above the 25th percentile according to the benchmark norms.

8 Example of Progress-Monitoring Off-Level: Randy
Because Randy’s reading fluency rate is so far below the grade-level norms (a gap of 54 WRC), his teacher decides to conduct a Survey Level Assessment to find the student’s optimal grade level placement for supplemental reading instruction.

9 Example of Progress-Monitoring Off-Level: Randy
On Grade 2-level probes, Randy attains a median score of 64 WRC. The AIMSweb winter norm (25th percentile) for a 2nd grade student is 53 WRC. The student is now in the ‘instructional’ range and the Survey Level Assessment ends. On Grade 3-level probes, Randy attains a median score of 48 WRC. The AIMSweb winter norm (25th percentile) for a 3rd grade student is 69 WRC. The student is still in the ‘frustration’ range and the Survey Level Assessment continues. Survey Level Assessment. The teacher conducts a Survey Level Assessment with Randy, assessing him using CBM reading fluency probes from successively earlier grades until he performs at or above the 25th percentile according to the AIMSweb norms. Source: AIMSweb® Growth Table Reading-Curriculum Based Measurement: Multi-Year Aggregate: School Year

10 How to Set a Goal for an ‘Off-Level’ Intervention
Selecting a Progress-Monitoring Goal. To set a progress-monitoring goal, the teacher looks up the benchmark WRC for the 50th percentile at the student’s off-level ‘instructional’ grade level previously determined through the Survey Level Assessment.

11 Example of Progress-Monitoring Off-Level: Randy
Goal-Setting. To find the progress-monitoring goal for Randy, his teacher looks up the benchmark WRC for the 50th percentile at Grade 2 (his off-level ‘instructional’ grade level)—which is 79 WRC. This becomes the progress-monitoring goal for the student. Source: AIMSweb® Growth Table Reading-Curriculum Based Measurement: Multi-Year Aggregate: School Year

12 How to Set a Goal for an ‘Off-Level’ Intervention
Translating a Progress-Monitoring Goal into Weekly Increments. The teacher’s final task before starting the progress-monitoring is to translate the student’s ultimate intervention goal into ‘ambitious but realistic’ weekly increments. One useful method for determining weekly growth rates is to start with research-derived growth norms and to then use a ‘multiplier’ to make the expected rate of weekly growth more ambitious.

13 How to Set a Goal for an ‘Off-Level’ Intervention
Translating a Progress-Monitoring Goal into Weekly Increments. (Cont.) The teacher first looks up the average rate of weekly student growth supplied in the research norms. (NOTE: If available, a good rule of thumb is to use the growth norms for the 50th percentile at the ‘off-level’ grade at which the student is receiving intervention and being monitored.) The teacher then multiplies this grade norm for weekly growth by a figure between 1.5 and 2.0 (Shapiro, 2008). Because the original weekly growth rate represents a typical rate student improvement, using this multiplier to increase the target student’s weekly growth estimate is intended accelerate learning and close the gap separating that student from peers.

14 Example of Progress-Monitoring Off-Level: Randy
Randy’s ultimate goal is 79 WRC (the 50th percentile norm for grade 2). During the Survey Level Assessment, Randy was found to read 64 WRC at the 2nd grade level. There is a 15-WRC gap to be closed to get Randy to his goal. At 2 additional WRC per week on intervention, Randy should close the gap within about 8 instructional weeks. Determining Weekly Rate of Improvement (ROI). Randy is to be monitored on intervention at grade 2. The teacher finds—according to AIMSweb norms—that a typical student in Grade 2 (at the 50th percentile) has a rate of improvement of 1.1 WRC per week. She multiplies the 1.1 WRC figure by 1.8 (teacher judgment) to obtain a weekly growth goal for Randy of about 2.0 additional WRCs. Source: AIMSweb® Growth Table Reading-Curriculum Based Measurement: Multi-Year Aggregate: School Year

15 How to Set a Goal for an ‘Off-Level’ Intervention
Advancing the Student to Higher Grade Levels for Intervention and Progress-Monitoring The teacher monitors the student’s growth in reading fluency at least once per week (twice per week is ideal). When the student’s reading fluency exceeds the 50th percentile in Words Read Correct for his or her ‘off-level’ grade, the teacher reassesses the student’s reading fluency using AIMSweb materials at the next higher grade. If the student performs at or above the 25th percentile on probes from that next grade level, the teacher advances the student and begins to monitor at the higher grade level. The process repeats until the student eventually closes the gap with peers and is being monitored at grade of placement.

16 Example of Progress-Monitoring Off-Level: Randy
Advancing the Student to Higher Grade Levels (Cont.). So Mrs. Chandler assesses Randy on AIMSweb reading fluency probes for Grade 3 and finds that he reads on average 72 WRC —exceeding the Grade 3 25th percentile cut-off of 69 WRC. Therefore, Randy is advanced to Grade 3 progress-monitoring and his intervention materials are adjusted accordingly. Advancing the Student to Higher Grade Levels of Progress-Monitoring. His teacher, Ms. Chandler, notes that after 7 weeks of intervention, Randy is now reading 82 WRC—exceeding the 79 WRC for the 50th percentile of students in Grade 2 (winter norms). Source: AIMSweb® Growth Table Reading-Curriculum Based Measurement: Multi-Year Aggregate: School Year

17

18 How to Set a Student Goal: Example (Cont.)
The reading intervention planned for Randy would last 8 instructional weeks. Mrs. Chandler consulted the research norms and noted that a typical rate of growth in reading fluency for a 4th-grade student is 0.9 additional words per week. Mrs. Chandlers adjusted the 0.9 word growth rate for Randy upward by multiplying it by 1.5 because she realized that he needed to accelerate his learning to catch up with peers. When adjusted upward, the weekly growth rate for Randy increased from 0.9 to 1.35 additional words per minute.

19 How to Set a Student Goal: Example (Cont.)
Multiplying the expected weekly progress of 1.35 additional words by the 8 weeks of the intervention, Mrs. Chandler found that Randy should acquire at least 11 additional words of reading fluency by the conclusion of the intervention. She added the 11 words per minute to Randy’s baseline of 70 words per minute and was able to predict that—if the 8-week intervention was successful—Randy would be able to read approximately 81 words per minute.

20 How to Set a Student Goal: Example (Cont.)
Because Randy would not be expected to fully close the gap with peers in 8 weeks, Mrs. Chandler regarded her intervention goal of 81 words per minute as an intermediate rather than a final goal. However, if the intervention was successful and the student continued to add 1.35 words per week to his reading fluency, he could be expected to reach an acceptable level of fluency soon.

21 How to Monitor a Student Off-Level
Conduct a ‘survey level’ assessment of the student to find their highest ‘instructional’ level (between 25th and 50th percentile). Student is monitored at ‘off level’ during the intervention (e.g., weekly). The ‘goal’ is to move the student up to the 50th percentile. Once per month, the student is also assessed at grade level to monitor grade-appropriate performance. When the student moves above 50th percentile on off-level, the interventionist tests the student on the next higher level. If the student performs above the 25th percentile on the next level, monitoring starts at the new, higher level.

22 Sample Reading Fluency Norms
Source: Tindal, G., Hasbrouck, J., & Jones, C. (2005).Oral reading fluency: 90 years of measurement [Technical report #33]. Eugene, OR: University of Oregon.

23 Activity: Academic Goal-Setting
At your tables: Review the guidelines for academic goal-setting using research-based norms. How can you promote the use of this goal-setting approach in your school?

24

25 How to Set a Student Goal
Determining Weekly Growth Rate: Method 2: If research norms with ‘ambitious’ rates of student growth are available, these can be used to determine the student’s weekly expected Rate of Improvement.

26 Example of Research-Based Norms for Weekly Reading Fluency Growth
Predictions for Rates of Reading Growth by Grade (Fuchs, Fuchs, Hamlett, Walz, & Germann, 1993) Increase in Correctly Read Words Per Minute for Each Instructional Week Grade Level Realistic Weekly Goal Ambitious Weekly Goal Grade Grade Grade Grade Grade Grade

27

28

29 RTI Lab: Creating District Decision Rules for Analyzing RTI Data to Determine LD Eligibility Jim Wright

30 RTI Data & LD Determination: Agenda…
Learning Disabilities in the Age of RTI: Introduction Analyzing Student Academic Risk: Performance Level and Rate of Improvement Evaluating a Student’s ‘Non-Responder’ Status: A Comprehensive Checklist Developing Your District’s Decision Rules for Using RTI Data to Determine ‘Non-Response’ Status: First Steps

31 The process by which public schools identify students as learning disabled often appears to be confusing, unfair, and logically inconsistent. In fact, G. Reid Lyon of the National Institute of Child Health and Human Development has suggested that the field of learning disabilities is a sociological sponge whose purpose has been and is to clean up the spills of general education. (Gresham, 2001) Source: Gresham, F. M.. (2001). Responsiveness to intervention: An alternative approach to the identification of learning disabilities. Paper presented at the Learning Disabilities Summit, Washington DC.

32 RTI & Special Education Eligibility

33 Special Education Eligibility & RTI: Establishing Confidence at Every Link
Special Education Eligibility Teams review the CUMULATIVE RTI information collected in general education (‘intervention audit’). If that Team lacks confidence in any one of the links in the RTI chain, it will be difficult to identify the student as an RTI ‘non-responder’ The goal of this workshop is to help schools to identify each link in the RTI chain and to know how to measure the quality of that link.

34 RTI Assumption: Struggling Students Are ‘Typical’ Until Proven Otherwise…
RTI logic assumes that: A student who begins to struggle in general education is typical, and that It is general education’s responsibility to find the instructional strategies that will unlock the student’s learning potential Only when the student shows through well-documented interventions that he or she has ‘failed to respond to intervention’ does RTI begin to investigate the possibility that the student may have a learning disability or other special education condition.

35 RTI ‘Pyramid of Interventions’
Tier 3: Intensive interventions. Students who are ‘non-responders’ to Tiers 1 & 2 are referred to the RTI Team for more intensive interventions. Tier 1 Tier 2 Tier 3 Tier 2 Individualized interventions. Subset of students receive interventions targeting specific needs. Tier 1: Universal interventions. Available to all students in a classroom or school. Can consist of whole-group or individual strategies or supports.

36 What previous approach to diagnosing Learning Disabilities does RTI replace?
Prior to RTI, many states used a ‘Test-Score Discrepancy Model’ to identify Learning Disabilities. A student with significant academic delays would be administered an battery of tests, including an intelligence test and academic achievement test(s). If the student was found to have a substantial gap between a higher IQ score and lower achievement scores, a formula was used to determine if that gap was statistically significant and ‘severe’. If the student had a ‘severe discrepancy’ [gap] between IQ and achievement, he or she would be diagnosed with a Learning Disability.

37 Avg Classroom Academic
Performance Level Discrepancy 1: Skill Gap (Current Performance Level) Discrepancy 2: Gap in Rate of Learning (‘Slope of Improvement’) Target Student ‘Dual-Discrepancy’: RTI Model of Learning Disability (Fuchs 2003)

38 Current NYS Definition of ‘Learning Disabled’

39 RTI Information: What It Does and Doesn’t Do
The primary purpose for the special education eligibility team to evaluate general-education RTI information is to rule out instructional explanations for the struggling student’s academic concerns. RTI information does not in and of itself provide detailed information to allow schools to draw conclusions about a student’s possible neurological differences that make up the construct ‘learning disabilities’. Therefore, RTI information allows for a rule-out (the learning problem resides within the student, not the classroom) but does not in and of itself provide positive evidence of a learning disability.

40 Using RTI Information to Identify the ‘Non-Responding’ Student: Goodbye, Gate
As a special education eligibility team adopts a process for evaluating a student’s RTI information as a ‘non-responder’ to intervention as part of an evaluation for learning disabilities, the team will discover that there is no longer a single ‘actuarial number’ or gate to determine ‘risk’ of LD in the manner of a test score discrepancy analysis. Therefore, the special education eligibility team must have confidence in the quality of the intervention and assessment programs available to the struggling student in the general education setting. Today’s workshop is about increasing that level of confidence.

41 Team Activity: What Are Your Major Challenges in Using RTI Data to Help to Determine Special Education Eligibility? What are the major challenge(s) that your school or district faces as you make the transition to using RTI data to help to make special education eligibility decisions?

42 Evaluating a Student’s ‘Non-Responder’ Status: An RTI Checklist

43

44 Evaluating a Student’s ‘Non-Responder’ Status: An RTI Checklist
Interventions: Evidence-Based & Implemented With Integrity Tier 1: High-Quality Core Instruction Tier 1: Classroom Intervention Tier 2 & 3 Interventions: Minimum Number & Length Tier 2 & 3 Interventions: Essential Elements Tier 1, 2, & 3 Interventions: Intervention Integrity

45 Evaluating a Student’s ‘Non-Responder’ Status: Activity
At your table: Review these ‘RTI Non-Responder’ elements. Tier 1: High-Quality Core Instruction Tier 1: Classroom Intervention Tier 2 & 3 Interventions: Minimum Number & Length Tier 2 & 3 Interventions: Essential Elements Tier 1, 2, & 3 Interventions: Intervention Integrity Select the element that you see as your school or district’s greatest challenge. Brainstorm ideas to positively address that challenge.

46

47

48

49

50

51 Evaluating a Student’s ‘Non-Responder’ Status: An RTI Checklist
Academic Screenings: General Outcome Measures and Skill-Based Measures Selection of Academic Screening Measures Local Norms Collected via Gradewide Academic Screenings at Least 3 Times Per Year

52

53

54 Evaluating a Student’s ‘Non-Responder’ Status: An RTI Checklist
Dual Discrepancy Cut-Offs: Academic Skill Level and Student Rate of Improvement Cut-point Established to Define ‘Severely Discrepant’ Academic Performance Cut-Off Criterion Selected to Define Discrepant Slope

55

56

57 Evaluating a Student’s ‘Non-Responder’ Status: An RTI Checklist
Data Collection Use of Both ‘Off-Level’ and Enrolled Grade-Level Benchmarks & Progress-Monitoring Measures to Assess Student Skills and Growth Student Baseline Calculated Student Goal Calculated Regular Progress-Monitoring Conducted

58

59

60

61

62 Evaluating a Student’s ‘Non-Responder’ Status: An RTI Checklist
Application of RTI Decision Rules to a Particular Student Case Despite the Tier 2/3 Interventions Attempted, the Student’s Skills Continue to Fall Below the Boundary of ‘Severely Discrepant’ Academic Performance Despite the Tier 2/3 Interventions Attempted, the Student’s Rate of Improvement (Slope) Continues to Be Discrepant

63

64 Special Education Eligibility Team & RTI Information: Recommendations
Create guidelines for general education to use to determine whether a student is a ‘non-responder’ under RTI. NOTE: Such guidelines are for the sole use of general education and should not be interpreted as RTI ‘special education eligibility criteria’. Create a checklist for schools to collect, collate, and ‘package’ RTI information for presentation to the Special Education Eligibility Team.

65 Special Education Eligibility Team & RTI Information: Recommendations (Cont.)
RTI information should be reviewed prior to the initial LD eligibility meeting. If there is questionable information, the Special Education Eligibility Team should contact the school to clarify questions. At the actual eligibility meeting, any concerns or questions about the RTI information should be framed in neutral terms and tied to the dual discrepancy RTI LD model. Whenever possible, schools should not feel ‘blamed’ for shortcomings of RTI information and should feel that the identification process is transparent.

66 Special Education Eligibility Team & RTI Information: Recommendations (Cont.)
It should be an expectation that at eligibility meetings: The Special Education Eligibility Team can ask for clarification of any building RTI information presented The Team is able to articulate how it interprets information and why it reaches its decision.

67 Evaluating a Student’s ‘Non-Responder’ Status: Activity
At your table: Discuss how your school or district may use the document Evaluating a Student’s ‘Non-Responder’ Status: An RTI Checklist to: increase compliance at every link in the ‘RTI chain’ develop specific decision rules for determining whether a student referred for a possible Learning Disability is a ‘non-responder’ to intervention Be prepared to share the main points of your discussion with the large group.

68 Assessing Intervention Integrity Jim Wright www. interventioncentral
Assessing Intervention Integrity Jim Wright

69 Why Assess Intervention Integrity?
When a struggling student fails to respond adequately to a series of evidence-based interventions, that student is likely to face significant and potentially negative consequences, such as failing grades, long-term suspension from school, or even placement in special education. It is crucial, then, that the school monitor the integrity with which educators implement each intervention plan so that it can confidently rule out poor or limited intervention implementation of the intervention as a possible explanation for any student’s ‘non-response’.

70 Intervention Integrity Check: Direct Observation
Intervention integrity is best assessed through direct observation (Roach & Elliott, 2008). The key steps of the intervention are defined and formatted as an observational checklist. An observer watches as the intervention is conducted and checks off on the checklist those steps that were correctly carried out. The observer then computes the percentage of steps correctly carried out.

71 Limitations of Direct Observation as an Intervention Integrity Check
Direct observations are time-consuming to conduct. Teachers who serve as interventionists may at least initially regard observations of their intervention implementation as evaluations of their job performance, rather than as a child-focused RTI “quality check”. An intervention-implementation checklist typically does not distinguish between--or differentially weight--those intervention steps that are more important from those that are less so. If two teachers implement the same 10-step intervention plan, for example, with one instructor omitting a critical step and the other omitting a fairly trivial step, both can still attain the same implementation score of steps correctly completed. Source: Gansle, K. A., & Noell, G. H. (2007). The fundamental role of intervention implementation in assessing response to intervention. In S. R. Jimerson, M. K. Burns, & A. M. VanDerHeyden (Eds.), Response to intervention: The science and practice of assessment and intervention (pp ).

72 Intervention Script Builder
Each Step Marked ‘Negotiable or ‘Non-Negotiable’ ‘Yes/No’ Step-by-Step Intervention Check Intervention Script Builder

73 Supplemental Methods to Collect Data About Intervention Integrity
Teacher Self-Ratings: As a form of self-monitoring, directing interventionists to rate the integrity of their own interventions may prompt higher rates of compliance (e.g., Kazdin, 1989). However, because teacher self-ratings tend to be ‘upwardly biased (Gansle & Noell, 2007, p. 247), they should not be relied upon as the sole rating of intervention integrity. One suggestion for collecting regular teacher reports on intervention implementation in a convenient manner is to use Daily Behavior Reports (DBRs; Chafouleas, Riley-Tillman,, & Sugai, 2007). Sources: Chafouleas, S., Riley-Tillman, T.C., & Sugai, G. (2007). School-based behavioral assessment: Informing intervention and instruction. New York: Guilford Press. Gansle, K. A., & Noell, G. H. (2007). The fundamental role of intervention implementation in assessing response to intervention. In S. R. Jimerson, M. K. Burns, & A. M. VanDerHeyden (Eds.), Response to intervention: The science and practice of assessment and intervention (pp ). Kazdin, A. E. (1989). Behavior modification in applied settings (4th ed.). Pacific Gove, CA: Brooks/Cole..

74 Teacher Intervention Integrity Self-Rating
Intervention Contact Log

75 Supplemental Methods to Collect Data About Intervention Integrity
Intervention Permanent Products: If an intervention plan naturally yields permanent products (e.g., completed scoring sheets, lists of spelling words mastered, behavioral sticker charts), these products can be periodically collected and evaluated as another indicator of intervention integrity (Gansle & Noell, 2007). Source: Gansle, K. A., & Noell, G. H. (2007). The fundamental role of intervention implementation in assessing response to intervention. In S. R. Jimerson, M. K. Burns, & A. M. VanDerHeyden (Eds.), Response to intervention: The science and practice of assessment and intervention (pp ).

76 Intervention Integrity: Verify Through a Mix of Information Sources
Schools should consider monitoring intervention integrity through a mix of direct and indirect means, including direct observation and permanent products (Gansle & Noell, 2007), as well as interventionist self-ratings (Roach & Elliott, 2008). Source: Gansle, K. A., & Noell, G. H. (2007). The fundamental role of intervention implementation in assessing response to intervention. In S. R. Jimerson, M. K. Burns, & A. M. VanDerHeyden (Eds.), Response to intervention: The science and practice of assessment and intervention (pp ). Roach, A. T., & Elliott, S. N. (2008). Best practices in facilitating and evaluating intervention integrity. In A. Thomas & J. Grimes (Eds.), Best practices in school psychology V (pp ).

77 ‘Selecting Methods to Track Intervention Integrity’…

78 ‘Selecting Methods to Track Intervention Integrity’…

79 ‘Selecting Methods to Track Intervention Integrity’…

80 ‘Selecting Methods to Track Intervention Integrity’…

81 Team Activity: Measuring ‘Intervention Follow-Through’
At your table: Brainstorm ways that your school or district will use to measure intervention integrity for math and writing interventions. What preparations are necessary to introduce these methods for measuring ‘intervention follow-through’ to your faculty?

82 Student Academic Performance: How to Determine ‘Discrepancy’

83 Avg Classroom Academic
Performance Level Discrepancy 1: Skill Gap (Current Performance Level) Discrepancy 2: Gap in Rate of Learning (‘Slope of Improvement’) Target Student ‘Dual-Discrepancy’: RTI Model of Learning Disability (Fuchs 2003)

84 Definition: Parallax “an apparent displacement or difference in the apparent position of an object viewed along two different lines of sight, and is measured by the angle or semi-angle of inclination between those two lines.” Source: Parallax. (2010, August 11). In Wikipedia, The Free Encyclopedia. Retrieved 09:33, August 13, 2010, from

85 Parallax and Academic Norms
When comparing the academic performance of a struggling student to peers, schools should pay attention to both external (research) academic norms and local norms. This dual perspective can simultaneously compare the student’s skills to ‘national’ norms as well as determine how discrepant the student’s skills are when compared to other children in his or her classroom. Source: Shapiro, E. S. (2008). Best practices in setting progress-monitoring monitoring goals for academic skill improvement. In A. Thomas & J. Grimes (Eds.), Best practices in school psychology V (pp ). Bethesda, MD: National Association of School Psychologists.

86 Defining ‘Discrepant’ Academic Performance: Do We Use External Norms or Local Norms?
External (Research or Benchmark) Norms: Used to compare the performance of a student or instructional program to objective external/research/national norms. External norms can help to answer these questions: Is the school’s core program successful (comparison of local to research norms)? Is a child performing at a minimum level of competency in the academic skill to allow us to predict future success? What objective academic performance cut-off should be set to determine student entry into and exit from Tier 2 and 3 intervention programs?

87 Defining ‘Discrepant’ Academic Performance: Do We Use External Norms or Local Norms?
Local Norms: Rank-ordered compilation of scores of students within a particular grade level/school. Local norms are used to help answer these questions: What is the typical range of student ability in the grade level or school? How is a particular student performing relative to other children in the grade level or school? How much effort must a teacher exert to instruct this student relative to other students in the class?

88 In their current number form, these data are not easy to interpret.
Baylor Elementary School : Grade Norms: Correctly Read Words Per Min : Sample Size: 23 Students Group Norms: Correctly Read Words Per Min: Book 4-1: Raw Data LOCAL NORMS EXAMPLE: Twenty-three 4th-grade students were administered oral reading fluency Curriculum-Based Measurement passages at the 4th-grade level in their school. In their current number form, these data are not easy to interpret. So the school converts them into a visual display—a box-plot —to show the distribution of scores and to convert the scores to percentile form. When Billy, a struggling reader, is screened in CBM reading fluency, he shows a SIGNIFICANT skill gap when compared to his grade peers.

89 Baylor Elementary School : Grade Norms: Correctly Read Words Per Min : Sample Size: 23 Students January Benchmarking Group Norms: Correctly Read Words Per Min: Book 4-1: Raw Data Source: Tindal, G., Hasbrouck, J., & Jones, C. (2005).Oral reading fluency: 90 years of measurement [Technical report #33]. Eugene, OR: University of Oregon. National Reading Norms: 112 CRW Per Min Correctly Read Words-Book 4-1 Group Norms: Converted to Box-Plot Median (2nd Quartile)=71 1st Quartile=43 3rd Quartile=108 Billy=19 Low Value=31 Hi Value=131

90 How Do We Define ‘Discrepant Academic Performance’ as One of the RTI ‘Dual Discrepancies’?
Using norms (benchmarks or local), the Special Education Eligibility Team can set a cut-point, below which a student is considered ‘severely discrepant’ from their peers in a specific academic skill.

91 How Do We Define ‘Discrepant Academic Performance’ as One of the RTI ‘Dual Discrepancies’? (Cont.)
Based on either external benchmarks or building-wide screening results, schools need to quantify the lower and upper range of academic performance that identifies a student as requiring Tier 2 or 3 supplemental intervention services. A student with screening results: below the LOW cut-point is defined as having ‘severely discrepant’ academic skills when compared with these local norms and would benefit from Tier 3 intervention services. between the LOW and HIGH values is at lesser academic risk and would benefit from Tier 2 intervention services. above the HIGH value does not require supplemental interventions.

92 10%ile 20%ile 1 2 3 31 WPM 51 WPM RTI Tiers EasyCBM Norms: Selecting Performance ‘Cut-Points’ for Tier 2/3 Services Source: EasyCBM: (2010). Interpreting the EasyCBM progress monitoring test results. Retrieved February 22, 2011, from

93 Defining ‘Discrepant Academic Performance’: Example
Baylor Elementary School conducted a winter screening of its 3rd-grade students in Oral Reading Fluency (ORF). The school set the 10th percentile as being the low academic cut-point, defining the cross-over to Tier3 support and the 30th percentile as being the high cut-point, defining a student who would need no supplemental intervention. The Tier 2 range, then, was 10th to 30th percentile. Baylor compared its local norm results to research/external ORF norms (Tindal, Hasbrouck & Jones, 2005). Source: Tindal, G., Hasbrouck, J., & Jones, C. (2005).Oral reading fluency: 90 years of measurement [Technical report #33]. Eugene, OR: University of Oregon.

94 Example: Baylor Elementary School/Smithtown School District
Grade 3: Winter Screening: Oral Reading Fluency: Word Correct Per Minute External/Research Norms: %ile WCPM = Tier 2 Range Local (Building) Norms: %ile WCPM = Tier 2 Range Correct Words Per Minute

95 Using Screening Results to Define ‘Discrepant’ Academic Performance: Local vs. External Fluency Norms

96 Defining ‘Discrepant Academic Performance’: Example
Because the external/research norms were higher than its local norms, Baylor School used those external norms to identify students who needed Tier 2 and 3 services. Using external/research norms prevented the school from under-identifying students with serious academic deficits simply because they happened to be in an underperforming school. Of course, the school also realized, based on the low local norms, that it needed to concentrate on improving core instruction to increase the reading performance of all students. Source: Tindal, G., Hasbrouck, J., & Jones, C. (2005).Oral reading fluency: 90 years of measurement [Technical report #33]. Eugene, OR: University of Oregon.

97 RTI & Special Education Eligibility: Sample Case
A student is found to fall within the ‘some risk’ Tier 2 level in oral reading fluency using external norms (student performed above the 20th percentile). The student was found to fall at the 8th percentile using local performance norms. The school believes that the student is learning disabled. How might you respond to this issue if you were a member of the Special Education Eligibility Team at an initial eligibility meeting?

98 Team Activity: What is the ‘Reachable, Teachable’ Range?
What do you believe should be the percentile cut-off (5%?, 10?, 20%?) that your district might set that would demarcate the boundary between Tier 2 reading services (the student can be maintained with strong core instruction and supplemental RTI support) and Tier 3 (the student needs intensive support and may eventually be a candidate for a special education referral)?

99 Student ‘Rate of Improvement’ (Slope): How to Determine ‘Discrepancy’

100 Estimating Student Rate of Improvement: What Are the Relative Advantages of External vs. Local Growth Norms? External (Research-Derived) Growth Norms: Provide a general estimate of the expected academic growth of a ‘typical’ student that can be applied across many academic settings. However, these norms may not be representative of student performance at a particular school. NOTE: Research-derived growth norms are likely to serve as the primary comparison at the Special Education Eligibility Team meeting of whether a student is ‘severely discrepant’ in rate of learning/slope of improvement.

101 Research-Derived Rates of Academic Growth/Slope: Example
Source: Tindal, G., Hasbrouck, J., & Jones, C. (2005).Oral reading fluency: 90 years of measurement [Technical report #33]. Eugene, OR: University of Oregon.

102 Estimating Student Rate of Improvement: What Are the Relative Advantages of External vs. Local Growth Norms? Local Growth Norms: Provide an estimate of typical growth for students within a particular school’s population. This provides insight into current levels of student achievement and the effectiveness of instruction in that building. However, these results cannot easily be applied to other dissimilar academic settings. NOTE: Local growth norms may be used to identify students in general education who are falling behind their classmates in academic skills.

103 Estimating Rates of LOCAL Academic Growth/Slope
Methods for estimating student academic growth can include: Calculation of typical rates of student progress from LOCAL academic fluency norms (e.g., fall, winter, and spring oral reading fluency norms collected in a school). All scores can be entered into Excel to generate individual student slopes (estimated rate of improvement) and mean/median slope for the group. Source: Hasbrouck, J., & Tindal, G. (2005). Oral reading fluency: 90 years of measurement. Eugene, OR: Behavioral Research & Teaching. Retrieved from

104 How Can a Student’s Rate of Improvement (Slope) Be Analyzed to See If It is ‘Discrepant’?
There is not yet consensus in the research literature about how to set specific criteria to judge whether a student’s actual rate of academic progress is discrepant from expectations. Two possible methods to determine slope discrepancy are direct comparison of the student slope to the local or external slope estimate—with a cut-off value to signal that the student is discrepant. calculation of the standard deviation for the slopes of student’s in a local norming sample to determine whether a student falls at least one standard deviation below the mean slope of the group.

105 How Can a Student’s Rate of Improvement (Slope) Be Analyzed to See If It is ‘Discrepant’?
There is not yet consensus in the research literature about how to set specific criteria to judge whether a student’s actual rate of academic progress is discrepant from expectations. Two possible methods to determine slope discrepancy are direct comparison of the student slope to the local or external slope estimate—with a cut-off value to signal that the student is discrepant. calculation of the standard deviation for the slopes of student’s in a local norming sample to determine whether a student falls at least one standard deviation below the mean slope of the group.

106 How Can a Student’s Rate of Improvement (Slope) Be Analyzed to See If It is ‘Discrepant’? (cont.)
GROWTH QUOTIENT: Direct Comparison With Research, Local Norms. The student slope can be divided by the local or external slope estimate—with a cut-off value (e.g., 0.75 or below) selected to indicate when this comparison value (quotient) is discrepant. NOTE: Any slope comparison quotient <1.0 indicates that the student did not meet growth expectations. Example: A 4th-grade student, Darlene, was increasing reading fluency by 0.8 words per week on an intervention. Comparison external research norms (Hasbrouck & Tindal, 2005) suggest that typical rates of student growth are 0.9 words per week. Darlene’s rate of progress (0.8 words per week) divided by the research growth norm (0.9 words per week) yield a quotient of This figure shows that Darlene has fallen below the target for growth (< 1.0) and is making progress about 90 percent of the target for growth.

107 Direct Comparison With Research, Local Norms.
How Can a Student’s Rate of Improvement (Slope) Be Analyzed to See If It is ‘Discrepant’? (cont.) Direct Comparison With Research, Local Norms. Advantage: Calculating a quotient that divides the student’s actual rate of growth by a comparison growth norm (either external or local slope estimate) provides quick visual feedback about whether the student is exceeding, meeting, or falling behind the growth represented by the comparison slope. This quotient figure also gives a visual indication of the magnitude of the student progress relative to the comparison slope. Limitation: There are no research-generated guidelines for setting a‘cutpoint’ below which this ‘growth quotient’ figure would be seen as indicating a significantly discrepant rate of student growth.

108 Example: Direct Comparison With Research, Local Norms.
At a Special Education Team Eligibility Meeting, the Team determined that there was not a disability but acknowledged that the student may be seen as ‘struggling’ in the context of his high-performing classroom. A student, Roy, was found with intervention to increase his reading fluency at a rate of 0.92 words per week. When Roy’s slope was divided by external/research norms, the resulting quotient was 1.02. When Roy’s slope was divided by local growth norms, the resulting quotient was 0.83.

109 Setting Individual Student RTI Academic Goals Using Research Norms Jim Wright

110 Setting Individual Student RTI Academic Goals Using Research Norms
To set a goal for student academic performance, four elements are needed: The student’s baseline academic performance. Prior to starting the intervention, the teacher calculates baseline performance by assessing the target student several times with the academic measure that will be used to measure that student’s progress once the intervention begins. Estimate of ‘typical’ peer performance. The teacher has a reliable estimate of expected or typical peer performance on the academic measure that will be used to measure the target student’s progress.

111 Setting Individual Student RTI Academic Goals Using Research Norms
To set a goal for student academic performance, four elements are needed (cont.): Estimate of expected weekly progress. The teacher selects a rate of weekly academic progress that the target student is expected to attain if the intervention is successful. Number of weeks for the intervention trial. The teacher decides on how many weeks the RTI intervention will last, as the cumulative, final academic goal can be calculated only when the entire timespan of the intervention is known.

112 How to Set a Student Goal
The teacher collects at least 3 baseline observations from the target student using alternate forms of the progress-monitoring measure (e.g., CBM oral reading fluency passages). The median baseline observation is selected to serve as the student’s starting (baseline) performance. The teacher subtracts the student’s baseline from the estimate of typical peer performance for that grade level supplied by the research norms to calculate the academic performance gap that is to be closed during the intervention.

113 How to Set a Student Goal
The teacher decides how many instructional weeks the intervention will be in place (e.g., 8 weeks). The teacher selects grade-appropriate norms for academic growth per week supplied by the research norms.

114 How to Set a Student Goal
The teacher multiplies the grade norm for weekly growth (selected in step 4) by a figure between 1.5 and 2.0 (Shapiro, 2008). Because the original weekly growth rate represents a typical rate student improvement, the target student’s weekly growth estimate should be adjusted upward to accelerate learning and close the gap separating that student from peers. Multiplying the original weekly growth rate by an amount ranging between1.5 and 2.0 accomplishes this adjustment.

115 How to Set a Student Goal
The teacher next multiplies the weekly growth figure by the total number of weeks that the intervention will be in place. This figure represents the minimum academic growth expected during the intervention. The teacher adds the expected academic growth calculated in the previous step to the student baseline calculated in step 1. This figure represents the final student goal if the intervention is successful.

116 How to Set a Student Goal
Recommendations for using this approach: Research norms for student performance and academic growth are the ‘gold standard’ in goal-setting, as they provide fixed, external standards for proficiency that are not influenced by variable levels of student skill in local classrooms. When setting academic goals for struggling students, schools should use research norms whenever they are available. In particular, research norms should be used for high-stakes RTI cases that may be referred at some point to the Special Education Eligibility Team.

117 How to Set a Student Goal: Example
In December, Mrs. Chandler, a 4th-grade teacher, collected three baseline assessments of a student, Randy, in oral reading fluency using 4th-grade passages. She found that Randy’s median reading rate in these materials was 70 words per minute. Mrs. Chandler consulted research norms in oral reading fluency (Tindal, Hasbrouck & Jones, 2005) and decided that a reasonable minimum reading rate for students by winter of grade 4 (25th percentile) is 87 words per minute. Randy would need to increase his reading rate by 17 words per minute to close this academic achievement gap with peers.

118 Sample Reading Fluency Norms
Source: Tindal, G., Hasbrouck, J., & Jones, C. (2005).Oral reading fluency: 90 years of measurement [Technical report #33]. Eugene, OR: University of Oregon.

119 How to Set a Student Goal: Example (Cont.)
The reading intervention planned for Randy would last 8 instructional weeks. Mrs. Chandler consulted the research norms and noted that a typical rate of growth in reading fluency for a 4th-grade student is 0.9 additional words per week. Mrs. Chandlers adjusted the 0.9 word growth rate for Randy upward by multiplying it by 1.5 because she realized that he needed to accelerate his learning to catch up with peers. When adjusted upward, the weekly growth rate for Randy increased from 0.9 to 1.35 additional words per minute.

120 How to Set a Student Goal: Example (Cont.)
Multiplying the expected weekly progress of 1.35 additional words times the 8 weeks of the intervention, Mrs. Chandler found that Randy should acquire at least 11 additional words of reading fluency by the conclusion of the intervention. She added the 11 words per minute to Randy’s baseline of 70 words per minute and was able to predict that—if the 8-week intervention was successful—Randy would be able to read approximately 81 words per minute.

121 How to Set a Student Goal: Example (Cont.)
Because Randy would not be expected to fully close the gap with peers in 8 weeks, Mrs. Chandler regarded her intervention goal of 81 words per minute as an intermediate rather than a final goal. However, if the intervention was successful and the student continued to add 1.35 words per week to his reading fluency, he could be expected to reach an acceptable level of fluency soon.

122 How to Monitor a Student Off-Level
Conduct a ‘survey level’ assessment of the student to find their highest ‘instructional’ level (between 25th and 50th percentile). Student is monitored at ‘off level’ during the intervention (e.g., weekly). The ‘goal’ is to move the student up to the 50th percentile. Once per month, the student is also assessed at grade level to monitor grade-appropriate performance. When the student moves above 50th percentile on off-level, the interventionist tests the student on the next higher level. If the student performs above the 25th percentile on the next level, monitoring starts at the new, higher level.

123 Sample Reading Fluency Norms
Source: Tindal, G., Hasbrouck, J., & Jones, C. (2005).Oral reading fluency: 90 years of measurement [Technical report #33]. Eugene, OR: University of Oregon.

124 RTI Lab: Writing a Tier 2/3 Intervention Summary Report: Guidelines Jim Wright

125 Team Activity: Review a Sample Intervention Summary Report
At your table: Review the sample Intervention Summary Report for Brian Haskell, a 5th grade student (pp. 1-4). What key points does the report contain that may support or prevent your effort to determine if the student is a ‘non-responder’ to intervention? How useful does your team find this Intervention Summary Report format to be?

126 Tier 2/3 Intervention Summary Report: Sample Introductory Paragraph

127 Writing a Tier 2/3 Intervention Summary Report…
Introductory Paragraph. This opening section presents essential background information about the RTI case, including: Current grade level of the student Information about how the student was identified for supplemental RTI intervention (e.g., student performance on fall/winter/spring academic screening placing them in the ‘at risk’ range) [If the student received an ‘off-level’ supplemental intervention] information about the grade level selected for the intervention and progress-monitoring

128 Tier 2/3 Intervention Summary Report: Sample Intervention Summary Paragraph/Section

129 Sample Reading Fluency Norms
Source: Tindal, G., Hasbrouck, J., & Jones, C. (2005).Oral reading fluency: 90 years of measurement [Technical report #33]. Eugene, OR: University of Oregon.

130 Writing a Tier 2/3 Intervention Summary Report…
Intervention Summary Paragraphs. This section provides summary information about each intervention plan or ‘trial’. A separate paragraph or section is written for each intervention plan/trial. Every intervention summary should include: • Name and brief description of the instructional program(s) or practices that makes up the intervention.

131 Writing a Tier 2/3 Intervention Summary Report…
Intervention Summary Paragraphs (Cont.). Details about intervention delivery, such as: Start and end dates of the intervention Total number of instructional weeks of the intervention plan Number of sessions per week Length of each intervention session Intervention group size

132 Writing a Tier 2/3 Intervention Summary Report…
Intervention Summary Paragraphs (Cont.). Details about the student ‘s progress, such as: Student baseline at the start of the intervention. TIP: When a student is starting an intervention plan and was previously monitored on an earlier, recent intervention plan, baseline can easily be computed for the new intervention by selecting the median value from the last three progress-monitoring data points collected during the previous intervention.

133 Writing a Tier 2/3 Intervention Summary Report…
Intervention Summary Paragraphs (Cont.). Details about the student ‘s progress, such as: Weekly ‘accelerated’ goal for student improvement. This ‘accelerated’ goal can be calculated by taking a research-based estimate of typical weekly student progress at a grade level and multiplying that typical growth rate by a figure between 1.5 and 2.0 (Shapiro, 2008). Cumulative performance goal at the end of the intervention. This end-goal is computed by multiplying the weekly accelerated goal by the number of weeks that the intervention will take place. That product is then added to the student’s baseline performance to compute the cumulative goal.

134 Writing a Tier 2/3 Intervention Summary Report…
Intervention Summary Paragraphs (Cont.). Details about the student ‘s progress, such as: Typical weekly growth rates in the academic skill and grade level in which the student is receiving a supplemental intervention. These ‘typical’ weekly growth rates are usually derived from research norms. Comparison of actual student performance to goals and norms. The student’s actual weekly growth rate is compared to both the accelerated weekly goal and the typical peer rate of weekly progress. Additionally, the student’s actual cumulative progress during the intervention is compared to the original cumulative goal to determine if the intervention was successful.

135 Writing a Tier 2/3 Intervention Summary Report…
Intervention Summary Paragraphs (Cont.). Information about intervention integrity. The total possible number of intervention sessions available to the student is computed as the number of sessions per week multiplied by the number of weeks of the intervention. The number of intervention sessions actually attended by the student is also presented. The actual sessions that the student participated in is divided by the total number of possible sessions; this decimal is then multiplied by 100 to yield a percentage of ‘intervention integrity’. ‘Intervention integrity’ figures should exceed 80 percent.

136 Writing a Tier 2/3 Intervention Summary Report…
Intervention Summary Paragraphs (Cont.). Number of data points collected to monitor student progress during the intervention. For Tier 2 interventions, monitoring information should be collected at least every two weeks. Tier 3 interventions should be monitored at least weekly.

137 Tier 2/3 Intervention Summary Report: Sample Intervention Series Analysis Section

138 Tier 2/3 Intervention Summary Report: Sample Intervention Series Analysis Section

139 Writing a Tier 2/3 Intervention Summary Report…
Intervention Series Analysis: This concluding section summarizes the findings of the several intervention trials and reaches a conclusion about whether the student was adequately responsive to general-education Tier 2/3 interventions. The section includes: An analysis of whether the student hit the accelerated goal(s) for each of the interventions discussed.

140 Writing a Tier 2/3 Intervention Summary Report…
Intervention Series Analysis (Cont.): An analysis of whether, during each intervention, the student exceeded, met, or fell below the typical peer growth norms at the grade level of the supplemental intervention. This information will be useful in determining whether a student has a significant discrepancy in academic growth compared to typical peer growth norms.

141 Writing a Tier 2/3 Intervention Summary Report…
Intervention Series Analysis (Cont.): Summary of the student’s performance at his or her current grade placement on recent schoolwide academic screenings. Ideally, the student’s screening results are presented with corresponding percentile rankings. Students who continue to perform below the 10th percentile on school screenings at their grade of record despite several intensive RTI interventions demonstrate that the interventions have failed to generalize to significant improvements in classroom academic skills. Students meeting this profile can be considered to have a severe discrepancy in academic skills.

142 Writing a Tier 2/3 Intervention Summary Report…
Intervention Series Analysis (Cont.): Conclusion about the student’s ‘response status’. Based on the student’s response to intervention across the full intervention history, the report reaches a conclusion about whether the student meets criteria as a ‘non-responder’ to intervention.

143 Team Activity: How Can You Create Intervention Summary Reports in Your District?
At your table: Review the outline, example, and recommendations provided at today’s workshop for writing an Intervention Summary Report. Discuss how you might develop a process in your own district to create high-quality Intervention Summary Reports for use by the Special Education Eligibility Team and others.

144 RTI & Special Education Eligibility: ‘Challenge’ Scenarios Jim Wright www.interventioncentral.org

145 RTI & Special Education Eligibility: ‘Challenge’ Scenarios
The school presents clear information suggesting that the student has not responded adequately to general-education reading interventions. The school believes that the student should be designated Learning Disabled. In your review of records, however, you note that the student passed the most state ELA test. How might you respond to this issue at the meeting?

146 RTI & Special Education Eligibility: ‘Challenge’ Scenarios
When information is presented from the school on the student’s response to a series of math intervention programs, you see that they have tried 3 interventions at Tiers 2 and 3. However, there is little information about the second of the 3 interventions, including what research-based strategies were used or what progress-monitoring data were collected. The intervention is instead described as ‘the student saw the math intervention teacher for 30 minutes 3 times per week.” How might you respond to this issue at the meeting?

147 RTI & Special Education Eligibility: ‘Challenge’ Scenarios
A student is referred to Special Education because she has apparently failed to respond adequately to the school’s Tier 2 and 3 interventions. As you review the student’s RTI information, you realize that the student fell within the average range on the external reading fluency performance norms, even though the classroom teacher has made an emphatic case that the student has severely limited reading skills in the classroom. How might you respond to this issue at the meeting?

148 Planning Your District’s ‘Next Steps’ in Developing RTI Decision Rules: Activity
At your table, review the content discussed at today’s workshop: Dual Discrepancy: Determining discrepant performance and rate of growth RTI ‘Non-Responder’ Checklist Intervention integrity Intervention Summary Report Decide on at least 3 key ‘next steps’ that your district should adopt to move forward in developing RTI ‘decision rules’ for CSE. Be prepared to share the main points of your discussion with the large group.


Download ppt "RTI Data Challenge: Setting Individual RTI Academic Goals Using Research Norms for Students Receiving ‘Off-Level’ Interventions Source: Shapiro, E. S."

Similar presentations


Ads by Google