Presentation is loading. Please wait.

Presentation is loading. Please wait.

Determining Eligibility for Special Education in an RTI System

Similar presentations


Presentation on theme: "Determining Eligibility for Special Education in an RTI System"— Presentation transcript:

1 Determining Eligibility for Special Education in an RTI System
Joseph F. Kovaleski, D.Ed., NCSP Indiana University of PA Indiana, PA Caitlin S. Flinn, M.Ed., NCSP Exeter Township School District Reading, PA

2 Acknowledgements This presentation is based on a training module developed in collaboration with the Pennsylvania Training and Technical Assistance Network (PaTTAN) as part of the RTI Pilot Project. Amy Smith, Ed Shapiro, and other PaTTAN consultants contributed to the development of these materials. Thanks to Andrew McCrea for contributing to the development of the Rate of Improvement slides.

3 Learning Objectives Participants will:
Identify assessment procedures for RTI that are embedded in a three-tier model of service delivery Graph and calculate rate of improvement data Articulate how RTI is used in the procedure to determine eligibility for special education Conceptualize new report writing language for composing evaluation reports in an RTI model

4 Today’s Perspective Assume knowledge of RTI and the three-tier model.
Determining eligibility for special education using RTI presupposes that the RTI infrastructure has been built. This session is about using RTI as an alternative to ability-achievement discrepancy, not in addition to it. The perspective will be based on law/regulations and best practices.

5 Most relevant for those ready to use RTI.
Some aspects of today’s presentation are relevant to the SLD requirements, even if you’re not using RTI. Application of some procedures and principles can begin now as effective practices.

6 Response to Intervention
Standards aligned core instruction Universal screening Interventions of increasing intensity Research-based practices Progress monitoring Data analysis teaming Parental engagement

7 Specific Learning Disability
1. Failure to meet age- or grade-level State standards in one of eight areas: oral expression listening comprehension written expression basic reading skill reading fluency skill reading comprehension mathematics calculation mathematics problem solving 2. Discrepancy: Pattern of strengths & weaknesses, relative to intellectual ability as defined by a severe discrepancy between intellectual ability and achievement, or relative to age or grade. OR RTI: Lack of progress in response to scientifically based instruction 3. Rule out: Vision, hearing, or motor problems mental retardation emotional disturbance cultural and/or environmental issues limited English proficiency 4. Rule out lack of instruction by documenting: Appropriate instruction by qualified personnel Repeated assessments Specific Learning Disability Inclusionary Exclusionary Observation

8 Inclusionary Criteria
Criterion #1:Does the child achieve adequately for the child’s age or meet State-approved grade level standards? The group may determine the child has an SLD if the child: Does not achieve adequately for the child’s age or to meet State- approved grade-level standards in one or more of the following areas, when provided with learning experiences and instruction appropriate for the child’s age or State-approved grade-level standards: (i) Oral expression (ii) Listening comprehension (iii) Written expression (iv) Basic reading skill (v) Reading fluency skills (vi) Reading comprehension (vii) Mathematics calculation (viii) Mathematics problem solving Inclusionary Criteria § (a)

9 Specific Learning Disability
1. Failure to meet age- or grade-level State standards in one of eight areas: oral expression listening comprehension written expression basic reading skill reading fluency skill reading comprehension mathematics calculation mathematics problem solving 2. Discrepancy: Pattern of strengths & weaknesses, relative to intellectual ability as defined by a severe discrepancy between intellectual ability and achievement, or relative to age or grade. OR RTI: Lack of progress in response to scientifically based instruction 3. Rule out: Vision, hearing, or motor problems mental retardation emotional disturbance cultural and/or environmental issues limited English proficiency 4. Rule out lack of instruction by documenting: Appropriate instruction by qualified personnel Repeated assessments Specific Learning Disability Inclusionary Exclusionary Observation 9

10 Sources of Data to Document Lack of Achievement
Existing Data Performance on benchmark assessments Terminal performance on progress monitoring measures Performance on statewide and district-wide assessments New Data to Collect (if necessary) Norm-referenced tests of academic achievement Curriculum-based evaluation (cf. Howell et al.)

11 Lack of achievement is in relation to age or grade-level standards.
The student’s assessed achievement on all measures should be significantly behind age- or grade-peers. Measures should be reflective of state standards. Achievement here is related to age or grade, not intellectual level.

12 Normative Comparisons
Normative group is important decision National normative data sets for CBM AIMSweb Hasbrouck & Tindal DIBELS

13 Who sets the parameters for being ‘deficient’
How deficient must a student be in order to demonstrate inadequate performance/achievement? It is the responsibility of individual school districts to establish or define appropriate assessment parameters.

14 How deficient should a student be to qualify? An opinion…
Contemporary research has indicated that a score of the 30th percentile on nationally normed benchmark tests or individual tests of academic achievement is equivalent to a proficient score on most statewide tests. Therefore, to demonstrate inadequate achievement relative to this standard, a student should be significantly below this level ( e.g., 10th percentile) to meet the SLD qualification under this component.

15 2.0X calculation Divide norm group mean by student’s score
Result expressed as a ratio of deficiency Example: 100 wpm / 50 wpm = 2.0X

16 DIBELS benchmarks (with ROI in parentheses based on 18 weeks between benchmarks, 36 total weeks):
K – ISF (0.9) K – PSF 35 (1.0) K - NWF 25 (0.7) 1 - NWF 50 (1.4) 1 - ORF 40 (1.1) 2 - ORF 90 (1.3) 3 - ORF 110 (0.9) 4 - ORF 118 (0.7) 5 - ORF 124 (0.6) 16

17 Consider John, a third grader
Consider John, a third grader. We’ll compare his scores (denominators) with the scores of the norm group (numerators), using the 3rd grade norms for ORF and the 1st grade norms for NWF. ORF: wpm = 2.0X 55 wpm NWF: 50 nwpm = 2.5X 20 nwpm

18 May we use norm-referenced tests of academic achievement in determining the extent of the deficiency? May we? Yes! There is nothing legally that prevents a team from doing so. Should we? It depends on how secure you are with other data regarding the student’s deficiency in relation to standards. If you have a preponderance of other data, you may choose not to use other norm- referenced measures. If you don’t, or if there are other questions that can be answered with norm- referenced measures, use them.

19 Example of report language:
Documentation of Deficiency in Level of Performance John has displayed documented deficiencies in reading skills since kindergarten. He has been at the below basic level on district-wide and statewide tests. His most recent universal screening using DIBELS (January) indicated an oral reading fluency score of 55 words per minute. Compared to typical peers for John's age and grade level (110 wpm), John's deficiency ratio is 2.0X. The Nonsense Word Fluency subtest of DIBELS was also administered. John attained a score of 20 nonsense words per minute on the subtest. Compared to the terminal score achieved by first-graders (50 nwpm), John has a deficiency ratio of 2.5X. Progress monitoring of John's oral reading fluency has indicated that John continues to have difficulty reading in spite of intensive intervention. His terminal score during the last week of March was 53 words per minute. For oral reading fluency John also attained a 20% accuracy rate on the 4Sight test which is considerably below the 80% mark that is typically attained by students in his grade.

20 Implications to consider
The student’s IQ level is not considered the criterion against which the student’s academic performance is compared. Students with intelligence levels in the ‘slow learner” range may not be excluded from having SLD if they display significantly inadequate academic achievement and if they meet the other criteria (e.g., RTI). Conversely, students with high levels of intelligence must display inadequacies in relation to their age or the state standards for their grade in order to meet this criterion.

21 Criterion #2: Does the child demonstrate a pattern of strengths and weaknesses or a lack of progress in response to scientifically based instruction? (i) The child does not make sufficient progress to meet age or State-approved grade-level standards in one or more of the areas identified ... when using a process based on the child’s response to scientific, research-based intervention; or  (ii) The child exhibits a pattern of strengths and weaknesses in performance, achievement, or both, relative to age, State- approved gradelevel standards, or intellectual development, that is determined by the group to be relevant to the identification of a specific learning disability, using appropriate assessments, consistent with §§ and

22 Specific Learning Disability
1. Failure to meet age- or grade-level State standards in one of eight areas: oral expression listening comprehension written expression basic reading skill reading fluency skill reading comprehension mathematics calculation mathematics problem solving 2. Discrepancy: Pattern of strengths & weaknesses, relative to intellectual ability as defined by a severe discrepancy between intellectual ability and achievement, or relative to age or grade. OR RTI: Lack of progress in response to scientifically based instruction 3. Rule out: Vision, hearing, or motor problems mental retardation emotional disturbance cultural and/or environmental issues limited English proficiency 4. Rule out lack of instruction by documenting: Appropriate instruction by qualified personnel Repeated assessments Specific Learning Disability Inclusionary Exclusionary Observation 22

23 Overview of RoI Define rate of improvement (RoI)
Review importance of RoI within context of RtI Establish a need for consistency when graphing and calculating rate of improvement (RoI) Model how to graph and calculate RoI in Excel

24 With Progress Monitoring Data…
How do we know if a student is learning? Look at the data points Where are they on the graph? Are the data points getting closer to the goal or benchmark? Is there a way to measure growth? Make an aimline toward goal Look to see where data points are compared to aimline Calculate rate of improvement

25 RoI Definition Rate of Improvement can be described algebraically as the slope of a line Slope is defined as: the vertical change over the horizontal change on a Cartesian plane. (x-axis and y-axis graph) Also called: Rise over run Formula: m = (y2 - y1) / (x2 - x1) Describes the steepness of a line (Gall & Gall, 2007)

26 RoI Definition Finding a student’s RoI is determining the student’s learning Creating a line that fits the data points, a trendline To find that line, we use: Linear regression Ordinary Least Squares

27 Progress Monitoring Frequent measurement of knowledge to inform our understanding of the impact of instruction/intervention. Measures of basic skills (CBM) have demonstrated reliability & validity (see table at

28 Classroom Instruction (Content Expectations)
Measure Impact (Test) Proficient! Non Proficient Use Diagnostic Test to Differentiate Content Need? Basic Skill Need? Intervention Progress Monitor Intervention Progress Monitor With CBM If CBM is Appropriate Measure Rate of Improvement McCrea, 2010

29 So… Rate of Improvement (RoI) is how we understand student growth (learning). RoI is reliable and valid (psychometrically speaking) for use with CBM data. RoI is best used when we have CBM data, most often when dealing with basic skills in reading/writing/math. RoI can be applied to other data (like behavior) with confidence too! RoI is not yet tested on typical Tier I formative classroom data.

30 RoI is usually applied to…
Tier One students in the early grades at risk for academic failure (low green kids) Tier Two & Three Intervention Groups Special Education Students (and IEP goals) Students with Behavior Plans

31 RoI Foundations Deno, 1985 Curriculum-based measurement
General outcome measures Technically adequate Short Standardized Repeatable Sensitive to change 31

32 RoI Foundations Fuchs & Fuchs, 1998
Hallmark components of Response to Intervention Ongoing formative assessment Identifying non-responding students Treatment fidelity of instruction Dual discrepancy model One standard deviation from typically performing peers in level and rate 32

33 RoI Foundations Ardoin & Christ, 2008
Slope for benchmarks (3x per year) More growth from fall to winter than winter to spring Might be helpful to use RoI for fall to winter And a separate RoI for winter to spring 33

34 RoI Foundations Fuchs, Fuchs, Hamlett, Walz, & Germann, 1993
Typical weekly growth rates in oral reading fluency and digits correct Needed growth to remediate skills Students who had 1.5 to 2.0 times the slope of typically performing peers were able to close the achievement gap in a reasonable amount of time 34

35 RoI Foundations Deno, Fuchs, Marston, & Shin, 2001
Slope of frequently non-responsive children approximated slope of children already identified as having a specific learning disability 35

36 How many data points? 10 data points are a minimum requirement for a reliable trendline (Gall & Gall, 2007) Is that reasonable and realistic? How does that affect the frequency of administering progress monitoring probes? How does that affect our ability to make instructional decisions for students?

37 How can we show RoI? Speeches that included visuals, especially in color, improved recall of information (Vogel, Dickson, & Lehman, 1990) “Seeing is believing.” Useful for communicating large amounts of information quickly “A picture is worth a thousand words.” Transcends language barriers (Karwowski, 2006) Responsibility for accurate graphical representations of data (Flinn, 2008)

38 Skills for Which We Compute RoI
Reading Oral Reading Fluency Word Use Fluency Reading Comprehension MAZE/DAZE Retell, Word Use Early Literacy Skills Initial Sound Letter Naming Letter Sound Phoneme Segmentation Nonsense Word Spelling Written Expression TWW, CWS, WSC Math Math Computation Math Concepts Math Facts Early Numeracy Oral Counting Missing Number Number Identification Quantity Discrimination Behavior

39 Guidelines? Visual inspection of slope Multiple interpretations
Instructional services Need for explicit guidelines

40 Ongoing Research RoI for instructional decisions is not a perfect process Research is currently addressing sources of error: Christ, 2006: standard error of measurement for slope Ardoin & Christ, 2009: passage difficulty and variability Jenkin, Graff, & Miglioretti, 2009: frequency of progress monitoring

41 Future Considerations
Questions yet to be empirically answered What parameters of RoI indicate a lack of RtI? How does standard error of measurement play into using RoI for instructional decision making? How does RoI vary between standard protocol interventions? How does this apply to non-English speaking populations?

42 Multiple Methods for Calculating Growth
Visual Inspection Approaches “Eye Ball” Approach Split Middle Approach Quantitative Approaches Tukey Method Last point minus First point Approach Split Middle “plus” Linear Regression Approach

43 The Visual Inspection Approaches

44 Eye Ball Approach

45 Split Middle Approach Drawing “through the two points obtained from the median data values and the median days when the data are divided into two sections” (Shinn, Good, & Stein, 1989)

46 Split Middle X(83) X(63) X (9)

47 The Quantitative Approaches

48 Tukey Method Divide scores into 3 equal groups
Divide groups with vertical lines In 1st and 3rd groups, find median data point and median week and mark with an “X” Draw line between two “Xs” (Fuchs, et. al., Summer Institute Student progress monitoring for math.

49 Tukey Method X(74) X(62)

50 Calculating Slope: Tukey Method
3rd median point minus the 1st median point Divided by the number of data points minus one (74-62)/(11-1) = slope 12/10=1.2

51 Last minus First Iris Center: last probe score minus first probe score over last administration period minus first administration period. Y2-Y1/X2-X1= RoI

52 Last minus First

53 Last Minus First Y2-Y1/X2-X1=RoI (74-41)/(18-1)=RoI 33/17=1.9

54 Split Middle “Plus” X(83) X(63) X (9) 54

55 Split Middle “Plus” Y2-Y1/X2-X1=RoI (83-63)/( )=RoI 20/9=2.2

56 Linear Regression

57 Any Method of Visual Inspection
RoI Consistency? Any Method of Visual Inspection ??? Last minus First 1.9 Tukey Method 1.2 Split Middle “Plus” 2.2 Linear Regression 2.5

58 RoI Consistency? If we are not all using the same model to compute RoI, we continue to have the same problems as past models, where under one approach a student meets SLD criteria, but under a different approach, the student does not. Without a consensus on how to compute RoI, we risk falling short of having technical adequacy within our model.

59 So, Why Are There So Many Other RoI Models?
Ease of application Focus on Yes/No to goal acquisition, not degree of growth How many of us want to calculate OLS Linear Regression formulas (or even remember how)?

60 Literature shows that Linear Regression is Best Practice
Student’s daily test scores…were entered into a computer program…The data analysis program generated slopes of improvement for each level using an Ordinary-Least Squares procedure (Hayes, 1973) and the line of best fit. This procedure has been demonstrated to represent CBM achievement data validly within individual treatment phases (Marston, 1988; Shinn, Good, & Stein, in press; Stein, 1987). Shinn, Gleason, & Tindal, 1989

61 Growth (RoI) Research using Linear Regression
Christ, T. J. (2006). Short-term estimates of growth using curriculum based measurement of oral reading fluency: Estimating standard error of the slope to construct confidence intervals. School Psychology Review, 35, Deno, S. L., Fuchs, L. S., Marston, D., & Shin, J. (2001). Using curriculum based measurement to establish growth standards for students with learning disabilities. School Psychology Review, 30, Good, R. H. (1990). Forecasting accuracy of slope estimates for reading curriculum based measurement: Empirical evidence. Behavioral Assessment, 12, Fuchs, L. S., Fuchs, D., Hamlett, C. L., Walz, L. & Germann, G. (1993). Formative evaluation of academic progress: How much growth can we expect? School Psychology Review, 22,

62 Growth (RoI) Research using Linear Regression
Jenkins, J. R., Graff, J. J., & Miglioretti, D.L. (2009). Estimating reading growth using intermittent CBM progress monitoring. Exceptional Children, 75, Shinn, M. R., Gleason, M. M., & Tindal, G. (1989). Varying the difficulty of testing materials: Implications for curriculum-based measurement. The Journal of Special Education, 23, Shinn, M. R., Good, R. H., & Stein, S. (1989). Summarizing trend in student achievement: A comparison of methods. School Psychology Review, 18,

63 Incorporating Research
More growth from fall to winter than winter to spring for benchmarks (3x per year) Christ & Ardoin (2008) Christ, Yeo, Silberglitt (in press) Fien, Park, Smith, & Baker (2010) More growth from winter to spring than fall to winter Graney, Missall, & Martinez (2009)

64 Actual Student Data & Benchmark 3rd grade DIBELS ORF
Student SLOPE=2.5 Benchmark ROI=0.88 Student SLOPE=1.89 Benchmark ROI=1.06

65 McCrea (2010) Looked at Rate of Improvement in small 2nd grade sample
Found differences in RoI when computed for fall and spring: Ave RoI for fall: WCPM Ave RoI for spring: 1.21 WCPM Unpublished data 65

66 DIBELS (6th Ed.) ORF Change in Criteria
Fall to Winter Winter to Spring 2nd 24 22 3rd 15 18 4th 13 5th 11 9 6th 5 66

67 AIMSweb Norms 1st 18 31 2nd 25 17 3rd 22 15 4th 16 13 5th 6th 12
Based on 50th Percentile Fall to Winter Winter to Spring 1st 18 31 2nd 25 17 3rd 22 15 4th 16 13 5th 6th 12 67

68 Speculation as to why Differences in RoI within the Year
Relax instruction after high stakes testing in March/April; a state test effect. Depressed BOY benchmark scores due to summer break; a rebound effect (Clemens). Instructional variables could explain differences in Graney (2009) and Ardoin (2008) & Christ (in press) results (Silberglitt). Variability within progress monitoring probes (Ardoin & Christ, 2008) (Lent). 68

69 Get Out Your Laptops! Open Microsoft Excel I love ROI 69

70 Graphing RoI For Individual Students
Programming Microsoft Excel to Graph Rate of Improvement: Fall to Winter

71 Setting Up Your Spreadsheet
In cell A1, type 3rd Grade ORF In cell A2, type First Semester In cell A3, type School Week In cell A4, type Benchmark In cell A5, type the Student’s Name (Swiper Example)

72 Labeling School Weeks Starting with cell B3, type numbers 1 through 18 going across row 3 (horizontal). Numbers 1 through 18 represent the number of the school week. You will end with week 18 in cell S3.

73 Labeling Dates Note: You may choose to enter the date of that school week across row 2 to easily identify the school week.

74 Entering Benchmarks (3rd Grade ORF)
In cell B4, type 77. This is your fall benchmark. In cell S4, type 92. This is your winter benchmark.

75 Entering Student Data (Sample)
Enter the following numbers, going across row 5, under corresponding week numbers. Week 1 – 41 Week 8 – 62 Week 9 – 63 Week 10 – 75 Week 11 – 64 Week 12 – 80 Week 13 – 83 Week 14 – 83 Week 15 – 56 Week 17 – 104 Week 18 – 74

76 *CAUTION* If a student was not assessed during a certain week, leave that cell blank Do not enter a score of Zero (0) it will be calculated into the trendline and interpreted as the student having read zero words correct per minute during that week.

77 Graphing the Data Highlight cells A4 and A5 through S4 and S5
Follow Excel 2003 or Excel directions from here

78 Graphing the Data Excel 2003 Excel 2007
Across the top of your worksheet, click on “Insert” In that drop-down menu, click on “Chart” Excel 2007 Click Insert Find the icon for Line Click the arrow below Line

79 Graphing the Data Excel 2003 Excel 2007
A Chart Wizard window will appear Excel 2007 6 graphics appear

80 Graphing the Data Excel 2003 Excel 2007 Choose “Line”
Choose “Line with markers…” Excel 2007 Choose “Line with markers”

81 Graphing the Data Excel 2007 Excel 2003 Your graph appears
“Data Range” tab “Columns”

82 Graphing the Data Excel 2003 Excel 2007 “Chart Title”
“School Week” X Axis “WPM’ Y Axis Excel 2007 To change your graph labels, click on your graph Then your options appear at the top Click on one of the Chart Layouts

83 Graphing the Data Excel 2003 Excel 2007
Choose where you want your graph Excel 2007 Your chosen layout is applied to the graph You can click on the labels to change them

84 Graphing the Trendline
Excel 2003 Right click on any of the student data points Excel 2007

85 Graphing the Trendline
Excel 2003 Choose “Linear” Excel 2007

86 Graphing the Trendline
Excel 2003 Choose “Custom” and check box next to “Display equation on chart” Excel 2007

87 Graphing the Trendline
Clicking on the equation highlights a box around it Clicking on the box allows you to move it to a place where you can see it better

88 Graphing the Trendline
You can repeat the same procedure to have a trendline for the benchmark data points Suggestion: label the trendline Expected ROI Move this equation under the first

89 Individual Student Graph: Fall to Winter

90 Individual Student Graph
The equation indicates the slope, or rate of improvement. The number, or coefficient, before "x" is the average improvement, which in this case is the average number of words per minute per week gained by the student.

91 Individual Student Graph
The rate of improvement, or trendline, is calculated using a linear regression, a simple equation of least squares. To add additional progress monitoring/benchmark scores once you’ve already created a graph, enter additional scores in Row 5 in the corresponding school week.

92 Individual Student Graph
The slope can change depending on which week (where) you put the benchmark scores on your chart. Enter benchmark scores based on when your school administers their benchmark assessments for the most accurate depiction of expected student progress.

93 Programming Excel First Semester
Calculating Needed RoI Calculating Benchmark RoI Calculating Student’s Actual RoI

94 Quick Definitions Needed RoI Benchmark RoI Student’s Actual RoI
The rate of improvement needed to “catch” up to the next benchmark. Benchmark RoI The rate of improvement of typically performing peers according to the norms Student’s Actual RoI Based on the available data points, this is the student’s actual rate of improvement per week

95 Calculating Needed RoI
In cell T3, type Needed RoI Click on cell T5 In the fx line (at top of sheet) type this formula =((S4-B5)/18) Then hit enter Your result should read: This formula simply subtracts the student’s actual beginning of year (BOY) benchmark from the expected middle of year (MOY) benchmark, then dividing by 18 for the first 18 weeks (1st semester).

96 Calculating Benchmark RoI
In cell U3, type Benchmark RoI Click on cell U4 In the fx line (at top of sheet) type this formula =SLOPE(B4:S4,B3:S3) Then hit enter Your result should read: This formula considers 18 weeks of benchmark data and provides an average growth or change per week.

97 Calculating Student Actual RoI
Click on cell U5 In the fx line (at top of sheet) type this formula =SLOPE(B5:S5,B3:S3) Then hit enter Your result should read: This formula considers 18 weeks of student data and provides an average growth or change per week.

98 Making Decisions: Best Practice
Research has yet to establish a blue print for ‘grounding’ student RoI data. At this point, teams should consider multiple comparisons when planning and making decisions. National User Norms (AIMSWEB, DIBELS) Local, District, Grade Level, School Building

99 Looking at Percent of Expected Growth
Tier I Tier II Tier III Greater than 150% Between 110% & 150% Possible LD Between 95% & 110% Likely LD Between 80% & 95% May Need More Below 80% Needs More Tigard-Tualatin School District (

100 Making Decisions: Lessons From the Field
When tracking on grade level, consider an RoI that is 100% of expected growth as a minimum requirement, consider an RoI that is at or above the needed as optimal. So, 100% of expected and on par with needed become the limits of the range within a student should be achieving.

101 What about Students Not on Grade Level?
Determining Instructional Level Independent/Instructional/Frustrational Instructional often b/w 40th or 50th percentile and 25th percentile. Frustrational level below the 25th percentile. AIMSweb: Survey Level Assessment (SLA).

102 Setting Goals off of Grade Level
100% of expected growth not enough. Needed growth only gets to instructional level benchmark, not grade level. Risk of not being ambitious enough. Plenty of ideas, but limited research regarding Best Practice in goal setting off of grade level. Best Practices V – Shapiro Chapter

103 Possible Solution (A) Weekly probe at instructional level and compare to expected and needed growth rates at instructional level. Ambitious goal: 200% of expected RoI (twice the expected RoI)

104 Possible Solution (B) Weekly probe at instructional level for sensitive indicator of growth. Monthly probes (give 3, not just 1) at grade level to compute RoI. Goal based on grade level growth (more than 100% of expected).

105 When to make a change in instruction and intervention?
Enough data points (6 to 10)? Less than 100% of expected growth. Not on track to make benchmark (needed growth). Not on track to reach individual goal.

106 How deficient is the student’s ROI? The 2.0X calculation
Divide norm group mean ROI by student’s ROI Result expressed as a ratio of deficiency Example: 1.0 wpm/wk = 2.0X 0.5 wpm/wk

107 2.0X calculation Examples Joe Elliot .9 wpm/wk = .44X .9 wpm/wk = 3.0X
Divide norm group mean ROI by student’s ROI Result expressed as a ratio of deficiency Example: 1.0 wpm/wk = 2.0X 0.5 wpm/wk Examples Joe Elliot .9 wpm/wk = .44X wpm/wk = 3.0X 2.1 wpm/wk wpm/wk Elliot’s deficiency in ROI exceeds 2.0X

108 Example of Report Language:
Documentation of Deficiency in Rate of Improvement Throughout the current intervention period, Elliot has displayed little progress. At the beginning of the intervention, Elliot scored 56 wpm on oral reading fluency probes. His last score at the end of the intervention was 59 wpm. Elliot's calculated rate of improvement during this period was 0.3 wpm/week. Compared to the typical rate of improvement for students in Elliot’s grade (0.9 wpm/week), Elliot’s range is 3.0X deficient.

109 How low is low? How slow is slow?
How deficient does the student need to be to qualify? There is not a research consensus on this issue at this time. Note that there never was a research consensus on the extent of the ability-achievement discrepancy. However, there is a good deal of research underway addressing this question (e.g., Christ, Ardoin, et al.).

110 In the meantime… The decision on how deficient a student needs to be to qualify rests with the MDE. A rough guide: A student with a learning disability should be severely deficient in level and display a poor response to research-based interventions (slope) such that he or she is not likely to meet benchmarks in a reasonable amount of time without intensive specially designed instruction.

111 Criterion: #3: Rule out other factors or conditions
The group may determine the child has an SLD if: 3. The group determines the results are not primarily the result of - (i) A visual, hearing, or motor disability; (ii) Mental retardation; (iii) Emotional disturbance; (iv) Cultural factors; (v) Environmental or economic disadvantage (vi) Limited English proficiency Exclusionary Criteria § (a)

112 Specific Learning Disability
1. Failure to meet age- or grade-level State standards in one of eight areas: oral expression listening comprehension written expression basic reading skill reading fluency skill reading comprehension mathematics calculation mathematics problem solving 2. Discrepancy: Pattern of strengths & weaknesses, relative to intellectual ability as defined by a severe discrepancy between intellectual ability and achievement, or relative to age or grade. OR RTI: Lack of progress in response to scientifically based instruction 3. Rule out: Vision, hearing, or motor problems mental retardation emotional disturbance cultural and/or environmental issues limited English proficiency 4. Rule out lack of instruction by documenting: Appropriate instruction by qualified personnel Repeated assessments Specific Learning Disability Inclusionary Exclusionary Observation 112

113 Rule Out: Vision Screening procedure
Check vision records (school nurse) If positive, assess… Optometric or ophthalmology exam Possible extraneous factor or condition that could account for learning problem Visual Impairment Adapted from Reschly (2005)

114 Rule Out: Hearing Screening procedure
Check hearing records (school nurse) If positive, assess… Audiological exam Possible extraneous factor or condition that could account for learning problem Hearing Impairment Adapted from Reschly (2005)

115 Rule Out: Motor Screening procedure
Check school health records (school nurse); observations of motoric problems If positive, assess… Physical or occupational therapy exam; medical examination Possible extraneous factor or condition that could account for learning problem Physical Disability or Health Impairment Adapted from Reschly (2005)

116 Example of Report Language:
Documentation of Rule-out of Other Disabilities and Conditions Sensory Impairments: John's vision has been screened on an annual basis by the school. No visual problems have been detected. Vision problems are ruled out as a possible reason for John's academic difficulties.

117 Rule Out: Mental Retardation
Screening procedure Review of school records indicating typical functioning in other academic and adaptive behavior If positive, assess… Intelligence test; test of adaptive behavior Possible extraneous factor or condition that could account for learning problem Mental Retardation Adapted from Reschly (2005)

118 Example of Report Language
Documentation of Rule-out of Other Disabilities and Conditions Mental Retardation: John displays many indications of typical intellectual ability. He has scores in the proficient range on tests of arithmetic skills since kindergarten, including state tests and universal screenings. His developmental milestones were age-appropriate, and he displays adaptive skills that are appropriate for his age and grade level according to both his parents and his teacher’s report on the Behavior Assessment for Children (BASC) II. Based on this information, mental retardation can be ruled out as a possible reason for John's academic difficulties.

119 Rule Out: Emotional Disturbance
Screening procedure Behavioral checklists If positive, assess… Behavior rating scales, other assessments of behavior and affect Possible extraneous factor or condition that could account for learning problem Emotional disturbance Adapted from Reschly (2005)

120 Example of Report Language:
Documentation of Rule-out of Other Disabilities and Conditions Emotional Disturbance: John displays appropriate behavior in the classroom. He is attentive and tries hard. He gets along well with his peers and teachers. According to the results of the Behavior Assessment for Children (BASC) II, his parents and teacher report typical behavior on both externalizing and internalizing subscales. John is often frustrated by his difficulties in learning to read, but these emotions appear to be secondary to his reading disability. Based on these data, emotional disturbance can be ruled out as a possible reason for John's academic difficulties.

121 Rule Out: Cultural Factors
Screening procedure Assess cultural status (e.g., Acculturation Quick Scale) If positive, assess… Interview with family Possible extraneous factor or condition that could account for learning problem Level of acculturation; cultural differences Adapted from Reschly (2005)

122 Rule Out: Environmental or Economic Disadvantage
Screening procedure School records If positive, assess… “Social work” interview with family Possible extraneous factors or conditions that could account for learning problem Child abuse, lack of sleep, poor nutrition, etc. Adapted from Reschly (2005)

123 Rule Out: Limited English Proficiency
Screening procedure Home language screening (required by law) If positive, assess… Primary language assessment Possible extraneous factor or condition that could account for learning problem May not have BICS or CALP necessary for learning academic content Adapted from Reschly (2005)

124 Example of Report Language:
Documentation of Rule-out of Other Disabilities and Conditions Culture and Language: John is an African- American student whose primary home language is English. Although he participates in the free and reduced lunch program, it is not believed that acculturation, language, or environmental circumstances are the primary cause of John's academic difficulties.

125 Criterion #4: RULE OUT LACK OF INSTRUCTION
A child must not be determined to be a child with a disability under this part— (1) If the determinant factor for that determination is— (i) Lack of appropriate instruction in reading, including the essential components of reading instruction (as defined in section 1208(3) of the ESEA); (ii) Lack of appropriate instruction in math, or (iii) Limited English proficiency; (§ [b])

126 Exclusionary Criteria
To ensure that underachievement is not due to lack of appropriate instruction in reading or math the group must consider: Data that demonstrate that prior to, or as a part of, the referral process, the child was provided appropriate instruction in regular education settings delivered by qualified personnel Data-based documentation of repeated assessments of achievement at reasonable intervals, reflecting formal assessment of student progress during instruction, which was provided to the child’s parents Exclusionary Criteria § (b)

127 Specific Learning Disability
1. Failure to meet age- or grade-level State standards in one of eight areas: oral expression listening comprehension written expression basic reading skill reading fluency skill reading comprehension mathematics calculation mathematics problem solving 2. Discrepancy: Pattern of strengths & weaknesses, relative to intellectual ability as defined by a severe discrepancy between intellectual ability and achievement, or relative to age or grade. OR RTI: Lack of progress in response to scientifically based instruction 3. Rule out: Vision, hearing, or motor problems mental retardation emotional disturbance cultural and/or environmental issues limited English proficiency 4. Rule out lack of instruction by documenting: Appropriate instruction by qualified personnel Repeated assessments Specific Learning Disability Inclusionary Exclusionary Observation 127

128 NCLB §1208(3) (3) ESSENTIAL COMPONENTS OF READING INSTRUCTION.—
The term ‘essential components of reading instruction’ means explicit and systematic instruction in— (A) phonemic awareness; (B) phonics; (C) vocabulary development; (D) reading fluency, including oral reading skills; and (E) reading comprehension strategies.

129 IDEA Language § (b): To ensure that underachievement in a child suspected of having a specific learning disability is not due to lack of appropriate instruction in reading or math, the group must consider, as part of the evaluation described in §§ through —  (1) Data that demonstrate that prior to, or as a part of, the referral process, the child was provided appropriate instruction in regular education settings, delivered by qualified personnel; and  (2) Data-based documentation of repeated assessments of achievement at reasonable intervals, reflecting formal assessment of student progress during instruction, which was provided to the child’s parents. 

130 Key Questions to Address
Is a Standards-Based Curriculum in Place (Tier 1)? Is it based on scientific research? If a scientifically validated curriculum is in place, is there evidence that it is being delivered at a sufficient level of fidelity?

131 Was the student effectively taught?
Key Questions to Address Has the student been provided with individualized supports in the general education classroom (Tier 1)? Has the student been provided with a sufficiently intense individualized intervention using research-based instructional procedures (Tier 2)?

132 Core Reading Program General Principles
Serves as the base of reading instruction Provides complete instruction in the key components of reading Designed for all settings and all students Is preventive and proactive Incorporates a high probability of student proficiency (80%)

133 Core Reading Program Program Design
Aligned student materials and assessments Small and large group instructional activities Scaffolding to support initial learning and transference of skills Cumulative review

134 Q. What do we do in those situations in which core programs are recommended, but the review of the literature does not identify a solid research base? A. Supplemental reading programs provide additional instruction in one or more areas of reading to support the core. One size does not fit all—may need to supplement or modify (Oregon Reading First, 2004) Core Core plus supplemental Core plus intervention Intervention Intervention plus supplemental

135 Effective Instructional Design
Allocation of time Connection to supplemental materials Grouping strategies Implemented Flexible Active student engagement Effective classroom management High levels of academic learning time

136 If a scientifically validated curriculum is in place, is there evidence that it is being delivered at a sufficient level of fidelity?

137 Tier 1 Fidelity Check: Process
How long has the curriculum been in place? Were teachers adequately trained? Are teachers using the prescribed materials? Is the curriculum being delivered for a sufficient amount of time? How long has the student been taught in this curriculum? Is the curriculum being delivered according to prescribed directions?

138 Considerations to assess the provision of appropriate instruction
Principal’s observation of teacher performance through classroom visits and observations conducted during the instructional period for the targeted content/subject area on a regular basis. Checklists of integrity of instruction completed by teachers as self- check measures Checklists of integrity of instruction completed among teachers as peer-check measures Completion of checklists by content specialists or curriculum supervisors working with teachers. 138

139 Fidelity Check Options
Use of a prepared checklist of critical features of the instructional program: Teacher self-monitoring Peer coaching Lesson plan review by principal Observation by principal Many programs leave permanent products that reflect fidelity.

140 Tier 1 Fidelity Check: Outcomes
Has the general education curriculum succeeded in bringing a high percentage of students to proficiency? The sufficiency of the general education curriculum should be judged by its outcomes in terms of overall student performance.

141 However, so do all of his classmates.
Expected Performance Words per minute However, so do all of his classmates. Keshawn (green) performs well below expectations. Adapted from Witt (2006)

142 Is the supplemental program based on research?
Next Question: Has the student been provided with individualized supports in the general education classroom? Has a plan been developed that targets the student’s deficiency through supplemental intervention in the general education classroom (differentiated instruction)? Is the supplemental program based on research?

143 Has the student been provided with a sufficiently intense individualized intervention using research-based instructional procedures (Tier 2)? Has a plan been developed that targets the student’s deficiency through supplemental intervention in the general education classroom (differentiated instruction)? Is the supplemental program based on research? Have the interventions used featured a research- based “standard protocol”?

144 A Standard Protocol Intervention …
is scientifically based. has a high probability of producing change for large numbers of students. is usually delivered in small groups. is designed to be used in a standard manner across students. is often scripted or very structured.

145 Tier 2 Process Analysis (cont.)
Has the intervention been implemented with a high degree of fidelity? Has progress monitoring occurred at least weekly during the course of the intervention? Has a building-level team (e.g., IST) helped to design and guide the implementation of the intervention?

146 Tier 2 Analysis: Outcomes
Is there evidence that the individualized intervention provided to the student has facilitated meaningful progress for other students receiving the same supports?

147 Adapted from Witt (2006)

148 Examples of Report Language:
Documentation of Effective Instruction and Intervention John has received appropriate instruction in reading throughout his four years at Lincoln Elementary School (K- 3). Since kindergarten, John’s teachers have used the SRA Reading Mastery reading series, which uses explicit instructional procedures to teach the “big ideas” in reading. This research-based program has been successful in bringing 80% of the current third graders to proficiency. All of John's teachers have had extensive training with SRA. Fidelity checks conducted by reading coaches and the school principal indicate that the SRA program has been used with a high degree of fidelity. (Documentation of the fidelity checks are on file in the principal's office.)

149 (cont.) John has been provided with intensive reading interventions at tier 2 of Lincoln's three-tier model since September of He has been provided with small-group interventions to address his difficulties in phonemic awareness and decoding skills, using the Early Reading Intervention (ERI) program (Scott Foresman). ERI has been identified by the Florida Center for Reading Research as a research-based practice, and has been shown to significantly increase the proficiency of students at tiers 2 and 3 in Lincoln School. Fidelity checks conducted by the district’s reading coordinator indicate that the reading teachers who implemented the ERI program have done so with a high degree of fidelity. (Documentation of the fidelity checks are on file in the principal's office.)

150 Repeated Assessments Repeated assessments of achievement or behavior, or both, conducted at reasonable intervals, reflecting formal monitoring of student progress during the interventions. Information regarding the student’s progress should be periodically provided to the student’s parents.

151 Frequency of Repeated Assessments
Repeated assessment information may come from: Universal Screening Typically conducted 3 times a year Strategic intervention Typically progress monitored once a month Intense intervention ( tier 2) Typically progress monitored once a week

152 Examples of Report Language:
Documentation of Repeated Measures of Assessment Since kindergarten, John has been assessed during the universal screening in reading three times per year (fall, winter, spring). Since his involvement with tier two interventions this year, John's progress has been monitored using curriculum-based measurement (CBM) on a weekly basis. Results of both universal screening and progress monitoring have been provided to his parents through written reports and periodic parent conferences.

153 May other instruments be administered?
Yes. Tests of cognitive processing Tests of visual motor integration Tests of auditory processing Tests of receptive and expressive language Etc. When conducting a comprehensive evaluation MDT determines what is needed

154 Should other instruments be administered? Consider treatment validity.
The selection of any assessment instrument or procedure is solely dependent on its ability to provide specific information about scientifically validated instructional strategies that have a high probability of producing meaningful change in the student’s academic or social- emotional skills.

155 Can you use both models? According to an OSEP letter to the field, a district may use both the RTI model and the discrepancy model in particular situations. A district with a plan to phase in RTI over a three to five year period may use RTI in one building and the discrepancy model in another. Districts may also choose to use RTI for SLD determination at the elementary level and discrepancy model at the secondary level. These and other exceptions must be documented and approved through the special education plan approval process. 155

156 However… If a district chooses RTI as its procedure for a particular school, all students identified with SLD in that school must meet the RTI eligibility criteria, in addition to what may be indicated on other assessments. Conversely, if a district chooses the ability-achievement (A-A) discrepancy as its procedure for a particular school, all students identified with SLD in that school must meet the A-A eligibility criteria, in addition to what other assessments or the student’s RTI indicate.

157 Protecting Parents’ Rights
The public agency must promptly request parental consent to evaluate: If prior to referral, a child has not made adequate progress after an appropriate period of time when provided instruction and Whenever a child is referred for an evaluation § (c)

158 Contact Information: Joseph F. Kovaleski, D.Ed., NCSP Indiana University of PA Indiana, PA / Caitlin S. Flinn, MEd, NCSP Exeter Township School District Reading, PA


Download ppt "Determining Eligibility for Special Education in an RTI System"

Similar presentations


Ads by Google