Presentation is loading. Please wait.

Presentation is loading. Please wait.

Responsiveness to Instruction (RtI)

Similar presentations


Presentation on theme: "Responsiveness to Instruction (RtI)"— Presentation transcript:

1 Responsiveness to Instruction (RtI)
Problem-Solving Model Tier III North Carolina Department of Public Instruction 2011 1

2 Four Tiers of Support Continue Tier I and Tier II Support 2

3 The NC Problem- Solving Model
Tier IV Consideration for EC referral Tier III Consultation with the Problem Solving Team Tier II Consultation With Other Resources Resources Tier I Consultation Between Teachers-Parents Identify Area(s) of Need Implement Plan Evaluate Develop a Plan Student Needs 3

4 Diagnostic Assessment Universal Screening for ALL students 3x per year
Tier IV Progress Monitoring 1-2x per week Tier III Tier II Progress Monitoring 1-2x per month Tier I Diagnostic Assessment Universal Screening for ALL students 3x per year Student Needs

5 Tier III - PSM Repeat steps of cyclical problem-solving model
Student need drives problem solving team members Parent Teacher Teaching peer, counselor, school psychologist, curriculum specialist, data/assessment specialist administrator, social worker, nurse, etc. Problem-solving model forms are completed to document the process. 5 40

6 Tier III - PSM Small percentage of students
Formalized, systematic process Intervention and assessment increases in intensity/frequency Individual goals- short and long term 6

7 Layered Support 7

8 (observable) definition
1 Step 1 Define the Problem Develop a behavioral (observable) definition of problem 2 7 6 TEAM FIRST MEETS TO DEFINE PROBLEM 3 5 4 8

9 Step 1: Define the Problem
Essential step Develop a behavioral/academic definition Concrete, Observable and Measurable Stranger test? Most difficult step! Defining the problem is an essential first step in a systematic approach to problem-solving. Research by Bergan and Tombari (1976) shows that when a problem is correctly identified and agreed upon by staff, a solution to the problem almost invariably results. Without a clear and operationalzed definition, effective problem-solving is not likely to occur. The definition must be stated in concrete, measurable and observable terms. Can the identified problem pass the stranger test? Can someone else observe the student and identify the defined problem ? This is frequently the most difficult step in this process. 9 13

10 1 2 7 6 3 5 4 HOW DO WE ANSWER THESE QUESTIONS? Step 1 Define the
Problem Develop a behavioral (observable) definition of problem 2 7 Step 2 Develop an Assessment Plan Generate a hypothesis and assessment questions related to the problem 6 HOW DO WE ANSWER THESE QUESTIONS? 3 The second step in the PS process is to develop an assessment plan. This step helps to ensure your problem has been correctly defined. 5 4 10

11 Step 2: Develop an Assessment Plan
We must ask questions to form a hypothesis regarding “What is the problem? Why is it occurring?” We ask questions across four domains: Environment Instruction Leaner Curriculum 11

12 Instruction Possible Questions Has the instruction been consistent?
Has the student received instruction in constituent skill areas? Does the student respond more effectively to a different pace? Has the student received descriptive feedback? 12

13 Curriculum Possible Questions Are the deficits in the core?
Does the curriculum include the needed skills? Has the student had enough time in the curriculum skill areas? 13

14 Environment Possible Questions
Is the student “on-task” during instruction? How is his/her behavior in class and out of class? Home and school environment? (past and present) 14

15 Learner Possible Questions Any Medical issues?
Leaner Possible Questions Any Medical issues? Background information in the cumulative record? Language issues? 15

16 Step 2: Develop an Assessment Plan
Environment Instruction Leaner Review Interview Observe Test Curriculum Review Interview Observe Test As you look at those four domains (E,C,I,L), you will decide what type of procedure to use--review, interview, observe, test. These four procedures can be utilized in each domain. Here are some examples: Review,: text books (curriculum), products such as work samples(instruction), pacing guides or school rules (environment), cumulative and/or health records, work samples (learner) Interview : Teachers, curriculum directors (Curriculum), Teachers (Instruction), Parents (Environment), Learner (student, checklists or rating scales) Observe: Lesson plans (curriculum); Anecdotal recording or checklist (Instruction), systematic observation (environment), Checklists or anecdotal notes (learner) Test: Reliability of texts or core assessment (Curriculum), Informal assessments (Instruction), Review of assessment/ behavior during assessment (Environment), Curriculum based assessments, formative assessments (learner) Review Interview Observe Test Review Interview Observe Test

17 Review Examples Review records Review grades
Review teachers’ anecdotal records/instructional artifacts/work samples 17

18 Interview Examples Teacher interview Parent interview
Interview past teachers/previous school Student interview (older grades) 18

19 Observe Examples Student observation
Student/teacher interaction observation Instructional observation Core Intervention 19

20 Test Examples CBM in area of concern CBM in other areas
Survey level Grade level CBM in other areas Common Assessments Diagnostic – informal or formal 20

21 Diagnostic Assessment
Further investigation to help determine which intervention is most appropriate Can be CBM or other common assessments Examples Running Record, CBM, Informal reading assessments, Single skill math computation tests, Writing samples, Student interviews Diagnostic - further investigate academic difficulties for individual students - helps to identify areas for further teaching or reteaching 21

22 Diagnostic Assessment: Myths
Comes in a box Used to identify the presence of a reading disorder Must be administered by specialists Are formal Time consuming and impractical. Many RtI schools under use diagnostic assessments and it hinders their problem solving. Diagnostic assessments help us to define the problem correctly. 22

23 Resources for Free Diagnostic Assessments
diagnostic reading assessments for pre-K through high school g/readinggradeleve.html - diagnostic phonics assessments for all grade levels Math- Intervention Central has the ability to make up single skill probes of math skills 23

24 Case Study – Tier III 24

25 25

26 Case Study - Page 1 Student Name: Chris Date: 9/6/XX
Areas targeted for instruction/intervention: Reading Specific Problem: Reading Fluency

27 Case Study: Tier III Background Information
Record Review Vision/Hearing - pass Retentions - none Absences/tardies - no concerns Transferred schools - midyear first grade (VA) Prior interventions – Tier I and II in second grade (see data) In a standard protocol model, a standard set of empirically supported instructional approaches are implemented to prevent and remediate academic problems. Such approaches might include partnered reading activities, direct instruction of phonological or phonics skills, or reinforcement of skills through computer programs (Case, Speece, & Molloy, 2003). A key feature is that standard instruction/intervention protocols are used with minimal analysis of the deficit skill (e.g., Peer- Assisted Learning Strategies; Fuchs, Fuchs, Mathes, & Simmons, 1997; McMaster, Fuchs, Fuchs, & Compton, 2005). 27 3

28 Case Study: Tier III Background Information
Record Review (cont’) Universal screening data R-CBM - 34, MAZE (Fall-current school year) math - on grade level DRA2 = 20 In a standard protocol model, a standard set of empirically supported instructional approaches are implemented to prevent and remediate academic problems. Such approaches might include partnered reading activities, direct instruction of phonological or phonics skills, or reinforcement of skills through computer programs (Case, Speece, & Molloy, 2003). A key feature is that standard instruction/intervention protocols are used with minimal analysis of the deficit skill (e.g., Peer- Assisted Learning Strategies; Fuchs, Fuchs, Mathes, & Simmons, 1997; McMaster, Fuchs, Fuchs, & Compton, 2005). 28 3

29 Remember Chloe. She is significantly below her peers in the fall
Remember Chloe. She is significantly below her peers in the fall. We put more intense intervention in place and then also need more frequent assessment to see if progress is being made. 29

30 Case Study: Tier III Background Information
Tier I (during second grade) Define the problem - “Chris’ R-CBM score of 29 (Winter- universal screening) is below the 10th percentile when compared to national norms.” Error analysis (teacher) indicates several decoding weaknesses, including multisyllabic words, diagraphs, blends Chris’ low score on R-CBM is because of his weakness in decoding, specifically multisyllabic words. In a standard protocol model, a standard set of empirically supported instructional approaches are implemented to prevent and remediate academic problems. Such approaches might include partnered reading activities, direct instruction of phonological or phonics skills, or reinforcement of skills through computer programs (Case, Speece, & Molloy, 2003). A key feature is that standard instruction/intervention protocols are used with minimal analysis of the deficit skill (e.g., Peer- Assisted Learning Strategies; Fuchs, Fuchs, Mathes, & Simmons, 1997; McMaster, Fuchs, Fuchs, & Compton, 2005). 3

31 Tier III - Case Study: Background Information
Tier I (cont’) 8 weeks intervention (2/01/xx - 3/28/xx) Syllable pattern activities from FCRR - 2x/week, 15 min./session Progress monitoring data R-CBM 28, 30, 36, 30 In a standard protocol model, a standard set of empirically supported instructional approaches are implemented to prevent and remediate academic problems. Such approaches might include partnered reading activities, direct instruction of phonological or phonics skills, or reinforcement of skills through computer programs (Case, Speece, & Molloy, 2003). A key feature is that standard instruction/intervention protocols are used with minimal analysis of the deficit skill (e.g., Peer- Assisted Learning Strategies; Fuchs, Fuchs, Mathes, & Simmons, 1997; McMaster, Fuchs, Fuchs, & Compton, 2005). 3

32 Tier I Interventions 50 45 40 35 36 30 30 30 29 28 25 20 15 10 1 2 3 4 5 6 7 8 32

33 Case Study: Tier III Background Information
Chris’ low score on R-CBM is because of his weaknesses in decoding, including multisyllabic words and diagraphs. 8 weeks intervention (3/28/xx -5/30/xx) Syllable pattern activities from FCRR - 2x/week, 15 min./session Letter sound correspondence activities from FCRR - 2x/week, 10 min./session In a standard protocol model, a standard set of empirically supported instructional approaches are implemented to prevent and remediate academic problems. Such approaches might include partnered reading activities, direct instruction of phonological or phonics skills, or reinforcement of skills through computer programs (Case, Speece, & Molloy, 2003). A key feature is that standard instruction/intervention protocols are used with minimal analysis of the deficit skill (e.g., Peer- Assisted Learning Strategies; Fuchs, Fuchs, Mathes, & Simmons, 1997; McMaster, Fuchs, Fuchs, & Compton, 2005). 3

34 Case Study: Tier III Background Information
Tier II (cont’) Progress monitoring data R-CBM 35, 40, 42, 43 In a standard protocol model, a standard set of empirically supported instructional approaches are implemented to prevent and remediate academic problems. Such approaches might include partnered reading activities, direct instruction of phonological or phonics skills, or reinforcement of skills through computer programs (Case, Speece, & Molloy, 2003). A key feature is that standard instruction/intervention protocols are used with minimal analysis of the deficit skill (e.g., Peer- Assisted Learning Strategies; Fuchs, Fuchs, Mathes, & Simmons, 1997; McMaster, Fuchs, Fuchs, & Compton, 2005). 3

35 Tier II Interventions 55 50 52 45 42 43 40 40 35 36 35 30 25 20 15 10 1 2 3 4 5 6 7 8 35

36 Case Study - Page 1 Create a hypothesis statement for each domain
Why do you think the problem is occurring? Instruction Curriculum Environment Learner What information do you need? Review Interview Observe Test 36

37 RED LINES! Highlight person responsible
37

38 38

39 1 2 7 6 3 5 4 WHAT DID WE FIND? Step 1 Define the Problem Step 2
Develop a behavioral (observable) definition of problem 2 7 Step 2 Develop an Assessment Plan Generate a hypothesis and assessment questions related to the problem 6 WHAT DID WE FIND? 3 Step 3 Analysis of the Assessment Plan Determine if problem is correctly defined Once the team has gathered more data and reconvened, the data are analyzed to ensure the problem has been correctly defined. Thus, an appropriate intervention plan will be developed. 5 4 39

40 Tier III - Analyze the Assessment Plan
Team reconvenes (within 2 weeks) to discuss results of assessment results Is our problem correctly defined? What is our hypothesis (based on the data we gathered) and how will we “test” it? 40

41 Tier III - Analyze the Assessment Plan
Environment Structured observation results Attention to task was age appropriate compared to peers in classroom Hypothesis rejected 41

42 Tier III - Analyze the Assessment Plan
Curriculum CBM error analysis - confirms multisyllabic words, diagraphs, blends Review DRA2 - Score of 18 Interview Instructional specialist from previous school Previous utilized different materials and instructional methods Hypothesis accepted 42

43 Tier III – Analyze the Assessment Plan
Instruction Teacher Interview Determined review of early literacy skills is not an intentional component of reading instruction Instructional observation Review of early literacy skills was not observed Review instructional materials 3rd grade materials do not include a review of early literacy skills Hypothesis accepted 43

44 Tier III - Analyze the Assessment Plan
Leaner Speech Language Screening Passed Social/Developmental History Uneventful social/medical development Hypothesis rejected 44

45 45

46 1 2 7 6 3 5 4 WHERE DO WE WANT THEM TO BE? Step 1 Define the Problem
Develop a behavioral (observable) definition of problem 2 7 Step 2 Develop an Assessment Plan Generate a hypothesis and assessment questions related to the problem 6 WHERE DO WE WANT THEM TO BE? 3 Step 3 Analysis of the Assessment Plan Determine if problem is correctly defined The fourth step is to generate a goal statement. 5 4 Step 4 Generate a Goal Statement Specific Description of the changes expected in student behavior 46

47 Letter Major Term [Minor Terms]
S---Specific,[Significant, Stretching, Simple] M--- Measurable, [Meaningful, Motivational, Manageable] A--- Attainable, [Appropriate, Achievable, Agreed, Assignable, Actionable, Action-oriented, Ambitious, Aligned] R--- Relevant [Realistic, Results/Results-focused/Results-oriented, Resourced, Rewarding] T--- Time-bound [Time-oriented, Time framed, Timed, Time-based, Timeboxed, Timely, Time-Specific, Timetabled, Time limited, Time-bound, Trackable, Tangible ] How about SMARTER goals ???? E-- Evaluate [Ethical, Excitable, Enjoyable, Engaging] R-- Reevaluate [Rewarded, Reassess, Revisit, Recorded, Rewarding, Reaching] Doran, George T. "There's a S.M.A.R.T. way to write management's goals and objectives." Management Review, Nov 1981, Volume 70 Issue 11. 47

48 Tier III – Goal Setting Short Term Goal Long Term
Length determined by intervention period Example: 8-10 weeks from intervention start date Set goal based on baseline data Long Term Length determined by grade level expectations Example: end of school year 48

49 Tier III - Goal Setting Norm Referenced Goal
Rate of Improvement (ROI)/Growth Rate Percentile Rank 49

50 Tier III – Goal Setting: Norm Referenced
A standard, model or pattern regarded as typical Typical performance of peers? Classroom Grade District State Nation 50

51 Tier III – Goal Setting: Norm Tables
Percentile ranks Average score (range 25th-75th percentile) Let’s practice reading a norm table (Example is for practice only) 51

52 Average Range Fall 52

53 Average Range Winter 53

54 Average Range Spring 54

55 Case Study – Goal Setting
Baseline Data Reading CBM 33, 36, 34 DRA2 level 18 Goal setting for Chris 55

56 Chris’ score in the fall was 34 wcpm - What percentile is this?
The 10th percentile 56

57 What is the goal for Chris to be in the average range for reading fluency in the winter?
79 wcpm 57

58 Norm-Referenced Goal Relevant Specific, Measurable Chris will read 79 words correctly per minute, from a third grade reading passage, by January. Time-Bound Attainable 58

59 Rate of Improvement/Growth Rates
Use for shorter time periods Double growth rate Student significantly below peers and/or targets Students who are below need to grow even faster than a typical student in order to close the gap. 59

60 Average Rate of Improvement/
Growth Rate Double the average ROI- 1.1 x 2 = 2.2 60

61 Tier III – Goal Setting Double growth rate for Chris
Intervention phase (end of short term goal) – weeks 2.2 x 8 = 17.6 (gain score) Gain score + baseline = short term goal = 51.6 wcpm Short term goal 61

62 Tier III - Long Term Goals
Set 1-2 years beyond baseline Focuses on grade level functioning Can utilize different assessment tools for short and long term goal setting Example: Chris – DRA2 Chris will read Level 34 text independently by the end of third grade. 62

63 63

64

65 Tier III – Case Study Set a goal for Chris Document on page 2
Norm Reference Rate of Improvement Document on page 2 Hypothesis statement Supports needed to close gap in performance Goal statement 65

66 1 2 7 6 3 5 4 HOW DO WE INTERVENE? Step 1 Define the Problem Step 2
Develop a behavioral (observable) definition of problem 2 7 Step 2 Develop an Assessment Plan Generate a hypothesis and assessment questions related to the problem 6 HOW DO WE INTERVENE? 3 Step 3 Analysis of the Assessment Plan Determine if problem is correctly defined Intervention should be directly linked to the function of the problem. 5 4 Step 5 Develop an Intervention Plan Base interventions on best practices and research- proven strategies Step 4 Generate a Goal Statement Specific Description of the changes expected in student behavior 66

67 Tier III – Intervention Plan
Continue Tier I and II Plan based on the gathered data Correct Problem Definition = Effective Interventions 67

68 Tier III - Intervention Plan
Detailed- should pass stranger test (who, what, when, how long, where) Measurement strategy how will you progress-monitor? (match intensity of intervention) Decision rules 68

69 Diagnostic Assessment Universal Screening for ALL students 3x per year
Tier IV Progress Monitoring 1-2x per week Tier III Tier II Progress Monitoring 1-2x per month Tier I Diagnostic Assessment Universal Screening for ALL students 3x per year Student Needs

70 Frequency of Assessment Directly Related to Student Achievement
Similar results found by Fuchs & Fuchs (1986) Assessment and instruction needs to be balanced. Too frequent assessment- loss of instructional time, too little assessment- students stay in instructional programs that are not sufficient. This is a summary of research related to frequency of assessment. More frequent assessment causes a direct gain in scores for students. At the high end, just two times a week with short measures, provides the highest percentile gain and effect size regardless of instructional program. Notes about this research- this is a meta-analysis of many studies regarding frequent classroom assessments. It took together many different formative assessments and many different summative assessments to show the percentile gain. 70

71 Progress Monitoring Necessities
Target skill/behavior= goal Baseline Research recommends: 3-4 consecutive data points below aimline or 4-6 consecutive data points above aimline- consider problem-solving; 6-10 consecutive data points showing negative trend- consider problem-solving Things you need before progress monitoring. The decision making plan will make more sense once you practice progress monitoring. Intervention Graph Measurement Strategy 71

72 Tier III - Case Study How will you intervene?
What resources do you have in your building to meet Chris’ needs? How will you monitor progress? Complete page 3 72

73

74 74

75 1 2 7 6 3 5 4 IMPLEMENT THE PLAN! Step 1 Define the Problem Step 2
Develop a behavioral (observable) definition of problem 2 7 Step 2 Develop an Assessment Plan Generate a hypothesis and assessment questions related to the problem IMPLEMENT THE PLAN! 6 3 Step 6 Implement the Intervention Plan Provide strategies, materials, and resources: include progress monitoring Step 3 Analysis of the Assessment Plan Determine if problem is correctly defined Finally, the plan is implemented. 5 4 Step 5 Develop an Intervention Plan Base interventions on best practices and research- proven strategies Step 4 Generate a Goal Statement Specific Description of the changes expected in student behavior 75

76 Tier III - Implement the Plan
Monitor progress throughout intervention Graph progress against short term goal Check fidelity of intervention 76

77 Tier III - Case Study Reading CBM - baseline scores 33, 36, 34
Median Score – 34 Graph Chris’ scores Baseline scores Short term goal – 52 for 8 weeks of intervention Draw aimline Taking the median is only for Oral Reading Fluency and only on the baseline (first time you assess). 77

78 78

79 Designed for skills where equal
Semi-Log Chart: Designed for skills where equal interval monitoring is not sufficient to convey growth Ex: young children learning basic skills the growth between 1 and 5 letter sounds is of greater significance than between 20 and 50 79

80 80

81 Tier III - Initial Graph
Baseline is the first time you assess (pre test)…Goal is set using a guide such as a national norm or a local norm…goal line or aimline is the connection of the two. This is the road you want to student to travel to meet their goal. Baseline Goal Goal or Aimline 81

82 Plot Week One: Tuesday- 36; Thursday- 34
Plot Week two: Monday- 28; Wed- 40; Fri- 38 82

83 Tier III Case Study – Progress Monitoring
Week One: Tuesday- 36; Thursday- 34 Week two: Monday- 28; Wed- 40; Fri- 38 Red dots along the way are Chris’ scores each time on the oral reading fluency assessment. The dashed line is the trend line. This is a prediction of where Chloe will end up at the end of the intervention if the data continues on this course. This is five data points and the trend line shows she won’t meet her goal. Next slide-talk about that decision making rule. 83

84 Decision Making Research recommends: 3-4 consecutive data points below aimline or 4-6 consecutive data points above aimline- consider problem-solving; 6-10 consecutive data points showing negative trend- consider problem-solving So with those five data points although her trend shows she is not going to meet her goal, research says that this is too few data points to make a decision data points is the safest if you are using a trend line. The exception to this 6-10 rule is if there are 3-4 consecutive below the aimline, you may wish to consider problem solving. Problem solving may mean assessing if you defined the problem correctly, if you set a reasonable goal, or if you need a more intense intervention (or whole new intervention). 84

85 Week Three: Tuesday- 34; Thursday- 26
Week Four: Mon- 36; Wed- 38; Fri- 38 85

86 Tier III Case Study – Progress Monitoring
Week Three: Tuesday- 34; Thursday- 26 Week Four: Mon- 36; Wed- 38; Fri- 38 Next two weeks of data. Now we have ten data points plus the baseline. What decision should you make? 86

87 Tier III Case Study – Intervention Change
Yes, we are going to problem solve and this time, we decided to change the intervention. We draw a vertical line to signify this change. Then we monitor some more. Intervention Change 87

88 Week Five: Tuesday- 42; Thursday- 40
Week Six: Mon- 45; Wed- 47; Fri- 49 88

89 Tier III Case Study – Progress Monitoring
Week Five: Tues- 42; Thurs – 40 Week Six: Mon- 45; Wed – 47; Fri- 49 Two more weeks of data for Chloe. How does the data look now? Look at the trend line and the data points. We do have some that are hovering around the aimline but the trend line is positive so let’s keep on course. 89

90 Week Seven: Tuesday- 50; Thursday- 55
Week Eight: Mon- 56; Wed- 54; Fri- 57 90

91 Tier III Case Study – Progress Monitoring
Week Seven: Tues- 50; Thurs- 55 Week Eight: Mon- 56; Wed- 54; Fri- 57 Two more weeks. She has met and exceeded her goal. 91

92 1 2 7 6 3 5 4 ANALYZE THE PLAN Step 1 Define the Problem Step 2
Develop a behavioral (observable) definition of problem 2 7 Step 2 Develop an Assessment Plan Generate a hypothesis and assessment questions related to the problem Step 7 Analysis of the Intervention Plan make a team decision on the effectiveness of the intervention 6 ANALYZE THE PLAN 3 Step 6 Implement the Intervention Plan Provide strategies, materials, and resources: include progress monitoring Step 3 Analysis of the Assessment Plan Create a functional and multidimensional assessment to test the hypothesis 5 4 Step 5 Develop an Intervention Plan Base interventions on best practices and research- proven strategies Step 4 Generate a Goal Statement Specific Description of the changes expected in student behavior 83 92

93 Tier III - Analysis of the Intervention Plan
Was the goal met? How does the student compare to the norm: School Subgroup District Nation What is the student’s growth rate? 93

94 Tier III - Analysis of the Intervention Plan
EVALUATE the DATA Progress monitoring is essential Examine student performance Evaluate the effectiveness of instruction 94

95 Final decision? 8 weeks was the end of this intervention but it may be 10 or 20. There is no set time frame at each tier but the data should always guide your decision making. We always want to compare Chris’ growth rates with the rest of his norm group. Yes, he is still below but he is growing at a much faster rate than his same age peers. He is likely not a student that has a disability but instead an “instructional casualty” 95

96 Tier III Case Study – Decision Making
Goal 51 Chris - 56 National 25th percentile for winter is 79 Average growth rate is 1.1 Chris’ per week Make a decision Final decision? 8 weeks was the end of this intervention but it may be 10 or 20. There is no set time frame at each tier but the data should always guide your decision making. We always want to compare Chris’ growth rates with the rest of his norm group. Yes, he is still below but he is growing at a much faster rate than his same age peers. He is likely not a student that has a disability but instead an “instructional casualty” 96

97 Tier III Case Study – Decision Making
What about his subgroup? Whole school population? Consider these scenarios. Sometimes, if the entire peer group is underperforming, you will want to look at intervening with that group rather than on an individual child basis. What if the 25th percentile for that subgroup in that school was 55? Would continuing intervention on an individual child basis be the smartest choice? 97

98

99 99

100 ‘Dual-Discrepancy’: RTI Model of Learning Disability (Fuchs 2003)
Avg Classroom Academic Performance Level Discrepancy 1: Skill Gap (Current Performance Level) Discrepancy 2: Gap in Rate of Learning (‘Slope of Improvement’) Target Student ‘Dual-Discrepancy’: RTI Model of Learning Disability (Fuchs 2003) 100

101 Tier IV - PSM Continue problem solving Review all available data and
- Continue interventions at Tier III OR Refer for consideration of special education If referral is made: Define the problem Use progress monitoring data as baseline on IEP IEP (intervention) is developed based on data Continue problem solving Team identifies areas to be addressed as concerns and determines that intensity of interventions require more than can be addressed in the regular classroom. Problem-solving model forms are completed to document the process. 101


Download ppt "Responsiveness to Instruction (RtI)"

Similar presentations


Ads by Google