Presentation is loading. Please wait.

Presentation is loading. Please wait.

Beyond Tier II: Assisting Students Who Still Do Not Respond Gary L. Cates, Ph.D. Illinois School Psychology Association January,

Similar presentations


Presentation on theme: "Beyond Tier II: Assisting Students Who Still Do Not Respond Gary L. Cates, Ph.D. Illinois School Psychology Association January,"— Presentation transcript:

1

2 Beyond Tier II: Assisting Students Who Still Do Not Respond Gary L. Cates, Ph.D. Illinois School Psychology Association January, 2011

3 Housekeeping Timeline (8:30 AM - 5:00 PM) 10 Minute Breaks (About 10 & 3:30) Lunch (1 hour 12:30-1:30)

4 Acknowledgments Cates, Blum, & Swerdlik (2010). Authors of Effective RTI Training and Practices: Helping School and District Teams Improve Academic Performance and Social Behavior and this PowerPoint presentation

5 Overview of Session Who am I? Quick Review of Three Tiered System of Support – Emphasis on Tier III General Framework for understanding Behavior Problems Generic Framework of understanding Learning Problems Factors to consider when selecting, developing, implementing and evaluating an intervention plan Progress Monitoring/Plan Evaluation Intervention Integrity

6 Tier III Individualized Intervention 5% Tier II Standard Protocol 15% Tier I Universal Instruction 80%

7 RTI Steps Step I: Solid Universal behavioral expectations for all students Step II: Reliable, Valid, and Brief School wide Screening of behavior 3 times per year. Step III: Data review by Problem Solving Team. Step IV: Targeted interventions and progress monitoring for low responsiveness Step V: Intense interventions and progress monitoring for low responsiveness Step VI: Entitlement to special education when student demonstrates little or no response to both targeted and intense interventions

8 Tier I Behavior Curriculum 3-5 School Wide Expectations

9 If we understand that behavioral skills are learned, it is necessary to teach expected behaviors as we would academic skills.

10 Example Mark Twain Behavior Matrix HallwaysCafeteriaPlaygroundBathroom RESPECT Self Walk at all times. Eat your food only. Walk carefully to return trays. Stay in assigned area. Get help when it is needed. Quietly wait your turn Keep to yourself. RESPECT Others Voices off and arms folded. Single file lines. Jaguar waves only. Stay in order when in line. Be polite and use good manners. Use kind words and quiet voices. Stay in order when in line. Play by the rules. Take turns and share equipment. Use polite language Walk in and out quietly. Voices off. Open stall doors slowly. RESPECT the Environment Eyes only on displays. Be quiet after ten minute warning. Clean up your own space. Line up when signal is given. Pick up litter that you see. Use toilets, sinks, and dryers correctly. Keep bathroom clean.

11 Average Referrals Per Day Per Month

12 ODR Data by Behavior

13 ODR Data by Location

14 ODR Data by Time of Day

15 ODR Data by Student

16 Tier II Behavior Check In-Check Out like Standard Protocols

17 Comments: Be SafeBe RespectfulBe Ready Teacher Initials Keep hands, feet, and objects to self Use kinds words and actions Follow directions Have Needed Materials Reading 0 1 Recess 0 1 Math 0 1 Lunch 0 1 Social Studies 0 1 Recess 0 1 Language Arts 0 1 Science 0 1 Daily Report Card Date _________________ Teacher___________________________Student__________________ 0 = No1= Yes Total Points =Today ________%Goal ________% Points Possible = 32 Parent’s signature _______________________________________________________

18

19 Tier III Individualized Assessment –Functional Behavior Assessment/Analysis –Determine behavioral function (not cause!) Individualized Intervention –Linked to “behavioral function” –Based on basic principals of behavior

20 What are Scientifically Based Interventions? Employs systematic, empirical methods Ensures that studies and methods are presented in sufficient detail and clarity Obtains acceptance by a peer-reviewed journal or approval by a panel of independent experts through scientific review Uses research designs and methods appropriate to the research question

21 Evidence-Based Interventions School-based professionals have a responsibility for both promoting and implementing interventions that are evidence- based AND objectively evaluating the effectiveness of those interventions through the data-based decision-making process. (NCLB & IDEA 2004)

22 Be Good Consumers Does the intervention meet the standards for Research-Based Interventions (internal validity)? Has the published research been peer- reviewed? Have the results been replicated? Does this apply to my population (External Validity)?

23 Selecting Research-Based Strategies Maalox Approach -attempt high- probability strategies that have demonstrated research support and are likely to show quick and effective results before conducting lengthy evaluations that may not lead to beneficial interventions.

24 Linking Behavioral Assessment to Intervention Through Problem Solving

25 To be successful in implementing Effective Behavioral Interventions you need to: Have a conceptual frameworks for what behavior REALLY is Know what expectations are for the student Know whether what you are doing is helpful, detrimental, or having no impact Be focused on the variables that you can immediately change

26 Typical Hypotheses What are common hypotheses for: –Not completing homework? –Talking out during class? –Out of Seat? –Inappropriate Touching of others? Group Example

27

28 Behavioral Function as a Framework for Understanding Behavior Determining what the antecedents and consequences are for a given behavior Focuses on what is maintaining the behavior not the cause!

29 What you must keep in mind Behavior has a function You are trying to identify the function You cannot be circular in your logic (e.g., ADHD).

30 4 General Functions of Behavior: To Get Something or To Ger out of Something Tangible Reinforcement: Food, Stickers, Toys, Social Reinforcement: Teacher or Peer Escape/Avoidance: Get out of or terminate something Sensory Reinforcement: Visual, auditory primarily (Touch, smell)

31 Typical Hypotheses What are common hypotheses for: –Not completing homework? –Talking out during class? –Poor math test scores? –Inappropriate touching of others? Group Example

32 Approaches to Functional Assessment Questionnaire: Have others tell you what happens Observational Assessment: Watch and describe A-B-C’s Experimental Functional Analysis: Do a test of hypothesis  I usually do a bit of all three of the above

33 Example of Functional Analysis: Talking out in class Potential FunctionTest Condition Tangible R+Access Contingent upon talking out AttentionReprimand Contingent upon talking out EscapeContingent upon talking out after demand Self-StimulationLeave isolated in room Control ConditionPlay with attention and no demands

34 What is the primary function of Behavior?

35 A More Applied Classroom Example Teacher & Student Behavior affect each other

36

37

38 Special Note An experimental functional analysis should be conducted if the FBA does not lend itself to an effective intervention.

39 Selecting and Developing Behavioral Intervention Plans A focus on behavioral function

40 Phases of Problem-Solving 1. Problem Identification 2. Problem Analysis 3. Plan Development 4. Plan Implementation 5. Plan Evaluation

41 Phase 1: Problem Identification What is the problem? & Is it a real problem?

42 4 Steps of Phase 1: Problem Identification 1.Operationally Define problem 2.Collect Baseline Data 3.State discrepancy between what is expected (typical peer performance) and what is occurring. 4.Identify a replacement behavior

43 Step 1 of Phase 1: Operationally Defining the Problem Must be observable Must be measurable Must pass the dead man's test Cannot be circular (e.g. ADHD is why Johnny acts that way)

44 Data can be collected from a number of sources: –R = Record Review –I = Interview –O = Observation –T = Testing Step 2 of Phase 1: Collect Baseline Data

45 Collect only what you need to determine the discrepancy between what is expected (peer performance) and what is occurring (target student performance). Use existing data when possible: –ODR –Records (e.g., attendance, permanent products) Collect additional information when needed: –Interview –Observation (e.g., Frequency Count, On-task). RIOT TIPS

46 Step 3 of Phase 1: State Discrepancy Calculate the Behavior Discrepancy Ratio (BDR) –Include statement of student’s current level of performance. –Include statement of the expected level of performance (e.g., peer data, teacher expectation). –You are essentially specifying “the gap”

47 Behavior Discrepancy Ratios Formula: Target Student Behavior Peer Behavior –Example: For disruptive talking out during 7 th grade math class, Jessica has engaged in the behavior on average 17 times per class period while the average 7th grade math class peers are engaging in disruptive talking out behavior on average 4 times per class period. Target Student Behavior = 12 = 4x discrepant Peer Behavior 3

48 Discrepancy Ratios Enables team to make decisions about levels of support and resource from the start. Generally speaking… –A student who is 2x discrepant from his/her peers is appropriate for the problem-solving team. –If a student is significantly discrepant from peers, additional problem-solving and intervention resources may be appropriate. –Example: Jessica is 4 x discrepant from peers and MAY benefit from problem solving.

49 Provides a way to evaluate student outcomes NameGradeAreaInitial Performance Discrepancy Follow Up Performance Discrepancy Outcome Decision Bill3Talking out3.5X1.9XSatisfactory; Maintain/Fade Intervention Susie2Out of Seat1.5XNANo Severe Problem Rob4Homework Completion 4.2X3.8XNo Progress, Recycle through process

50 Step 4: List Problem Behaviors and Prioritize Teams should tackle one or Two problem at a time. Consider the following problems first: –Dangerous/Severe behaviors –High frequency behaviors –Foundational/Keystone behaviors (e.g., reading) –Chronic problem behaviors

51 Step 5 of Phase 1: Identify a Replacement Behavior State specifically what you want the student to do instead Example: Initiate compliance with 85% of requests within 5 seconds. Example: Raise hand 100% of time during independent seatwork when the student requires attention from the teacher. Example: Remain on-task for 7 minutes Example: Complete 3 digit by 2 digit multiplication problems with 95% accuracy in 9 weeks.

52 Group Example 1.Using the information below, how would you prioritize this 4th grade team’s list of generated concerns regarding William –William is on-task during 40% of observed intervals compared to Peers on-task of 90%. –They think he may have ADHD –Makes inappropriate comments in class that disrupts students. –Inconsistent homework completion –Keeps a messy work area. 2. Based on the primary area of concern, how would you define the behavior in observable and measurable terms?

53 What can go wrong at Phase 1 (Problem Identification)? Cannot select one problem to focus on. Cannot empirically quantify the behavior/too vague or general about the concern. Jumping to solutions Cannot establish ‘typical peer’ behavior and discrepancy between what is expected and what is occurring Problem Naming or “Admiring the problem” Lack quantitative baseline data (verbal report only)

54 How to stay on track during Problem Identification… –Interview the teacher before the meeting to allow for venting time and facilitate the description of the problem. –Proactively collect school-wide benchmark data. –Collect baseline data before meeting. –Prioritize keystone behaviors. –State discrepancies before meeting –Identify replacement behavior before meeting

55 Phase 2: Problem Analysis What may be contributing to the behavior?

56 2 Steps of Phase 2: Problem Analysis 1.Collect enough of the right data. 2.Generate hypotheses of controllable variables related to the behavior.

57 Step 1: Collecting Enough of the Right Data Verbal Reports (e.g. interviews) Rating Scales (e.g. BASC) Record Review (e.g. Cumulative file, homework – permanent products) Observation Systems (e.g. BOSS) Direct Systematic Behavioral Observation (e.g. Interval recording, frequency counts)

58 RIOT (R)ecord Review (I)nterviews (O)bservation (T)est

59 Verbal Reports Reliability is a concern Can be used to generate hypotheses Get direct data (i.e. independent observation) to corroborate

60 Specific Questions to Teachers: Behavior Problems What does the behavior look like? How often does it occur What happens immediately before the behavior? What happens immediately after? What have you tried so far? What behavior would you rather see?

61 Rating Scales More reliable than verbal report Used only as a “screener” DO NOT USE ALONE FOR INTERVENTION OR DIAGNOSIS! Broadband versus Narrow band

62 Observations This is not an anecdotal report of what someone observed for a class period

63 General guidelines for observations: –Don’t be intrusive. –Agree upon a clearly defined and observable behavior first. –Observe across days/times/settings to increase reliability. –Use with other forms of assessment to increase validity. –Carefully consider the goal of the observation before selecting an observation tool. –Always note the environmental context of the behavior. –Observe students in their natural environments. –Always observe peers for a comparison.

64 Observation “systems” Save your money Very limited Use direct behavioral systematic observation methods

65 Direct Behavioral Observations ABC Log’s Frequency Tabulation Log’s Systematic Interval Recording

66 Examples of Direct Observations ABC Recording Antecedents - what occurs right before the behavior. Behavior - problem behavior (observable and defined) Consequences - what happens right after the behavior

67 Data-Based Decision Making Using Antecedent-Behavior-Consequence Logs

68

69 Practice Analyzing an ABC Log See handout Why do you think the behavior is occurring? What might you do for an intervention? What is an acceptable alternative behavior? How would you monitor progress?

70 1. What patterns do you see here? 2. What is the likely function of behavior?

71 Student: Grade:Referring Staff: Date of Referral:Time of Behavior: Location o Classroom # ____ o Hallway o Cafeteria o Library o Bathroom ____ o Bus o Open Yard o In Front of School o Parking Lot o Other (please ID) External Behavior o Abusive language o Physical contact o Sexual Language toward peer or adult o Lying o Cheating o Vandalism o Smoking o Truant from Class o Other (please ID) Internalizing Behavior Note: For internalizing referrals send form, but do not send student to the office unless necessary. o Does not talk with peers o Excessively shy/withdrawn o Avoid social interaction o Appears fearful in non threatening situations o Fails to assert self o Unresponsive to social situations o Doesn’t participate in social activities o Other (please ID) AntecedentBehaviorConsequence Office Discipline Referral Form

72 Data-Based Decision Making Using Frequency Counts

73 Frequency Count (RATE MEASURE!) –A measure of how often a clearly defined behavior occurs within a given period of time. –Examine the frequency of the behavior by tallying or counting the behavior as it occurs. –Use this when the behavior is discrete (has an obvious beginning and ending) and does not occur at very high rates. –This information is helpful at ALL steps of the problem solving process –ALWAYS MEASURE AS RATE WHEN POSSIBLE!!!! Examples of Direct Observations

74

75 Practice Using A Frequency Count/Rate Measure Log See Handout Determine the rate of behavior Determine Discrepancy Ratio –The average child does this on average 1.8 times per day. Write a hypothesis: Remember ICEL Develop a method for hypothesis testing: Remember RIOT

76 1.What day does the behavior most often occur? What day is it least likely to occur? 2.What time of day does the behavior most often occur? Least often? 3.When should someone come to visit if they wanted to witness the behavior? Note: It is just as important to look at when the behavior occurs as it is to look at when it doesn’t.

77 Data-Based Decision Making Using Direct Behavioral Observations

78 Systematic Data Recording –Examine percentage of target behavior by: Recording when the selected student is engaging in target behavior during 10-second intervals for 15 minutes. Peers are observed in the same way as a comparison. –Requires more training than the other observation tools. –This information is helpful at all steps of the problem solving process Examples of Direct Observations

79 Systematic Direct Behavioral Observations: Interval Recording Partial Interval Recording: Occurs anytime within interval Whole Interval Recording: Occurs majority of Interval Momentary Time Sampling: Within 3 seconds Duration Recording: How long behavior occurs

80 Behavior AXX 2TOXXXXXX 3OTXXXXXXXXXXXXXXXXX Behavior AX 2TOXXX 3OTXXXXXXXXXXXXXXXXXX Target Child Composite Child

81 1.What can you get from this? 2.Are all of these behaviors severe enough to warrant individualized intervention?

82 Step 2: Writing a Hypothesis Provide the discrepancy statement Add because… at the end of the discrepancy statement and insert your hypothesis. The hypothesis should be specific, observable, and measurable. –Example: Beth is on-task for 35% of intervals while peers are on-task 87% of intervals during a 20-minute observation during direct instruction in Math class, because she is escaping the Math work which is above her instructional level.

83 Plan for the Collection of Additional Data Needed to Support Hypotheses Your hypotheses should be supported by at least 2 convergent sources of RIOT data with at least one piece being objective. If you develop a hypothesis that you don’t have enough data to support, plan for the collection of additional data you need validate or refute the hypothesis. Data collection should be planned not random!

84 Challenges/Barriers/What can go wrong at Stage 2: Problem Analysis Don’t consider appropriate variables Choosing variables you can’t change Get ‘stuck’ searching for the cause The Filibuster Individual team members focused on their own agenda. Problem analysis is skipped altogether Hypotheses selected are not supported by 2 forms of Data

85 How to stay on track during Problem Analysis (Step 2)… –Focus on behavioral function –Insist that a hypothesis needs at least two supporting pieces of evidence (one must be quantitative). –Enforce the agenda. –Verbally redirect those who provide solutions before developing a hypothesis.

86 Phase 3: Plan Development Linking Assessment to Intervention

87 2 Steps of Phase 3: Plan Development 1.Set a Goal 2.Develop a plan based on the hypothesized behavioral function

88 Step 1: Set a goal Goal should be to bring the student’s behavior into acceptable levels relative to peers. Discrepancy Ratios Criterion Based

89 Writing and Evaluating Measurable Goals Behavior Goals

90 Goals/Objectives Should state: Performance, condition, criteria and when possible the date. e.g. Given a verbal prompt to complete a mastered task, the student will initiate compliance within 5 seconds 90% of the time within 6 weeks. e.g. The student will “respect others” within 30% of average peer performance within 6 weeks.

91 Step 2: Develop Intervention based on Behavioral Function Extinction is difficult to manage Attention function should not get ignoring alone Remember to focus on the alternative replacement behavior Consider NCR, DRO, Response Effort, as starting points for brainstorming.

92 Phase 4: Plan Implementation Support & Integrity

93 3 Steps of Phase 4: Plan Implementation Specify when & Where (Steps 1 & 2) Not the entire day in the beginning. Not everywhere in the beginning. Specify who & What (Steps 3 & 4) Implementer(s) Integrity Monitor Data collector/analyzer Materials Specify How (Step 5) Articulation Form

94

95

96 Phase 5: Plan Evaluation

97 3 Steps of Phase 5: Plan Evaluation Answer the following Questions: Is the intervention plan effective? A. Is the student making progress toward the goal? B.Is the student decreasing the discrepancy between him/her and the general education peers? C.Is the plan able to be maintained in the general education/current setting with current level of support? You cannot evaluate an intervention if integrity is not maintained. No implementation No Evaluation No Change.

98 Questions

99 Tier I Academics

100

101 Skill Teaching Strategy MaterialsFormat Allocated Time Reward or Reinforcer Method of Assessment Phonemic Awareness Phonics Reading Fluency Vocab Comp Instructional Analysis Form

102 Tier II Academics Standard Protocol: Scripted with Monitoring of Progress

103

104 Tier III Individualized Assessment –Curriculum Based Evaluation (CBE) –Determine “instructional level” (CBA) Individualized Intervention –Linked to “instructional level” –Based on basic principals of learning

105 A word about “instructional level” Instructional level is not: –25 th to 50 th percentile (or something like that). –Instructional be based on criterion level of accuracy and rate. See Shapiro (2004), Howell & Nolett (2000). –Instructional level can also be based on the skill level (i.e. stage of learning).

106 Linking Academic Assessment to Intervention Through Problem Solving

107 To be successful in implementing Effective Academic Interventions you need to: Have a conceptual frameworks for what learning REALLY is Know what expectations are for the student Know whether what you are doing is helpful, detrimental, or having no impact Be focused on the variables that you can immediately change

108 Typical Hypotheses What are common hypotheses for: –Not completing homework? –Reading “really choppy” out loud? –Performing poorly on tests only? –Inconsistent performance with subtraction? Group Example

109 Learning from an Instructional Hierarchy Perspective A framework for Linking Assessment to Intervention

110 The ABC’s of Learning Antecedent –Instructional pace/Materials/Methods –Location, Demands, etc. Behavior –Topography –Rate/Accuracy/Level/Trend/Expectation Consequences –Delayed versus immediate –Feedback versus none/ R+/P

111 The Instructional Hierarchy 4 Stages of Learning Development * Acquisition, Fluency, Generalization, Adaptation Similar to other “Stage Theories” with regard to pros and cons

112 Stage 1: Acquisition General Question: Acquisition General Variable: Percent Correct General Strategies: 1. Modeling 2. Demonstration 3. Prompting * Often requires a task analysis

113 Modeling Presenting example of a skill e.g. Mathematics “here is a problem for you to look at”

114 Demonstration Active performance of a skill e.g., Mathematics “Watch me work this problem here”

115 Prompting Providing a cue to perform a target response e.g., Mathematics “Don’t forget to carry the 1”

116 Example of using Demonstration Subtraction with regrouping

117 The bottom number is bigger than the top number in the right column So we must borrow from the left column.

118 Cross out the top number in the left column and write the next smallest number Above it.

119 Now put a 1 in front of the top number in the right column.

120 Now subtract starting in the right column

121 Now subtract the left column

122 Example of Demonstration Prompting and Modeling Telling time to the nearest minute

123 What time is it?

124 Write down the number that the small hand is pointing to: 11 Hint: If in between two numbers then It is always the smallest number.

125 Now count by 5’s stating with the number 1 and write down the number that The big hand is on 11:45

126 Sometimes big hands are also between numbers. Let’s tell time.

127 Write down the number that the small hand is pointing to: 1: Hint: If in between two numbers then It is always the smallest number.

128 Now count by 5’s starting with the number 1 and write down the smallest number that the big hand is in between on next to the clock 1:__ :15

129 Now count each little tick mark after the smallest number and add it to the number you wrote down. 1:18 :

130 Stage 2: Fluency General Question: Accurate response rate General Variable: Behavior per minute (e.g. wrcpm) General Procedures: 1. Drill: Active repeated responses 2. Overlearning (Maintenance)

131 Example of Drill Basic Addition Facts

132 Flashcard Drill Procedure All possible combinations 0-12 Start timer Present first stimulus (wait time) If correct put in correct pile with feedback If incorrect put in incorrect pile with corrective feedback. Repeat procedure with incorrect pile until all cards are put into correct pile Graph Data and show student

133 Stage 3: Generalization General Measurement: Generalization/Transfer General Procedures: Practice (new response with other responses). Discrimination Training: Behavior in presence of one stimulus but not another. Differentiation: reinforce responses to stimulus while slowly varying one essential aspect of the stimulus

134 Example of Discrimination Training Letter Reversal b and d

135 b or d? Present a single stimulus to student “b” Ask: What letter is this? Correct response = praise Incorrect response = corrective feedback 10 consecutive correct responses fade in d 10 consecutive responses stop and start over with d 10 consecutive responses fade in b Alternate between the two letters fade in others as needed Graph performance

136 Differentiation Learning to count money under stimulus “How much is this? (multiple coins placed in front of child). Modify by placing heads up/tales up Modify by changing prompt (is this more or less than 30 cents?) Use in multiple environments

137 Stage 4: Adaption Changing form of response when needed very efficiently What’s up versus how are you Making change Problem solving Multiple experiences multiple environments with heavy feedback

138 Important Variables in understanding Instruction and Learning ABC’s and 123’s of learning and instruction

139 Types of Academic Time Allocated Time: - How much time in school we have Instructional Time - How much time teacher spends providing instruction Engaged Time - How much time student spends engaged * This is the best predictor of student performance

140 Question 1 Should we focus on increasing academic engaged time?

141 Yes and No Yes if completing the ABC’s with correct responses No if not completing ABC’s

142 ABC’s of Learning Antecedents: Instructional Directions Stimulus to respond in the presence of Pace of instruction

143 ABC’s Continued Behavior: Topography: Written, verbal, typed Response rate Inter-trial interval Wait times

144 ABC’s continued Consequences: Feedback (negative/positive) Immediate Contingent Change behavior

145 123’s Rate of accurate responding – This is what you graph as often as possible GPA, Grade, Accuracy – This is what you graph, report, measure as general long term goal attainment.

146 Phases of Problem-Solving 1. Problem Identification 2. Problem Analysis 3. Plan Development 4. Plan Implementation 5. Plan Evaluation

147 Phase 1: Problem Identification What is the problem? & Is it a real problem?

148 4 Steps of Phase 1: Problem Identification 1.Operationally Define problem 2.Collect Baseline Data 3.State discrepancy between what is expected (typical peer performance) and what is occurring. 4.Identify a replacement behavior

149 Phase 2: Problem Analysis What may be contributing to the behavior?

150 2 Steps of Phase 2: Problem Analysis 1.Collect enough of the right data. 2.Generate hypotheses of controllable variables related to the behavior.

151 Step 1: Collecting Enough of the Right Data Verbal Reports (e.g. interviews) Rating Scales (e.g. SMALSI) Record Review (e.g. Cumulative file, homework – permanent products) Direct Systematic Behavioral Observation (e.g. Interval recording, frequency counts, IAF)

152 Interviews: Specific Questions to Teachers related to Academics How are instructional assignments presented? What is expected? Where is the student currently? How are opportunities for practice presented? How is feedback provided? What has or has not worked?

153 Rating Scales In my opinion they are useless. Problem exists because data tell us so. Use them only to support hypothesis or generate a hypothesis. They are not “outcome measures”

154 Observations Direct Observations of: –Academic Engaged Time –Instructional Analysis Form As an integrity tool

155 Testing This is NOT WJ, KTEA, WIAT etc. IQ NOT needed nor is it helpful. Curriculum Based Evaluation is the process.

156 Hypothesis: Won’t do Versus Can’t Do Provide reinforcer for reading accurately (50% increase?)

157 Reading Hypothesis 1: Error not important to meaning Tally errors and get percent of words that violate meaning (i.e. would give you a different sentence understanding). Shouldn’t be out of specified range (~5%).

158 Reading Hypothesis 2: Code Structure is the issue Read a passage and note errors. Errors related to pattern in words? Be sure to base this on opportunity for error not just percentage of errors.

159 Reading Hypothesis 3: Word Substitutions are? Related to phonics? –Misses phonetically regular portions of words –Can’t read non-sense words Not related to phonics? –Provide assisted self-monitoring –Maybe not a problem (Check if affects to meaning)

160 CBE: Comprehension

161 Let’s Change our Thinking Comprehension is a complex process Let’s talk about how a reader “reacts” to their reading. –Answering questions, retelling, paraphrasing, cloze, maze, t/f etc.

162 9 Causes of Comprehension Failure These are 9 things that a good reader does that a poor reader doesn’t. If you want a cool round number (the top “10” reasons) the 10 th is Insufficient reinforcement.

163 Strategies of Comprehension Monitor for meaning and self-correct Selective attention to text: Skimming, going over closely Adjust for Text Difficulty: Change rate, rereading, highlighting Connect with Prior Knowledge: Clarify: Figure it out in some way to make it make sense (Ask for help?; Google)

164 Enablers of Comprehension Decoding: 140 wcpm (after 3 rd grade) Vocabulary (Semantics) – 70% of the variability! –Definitions –Determining Word Meaning Grammar (Syntax): Rare, but could be ESL Prior Knowledge

165 CBE of Math

166 Mathematics Areas Computation: Accurately and quickly responding with symbols of quantity Concepts: Rules Strategies: Need to be efficient Facts: Numerical statements Application: Using math –Sub-domains: Tool use, content knowledge, and Vocabulary Problem-Solving: Using both computation and application.

167 Math Assessments Irrelevant standards Irrelevant formats Lack empirically validated sequencing Inadequate samples of student behavior Provide little insight into why errors are made Not aligned with instructional objectives

168 Interviewing & Error Analysis 2 ways of collecting information for the development of a hypothesis Interviewing: See Instructional Analysis Form, Previous questions to ask teachers Error Analysis: Need a lot of problems of the same type (Facts, operations, applications)

169 Example of CBE: Tammy Fourth-grade student Did not make adequate progress with the Tier II standard protocol intervention in winter School psychologist administered an individual probe (i.e., diagnostic tool) and observed Tammy’s completion of this probe An analysis of responding yielded a diagnosis of the problem This diagnosis of the problem informs intervention selection

170 1. What seems to be the problem? 2. What should the intervention target? 3. Describe something a teacher could do to target this problem. 4. Do you have to buy an expensive program just for Tammy?

171 Setting Goals with BMC considered Consider Basic Movement Cycle (BMC) –Think of it as a “handicap” Task Mastery Rate (TMR)= 50/minute Current BMC = 75/minute Expected BMC = 100/minute Formula: (TMR * Current BMC)/(EBMC) –(50*75)/100 = 37.5 –With current BMC student should be able to make 37.5 DCPM

172 Step 2: Writing a Hypothesis Provide the discrepancy statement Add because… at the end of the discrepancy statement and insert your hypothesis. The hypothesis should be specific, observable, and measurable. –Example: Beth is on-task for 35% of intervals while peers are on-task 87% of intervals during a 20-minute observation during direct instruction in Math class, because she is escaping the Math work which is above her instructional level.

173 Consider Multiple Domains: ICEL InstructionCurriculum Environment Learner

174 Phase 3: Plan Development Linking Assessment to Intervention

175 2 Steps of Phase 3: Plan Development 1.Set a Goal 2.Develop a plan based on the hypothesis (ICEL)

176 Step 1: Set a goal Goal should be to bring the student’s performance into acceptable levels relative to peers or Criterion.

177 Determining Long range Goal Multiply number of weeks that you will be monitoring by the criterion (Expected ROI). Add this number to the median baseline point Example: –Median baseline point = 35 –Number of weeks = 10 –Expected rate of growth (based on norms or suggestion)

178 Baseline Intervention

179 Writing IEP Goals Long range Goal In ___ (total # weeks) when presented with math problems form (curriculum and grade level) ____ (Student’s name) will perform ____(long range goal) with _____ errors or fewer.

180 Writing IEP Goals Short term objective Each successive week, when presented with a random selection from _____ (curriculum and grade level) ____ (Student’s name) will perform at an average increase of _____ DCM and no increase in errors.

181 Step 2: Develop Intervention based on Instructional Level Rule out motivation deficits Consider multiple topographies Consider stage of learning Match learning stage principle to instructional components.

182 Phase 4: Plan Implementation Support & Integrity

183 3 Steps of Phase 4: Plan Implementation Specify when & Where (Steps 1 & 2) Not the entire day in the beginning. Not everywhere in the beginning. Specify who & What (Steps 3 & 4) Implementer(s) Integrity Monitor Data collector/analyzer Materials Specify How (Step 5) Articulation Form

184 Skill Teaching Strategy MaterialsFormat Allocated Time Reward or Reinforcer Method of Assessment Phonemic Awareness Phonics Reading Fluency Vocab Comp Instructional Analysis Form

185

186 Phase 5: Plan Evaluation

187 3 Steps of Phase 5: Plan Evaluation Answer the following Questions: Is the intervention plan effective? A. Is the student making progress toward the goal? B.Is the student decreasing the discrepancy between him/her and the general education peers? C.Is the plan able to be maintained in the general education/current setting with current level of support? You cannot evaluate an intervention if integrity is not maintained. No implementation No Evaluation No Change.

188 Questions


Download ppt "Beyond Tier II: Assisting Students Who Still Do Not Respond Gary L. Cates, Ph.D. Illinois School Psychology Association January,"

Similar presentations


Ads by Google