Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 Data Analysis within an RtI 2 Framework: Linking Assessment to Intervention Aimee R. Holt, PhD Middle Tennessee State University.

Similar presentations


Presentation on theme: "1 Data Analysis within an RtI 2 Framework: Linking Assessment to Intervention Aimee R. Holt, PhD Middle Tennessee State University."— Presentation transcript:

1 1 Data Analysis within an RtI 2 Framework: Linking Assessment to Intervention Aimee R. Holt, PhD Middle Tennessee State University

2 2 What is RTI 2 ? A systematic and data-based method for addressing academic concerns: –identifying –defining & –resolving Brown-Chidsey & Steege (2010)

3 3 RTI 2 is a general education initiative…. Components of RTI 2 –High-quality instruction –Frequent assessment of academic skills –Data-based decision making Brown-Chidsey & Steege (2010)

4 4 Problem Solving At each tier within RTI 2, a problem solving model is employed to make decisions Analyze the Results of Implementation Determine Next Steps Analyze the Assessment Plan Results Develop an Intervention Plan Define the Problem Develop an Assessment Plan Implement Plan Progress Monitor Problem Identification Problem Analysis Plan Evaluation

5 5 What would Assessment at Tier I look like?

6 6 Universal Screeners LEAs are required to: –Administer a nationally normed, –skills-based universal screener –to students at their grade level

7 7 –For K-8, Universal Screeners should be administered 3X per year –In grades 9-12, there are multiple sources of data that can be reviewed, such as: EXPLORE, PLAN and ACT; Tennessee Comprehensive Assessment Program (TCAP) which includes Writing (TCAP- WA), End of Course (EOC), 3-8 Achievement and in , Partnership for Assessment of Readiness for College and Careers (PARCC); TVAAS

8 8 Characteristics of Appropriate Universal Screening Tools Helps answer questions about efficiency of core program –Aligns with curriculum for each grade level Skills mastery aligns with state mandated year-end assessment Ikeda, Neessen, & Witt (2008).

9 9 3 Types of CBM’s General Outcome Measures (GOM’s) Skill Based Measures Sub-skill Mastery Measures

10 10 General Outcome Measures GOMs –sample performance –across several goals at the same time – capstone tasks Ex. Oral reading fluency Can be used for –screening (benchmarking), –survey & specific level assessment –progress monitoring

11 11 Skills-Based Measures SBM are similar to GOM’s but can be used when capstone tasks are not available –Ex. Math computation Can be used for –screening (benchmarking), –survey & specific level assessment –progress monitoring

12 12 Subskill Mastery Measures SMMs are very narrow in focus –Ex. Names of letters Should not be used for benchmarking –(exception… early skills such as Letter Naming Fluency, Letter Sound Fluency, Number Naming Fluency)

13 13 Example Reading Skills Typically Assessed by Universal Screeners GradeAreas Typically Assessed by Universal Screeners 6 th Oral Reading Fluency ; Reading for understanding 5 th Oral Reading Fluency; Reading for understanding 4 th Oral Reading Fluency; Reading for understanding 3 rd Oral Reading Fluency; Reading for understanding 2 nd Oral Reading Fluency; Reading for understanding 1 st Letter Naming Fluency (beginning); Phonemic Awareness; Phonics; Word Identification Fluency; Oral Reading Fluency (end) KLetter Naming Fluency; Phonemic Awareness: Early Phonics Skills including Letter Sound Fluency

14 14 What would Data Analysis at Tier I look like?

15 15 Making Decisions about Group Data Review universal screening data to answer the following questions: –Is there a class wide problem? –Who needs a Tier II intervention? Be sure to examine students at the margin –Does anyone need Tier III now?

16 16

17 17 Who needs a Tier II or Enrichment? Winter Benchmark for ORF: –90 th %- 153; –25 th % - 72; Winter Benchmark for Maze: –90 th % - 25; –25 th % - 9; Instructional level criteria For contextual reading – 93-97% correct For most other academic skills – 85-90% correct ORFMaze 154/100% 26 /98% 154/85%26 /79% 68/ 95% 09 /94% 68/88%08 /80%

18 18 Examining students at the Margins Winter Benchmark for ORF: –90 th %- 153; –25 th % - 72; Winter Benchmark for Maze: –90 th % - 25; –25 th % - 9; Instructional level criteria For contextual reading – % correct ORFMaze 75/96% 11 /100% 80/100%10 /97% 73/82% 11/75%

19 19 Identifying who needs Tier III Winter Benchmark for ORF: –25 th % - 72; –10 th % -44 Winter Benchmark for Maze: –25 th % - 9; –10 th % - 6 Instructional level criteria For contextual reading – 93-97% correct ORFMaze 46 / 76%6 / 80% 42 / 83% 5 / 75%

20 20 Referral to Tier II Decision Tree Core literacy instruction has been implemented with fidelity ≥80% of student needs are met by core instruction Differentiated instruction has been provided in a small group within core literacy instruction Student has been present for ≥75% of instructional days Student has passed vision and hearing screening Data indicates performance below the 25 th % on universal screening of student achievement compared to national norms Additional Assessment data supports universal screening data

21 21 What do we mean by linking assessment to intervention?

22 22 Linking Assessment to Interventions…. Research has shown that effective interventions have certain features in common: –Correctly targeted to the student’s deficit –Appropriate level of challenge (instructional range) –Explicit instruction in the skill –Frequent opportunities to practice (respond) –Provide immediate corrective feedback (e.g., Brown-Chidsey & Steege, 2010; Burns, Riley-Tillman, & VanDerHeyden, 2013; Burns, VanDerHeyden, & Boice, 2008;)

23 23 Academic Instruction in Reading Both NCLB and IDEA require that instruction in the general education setting cover all 5 areas of reading identified by the National Reading Panel

24 24 Linking the 5 skill areas to 3 SLD areas

25 25 Phonological Awareness A metacognitive understanding that words we hear have internal structures based on sound –Research on PA has shown that it exerts an independent causal influence on word-level reading. (Berninger & Wagner, 2008) –Phoneme – smallest unit of speech The English language has phonemes

26 26 Phonics Alphabetic principle - Linking phonological (sound) and orthographic (symbol) features of language (Joseph, 2006) –Important for learning how to read and spell National Reading Panel –students with explicit AP instruction showed benefits through the 6 th grade –Phonological awareness is a prerequisite skill

27 27 Word Reading Skills - ( McCormick, 2003 ) –Word identification: the instance when a reader accesses one or more strategies to aid in reading words (e.g., applying phonic rules or using analogies) Decoding – blending sounds in words or using letters in words to cue the sounds of others in a word (Joseph, 2006) –Word recognition: the instant recall of words or reading words by sight; automaticity

28 28 Fluency “ The ability to read a text quickly, accurately, and with proper expression” (NRP, 2000 p.3-5) Most definitions of fluency include an emphasis on prosody – the ability to read with correct expression, intonation and phrasing ( Fletcher et al., 2007 ) National Reading Panel -Good reading fluency skills improved recognition of novel words, expression during reading, accuracy and comprehension

29 29 Vocabulary & Text Comprehension Skills Vocabulary knowledge – including understanding multiple meanings of words; figurative language etc.. Identifying stated details Sequencing events Recognizing cause and effect relationships Differentiating facts from opinions Recognizing main ideas – getting the gist of the passage Making inferences Drawing conclusions

30 30 What Would Assessment at Tier II Look Like?

31 31 So you have identified your “at risk students”- now what? You will need to conduct Survey Level Assessment (SLA) for these students Survey Level Assessment (SLA) –Can be used to: (a) provide information on the difference between prior knowledge and skills deficits to be used to plan instructional interventions & (b) serve as baseline for progress monitoring

32 32 Why is it important to conduct Survey Level Assessments before beginning Tier II interventions? The primary question being addressed by the survey level assessment at Tier II is –“What is the CATEGORY of the problem” –(What is the specific area of academic deficit?) (e.g., Riley-Tillman, Burns, Gibbons, 2013)

33 33 An Example of Survey Level Assessment Using DIBELS GradeCBM AssessedBenchmarked 6 th Oral Reading FluencyFall, Winter, Spring 5 th Oral Reading FluencyFall, Winter, Spring 4 th Oral Reading FluencyFall, Winter, Spring 3 rd Oral Reading FluencyFall, Winter, Spring 2 nd Oral Reading FluencyFall, Winter, Spring 1 st Oral Reading FluencyWinter, Spring 1 st Nonsense Word FluencyFall, Winter, Spring 1 st Phoneme Segmentation FluencyFall, Winter, Spring 1 st Letter Naming FluencyFall KNonsense Word FluencyWinter, Spring KPhoneme Segmentation FluencyWinter, Spring KLetter Naming FluencyFall, Winter, Spring KInitial Sound FluencyFall, Winter 1) Start at student’s grade level 2)Test backwards by grade until the student has reached the “low risk” benchmark for a given skill Low risk/ established indicates the student has “mastered” that skill

34 34 For example….. In reading comprehension & fluency = comprehension intervention comprehension + low fluency, but decoding = fluency intervention comprehension + fluency + decoding, but phonemic awareness skills decoding intervention Riley-Tillman et al., (2013)

35 35 Let’s look at Michael a 2 nd grade student At the fall benchmark, he was identified on ORF as being in the some risk range. His score was 30 wcpm Survey level assessment were conducted using: DORF 1 st grade – (fluency) DNWF 1 st grade – (decoding) DPSF 1 st grade – (phonemic awareness) Problem Identification Problem Analysis

36 36 Michael’s Scores DORF – 35 wcpm DNWF – 28 scpm DPSF – 38 pcpm DIBELS Scores Representing Skills Mastery FallWinterSpring DORF> 20> 40 DNWF> 24> 50 DPSF> 35 DLNF> 37---

37 37 What next…. You link your assessment data to an intervention that targets the category of skill deficit that was identified You select progress monitoring probe(s) that assess that skill You set the student’s goal for improvement – You can use ROI & Gap Analysis Worksheets to help with this

38 38 What progress monitoring is not… It is NOT an instructional method or intervention Think of progress monitoring as a template that can be laid over goals and objectives from an assortment of content areas

39 39 What Would Data Analysis at Tier II Look Like?

40 40

41 41 Referral to Tier III Decision Tree Tier II intervention(s) have occurred daily for 30 minutes in addition to core instruction Intervention logs attached (3) Fidelity checks completed and attached Implementation integrity has occurred with at least 80% fidelity Student has been present for ≥75% of intervention sessions Tier II intervention(s) adequately addressed the student’s area of need

42 42 Tier II intervention was appropriate and research-based Research based interventions are: □ Explicit □ Systematic □ Standardized □ Peer reviewed □ Reliable/valid □ Able to be replicated Progress monitoring has occurred with at least weekly data points –OR bi-monthly data points Gap analysis indicates that student’s progress is not sufficient for making adequate growth with current interventions

43 43 Does a student require Tier III intervention? Step 1: Need to check to see if the data can be interpreted –A minimum of 8-10 data points, if progress monitoring every other week, OR data points, if progress monitoring weekly to make a data-based decision to change to Tier III.

44 44 Step 2: Examine Rate of Improvement –You can compare the student’s actual ROI to the goal that was established –You can use the ROI worksheets Let’s complete one for Michael

45 45 Completing the ROI Worksheet for Michael Assessment Used: DIBELS NWF Student’s score on first probe administered: 28 Student’s score on last probe administered: 37 Fall benchmark expectation: 24 Spring benchmark expectation: 50 Step 1 ____________ - _____________ / _________ = ___________ Spring benchmark expectation Fall benchmark expectation Number of weeks Typical ROI (slope)

46

47

48 48 You also can visually analyze the graphed progress monitoring data –Calculate the trend line of the intervention data points and compare it to the aim (goal) line. »If the slope of the trend line is less than the slope of the aim line, the student may need to be moved to Tier III. »Especially if it appears that given the student’s current ROI that they will not meet year end grade level standards

49 49 Dual Discrepancy -A student should be deficient in level and have a poor response to evidenced-based interventions (slope) to the degree that he/she is unlikely to meet benchmarks in a reasonable amount of time without intensive instruction to move: – between Tier II to Tier III as well as between Tier III and referral for a comprehensive special education evaluation. –(e.g., Brown-Chidsey & Steege, 2008; Lichenstien, 2008)

50 50 What Would Assessment at Tier III Look Like?

51 51 Specific Level Assessment Functional analysis of skills –Are used to: (a) identify specific skills deficits; (b) students prior knowledge; & (c) serve as baseline for progress monitoring –specific level assessments rely primarily on subskill mastery measures. “drill down” to specific deficits

52 52 Functional Analysis R- review I – interview O – observe T - test I – instruction C – curriculum E – environment L- learner RIOT/ICEL Matrix

53 53 Linking Assessment Data to Intervention at Tier III The learner –focus on alterable learner variables –identify academic entry level skills The task –level of the material the student is expected to master The instruction –research-based methods and management strategies used to deliver curriculum Match = Success Instruction Student Task

54 54 Targets for Academic Instructional Materials Instructional level contextual reading – 93-97% correct other academic skills – 85-90% correct –Produce larger gains more quickly Gravois, T.A., & Gickling, E.E. (2008). Best practices in instructional assessment. In A. Thomas & J.Grimes (Eds.), Best practices in school psychology (5 th ed., pp ). Bethesda, MD: National Association of School Psychologists.

55 55 Daly, Chafouleas, & Skinner (2005) Phonemic Awareness Hierarchy

56 56 Let’s look at Michael again….. Specific Level Assessment – Phonics: Decoding Skills test Developmental Spelling Analysis Sight words: Graded word list Phonemic Awareness: LAC 3 Problem Analysis

57 57 Linking specific level assessment data to interventions…. Basing interventions on direct samples of student’s academic skills has been shown to result in larger effect sizes than interventions derived from other data –This is also known as a skill by treatment interaction –Burns, Codding, Boice & Lukito, (2010)

58 58 What Would Data Analysis at Tier III Look Like?

59 59 Need to look at 3 areas »Level »Slope »Variability

60 60 Level Central location of data within a phase often compared to benchmark (goal/aim line) can also look at mean or median for each phase –(e.g., Daly III et all., 2010; Hixson et al., 2008; Riley-Tillman & Burns, 2009) Can conduct a Gap Analysis using the worksheet

61 61 Slope/Trend How the central location changes over time With academic data we are usually looking for an increase in skills Target students ROI can be compared with peer groups ROI or benchmark (e.g., Daly III et all., 2010; Hixson et al., 2008; Riley-Tillman & Burns, 2009)

62 62 2 approaches for analyzing slope Calculate ROI and compare to an identified peer group using the ROI worksheet Plot the trend line and compare the aim (goal) line to the slope (trend) line

63 63 Variability Should be examined both within and between phases –General rule- most of the variability in the data should explained by the trend line 80% of the data points should fall with in 15% of the trend line

64 64

65 65 Referral for SLD Evaluation Decision Tree Tier III Intervention(s) have occurred daily for 60 minutes in addition to core instruction Intervention logs attached (5) Fidelity checks completed and attached Implementation integrity has occurred with at least 80% fidelity Student has been present for ≥75% of intervention sessions Tier III intervention(s) adequately addressed the student’s area of need

66 66 Referral for SLD Evaluation Decision Tree Tier III intervention was appropriate and research-based Research based interventions are: □ Explicit □ Systematic □ Standardized □ Peer reviewed □ Reliable/valid □ Able to be replicated Progress monitoring has occurred with at least weekly data points –OR bi-monthly data points at Tier III Gap analysis indicates that student’s progress is not sufficient for making adequate growth with current interventions

67 67 Referral for SLD Evaluation Decision Tree The following have preliminarily been ruled out as the primary cause of the student’s lack of response to intervention □ Visual, motor, or hearing disability □ Emotional disturbance □ Cultural factors □ Environmental or economic factors □ Limited English proficiency □ Excessive absenteeism

68 68 Deciding to refer for SLD evaluation As part of the teams decision to refer for an SLD evaluation, a Gap Analysis should be conducted Let’s look at how to complete the Gap Analysis worksheet with Michael

69 69 Gap Analysis Assessment Used: 2 nd ORF Student’s current benchmark performance: 66 Student’s current rate of improvement (ROI): 1.3 Current benchmark expectation:90 End of year benchmark expectation:90 Number of weeks left in the school year: 5 Is Gap Significant? ________ / ________ = _________ □ Yes □ No Current benchmark expectation Current performance Current gap

70 70 Conducting a Gap Analysis Step

71 71 Additional Consideration

72 72 SEM Additionally, we cannot ignore issues such as interpreting CBM scores in light of SEM or CI when those scores are used for such as diagnoses and eligibility determinations For more detailed discussion including suggested SEM guidelines for oral reading fluency scores in grades 1-5 see: –Christ, T. J. Silberglitt, B. (2007). Estimates of the standard error of measurement for curriculum-based measures of oral reading fluency. School Psychology Review, 36, pp

73 73 Use of Progress Monitoring in Special Education Because CBM data –can be directly tied to skill development necessary to be successful in the curriculum, –they possess a higher level of sensitivity, and –allows for graphic representation; –they allows for development of a higher quality IEP Progress monitoring should continue after the IEP is initiated Exit criteria can be set to determine if early reevaluation can be completed due to student success.

74 74 Helpful Resources

75 75 Helpful Resources from NASP

76 76

77 77 Additional Helpful Resources Guilford Press

78 78

79 79

80 80

81 81


Download ppt "1 Data Analysis within an RtI 2 Framework: Linking Assessment to Intervention Aimee R. Holt, PhD Middle Tennessee State University."

Similar presentations


Ads by Google