Presentation is loading. Please wait.

Presentation is loading. Please wait.

Universal Screening Illinois ASPIRE Alliance for School-based Problem-solving & Intervention Resources in Education Illinois ASPIRE is a State Personnel.

Similar presentations


Presentation on theme: "Universal Screening Illinois ASPIRE Alliance for School-based Problem-solving & Intervention Resources in Education Illinois ASPIRE is a State Personnel."— Presentation transcript:

1 Universal Screening Illinois ASPIRE Alliance for School-based Problem-solving & Intervention Resources in Education Illinois ASPIRE is a State Personnel Development Grant-funded initiative of the Illinois State Board of Education. All funding is from federal sources. 1

2 Illinois ASPIRE Alliance for School-based Problem-solving & Intervention Resources in Education
Project Goal : Establish and implement a coordinated, regionalized system of personnel development that will increase school systems’ capacity to provide early intervening services [with an emphasis on reading], aligned with the general education curriculum, to at-risk students and students with disabilities, as measured by improved student progress and performance. Illinois ASPIRE is a State Personnel Development Grant-funded initiative of ISBE. All funding is from federal sources. 2

3 Illinois ASPIRE Alliance for School-based Problem-solving & Intervention Resources in Education
Objectives: Deliver research-based professional development and technical assistance in Problem-Solving Service Delivery Systems, Response-to-Intervention (RTI), scientifically based reading instruction, and Standards Aligned Classrooms (SAC). Increase the participation of parents in decision-making across district sites. Incorporate professional development content into higher education general and special education preservice & graduate level curricula. Evaluate the effectiveness of project activities. Illinois ASPIRE is a State Personnel Development Grant-funded initiative of ISBE. All funding is from federal sources. 3

4 Intended Participant Outcomes
Understand Key Concepts and Vocabulary of Universal Screening Distinguish Between Referral-Driven Problem Solving and Universal Screening and Referral-Driven Problem-Solving Be Able to Organize and Implement a Benchmark Assessment Process for Universal Screening Using CBM (or member of the CBM “Family” like DIBELS) 4

5 UNIVERSAL SCREENING AND BENCHMARKING….
Always needs to occur in the context of all the problem solving/RTI components in place in the school

6 Foundational Concepts, Vocabulary, and Tools of RtI
IT’S ALL ABOUT A MAJOR CHANGE IN HOW WE GO ABOUT OUR BUSINESS OF HELPING KIDS Response-to-Intervention and Problem-Solving are about: Thinking Differently About Problems, Causes, and Solutions (Concepts) Talking Differently About Problems, Causes, and Solutions (Vocabulary) Doing Some Things Differently (Tools and Behaviors) “You can’t do something different in the same way”. (Dr. George Batsche)

7 All the principles and components of RtI are about building a better support system for general education. DOING IT BETTER DOING IT DIFFERENTLY PROVIDING MORE LEVELS OF SUPPORT TO HELP ALL STUDENTS REALLOCATING RESOURCES/SKILLS IN DIFFERENT WAYS

8 Foundational Concepts: The Data, The Interventions, The Problem Solving Process

9 What is Response to Intervention (RtI)
What is Response to Intervention (RtI) ? (Batsche, Elliott, Graden, Grimes, Kovaleski, Prasse, Reschly, Scharg, Tilley, 2005) •Identifying and providing high quality instruction and research-based interventions matched to students needs •Measuring rate of improvement (ROI) over time to make important educational decisions •Educators using ongoing student performance data to determine if an intervention is working. If it is not, it is time to do something different

10 An RtI Vision Any School, USA
Efficient Teaming and Problem Solving at each Tier Data-Based Decisions at each Tier Intervention-rich environment at each Tier

11 And… For a child suspected of having a specific learning disability,
the group MUST consider,.. as part of the evaluation …data that demonstrates that-- …Data-based documentation of repeated assessments of achievement at reasonable intervals, reflecting formal assessment of student progress during instruction, which was provided to the child’s parents. Individuals with Disabilities Education Improvement Act of 2004 (IDEIA)

12 Problem Solving Method
Problem Identification Is there a problem? What is it? Plan Evaluation Did our plan work? Problem Analysis Why is it happening? Plan Development What shall we do about it?

13 Purposes of Assessment
Who has problems? (Problem Identification) Why is the problem is occurring? (Problem Analysis) Is our instruction working to fix the problem? (Plan Development & Implementation) How well are we doing overall? (Plan Evaluation) Screening Diagnostic Progress Monitoring Outcome/Accountability Taken from Heartland AEA 11

14 Assessment Systems Used in RtI Models
Taken from Heartland AEA 11

15 Assessment Systems Used in RtI Models
ISAT MAP Aimsweb DIBELS ITBS, Terra Nova Aimsweb DIBELS CBE -R SLA, ISEL,QRI MAP, Run.Rec. Inform. Phonics Aimsweb DIBELS Sopris West tool Functional Beh. Assessment Taken from Heartland AEA 11

16 Use Scientifically Based Problem Identification & Progress Monitoring Tools
NATIONAL CENTER ON STUDENT PROGRESS MONITORING RTI4SUCCESS.ORG)

17 Standards for Scientifically Based Problem ID and Progress Monitoring Have Been Established
Reliability Quality of Good Test Validity Sufficient Number of Alternate Forms and of Equal Difficulty Essential for Progress Monitoring Evidence of Sensitivity to Improvement or to Effects of intervention Critical for Progress Monitoring Benchmarks of Adequate Progress and Goal Setting Rates of Improvement are Specified Evidence of Impact on Teacher Decision Making instruction or Student Achievement; Critical for Formative Evaluation Evidence of Improved Instruction and Student Achievement; Gold Standard for Progress Monitoring Logistically Feasible--Low Cost, Efficient, Accurate Critical for IMPLEMENTATION

18 Members of the CBM “Family” Do
ASSESSMENT TOOLS Not All Assessment Tools Schools Use Meet Accepted Psychometric Standards Members of the CBM “Family” Do

19 SAME MEASURES FOR US & B AND PROGRESS MONITORING
(ROI) SYSTEMATIC PROBLEM SOLVING PINPOINTING THE SPECIFIC AREA OF DIFFICULTY, DIAGNOSTIC INFORMATION EVERY WEEK OR 2 TIER III MONTHLY STRATEGIC MONITORING (ROI) TIER II 1. UNIVERSAL SCREENING AND BENCHMARKING: EARLY LITERACY MEASURES, AS DIBELS OR AIMSWEB CBM (KEY CRITICAL INDICATORS) TIER I 3 X PER YEAR

20 Integrated Assessment Systems
Not this Assessment Instruction This is what we want.. Assessment Instruction Aligning Assessment and Instruction

21 CBM-GOM are used as Universal Screeners- What is a Universal Screening?
Given to everyone Measures Critical Skills Brief Repeatable Cheap and easy to administer and score Tells us who and what tier needs intervention

22 Aimsweb Literacy Measures:
Letter Naming Fluency Letter Sound Fluency Phoneme Segmentation Fluency Nonsense Word Fluency Oral Reading Fluency Early Literacy

23 HEY,…WHY AREN’T WE CALLING THIS DIBELS?!
DIBELS (Dynamic Indicators of Basic Early Literacy Skills) are early literacy measures developed by the University of Oregon. AIMSWeb also has a set of Early Literacy Measures. They are almost identical to the DIBELS measures. The differences are not significant. Most RtI Aspire sites are using the administration, scoring, and probes from Aimsweb.

24

25 1. Aimsweb Letter Naming Fluency
(Measures the number of letters a student can name in one minute.) Here are some letters. Begin here (point to the first letter) and tell me the names of as many letters as you can. If you come to a letter you don’t know, I’ll tell it to you. Are there any questions? Put your finger under the first letter. Ready, begin. In general, does the student have automaticity? fluency of naming? g N E Y R l V d H Z N d x S C n j H s S E n G h c i h B b O Y F p D L i q c D Q R v F J Z M P o p u l G A f V B P k m I V M e r y z a L U A d y q v w u T w N U H j K e r X T z Y X Z x f m W W s J I k l E R K g N E Y R l

26 2. Aimsweb Letter Sound Fluency
(Measures the number of letter sounds a student can name in one minute.) Here are some letters. Begin here and tell me the sounds (with emphasis) of as many letters as you can. If you come to a sound you don’t know, I’ll tell it to you. Are there any questions? Put your finger under the first letter. Ready, begin. In general, can the student efficiently convert the visual symbol into an auditory one..with automaticity? g N E Y R l V d H Z N d x S C n j H s S E n G h c i h B b O Y F p D L i q c D Q R v F J Z M P o p u l G A f V B P k m I V M e r y z a L U A d y q v w u T w N U H j K e r X T z Y X Z x f m W W s J I k l E R K g N E Y R l

27 3. Aimsweb Phoneme Segmentation Fluency
(Measures the number of phonemes students can segment in 1 minute.) I am going to say a word. After I say it, you tell me all the sounds in the word. So, if I say, “sam,” you would say /s/ /a/ /m/. Let’s try one. (one second pause). Tell me the sounds in “mop” Ok. Here is your first word. In general, does the student understand that words can be broken into its component phoneme/parts?

28 4. Aimsweb Nonsense Word Fluency
(Measures the number of phonemes students can read in 1 minute.) Here are some more make-believe words (point to the student probe). Start here (point to the first word) and go across the page (point across the page). When I say, “begin”, read the words the best you can. Point to each letter and tell me the sound or read the whole word. Read the words the best you can. Put your finger on the first word. Ready, begin. In general, does the student have automaticity with mapping/recalling the sound-letter relationships?Can they ‘CRACK THE CODE’?

29 5. Aimsweb CBM Oral Reading Fluency
(Measures student’s ability to read grade level passages accurately and fluently.) Please read this (point) out loud. If you get stuck, I will tell you the word so you can keep reading. When I say, “stop” I may ask you to tell me about what you read, so do your best reading. Start here (point to the first word of the passage). Begin. In general, has the student developed automatic phonemic awareness, phonics skills, and word recognition skills to be a fluent reader?

30 6. Aimsweb Maze (Measures student’s ability to read grade level passages accurately and fluently and comprehend.) When I say ‘Begin’, I want you to silently read a story. You will have 3 min. to read the story and complete the task. Listen carefully to the directions. Some of the words in the story are replaced with a group of 3 words. Your job is to circle the 1 word that makes the most sense in the story. Only 1 word is correct. In general, has the student developed automatic phonemic awareness, phonics skills, and word recognition skills to be a fluent and comprehending reader?

31 Why THESE Literacy Measures?
Torgesen says that “Measures of letter knowledge continue to be the best single predictor of reading difficulties.” Marilyn J. Adams, in her article, “The Elusive Phoneme”, says that “a child’s level of phonemic awareness on entering school is widely held to be the strongest single determinant of the success that he or she will experience in learning to read.” Research has shown that Oral Reading Fluency is the best reading General Outcome Measure (GOM).

32 BIG IDEAS IN READING (National Reading Panel)
PHONEMIC AWARENESS PHONICS FLUENCY VOCABULARY COMPREHENSION

33 BIG IDEAS IN EARLY LITERACY SKILLS
Phonemic Awareness: The awareness and understanding of the sound structure of our language, that ‘cat’ is composed of the sounds: /k/ /a/ /t/ Alphabetic Principle: Based on 2 parts: Alphabetic Understanding. Words are composed of letters that represent sounds, and Phonological Recoding. Using systematic relationships between letters and phonemes (letter-sound correspondence) to retrieve the pronunciation of an unknown printed string or to spell Accuracy and Fluency with Connected Text. Readers who are not fluent at decoding are not able to focus their additional resources on comprehension

34 Big Ideas Drive the Train Big ideas drive the curriculum and instruction Big ideas drive the measures we use Phonemic Awareness Alphabetic Principle Accuracy and Fluency with Connected Text Risk indicator that acquisition of crucial skills may be difficult Phoneme Segmentation Fluency Letter Sound Fluency Nonsense Word Fluency CBM Oral Reading Fluency Letter Naming Fluency

35 IN GENERAL, ORAL READING FLUENCY MEASURES PROVIDE QUALITATIVE INFORMATION ABOUT 3 BROAD COMPETENCIES: 1. RATE: Words read correctly -Above 75th%- consider differentiating instruction -Below 25%- consider need for Tier 2 interventions -Below 10%- further assess, do problem analysis, and consider need for Tier 2 and/or 3 interventions 2. ACCURACY: Error rates -0-5 error rate= acceptable accuracy (skilled readers are 95% or better accurate).. -5-10% error rate= accuracy in question (90% accuracy) ->10% error rate=unacceptable accuracy (<90% accuracy) 3. COMPREHENSION -ADEQUATE FLUENCY AND RATE CORRELATE STRONGLY WITH ADEQUATE COMPREHENSION

36 Students with Some (Limited) Reading Skills
Linking oral reading fluency with comprehension. These are in______and cHallinGinG times for anyone whose pRoFEshuNle res________ are ________in any way to liTiRucY outcomes among school children. For, in sport of all our new NaWLEGe about reading and reading iNstRukshun, there is a wide-speeded con______ that public EdgUkAshuN is not as eFfEktIve as it shood be in tEecHiNg all children to read.

37 These are interesting and challenging times for anyone whose professional responsibilities are related in any way to literacy outcomes among school children. For, in spite of all our new knowledge about reading and reading instruction, there is a wide-spread concern that public Education is not as effective as it should be in teaching all children to read.

38 ORF Informs Qualitative Features of Good Reading
Is highly fluent (rate and accuracy)? 2. Uses effective strategies to decode words? effective word attack Context Adjusts pacing (i.e., slows down and speeds up according to level of text difficulty)? of word(s) syntax (word order) semantics (word meaning)

39 ORF Informs Qualitative Features of Good Reading
4. Attends to prosodic features? inflection (pause, voice goes up and down) reads with expression punctuation (commas, exclamation points, etc.) predicts level of expression according to syntax 5. Possesses prediction-orientation? seems to look ahead when reading reads at a sentence or paragraph level

40 ORF Informs Qualitative Features of Good Reading
6. Self-monitors what she/he is reading? Self-corrects if makes meaning distortion errors 7. Makes only meaning preservation errors? more errors that preserve meaning (e.g., “house” for “home”) fewer meaning distortion errors (e.g., “mouse” for “house”) 8. Automaticity on reread words. words that appear throughout text are read automatically (e.g., become “sight words”)

41 Qualitative Features Worth Noting Source: AIMSweb/M ark Shinn

42 What Does R-CBM Measure?
ALL These Skills = General Reading Skill

43 General Outcome Measures (GOMs) From Other Fields
Medicine measures height, weight, temperature, and/or blood pressure as the best ndicators of general health. Federal Reserve Board measures the Consumer Price Index Wall Street measures the Dow-Jones Industrial Average Companies report earnings per share McDonald’s measures how many hamburgers they sell Reading measures Oral Reading Fluency as the best indicators of general reading health/achievement. Reliable measures that give us good feedback Good baseline/starting point We can screen people using these measures very quickly. Every doctor does not use every measure For many students, we can determine if students are doing well in an easy, efficient manner. They don’t all need extensive assessment if they are doing well. Don’t worry about the wrong things. CBM- assess everyone using quick measures. Can constantly monitor the efficacy of instructional programs on an ongoing basis. Simple, quick, reliable, and give a good indicator of how things are progressing over time

44 Things to Always Remember About CBM- GOM
Are sensitive to improvement in brief intervals of time Also tell us how students earned their scores (qualitative information) Designed to be as short as possible to ensure its “do ability” Are linked to decision making for promoting positive achievement and Problem-Solving

45 Once Screening Data is Collected You Begin to Make Informed Decisions…
Data-Based Decisions! District or School Level Decisions Classroom or Group Decisions Individual Student Decisions

46 ALWAYS THINK ABOUT STUDENT NEEDS IN THIS FRAMEWORK:
INDIVIDUAL NEEDS TIER III. TIER II. CLASS NEEDS SMALL GROUP NEEDS DISTRICT NEEDS SCHOOL NEEDS GRADE LEVEL NEEDS TIER I.

47 Old System of Problem Solving
Special Education General Education Amount of Resources Needed To Benefit General Education with Support FLEX or P-S model—looked like TAT We created a team to address these students with typically sped staff leading the data collecting and intervention implementation Students identified for individual problem solving were typically teacher or parent referred. Severity of Educational Need or Problem

48 Why hasn’t this old system of problem solving been very effective?
Because we’ve been trying to solve students’ problems one student at a time. This has been impractical and too time intensive to be effective.

49 Bridging the Gap Core + Intensive Core + Supplemental Weekly Core
Amount of Resources Needed To Benefit System wide p-s model..what is the difference? Weekly-Monthly 3x/year Severity of Educational Need or Problem

50 Bridging the Gap Core + Intensive Core + Supplemental Weekly Core
Amount of Resources Needed To Benefit Do you have this in place? What components do you have in place now? Weekly-Monthly 3x/year Severity of Educational Need or Problem

51 Bridging the Gap Core + Intensive Core + Supplemental Weekly Core
Amount of Resources Needed To Benefit Do you have this in place? What components do you have in place now? Weekly-Monthly 3x/year Severity of Educational Need or Problem

52 Bridging the Gap Core + Intensive Core + Supplemental Weekly Core
Amount of Resources Needed To Benefit Do you have this in place? What components do you have in place now? Weekly-Monthly 3x/year Severity of Educational Need or Problem

53 Data-Based Decision Making Steps
Problem Identification What is the Problem and Is it Significant? Plan Evaluation Did our plan work? Problem Analysis Why is it happening? Plan Development What shall we do about it?

54 Data Based Decision Making
Targeted/ Supplemental 7%-15% Universal 80%-90% Intensive 3%-5% ACADEMICS BEHAVIOR

55 Data Based Decision Making
Targeted/ Supplemental 15% Universal 80% Intensive 5% We want these percentages: Tier 1.: 50% or better on Aimsweb norms. Tier 3.: 25% or lower on Aimsweb norms. Tier 2: Everyone in between.

56 School-Wide Reading Improvement in a School Using Problem-Solving
Courtesy of Christine Martin, Indian Prairie School District, IL

57 Tier 2: Strategic Monitoring of At Risk

58 Data Review Intervention Group 1
Median GOAL ROI = 1.3 Median TREND ROI = 4.71 Intervention Effective?

59 Student Progress Monitoring: Is the student making progress
from the intervention?

60 Educational Need is Measured by a PERFORMANCE DISCREPANCY
No Significant Discrepancy/ Educational Need

61 Significant Performance Discrepancy
Educational Need Significant Performance Discrepancy More Severe Educational Need Significant Discrepancy

62 Some Potential Educational Need, Significant Educational Benefit: Maintain the General Education Program (Tier 2) IS THIS STUDENT REDUCING THE DISCREPANCY BETWEEN HIMSELF AND GRADE LEVEL PEERS? Rate of Improvement that is REDUCING the Gap

63 BIG IDEA Use assessment data to determine
student need and link that to research-based interventions that match the need DATA INFORMS NEED INTERVENTION


Download ppt "Universal Screening Illinois ASPIRE Alliance for School-based Problem-solving & Intervention Resources in Education Illinois ASPIRE is a State Personnel."

Similar presentations


Ads by Google