Download presentation
Presentation is loading. Please wait.
1
Intro to AIMSweb Progress Monitoring
Shirley Jirik, Ed.D. SVVSD RtI Coordinator June 10, 2014
2
Purpose To introduce participants to AIMSweb progress monitoring and how these data can be used to drive the the RtI/MTSS process
3
Learning Targets Use AIMSweb Software to record progress monitoring data Administration and scoring of AIMSweb probes Use these data to drive the RtI process Be able to locate resources for future references
4
Essential ? How can formative assessment (AIMSweb) help guide a teacher’s instructional decisions?
5
Working Agreements Limit side Conversations & technology distractions
Focus on the topic Hand Signal to regain your attention
6
Seven Norms of Collaborative Work
PAUSING PARAPHRASING Posing questions Putting ideas on the table PROVIDING DATA PAYING ATTENTION TO SELF AND OTHERS PRESUMING POSITIVE INTENTIONS
7
Agenda Introductions Formative/Summative Assessment
Overall Introduction to AIMSweb Administration & Scoring of MCAP & MCOMP Progress Monitoring Set-up Administration & Scoring of MAZE & RCBM
8
Introductions Name Site and Position Your past experience with AIMSweb
What you hope to learn from this class
9
Evaluations to Inform Teaching— Summative & Formative Assessment
Summative Assessment: Culmination measure. Mastery assessment. Assessment after instruction. Pass/fail type assessments which summarize the knowledge students learn. Typical summative assessments include: End of chapter tests High-stakes tests (e.g., State assessments) GRE, ACT, SAT, GMAT, etc. tests Driver’s license test Final Exams. Formative Assessment: Process of assessing student achievement frequently during instruction to determine whether an instructional program is effective for individual students. Informs: When students are progressing, continue using your instructional programs. When tests show that students are not progressing, you can change your instructional programs in meaningful ways.
10
Assessment Sort What type of assessment is it?
In small groups, use the list of assessments to sort summative assessments from formative assessments Note catcher provided in handout
11
Summative Assessment: Characterized as assessment of learning.
Summative & Formative Assessment Summative Assessment: Characterized as assessment of learning. Formative Assessment: Characterized as assessment for learning. (Citation: Summative assessment tells you what happened Formative assessment tells you what’s happening.
12
Criterion-referenced assessments Cognitive assessments Rating scales
Evaluations to Inform Teaching— Diagnostic Assessment Diagnostic Assessments: Measures that indicate specific skill strengths and those areas needing improvement. Results may indicate skill areas needing intervention/instruction. Programming may then address students’ needs. Examples: Criterion-referenced assessments Cognitive assessments Rating scales Norm-referenced, standardized assessments Tests may be based on the assessment of cognitive skills, academic skills, behavior, health, social-emotional wellbeing, etc.
13
Powerful measures that are: Simple Accurate
Common Characteristics of General Outcome Measures Powerful measures that are: Simple Accurate Efficient indicators of performance that guide and inform a variety of decisions Generalizable thermometer that allows for reliable, valid, cross comparisons of data
14
General Outcome Measures (GOMs) from Other Fields
Medicine measures height, weight, temperature, and/or blood pressure. Department of Labor measures the Consumer Price Index. Wall Street measures the Dow-Jones Industrial Average. Companies report earnings per share. McDonald’s® measures how many hamburgers they sell.
15
Sensitive to improvement of students’ achievement over time
CBM is Used for Scientific Reasons Based on Evidence Reliable and valid indicator of student achievement Simple, efficient, and of short duration to facilitate frequent administration by teachers Provides assessment information that helps teachers plan better instruction Sensitive to improvement of students’ achievement over time Easily understood by teachers and parents Improves achievement when used to monitor progress
16
Things to Always Remember About CBM
CBMs are designed to serve as “indicators” of general reading achievement: CBM probes don’t measure everything, but they measure the important things Standardized tests to be given, scored, and interpreted in a standard way Researched with respect to psychometric properties to ensure accurate measures of learning
17
Items to Remember (continued)
Are sensitive to improvement in brief intervals of time Also can tell us how students earned their scores (offers opportunity to gather qualitative information) Designed to be as short as possible (2-4min) to ensure its “do-ability” Are linked to decision making for promoting positive achievement and Problem-Solving
18
Professional Reading Form a trio. All read the Introduction
Person A reads, ‘A CBM Skeptic’ Person B reads, ‘Becoming a CBM Advocate’ Person C reads, ‘An Example’ (stop at pg. 3) As you read this article mark the text for purposes, such as: * Affirms prior knowledge ! Surprises you ? You wish to know more about this When you are all ready, teach your section to your partner.
19
Formative Assessment Effect size for using formative assessment= .90
Higher when data were graphed Hattie, 2009
20
Three-Tiered Assessment Model
Tier 3: PROGRESS MONITOR Intensive monitoring towards specific goals for students at significant risk for failure 1 2 3 Tier 2: STRATEGIC MONITOR Monthly monitoring for students who are mild to moderate risk for failure Tier 1: BENCHMARK Universal Screening for all students
21
Measures Currently Available via AIMSweb®:
Early Literacy [K-1 benchmark, Progress Monitor (PM) any age] Letter Naming Fluency Letter sound fluency Phonemic Segmentation Fluency Nonsense Word Fluency Early Numeracy (K-1 benchmark, PM any age) Oral Counting Number identification Quantity discrimination Missing number Oral Reading (K-8, PM any age) MAZE (Reading comprehension); (1-8, PM any age) Math Computation (1-6, PM any age) Math Facts (PM any age) Spelling (1-8, PM any age) Written Expression (1-8, PM any age) All students in an academic curriculum are “benchmarked” three times per year across any/all of these assessment areas.
22
Three-Tiered Assessment Model
Tier 3: PROGRESS MONITOR Intensive monitoring towards specific goals for students at significant risk for failure 1 2 3 Tier 2: STRATEGIC MONITOR Monthly monitoring for students who are mild to moderate risk for failure Tier 1: BENCHMARK Universal Screening for all students
23
BIG IDEA Curriculum-Based Measurement is different than regular classroom assessment There are Standardized Directions Scoring Guidelines/Rules
24
Training Accounts Navigating the System
26
Box & Whiskers Graphs (box plots): A 3-Step Explanation
1 3 2 Average Range (middle 50%) AIMSweb commonly uses box plots to report data. AIMSweb’s Box plots are somewhat similar in shape and representation as to a vertical bell curve. Michael Martin Above 90th percentile * Above Average Range 90th percentile * Target Line * 75th percentile Median (50th percentile) 25th percentile Average range of population included in sample. * * Below Average Range 10th percentile * Below 10th percentile * *In relation to user-defined comparison group
29
AIMSweb National Aggregate Norm Table
30
AIMSweb Aggregate Norm Table
32
AIMSweb Administering & Scoring Math Computation (M-COMP)
5-10 seconds Preparing trainees to understand the use of General Indicators (including CBM) and how they apply within a Tiered Service Delivery Model. 32
33
Purpose of M-COMP Mathematics Computation (M–COMP) is a series of assessments that yield general math computation performance and rate of progress information. You can use the initial M–COMP Benchmark probe as a screening tool to make Response to Instruction / Intervention (RTI) decisions, and to compare the results to normative- or standards-based data. Test Name Area Assessed Test Arrangements What is Scored M-COMP General Math Computation Skills Individual, Small Group, or Large Group Number of Correct Items 33
34
NCTM Curriculum Focal Points by Grade
34
35
Domains Evaluated by M-COMP by Grade
36
An Overview of M-COMP Test Development
Facts about M-COMP: Developed by professional test developers and math experts. Items are based on grade-level and domain-specific criteria An anchor probe was developed for each grade level (1–8). Item placement on each anchor probe was based on increasing item difficulty within a domain. Easier items were generally placed at the beginning of each probe and more difficult items followed. Additionally, skills are presented in a similar order, frequency, and sequence across each probe and grade level. 36
37
Sample Grade 2 M-COMP 37
38
Sample Grade 5 M-COMP 38
39
M-COMP Administration is flexible and can include:
Small Group Administration Individual Administration Whole Class Administration 39
40
Planning and Preparation
Getting Started Planning and Preparation 40 40
41
Paper/Pencil Administration: Prepare the AIMSweb® M-COMP Test
Before Testing Paper/Pencil Administration: Prepare the AIMSweb® M-COMP Test Student Copies 2 pages (or duplex copy), one copy per student. Examiner Copy Scoring key 41 41
42
Additional tools needed
Before Testing Additional tools needed This is really minutia but that stopwatch is really not the best - the little reading timers that beep are so much better List of Students to be Tested (Optional) Printed Directions Digital Stopwatch (or Timer) preferred 42 42
43
Students: Prohibited Items
BEFORE Testing Students: Prohibited Items Calculators Rulers, Protractors, etc. Math-related posters in room Texting, phones? (calculating or cheating) 43 43
44
Obtain the Necessary Test Materials
During Universal Screening you will use a different probe for fall, winter and spring sessions. MEASURE FALL WINTER SPRING M-CAP PROBE #1 PROBE #2 PROBE #3 M-COMP Available on AIMSweb love that this was made clear rp You do NOT need to give 3 probes and take the median for any of the M-COMP or M-CAP measures. They are all one probe each per Universal Screening window. 44 44
45
Setting up the Assessment Environment
Give a backup probe in the case an assessment is “spoiled.” Examples: For Universal Screening, use an alternate progress monitor probe. For progress monitoring use an alternate progress monitoring probe. Ensure child is alert and well for testing 45 45
46
Before Testing The M-CAP and M-COMP measures are both standardized tests. You must: Administer M-CAP and M-COMP the same way, each time. Adhere to the exact standardized directions at ALL times. Remember for 8 minutes it’s about testing, not teaching. Do not teach or correct the student. Avoid Practice Effects: Do not allow students to pre-read, use the probes for practice, or use the probes for review after testing, etc. Prepare to proctor the assessment carefully. No rulers, guides, or other manipulatives are to be used during universal screening. 46 46
47
Before Testing Additional standard procedures:
To make valid normative (national, state, district, or school) decisions, all directions must be followed carefully. Actions that may invalidate decisions and conclusions made about student performance include, but are not limited to: Changes in the presentation of the probes (e.g., order, size, etc.) Alterations of the instructions given to students. Inappropriate use of probes as teaching instruments. Use of any accommodations, modifications, or supplementary aids (e.g., rulers, calculators, etc.) Before administering a probe, it is important that the examiner become familiar with these administration directions. 47 47
48
Directions Give teachers the assessment now (3rd and 7th) 48
49
Directions (Continued)
49
50
Directions: (During Testing)
50
51
Time to Test!
52
M-COMP Scoring: Rules & Examples
52
53
After Testing: Scoring
Collect tests immediately. Scoring: All problems receive either full credit or no credit. If any part of an answer is incorrect, whether it is a one-part or multi-part question, the score for that item is 0. Examiners circle the score value i.e., 1, 2, 3) for a correct answer and circle “0” for an incorrect answer. The total score value is then summed at the bottom of the page. 53
54
Scoring: Awarding Points
Least difficult items: 1 point Medium difficulty items: 2 points Most difficult items: 3 points Point values are pre-assigned to avoid scorer subjectivity. 54
55
Scoring: Variant Answers
Variant Answers: An examiner may determine if an answer that varies slightly from that which is on the scoring sheet deserves credit. This should be based on: Best practices Professional judgment Primary goal: Determine if the answer reflects an understanding of the task presented (e.g., a money task response indicates if the student knows how to properly express monetary amounts). Credit may be given for a clearly correct response conveyed in a manner other than the one indicated. 55
56
Scoring: Variant Answers
Variant Answers: An examiner may determine if an answer that varies slightly from that which is on the scoring sheet deserves credit. This should be based on: Best practices Professional judgment Primary goal: Determine if the answer reflects an understanding of the task presented Answer keys are not exhaustive. Give credit if the answer is numerically equivalent to the listed answer, as long as it meets the item requirements. 56
57
Scoring: Final Answer is What Counts
When the target answer is a fraction, some students may choose to reduce/simplify the answer—even when instruction to do so has not been given. If reduced/simplified properly and calculated correctly, give the student credit for the answer. If the student initially provides the correct response and then makes an error when reducing/simplifying, the final answer presented is what you score. 57
58
Scoring: Final Answer is What Counts
When directions may be open to interpretation, (e.g., student is instructed, “Write the answer in the lowest terms.”) the target answer is a mixed number, but the student provided an improper fraction. You may score a properly simplified improper fraction as correct. 58
59
Scoring: No Partial Scoring
All problems receive either full credit or no credit. No credit Full credit 394 395 59
60
Let’s Practice! MCOMP Practice Exercises
Use handout to practice scoring Practice scoring from practice exercises
61
Progress Monitoring Strategies for Writing Individual Goals in General Curriculum and More Frequent Formative Evaluation Mark Shinn, Ph.D. Lisa A. Langell, M.A., S.Psy.S. V. Scott Hooper, Ph.D. Linner, Scott E., Ed.S., NCSP Summer 2009
62
Progress Monitoring is:
Research-Based Best Practices: Systematic Formative Evaluation that requires the use of: Standard assessment tools… That are the same difficulty That are Given the same way each time. (AIMSweb® Offers these features.) *All data and identifying information presented are fictitious. Summer 2009
63
Lunch Break
64
Connect Two In your handout, connect any concepts or ideas that go together. Then share your connections with a partner across the room.
65
Adding a Student to Progress Monitoring
Page 29 in the SVVSD AIMSweb Training Manual for Progress Monitoring
66
Adding data to Progress Monitoring Schedules
Starts on page 35 in the SVVSD AIMSweb Training Manual for Progress Monitoring Need your norm charts first Where can you find them? You need the one for the measure you administered
67
Break Time
68
Logistics Entering New students
Google Doc for transfers Brand new to district, you add them MUST have student ID number (Unique Identifier) This avoids duplicates
69
Mark R. Shinn, Ph.D. Michelle M. Shinn, Ph.D.
Administration and Scoring of READING-CURRICULUM-BASED MEASUREMENT (R-CBM) for Use in General Outcome Measurement Mark R. Shinn, Ph.D. Michelle M. Shinn, Ph.D.
70
Administration & Scoring RCBM
Videos What Examiners Need To Do… Before Testing students While Testing students After Testing students
71
Things you Need Before Testing
1. Standard Reading Assessment Passage Student Copy: No numbers Between words (exception: 1st grade) An informative first sentence Same font style and size Text without pictures Obtain from your LAM
72
Things you Need Before Testing
2. Standard Reading Assessment Passage Examiner Copy: Pre-numbered so they can be scored quickly and immediately. Obtain from your LAM.
73
3. Tier 1 (Benchmark) R-CBM Probes:
AIMSweb Manager provides staff with copies of three grade-level probes (teacher and student copies). FALL (Sept.): Staff administer three, grade-level probes to each student. Report median score. WINTER (Jan.): Repeat administration of same three probes to each student. Report median score. SPRING (May): Repeat administration of same three probes to each student. Report median score. FALL (Sept): Grade 3 P01: It rained all day. P02: Billy was sitting. P03: Mama frog carried. E X A M P L WINTER (Jan): Grade 3 P01: It rained all day. P02: Billy was sitting. P03: Mama frog carried. SPRING (May): Grade 3 P01: It rained all day. P02: Billy was sitting. P03: Mama frog carried.
74
3. Tier 1 (Benchmark) R-CBM Probes:
AIMSweb Manager provides staff with copies of three grade-level probes (teacher and student copies). FALL: Staff administer three, grade-level probes to each student. Report median score. WINTER: Repeat administration of same three probes to each student. Report median score. SPRING: Repeat administration of same three probes to each student. Report median score.
75
Additional Assessment Aids Needed Before Testing
A list of students to be assessed Stop Watch (required—digital preferred) Clipboard Pencil Transparencies or paper copies of examiner passages Dry Marker or Pencil Wipe Cloth (for transparencies only)
76
Setting up Assessment Environment
Assessment environments are flexible and could include… A set-aside place in the classroom Reading station in the hall way Reading stations in the media center, cafeteria, gym, or empty classrooms
77
Things You Need to do While Testing
Follow the standardized directions: R-CBM is a standardized test Administer the assessment with consistency Remember it’s about testing, not teaching Don’t teach or correct Don’t practice reading the passages Remember best, not fastest reading Sit across from, not beside student
78
R-CBM Standard Directions for 1 Minute Administration
Place the unnumbered copy in front of the student. Place the numbered copy in front of you, but shielded so the student cannot see what you record. Say: When I say ‘Begin,’ start reading aloud at the top of this page. Read across the page (DEMONSTRATE BY POINTING). Try to read each word. If you come to a word you don’t know, I will tell it to you. Be sure to do your best reading. Are there any questions? (PAUSE) Say “Begin” and start your stopwatch when the student says the first word. If the student fails to say the first word of the passage after 3 seconds, tell them the word, mark it as incorrect, then start your stopwatch. Follow along on your copy. Put a slash ( / ) through words read incorrectly. At the end of 1 minute, place a bracket ( ] ) after the last word and say, “Stop.” Score and summarize by writing WRC/Errors
79
Items to Remember Emphasize Words Read Correctly (WRC). Get an accurate count. 3-Second Rule. No Other Corrections. Discontinue Rule. Be Polite. Best, not fastest. Interruptions.
80
Accuracy of Implementation (AIRS)
81
Things to do After Testing
Score immediately! Determine WRC. Put a slash (/) through incorrect words. If doing multiple samples, organize your impressions of qualitative features.
82
What is a Word Read Correctly?
Correctly pronounced words within context. Self-corrected incorrect words within 3 seconds.
83
What is an Error? Mispronunciation of the word Substitutions Omissions
3-Second pauses or struggles (examiner provides correct word)
84
What is not Incorrect? (Neither a WRC or an Error)
Repetitions Dialect differences Insertions (consider them qualitative errors)
85
Let’s Practice! Practice Exercises in your Handout Videos
86
Administration and Scoring of READING-MAZE (R-MAZE) for Use in General Outcome Measurement
Power Point Authored by Jillyan Kennedy Based on Administration and Scoring of Reading R-MAZE for Use with AIMSweb Training Workbook By Michelle M. Shinn, Ph.D. Mark R. Shinn, Ph.D
87
Making Data-Based Decisions With Progress Monitor
Typically need at least 7-10 data points (Shinn & Good, 1989) before making programming decision— and you may need to collect more if uncertain. Christ & Silberglitt (2007) recommended 6-9 data points As the number of data points increases, the effects of measurement error on the trend line decreases. Summer 2009
88
What further research says about growth rates
Summer 2009
89
Adding data to Progress Monitoring Schedules
Starts on page 35 in the SVVSD AIMSweb Training Manual for Progress Monitoring Remember to check your norm charts first
90
What do I do if… …student is reading well below grade level and can’t access the RCBM or MCOMP? Other assessments available that can help you determine the students’ needs: Tests of Early Literacy- (TEL) Letter Naming, Letter Sound, Phoneme Segmentation and Nonsense Word Fluency Tests of Early Numeracy- (TEN) Oral Counting, Number Identification, Quantity Discrimination, Missing Number Video
91
Curriculum Based Measurement Reading R-MAZE
Administration & Scoring of MAZE Curriculum Based Measurement Reading R-MAZE CBM R-MAZE is designed to provide educators a more complete picture of students’ reading skills, especially when comprehension problems are suspected. Area Timing Test Arrangements What is Scored? CBM R-MAZE Reading 3 minutes Individual, Small Group, or Classroom Group # of Correct Answers
92
Curriculum Based Measurement Reading R-MAZE (Continued)
R-MAZE is a multiple-choice cloze task that students complete while reading silently. The students are presented with word passages. The first sentence is left intact. After the first sentence, every 7th word is replaced with three word choices inside a parenthesis. The three choices consist of: Near Distracter Exact Match Far Distracter
93
Sample Grade 4 R-MAZE Passage
94
Examples of R-MAZE R-MAZE Workbook: Page 9
95
Observation Questions
What did you observe about Emma’s and Abby’s R-MAZE performances? What other conclusions can you draw?
96
Items Students Need Before Testing
What the students need for testing: CBM R-MAZE practice test Appropriate CBM R-MAZE passages Pencils
97
Items Administrators Need Before Testing
What the tester uses for testing: Stopwatch Appropriate CBM R-MAZE answer key Appropriate standardized directions List of students to be tested.
98
Things You Need to do While Testing
Follow the standardized directions: Attach a cover sheet that includes the practice test so that students do not begin the test right away. Do a simple practice test with younger students. Monitor to ensure students are circling answers instead of writing them. Be prepared to “Prorate” for students who may finish early. Try to avoid answering student questions. Adhere to the end of timing.
99
CBM R-MAZE Standard Directions
Pass R-MAZE tasks out to students. Have students write their names on the cover sheet, so they do not start early. Make sure they do not turn the page until you tell them to. Say this to the student (s): When I say ‘Begin’ I want you to silently read a story. You will have 3 minutes to read the story and complete the task. Listen carefully to the directions. Some of the words in the story are replaced with a group of 3 words. Your job is to circle the 1 word that makes the most sense in the story. Only 1 word is correct. Decide if a practice test is needed. Say . . . Let’s practice one together. Look at your first page. Read the first sentence silently while I read it out loud: ‘The dog, apple, broke, ran after the cat.’ The three choices are apple, broke, ran. ‘The dog apple after the cat.’ That sentence does not make sense. ‘The dog broke after the cat.’ That sentence does not make sense. ‘The dog ran after the cat.’ That sentence does make sense, so circle the word ran. (Make sure the students circle the word ran.)
100
CBM R-MAZE Standard Directions (Continued)
Let’s go to the next sentence. Read it silently while I read it out loud. ‘The cat ran fast, green, for up the hill. The three choices are fast, green, for up the hill. Which word is the correct word for the sentence? (The students answer fast) Yes, ‘The cat ran fast up the hill’ is correct, so circle the correct word fast. (Make sure students circle fast) Silently read the next sentence and raise your hand when you think you know the answer. (Make sure students know the correct word. Read the sentence with the correct answer) That’s right. ‘The dog barked at the cat’ is correct. Now what do you do when you choose the correct word? (Students answer ‘Circle it’. Make sure the students understand the task) That’s correct, you circle it. I think you’re ready to work on a story on your own.
101
CBM R-MAZE Standard Directions (Continued)
Start the testing by saying . . . When I say ‘Begin’ turn to the first story and start reading silently. When you come to a group of three words, circle the 1 word that makes the most sense. Work as quickly as you can without making mistakes. If you finish a/ the page/first side, turn the page and keep working until I say ‘Stop’ or you are all done. Do you have any questions? Then say, ‘Begin.’ Start your stopwatch. Monitor students to make sure they understand that they are to circle only 1 word. If a student finished before the time limit, collect the student’s R-MAZE task and record the time on the student’s test booklet. At the end of 3 minutes say: Stop. Put your pencils down. Please close your booklet. Collect the R-MAZE tasks.
102
CBM R-MAZE Familiar Directions
After the students have put their names on the cover sheer, start the testing by saying . . . When I say ‘Begin’ turn to the first story and start reading silently. When you come to a group of three words, circle the 1 word that makes the most sense. Work as quickly as you can without making mistakes. If you finish a/ the page/first side, turn the page and keep working until I say ‘Stop’ or you are all done. Do you have any questions? Then say, ‘Begin.’ Start your stopwatch. Monitor students to make sure they understand that they are to circle only 1 word. If a student finished before the time limit, collect the student’s R-MAZE task and record the time on the student’s test booklet. At the end of 3 minutes say: Stop. Put your pencils down. Please close your booklet. Collect the R-MAZE tasks.
103
Things to Do After Testing
Score immediately to ensure accurate results! Determine the number of words (items) correct. Use the answer key and put a slash (/) through incorrect words.
104
An answer is considered an error if the student:
CBM R-MAZE Scoring What is correct? The students circles the word that matches the correct word on the scoring template. What is incorrect? An answer is considered an error if the student: Circles an incorrect word Omits word selections other than those the student was unable to complete before the 3 minutes expired
105
How much data should be collected?
Making Data-Based Decisions With Progress Monitor Typically need at least 7-10 data points (Shinn & Good, 1989) before making programming decision— and you may need to collect more if uncertain. Christ & Silberglitt (2007) recommended 6-9 data points As the number of data points increases, the effects of measurement error on the trend line decreases. Summer 2009
106
Four Criteria To Consider:
How much data should be collected? Four Criteria To Consider: Criteria #1. Trend line meets (or is on-target to meet) AIM line for ultimate goal: Success! Once goal is met, consider transition to less intensive program or new goal as needed. Summer 2009
107
How much data should be collected?
Criteria #2. Trend line and AIM line will intersect in relatively near future? Keep with current intervention until goal is reached. *All data and identifying information presented are fictitious. Summer 2009
108
How much data should be collected?
Criteria #3a. Trend line exceeds AIM line? a. Consider increasing goal or difficulty level Grade 5 student reading grade 4 passages. Goal was changed from 104 wrc/min to 125 wrc/min. NOTE: When changing a goal to require a different grade level of material, start a new schedule. Do not use the same schedule as the data are not comparable (i.e., 50 wrc/min on a 5th grade passage means something different than 50 wrc/min on a 3rd grade passage.) *All data and identifying information presented are fictitious. Summer 2009
109
Grade K student on Grade K PSF probes.
How much data should be collected? Criteria #3b. Trend line exceeds AIM line? b. Or, retain the current intervention and close the gap even faster if this goal is the final performance level the student is to reach while being progress monitored. Grade K student on Grade K PSF probes. Student may reach goal in mid-March, rather than the end of May if progress continues at same rate of improvement. *All data and identifying information presented are fictitious. Summer 2009
110
Note four data points are already below the AIM line.
How much data should be collected? Criteria #4. Trend line will not likely intersect AIM line—and/or moves in opposite direction of AIM line: Consider adding additional intervention, changing variable, and/or intensifying program changes. Note four data points are already below the AIM line. *All data and identifying information presented are fictitious. Summer 2009
111
Progress Monitor (PM) Testing Frequency
General Guidelines Based on Best Practices & Research Progress Monitor (PM) Testing Frequency **Probable strength of PM data’s ability to reliably inform instruction: R-CBM Recommendation (Other measures need only one probe per session.) After 4 week period After 6 week period After 8 week period After 10+ week period 2x/week **Good **Excellent 1 probe 1x/week ** Fair Every ~10 days **Poor **Fair Every 2 weeks Every 3 weeks Poor Median of 3 probes Every 4+ weeks ** Consider all recommendations and guidelines presented within this AIMSweb® training module, as well as other local factors that may apply. Summer 2009 Copyright © 2009 Pearson Education, Inc. or its affiliate(s). All rights reserved.
112
Building Confidence in Decision-Making
Variability of the data: The “more variable” the data, the larger the error in the slope. The larger the error in the slope, the more data points are needed to gain confidence in the trend/actual progress made. The "tighter" the data, the fewer the number of data points potentially needed to be “confident” in the trend developing. Summer 2009
113
Building Confidence in Decision-Making
Note that a student may potentially be in an ineffective program longer than needed when data collection is not done frequently enough. 5 data points over 15 weeks. vs. 5 data points over 5 weeks. *All data and identifying information presented are fictitious. Summer 2009
114
Baseline Data KEY Aimline: Trend line: Corrects: Errors: Summer 2009
*All data and identifying information presented are fictitious. Summer 2009
115
*All data and identifying information presented are fictitious.
Summer 2009
116
*All data and identifying information presented are fictitious.
AIMSweb® Progress Monitor provides the new ROI after the entry of three (3) data points. Summer 2009
117
*All data and identifying information presented are fictitious.
Summer 2009
118
*All data and identifying information presented are fictitious.
Summer 2009
119
*All data and identifying information presented are fictitious.
Summer 2009
120
*All data and identifying information presented are fictitious.
Summer 2009
121
Is this enough data to evaluate efficacy of instructional program?
Teacher referenced research of Shinn (1989), Christ & Silberglitt (2007) and collected eight (8) data points thus far. Is this enough data to evaluate efficacy of instructional program? *All data and identifying information presented are fictitious. Summer 2009
122
Sample questions to ask when reviewing data:
Has instructional program been provided with fidelity? (Has this been observed directly?) Has student attendance been acceptable? Is core instruction also being provided in reading? Or, is student missing core instruction? Does instruction address student skill deficits? What other factors could be impacting student’s performance? *All data and identifying information presented are fictitious. Summer 2009
123
An “Intervention line” is added on the exact date the new intervention has begun.
*All data and identifying information presented are fictitious. Summer 2009
124
*All data and identifying information presented are fictitious.
An “Intervention line” is added on the exact date the new intervention has begun. Summer 2009
125
*All data and identifying information presented are fictitious.
Summer 2009
126
*All data and identifying information presented are fictitious.
Summer 2009
127
*All data and identifying information presented are fictitious.
Summer 2009
128
Intervention 3 added and performance observed.
*All data and identifying information presented are fictitious. Summer 2009
129
Intervention 3 added and performance observed.
Note: Skill regression & recoupment pattern during winter break between December 22-January 5. *All data and identifying information presented are fictitious. Summer 2009
130
*All data and identifying information presented are fictitious.
Summer 2009
131
Note: Gradual decrease in error rates and increase in words read correct over time.
*All data and identifying information presented are fictitious. Summer 2009
132
BIG IDEAS… Take into account other data sources
Triangulation of Data Lots of variability = more data collected to make decisions
133
Let’s Practice! Identify which decision rule you are using---in your manual on page 50-52
134
Let’s Practice!
135
Let’s Practice!
136
Let’s Practice!
137
Connecting Interventions to PM
Web-sites Math- Illuminations Reading- Florida Center for Reading Research
138
Wrap Around Activity Reflecting on learning is an important step in metacognition Participants form a circle Each individual takes a turn telling… Something the student will use from information or activities learned today Something the student will remember from today A significant AHA! from this session I have learned ________________ I hope to learn ________________
139
Here’s What, So What, Now What?
I need to collect 6-9 data points on the same measure at the same grade level So What? I can make better decisions for my students Now What? I will be more consistent with my data collection because I know about this research
140
Feedback Effect Size= .73 Most powerful when it is Teachers seeking feedback from students about their instruction. I want to seek feedback about my instruction today. Please complete the evaluation provided on the back of your handout. Thank you!!!
Similar presentations
© 2025 SlidePlayer.com Inc.
All rights reserved.