Presentation is loading. Please wait.

Presentation is loading. Please wait.

Local AIMSweb® Manager: Taking the Role An introduction to: Course Overview Role of a Local AIMSweb Manager (LAM) Skills needed to become a successful.

Similar presentations


Presentation on theme: "Local AIMSweb® Manager: Taking the Role An introduction to: Course Overview Role of a Local AIMSweb Manager (LAM) Skills needed to become a successful."— Presentation transcript:

1

2

3 Local AIMSweb® Manager: Taking the Role An introduction to: Course Overview Role of a Local AIMSweb Manager (LAM) Skills needed to become a successful LAM

4 Training Outcomes Learn the Big Picture: –Overview of AIMSweb System and using Curriculum-Based Measurement (CBM) –Overview of Progress Monitoring to affect student achievement and determine efficacy of interventions Learn to Administer and Score: –Oral Reading (R-CBM) –Reading Comprehension (MAZE) –Math Computation (M-CBM) –Early Numeracy (TEN) Learn to use AIMSweb Software: –Benchmark Assessments/Strategic Monitoring –Progress Monitoring –Management of the AIMSweb System Learn the fundamentals of organizing and implementing the AIMSweb Assessment System: –Training others in your organization –Student database upload –Expectations you should have for this course in assisting you with this process.

5 Training Intentions: For the Novice or Advanced AIMSweb User: –Materials: Receive training, materials and information that is useful for training others. –Administration & Scoring Practice: Learn how to conduct training sessions that will ensure inter-rater reliability, plus related materials. –Training Accounts: Learn how to use the AIMSweb software, and how to request training accounts for your own trainings. –Training Scripts: Follow along with pre-scripted scenarios that allow you exposure to the software. These materials may be duplicated and used in your own schools for subsequent training.

6 Realistic Expectations: –A comprehensive two day workshop in a train-the-trainer approach. –DURING THE WORKSHOP: The two days spent here provide you with solid familiarity, practice, and materials. This workshop will provide you with sufficient tools and exposure to the assessments and software which you may use a springboard to further learning and successful adoption of AIMSweb. –AFTER THE WORKSHOP: Requires frequent practice using materials provided and self-study to maintain and enhance skills taught. –Give yourself permission and time to experiment with software, practice presenting PowerPoint slides to others, and review readings as needed. Success comes most readily to those who have spent time with the system until a comfort level is achieved. –Do not be frustrated if you leave here feeling less than fully confident with all which is presented. This training will provide you with tools to use for weeks and months to come until you do gain fluency with the system. Then you will feel most confident when training others.

7 Helpful Hints: –Take thoughtful notes on the PowerPoint slide presentations. These will be the same slides that you are welcome to use in your own organizations as needed to train others. –Ask questions. If you’re thinking about it —someone else is likely doing the same! –Download and store these files in a safe location on your computer for future use. –Generate a support network: Get to know someone seated near you. Gather contact information from these individuals. In the future, it may be helpful for you to share notes and converse about this training as you begin to implement and/or use AIMSweb. –Not everyone here will be taking this course in order to learn to train others. Some may be simply improving their own knowledge or skills about AIMSweb. That is OK! Use the information presented in whichever ways it is most meaningful to you. Everyone is very welcome here!

8 Tips for Training Others: –You will be training others in your organization using this material. Again, take meaningful notes today that will prepare you for this task. –Download your materials from the website we provide to you. Practice presentations aloud to yourself several times in order to become comfortable with your delivery. –The handout copies for presentations you deliver should be printed in advance so that your participants may take notes about this presentation for later reference. –This material is copyrighted and is the property of Edformation, Inc., the creators of AIMSweb. Please contact Edformation, Inc. for written permission prior to using these materials other than for the purpose in which they were intended.

9 Continued Support: –AIMSonline Discussion Forum (located within your training accounts) –Documentation: Training scripts, user guides, Help button. –support@edformation.com Use (judiciously) when you have technical questions that are not answered in the tools already provided to you.support@edformation.com –Web link with downloadable materials for this training.

10 Let’s Begin: What are Local AIMSweb Managers (LAMs)? LAMs are individuals who take a leadership position in the use of the AIMSweb system within their organizations. As a LAM, you will likely be responsible for training colleagues to utilize various components of the AIMSweb system, according to the organization’s needs. Successful LAMs have a strong background in assessment and standardized test administration. Examples of LAMs include: School Psychologists, Speech Pathologists, other ancillary staff Teachers Administrators Reading Consultants Special Education Staff

11 Successful LAMs Successful LAMs are able to: Gain a solid understanding of the concepts surrounding the use of CBM and AIMSweb Understand the importance of training staff well so that trustworthy data are obtained Are fluent with the key factors which make AIMSweb training and implementation successful within your organization –Data integrity –Not taking on too much at once –Listening to the needs of your staff –Knowing the hierarchy of support available to you and staff –Building logical support systems within your organization

12 New to AIMSweb? Welcome to the AIMSweb System AIMSweb is an assessment and reporting system using: –Curriculum-Based Measurement (CBM) assessments Cover all core academic areas Scientifically researched for stability and validity Designed for formative assessment—continually informing educators of the benefit instructional programs provide –Data management and reporting software AIMSweb is user-friendly, web-based software Used to manage and report student data Reports assist staff in making determinations about the efficacy of instructional programs over time Program evaluation tool: Assists educators in determining the value of instructional programs and interventions for populations such as: –Individual students –Small groups of students –Classrooms, grade-levels –Schools, districts/large organizations

13

14

15 OVERVIEW OF CURRICULUM-BASED MEASUREMENT AS A GENERAL OUTCOME MEASURE Mark R. Shinn, Ph.D. Michelle M. Shinn, Ph.D.

16 Formative Evaluation to Inform Teaching Summative Assessment: Culmination measure. Mastery assessment. Pass/fail type assessments which summarize the knowledge students learn. Typical summative assessments include: End of chapter tests High-stakes tests (e.g., State assessments) GRE, ACT, SAT, GMAT, etc. tests Driver’s license test Final Exams. Formative Evaluation: Process of assessing student achievement during instruction to determine whether an instructional program is effective for individual students. Informs: When students are progressing, continue using your instructional programs. When tests show that students are not progressing, you can change your instructional programs in meaningful ways.

17 Big ideas of Benchmark Assessment It’s about using General Outcome Measures (GOMs) for formative assessment/evaluation to: ANDInform teaching AND ensure accountability. It’s different from, but related to, summative high-stakes testing/evaluation, which: Doesn’t inform teaching. Mostly used for accountability/motivation.

18 Today’s High Stakes Evaluation World High Stakes Tests are USUALLY appropriate for only for summative evaluation. NOT useful for decisions teachers need to make every day (formative): For whom do I need to individualize instruction or find more intensive instructional programs? How do I organize my classrooms for instructional grouping? How do I know that my teaching is “working” for each student so that I can make changes in instruction when necessary? NOT very useful to administrators who must make decisions about allocating instructional resources, especially in a preventative or responsive model.

19 High Stakes Evaluation World (continued) Reliability/Validity issues on High-stakes tests:Reliability/Validity issues on High-stakes tests: Guessing factor (bubble-in, skipping items, etc.)Guessing factor (bubble-in, skipping items, etc.) Cheating—at various levelsCheating—at various levels Culture-sensitivity concernsCulture-sensitivity concerns Test may not match what is taughtTest may not match what is taught FatigueFatigue Enabling behaviors may not be present for test that is requiredEnabling behaviors may not be present for test that is required Text anxietyText anxiety Political pressures on student performancePolitical pressures on student performance Testing often takes place after year(s) of instruction with long time frames between (annual). Information about success and failure rates provided too late to make changes. Primary use of high stakes tests then may be to “assign the blame” to students, their parents, teachers, or schools.

20 High Stakes Evaluation World (continued) High Stakes = High cost in terms of: Loss of instructional time. Time for test taking. Paying for the tests. In summary, typical High Stakes Testing is: Too little! Too late! At too high a cost!

21 An Example: Weight High standard: All children will have a healthy weight by the end of third grade. High Stakes Assessment: Based on assessing body density. Weighing each student. Immersing each student in a large tub filled with water, and measuring the amount of water displaced. Divide weight by displacement and get density, a very accurate picture of physical status.

22 Weight (continued) After 8-9 YEARS of growth, we would: 1.Place 1.Place students who are “unhealthy” in remedial programs. 2.Create 2.Create new health programs. 3.Blame 3.Blame the effectiveness of old health programs. 4.Blame 4.Blame the students (or their families) for over - or under - eating.

23 Formative Evaluation: Same Standard - Different Assessment High Standard: All children will have a healthy weight by the end of third grade. Benchmark Assessment: Monitor weight directly, frequently, and continuously. From birth, measure weight frequently and continuously with a simple, albeit less precise, general outcome measure, weight in pounds, using a scale.

24 At ANY Point in Development The child could be weighed and a decision made about healthy weight. This process is: Efficient. Efficient. Sufficiently accurate. Sufficiently accurate. Proactive. Proactive. Cost effective Cost effective We would know their health status before they reached the high stakes point!

25 AIMSweb in a Picture and a Sentence AIMSweb is a 3-tier Progress Monitoring System based on direct, frequent and continuous student assessment which is reported to students, parents, teachers and administrators via a web based data management and reporting system for the purpose of determining response to instruction.

26 Common Characteristics of GOMs The same kind of evaluation technology as other professions Powerful Powerful measures that are: Simple Simple Accurate Accurate Efficient decisions Efficient indicators of performance that guide and inform a variety of decisions Generalizable thermometer that allows for reliable, valid, cross comparisons of data Generalizable thermometer that allows for reliable, valid, cross comparisons of data

27 General Outcome Measures (GOMs) from Other Fields Medicine measures height, weight, temperature, and/or blood pressure. Federal Reserve Board measures the Consumer Price Index. Wall Street measures the Dow-Jones Industrial Average. Companies report earnings per share. McDonald’s measures how many hamburgers they sell.

28 CBM is a GOM Used for Scientific Reasons Based on Evidence Reliablevalid Reliable and valid indicator of student achievement Simple, efficientshort Simple, efficient, and of short duration to facilitate frequent administration by teachers helps teachers plan better instruction Provides assessment information that helps teachers plan better instruction Sensitive to the improvement Sensitive to the improvement of students’ achievement over time Easily understood Easily understood by teachers and parents Improves achievement Improves achievement when used to monitor progress

29 Things to Always Remember About CBM “indicators” important Designed to serve as “indicators” of general reading achievement: CBM probes don’t measure everything, but measure the important things. Standardized tests standard way Standardized tests to be given, scored, and interpreted in a standard way Researched Researched with respect to psychometric properties to ensure accurate measures of learning

30 Items to Remember (continued) Are sensitive to improvement in brief intervals of time Also tell us how students earned their scores (qualitative information) Designed to be as short as possible to ensure its “do ability” Are linked to decision making for promoting positive achievement and Problem-Solving

31 What is CBM? CBM is a form of Curriculum-Based Assessment (CBA). Curriculum-Based Measurement (CBM) is the method of monitoring student progress through direct, continuous assessment of basic skills. CBM is used to assess skills such as reading fluency, comprehension, spelling, mathematics, and written expression. Early literacy skills (phonics and phonological awareness) are similar measures and are downward extensions of CBM. CBM probes last from 1 to 4 minutes depending on the skill being measured and student performance is scored for speed and accuracy to determine proficiency. Because CBM probes are quick to administer and simple to score, they can be given frequently to provide continuous progress data. The results are charted and provide for timely evaluation based on hard data.

32 Origins of CBM as General Outcome Measures Curriculum-Based Measurement Curriculum-Based Measurement (CBM) was developed more than 20 years ago by Stanley Deno at the University of Minnesota through a federal contract to develop a reliable and valid measurement system for evaluating basic skills growth. 25 years of school-based research CBM is supported by more than 25 years of school-based research by the US Department of Education. Starting in the area of reading, researchers have expanded to investigate additional academic areas over the years. With significant additional research, AIMSweb now offers CBM assessments in 7 areas and 2 languages, with new areas currently being tested and developed. 100’s of articles, book chapters, and books in the professional literature Supporting documentation can be found in 100’s of articles, book chapters, and books in the professional literature describing the use of CBM to make a variety of important educational decisions.

33 Skill Areas Currently Assessable via AIMSweb: Early Literacy [K-1 benchmark, Progress Monitor (PM) any age] –Letter Naming Fluency –Letter sound fluency –Phonemic Segmentation Fluency –Nonsense Word Fluency Early Numeracy (K-1 benchmark, PM any age) –Oral Counting –Number identification –Quantity discrimination –Missing number Oral Reading (K-8, PM any age) MAZE (Reading comprehension); (1-8, PM any age) Math Computation (1-6, PM any age) Math Facts (PM any age) Spelling (1-8, PM any age) Written Expression (1-8, PM any age) Early Literacy and Oral Reading—Spanish (K-8)

34 Advantages of CBM Direct measure of student performance. Correlates strongly with “best practices” for instruction and assessment, and research-supported methods for assessment and intervention. Focus is on repeated measures of performance. (This cannot be done with most norm-referenced and standardized tests due to practice effect or limited forms.)

35 Advantages of Using CBM Quick to administer, simple, easy, and cost-efficient. Performance is graphed an analyzed over time Sensitive to even small improvements in performance This is KEY—as most standardized/norm-referenced tests do NOT show small, incremental gains. CBM allows teachers to do what they do better! Capable of having many forms—whereas most standardized tests only have a maximum of two forms. Monitoring frequently enables staff to see trends in individual and group performance—and compare those trends with targets set for their students.

36 Big Ideas of Benchmark (Tier 1) Assessment Benchmarking allows us to add systematic Formative Evaluation to current practice. For Teachers (and Students) Early Identification of At Risk Students Early Identification of At Risk Students Instructional Planning Instructional Planning Progress Monitoring Progress Monitoring For Parents Opportunities for Communication/Involvement Opportunities for Communication/Involvement Accountability Accountability For Administrators Resource Allocation/Planning and Support Resource Allocation/Planning and Support Accountability Accountability

37 Benchmark Testing: Useful for Communicating with Parents Designed for collaboration and communication with parents. Student achievement is enhanced by the teacher-parent communication about achievement growth on a continuous basis. A Parent Report is produced for each Benchmark Testing.

38 2006 2007 School Calendar Year (2006-2007): Benchmarking (Tier 1) 2-weeks during: September 1 to October 15 January 1 to February 1 May 1 to June 1

39 Benchmark (Tier 1) for Oral Reading (R-CBM) Set of 3 probes (passages) at grade-level* (~1 through 8). Administer the same set, three times per year, to all students. Requires 3 minutes per student, 3 times per year. 2 3 1

40 How the AIMSweb System Works for Benchmark (Tier 1): Oral Reading (R-CBM) as an Example Research suggests there is no significant practice effect by repeating the set of three passages for benchmark assessment periods. Using same passage sets for each benchmark increases confidence in data obtained, reduces extraneous variables. Edformation Standard Reading Assessment Passages. Students read aloud for 1 minute from each of the three Edformation Standard Reading Assessment Passages. Passages contain Passages contain meaningful, connected text. Number of words read correctly (wrc) and number of errors are counted per passage read. Scores reported as Scores reported as WRC/errors

41 Benefits of Using Edformation’s Standard Reading Assessment Passages Passages are written to represent general curriculum or to be “curriculum independent” For additional data on R-CBM passages, review: Allow decision making about reading growth, regardless of between-school, between-school-district, between-teacher differences in reading curriculum Are graded to be of equal difficulty Have numerous alternate forms for testing over time without practice effects Standard Reading Assessment Passages for Use in General Outcome Measurement: A manual describing development and technical features. Kathryn B. Howe, Ph.D. & Michelle M. Shinn, Ph.D.

42 Sample R-CBM Assessment Passage — Student Copy Standard Reading Assessment Passage Student Copy: No numbers Between 250-300 words (exception: 1st grade) An informative first sentence Same font style and size Text without pictures

43 Sample R-CBM Assessment Passage — Examiner Copy Standard Reading Assessment Passage Examiner Copy: Pre-numbered so they can be scored quickly and immediately.

44 Data: Get the MEDIAN score for student’s 3 passages : 67 / 2 85 / 8 74 / 9 1 min. 1 min. 1 min. 1 min. 1 min. 1 min. Why use Median vs. Average? Averages are susceptible to outliers when dealing with small number sets. Median Score is a statistically more reliable number than average for R-CBM.

45 1 min. 1 min. 1 min. 1 min. 1 min. 1 min. The Data: Get the MEDIAN score for 3 passages: 67 / 2 85 / 8 74 / 9 1. Throw out the HIGH and LOW scores for Words Read Correct

46 1 min. 1 min. 1 min. 1 min. 1 min. 1 min. 2. Throw out the HIGH and LOW scores for the Errors. Remaining scores = MEDIAN. Remaining scores = MEDIAN. 3. Report this score in your AIMSweb account. The Data: Get the MEDIAN score for 3 passages: 67 / 2 85 / 8 74 / 9 =74/8

47 Managing Data after Assessment: Take median score for each student and report in AIMSweb System. AIMSweb instantly generates multiple reports for analysis and various decision-making purposes. A few of the many reports available appear here: SAMPLESSAMPLESSAMPLESSAMPLES =74/8

48 For Teachers: Classroom Report

49 Box & Whiskers Graphs (box plots): A Brief Explanation AIMSweb commonly uses box plots to report data. This chart will help familiarize yourself with box plots: Consider bell-curve. Box plots are somewhat similar in shape and representation. 75 th percentile Median (50 th percentile) 25 th percentile 90 th percentile 10 th percentile outlier Average range of population included in sample. Below Average Range Above Average Range

50 Report Beginning of Year Status

51 Individual Report: Student

52 Know When Things are Working

53 Have Data to Know When Things Need Changing

54 Data to Know that Changes Made a Difference

55 Data to Know that Things Went Well

56 Identifying At Risk Students

57 For Teachers: Classroom Report

58 At-a-Glance Views of Student Ranking & Growth

59 Follow student progress over time.

60 Compare Sub-group Trends:

61 Compare a School to a Composite

62 Many Reporting Options Available

63 Finally… Benchmark Testing, using simple general, RESEARCHED outcome measures, provides an ONGOING data base to teachers, administrators, and parents for making decisions about the growth and development of basic skills. Professionally managed by staff in a process that communicates that WE are in charge of student learning.

64 The End

65

66 Administration and Scoring of READING-CURRICULUM-BASED MEASUREMENT (R-CBM) for Use in General Outcome Measurement Mark R. Shinn, Ph.D. Michelle M. Shinn, Ph.D.

67 Overview of Reading-CBM Assessment Training Session Part of a training series developed to accompany the AIMSweb Improvement System. Purpose is to provide the background information and data collection procedures necessary to administer and score Reading - Curriculum Based Measurement (R-CBM). Designed to accompany: Administration and Scoring of Reading Curriculum-Based Measurement for Use in General Outcome Measurement (R-CBM) Training Workbook Standard Reading Assessment Passages AIMSweb Web-based Software Training Video

68 Training Session Goals Brief review of Curriculum-Based Measurement (CBM) and General Outcome Measurement GOM). Its Purpose. Its Purpose. Its Origins. Its Origins. Learn how to administer and score through applied practice.

69 General Outcome Measures from Other Fields Medicine measures height, weight, temperature, and/or blood pressure. Federal Reserve Board measures the Consumer Price Index. Wall Street measures the Dow-Jones Industrial Average. Companies report earnings per share. McDonald’s measures how many hamburgers they sell.

70 Common Characteristics of GOMs Simple, accurateinexpensive Simple, accurate, and reasonably inexpensive in terms of time and materials. routine. Considered so important to doing business well that they are routine. collectedongoing frequent basis. Are collected on an ongoing and frequent basis. important decisions. Shape or inform a variety of important decisions.

71 Origins of CBM as General Outcome Measures Curriculum-Based Measurement Curriculum-Based Measurement (CBM) was developed more than 20 years ago by Stanley Deno at the University of Minnesota through a federal contract to develop a reliable and valid measurement system for evaluating basic skills growth. 25 years of school-based research CBM is supported by more than 25 years of school-based research by the US Department of Education. 150 articles, book chapters, and books in the professional literature Supporting documentation can be found in more than 150 articles, book chapters, and books in the professional literature describing the use of CBM to make a variety of important educational decisions.

72 CBM was Design to Provide Educators With……. The same kind of evaluation technology as other professions… Powerful Powerful measures that are: Simple Simple Accurate Accurate Efficient decisions Efficient indicators of student achievement that guide and inform a variety of decisions

73 How the AIMSweb System Works for Benchmark (Tier 1): Oral Reading (R-CBM) Research suggests there is no significant practice effect by repeating the set of three passages for benchmark assessment periods. Using same passage sets for each benchmark increases confidence in data obtained, reduces extraneous variables. Edformation Standard Reading Assessment Passages. Students read aloud for 1 minute from each of the three Edformation Standard Reading Assessment Passages. Passages contain Passages contain meaningful, connected text. Number of words read correctly (wrc) and number of errors are counted per passage read. Scores reported as Scores reported as WRC/errors

74 Benefits of Using Edformation’s Standard Reading Assessment Passages Passages are written to represent general curriculum or to be “curriculum independent” For additional data on R-CBM passages, review: Allows decision making about reading growth, regardless of between-school, between-school-district, between-teacher differences in reading curriculum Are graded to be of equal difficulty Have numerous alternate forms for testing over time without practice effects Standard Reading Assessment Passages for Use in General Outcome Measurement: A manual describing development and technical features. Kathryn B. Howe, Ph.D. & Michelle M. Shinn, Ph.D.

75 An Example of R-CBM - Illustration 1 Practice Video

76 Observation Questions What did you observe about this child’s reading? Is she a good reader? Give your reason(s) for your answer to the second question? About how many words did she read correctly?

77 An Example of R-CBM - Illustration 2 Practice Video

78 Observation Questions What did you observe about this child’s reading? Is she a good reader? Give your reason(s) for your answer to the second question? About how many words did she read correctly?

79 Qualitative Features Worth Noting R-CBM Workbook: Page 19

80 R-CBM is Used for Scientific Reasons Based on Evidence: reliablevalid It is a reliable and valid indicator of student achievement. simple, efficientshort It is simple, efficient, and of short duration to facilitate frequent administration by teachers. helps teachers plan better instruction. It provides assessment information that helps teachers plan better instruction. sensitive to the improvement It is sensitive to the improvement of students’ achievement over time. easily understood It is easily understood by teachers and parents. Improves achievement Improves achievement when used to monitor progress.

81 Things to Always Remember About R-CBM “indicators” important Are designed to serve as “indicators” of general reading achievement. R-CBM doesn’t measure everything, but measures the important things. Standardized tests standard way. Are Standardized tests to be given, scored, and interpreted in a standard way. researched Are researched with respect to psychometric properties to ensure accurate measures of learning.

82 Items to Remember (continued) SensitiveShort Periods Are Sensitive to improvement in Short Periods of time. how Qualitative Also tell us how students earned their scores (Qualitative Information). as short as possible“do ability.” Designed to be as short as possible to ensure its “do ability.” linked to decision making Are linked to decision making for promoting positive achievement and Problem-Solving.

83 Benefits of Using Edformation’s Standard Reading Assessment Passages generalcurriculum “curriculum independent.” Are written to represent general curriculum or be “curriculum independent.” Allow decision making Allow decision making about reading growth, regardless of between- school, between-school-district, between-teacher differences in reading curriculum. gradedequal Are graded to be of equal difficulty. alternate forms Have numerous alternate forms for testing over time without practice effects.

84 Administration and Scoring of R-CBM What Examiners Need To Do… Before Before Testing students While While Testing students After After Testing students

85 Things you Need Before Testing 1. Standard Reading Assessment Passage Student Copy: No numbers Between 250-300 words (exception: 1st grade) An informative first sentence Same font style and size Text without pictures Obtain from your LAM

86 Things you Need Before Testing 2. Standard Reading Assessment Passage Examiner Copy: Pre-numbered so they can be scored quickly and immediately. Obtain from your LAM.

87 3. Tier 1 (Benchmark) R-CBM Probes: 1.AIMSweb Manager provides staff with copies of three grade-level probes (teacher and student copies). 2.FALL: Staff administer three, grade-level probes to each student. Report median score. 3.WINTER: Repeat administration of same three probes to each student. Report median score. 4.SPRING: Repeat administration of same three probes to each student. Report median score.

88 4. Additional Assessment Aids Needed Before Testing A list of students to be assessed Stop Watch (required—digital preferred) Clipboard Pencil Transparencies or paper copies of examiner passages Dry Marker or Pencil Wipe Cloth (for transparencies only)

89 Setting up Assessment Environment Assessment environments are flexible and could include… A set-aside place in the classroom Reading station in the hall way Reading stations in the media center, cafeteria, gym, or empty classrooms

90 Things You Need to do While Testing Follow the standardized directions: R-CBM is a standardized test Administer the assessment with consistency Remember it’s about testing, not teaching Don’t teach or correct Don’t practice reading the passages Remember best, not fastest reading Sit across from, not beside student

91 R-CBM Standard Directions for 1 Minute Administration 1)unnumbered 1)Place the unnumbered copy in front of the student. 2)numbered 2)Place the numbered copy in front of you, but shielded so the student cannot see what you record. 3) 3)Say: When I say ‘Begin,’ start reading aloud at the top of this page. Read across the page Try to read each word. If you come to a word you don’t know, I will tell it to you. Be sure to do your best reading. Are there any questions? When I say ‘Begin,’ start reading aloud at the top of this page. Read across the page (DEMONSTRATE BY POINTING). Try to read each word. If you come to a word you don’t know, I will tell it to you. Be sure to do your best reading. Are there any questions? (PAUSE) 4)“Begin” 4)Say “Begin” and start your stopwatch when the student says the first word. If the student fails to say the first word of the passage after 3 seconds, tell them the word, mark it as incorrect, then start your stopwatch. 5) 5)Follow along on your copy. Put a slash ( / ) through words read incorrectly. 6) “Stop.” 6)At the end of 1 minute, place a bracket ( ] ) after the last word and say, “Stop.” 7) 7) Score and summarize by writing WRC/Errors

92 “Familiar” Shortened Directions When students are assessed frequently and know the directions. Say: When I say ‘Begin,’ start reading aloud at the top of this page.

93 Items to Remember Emphasize Words Read Correctly (WRC). Get an accurate count. 3-Second Rule. No Other Corrections. Discontinue Rule. Be Polite. Best, not fastest. Interruptions.

94 Accuracy of Implementation (AIRS)

95 Things to do After Testing Score immediately! Determine WRC. Put a slash (/) through incorrect words. If doing multiple samples, organize your impressions of qualitative features.

96 What is a Word Read Correctly? Correctly pronounced Correctly pronounced words within context. Self-corrected incorrect Self-corrected incorrect words within 3 seconds.

97 What is an Error? Mispronunciation Mispronunciation of the wordSubstitutionsOmissions 3-Second 3-Second pauses or struggles (examiner provides correct word)

98 What is not Incorrect? (Neither a WRC or an Error) Repetitions Dialect d Dialect differences Insertions Insertions (consider them qualitative errors)

99 Example Juan finished reading after 1 minute at the 145th word, so he read 145 words total. Juan also made 3 errors. Therefore his WRC was 142 with 3 errors. Reported as: 142/3.

100 Calculating and R-CBM Scores total Record total number of words read. errors. Subtract the number of errors. standard format of WRC/Errors Report in standard format of WRC/Errors (72/3).

101 R-CBM Scoring Rules and Examples A complete list of scoring rules can be found in the Appendix of your workbook. Please review and become familiar with the more unusual errors.

102 Benchmark Data: Obtain MEDIAN score for student’s 3 passages: 67 / 2 85 / 8 74 / 9 1 min. 1 min. 1 min. 1 min. 1 min. 1 min. Why use Median vs. Average? Averages are susceptible to outliers when dealing with small number sets. Median Score is a statistically more reliable number than average for R-CBM.

103 1 min. 1 min. 1 min. 1 min. 1 min. 1 min. The Data: Obtain MEDIAN score for 3 passages: 67 / 2 85 / 8 74 / 9 1. Throw out the HIGH and LOW scores for Words Read Correct (WRC)

104 1 min. 1 min. 1 min. 1 min. 1 min. 1 min. 2. Throw out the HIGH and LOW scores for the Errors. Remaining scores = MEDIAN. Remaining scores = MEDIAN. 3. Report this score in your AIMSweb account. The Data: Obtain MEDIAN score for 3 passages: 67 / 2 85 / 8 74 / 9 =74/8

105 Determining Inter-Rater Agreement Example: Dave 2 examiners observed Dave’s reading: 1 scored Dave as 100 WRC 1 scored Dave as 98 WRC They agreed that Dave read 98 of the words correct. They agreed that Dave read 98 of the words correct. They disagreed on 2 words correct. They disagreed on 2 words correct. Inter-rater Agreement Formula: Agreements/(Agreements + Disagreements) x 100=IRA (98)/ (98 + 2)= (98)/ (98 + 2)= 98/100 =.98 98/100 =.98.98 x 100 = 98% Inter-rater Agreement for Dave is: 98%. (Goal is 95% or better.) See R-CBM Workbook: Page 14

106 Prepare to Practice You now have the building blocks to begin R-CBM assessment to ensure reading growth. Practice to automaticity Practice to automaticity — You’ll become more efficient. Determine IRR Determine IRR with AIRS for accuracy and efficiency with a colleague. Stay in tune Stay in tune by periodically checking AIRS.

107 Prepare to Practice

108 Practice Exercise 1: Let’s Score

109 This student read 73 WRC/7 Errors

110 Practice Exercise 2: Let’s Score

111 This student read 97 WRC/3 Errors

112 Practice Exercise 3: Let’s Score

113 This student read 141 WRC/2 Errors

114 Practice Exercise 4: Let’s Score

115 This student read 86 WRC/5 Errors

116 Practice Exercise 5: Let’s Score

117 This student read 88 WRC/2 Errors

118 Practice Exercise 6: Let’s Score

119 This student read 99 WRC/1 Error

120 Practice Exercise 7: Let’s Score

121 This student read 96 WRC/6 Errors

122 Practice Exercise 8: Let’s Score

123 This student read 139 WRC/2 Errors

124 The End

125

126 Administration and Scoring of READING-MAZE (R-MAZE) for Use in General Outcome Measurement Power Point Authored by Jillyan Kennedy Based on Administration and Scoring of Reading R-MAZE for Use with AIMSweb Training Workbook By Michelle M. Shinn, Ph.D. Mark R. Shinn, Ph.D

127 Overview of the CBM R-MAZE Assessment Training Session The Purpose is to provide background information, data collection procedures, and practice opportunities to administer and score Reading MAZE. Designed to accompany: Administration and Scoring of Reading R-MAZE for Use in General Outcome Measurement Training Workbook Administration and Scoring of Reading R-MAZE for Use in General Outcome Measurement Training Workbook Standard R-MAZE Reading Passages Standard R-MAZE Reading Passages AIMSweb Web-based Software AIMSweb Web-based Software

128 Training Session Goals 1.Brief review of Curriculum Based Measurement (CBM) and General Outcome Measurement (GOM). Its Purpose Its Purpose Its Origin Its Origin 2.Learn how to administer and score CBM R-MAZE through applied practice.

129 R-MAZE is Used for Scientific Reasons Based on Evidence It is a reliable and valid indicator of student achievement. It is a reliable and valid indicator of student achievement. It is simple, efficient, and of short duration to facilitate frequent administration by teachers. It is simple, efficient, and of short duration to facilitate frequent administration by teachers. It provides assessment information that helps teachers plan better instruction. It provides assessment information that helps teachers plan better instruction. It is sensitive to the improvement of students’ achievement over time. It is sensitive to the improvement of students’ achievement over time. It is easily understood by teachers and parents. It is easily understood by teachers and parents. Improves achievement when used to monitor progress. Improves achievement when used to monitor progress.

130 Curriculum Based Measurement Reading R-MAZE AreaTimingTest ArrangementsWhat is Scored? CBM R-MAZE Reading 3 minutes Individual, Small Group, or Classroom Group # of Correct Answers CBM R-MAZE is designed to provide educators a more complete picture of students’ reading skills, especially when comprehension problems are suspected. Best for : Students in Grades 3 through 8 Students in Grades 3 through 8 Students in Grades 3 through 8, but who are performing below | grade-level Students in Grades 3 through 8, but who are performing below | grade-level

131 Curriculum Based Measurement Reading R-MAZE (Continued) R-MAZE is a multiple-choice cloze task that students complete while reading silently. R-MAZE is a multiple-choice cloze task that students complete while reading silently. The students are presented with 150-400 word passages. The students are presented with 150-400 word passages. The first sentence is left intact. The first sentence is left intact. After the first sentence, every 7th word is replaced with three word choices inside a parenthesis. After the first sentence, every 7th word is replaced with three word choices inside a parenthesis. The three choices consist of: The three choices consist of: 1)Near Distracter 2)Exact Match 3)Far Distracter

132 Sample Grade 4 R-MAZE Passage

133 Examples of R-MAZE R-MAZE Workbook: Page 9

134 Observation Questions 1.What did you observe about Emma’s and Abby’s R-MAZE performances? 2.What other conclusions can you draw?

135 Things to Always Remember About CBM R-MAZE Are designed to serve as “indicators” of general reading achievement. Are designed to serve as “indicators” of general reading achievement. Are standardized tests to be given, scored, and interpreted in a standard way. Are standardized tests to be given, scored, and interpreted in a standard way. Are researched with respect to psychometric properties to ensure accurate measures of learning. Are researched with respect to psychometric properties to ensure accurate measures of learning.

136 Things to Always Remember About CBM R-MAZE (Continued) Are sensitive to improvement in Short Periods of time. Are sensitive to improvement in Short Periods of time. Designed to be as short as possible to ensure its “do ability.” Designed to be as short as possible to ensure its “do ability.” Are linked to decision making for promoting positive achievement and Problem-Solving. Are linked to decision making for promoting positive achievement and Problem-Solving.

137 Administration and Scoring of CBM R-MAZE What examiners need to do... Before testing students Before testing students While testing students While testing students After testing students After testing students

138 Items Students Need Before Testing What the students need for testing: CBM R-MAZE practice test CBM R-MAZE practice test Appropriate CBM R-MAZE passages Appropriate CBM R-MAZE passages Pencils Pencils

139 Items Administrators Need Before Testing What the tester uses for testing: Stopwatch Stopwatch Appropriate CBM R-MAZE answer key Appropriate CBM R-MAZE answer key Appropriate standardized directions Appropriate standardized directions List of students to be tested. List of students to be tested.

140 Additional Assessment Aids A List of students to be tested A List of students to be tested Stopwatch (required—digital preferred) Stopwatch (required—digital preferred)

141 Setting up Assessment Environment The classroom if assessing the entire class. The classroom if assessing the entire class. A cluster of desks or small tables in the classroom for small group assessment. A cluster of desks or small tables in the classroom for small group assessment. Individual desks or “stations” for individual assessment. Individual desks or “stations” for individual assessment. Assessment environments are flexible and could include… Assessment environments are flexible and could include…

142 Things You Need to do While Testing Attach a cover sheet that includes the practice test so that students do not begin the test right away. Attach a cover sheet that includes the practice test so that students do not begin the test right away. Do a simple practice test with younger students. Do a simple practice test with younger students. Monitor to ensure students are circling answers instead of writing them. Monitor to ensure students are circling answers instead of writing them. Be prepared to “Prorate” for students who may finish early. Be prepared to “Prorate” for students who may finish early. Try to avoid answering student questions. Try to avoid answering student questions. Adhere to the end of timing. Adhere to the end of timing. Follow the standardized directions: Follow the standardized directions:

143 CBM R-MAZE Standard Directions 1) 1)Pass R-MAZE tasks out to students. Have students write their names on the cover sheet, so they do not start early. Make sure they do not turn the page until you tell them to. 2) 2)Say this to the student (s): When I say ‘Begin’ I want you to silently read a story. You will have 3 minutes to read the story and complete the task. Listen carefully to the directions. Some of the words in the story are replaced with a group of 3 words. Your job is to circle the 1 word that makes the most sense in the story. Only 1 word is correct. 3) 3)Decide if a practice test is needed. Say... Let’s practice one together. Look at your first page. Read the first sentence silently while I read it out loud: ‘The dog, apple, broke, ran after the cat.’ The three choices are apple, broke, ran. ‘The dog apple after the cat.’ That sentence does not make sense. ‘The dog broke after the cat.’ That sentence does not make sense. ‘The dog ran after the cat.’ That sentence does make sense, so circle the word ran. (Make sure the students circle the word ran.)

144 CBM R-MAZE Standard Directions (Continued) Let’s go to the next sentence. Read it silently while I read it out loud. ‘The cat ran fast, green, for up the hill. The three choices are fast, green, for up the hill. Which word is the correct word for the sentence? (The students answer fast) Yes, ‘The cat ran fast up the hill’ is correct, so circle the correct word fast. (Make sure students circle fast) Silently read the next sentence and raise your hand when you think you know the answer. (Make sure students know the correct word. Read the sentence with the correct answer) That’s right. ‘The dog barked at the cat’ is correct. Now what do you do when you choose the correct word? (Students answer ‘Circle it’. Make sure the students understand the task) That’s correct, you circle it. I think you’re ready to work on a story on your own.

145 CBM R-MAZE Standard Directions (Continued) 4) 4)Start the testing by saying... When I say ‘Begin’ turn to the first story and start reading silently. When you come to a group of three words, circle the 1 word that makes the most sense. Work as quickly as you can without making mistakes. If you finish a/ the page/first side, turn the page and keep working until I say ‘Stop’ or you are all done. Do you have any questions? 5) 5)Then say, ‘Begin.’ Start your stopwatch. 6) 6)Monitor students to make sure they understand that they are to circle only 1 word. 7) 7)If a student finished before the time limit, collect the student’s R-MAZE task and record the time on the student’s test booklet. 8) 8)At the end of 3 minutes say: Stop. Put your pencils down. Please close your booklet. 9) 9)Collect the R-MAZE tasks.

146 CBM R-MAZE Familiar Directions 1) 1)After the students have put their names on the cover sheer, start the testing by saying... When I say ‘Begin’ turn to the first story and start reading silently. When you come to a group of three words, circle the 1 word that makes the most sense. Work as quickly as you can without making mistakes. If you finish a/ the page/first side, turn the page and keep working until I say ‘Stop’ or you are all done. Do you have any questions? 2) 2)Then say, ‘Begin.’ Start your stopwatch. 3) 3)Monitor students to make sure they understand that they are to circle only 1 word. 4) 4)If a student finished before the time limit, collect the student’s R-MAZE task and record the time on the student’s test booklet. 5) 5)At the end of 3 minutes say: Stop. Put your pencils down. Please close your booklet. 6) 6)Collect the R-MAZE tasks.

147 Things to Do After Testing Score immediately to ensure accurate results! Score immediately to ensure accurate results! Determine the number of words (items) correct. Determine the number of words (items) correct. Use the answer key and put a slash (/) through incorrect words. Use the answer key and put a slash (/) through incorrect words.

148 CBM R-MAZE Scoring What is correct? The students circles the word that matches the correct word on the scoring template. What is incorrect? An answer is considered an error if the student: 1)Circles an incorrect word 2)Omits word selections other than those the student was unable to complete before the 3 minutes expired

149 Making Scoring Efficient 1)Count the total number of items up to the last circled word. 2)Compare the student answers to the correct answers on the scoring template. Mark a slash [/] through incorrect responses. 3)Subtract the number of incorrect answers from the total number of items attempted. 4)Record the total number of correct answers on the cover sheet followed by the total number of errors (e.g., 35/2).

150 CBM R-MAZE Prorating 1)When the student finished, the time must be recorded and the number of correct answers counted. For example, the student may have finished in 2 minutes and correctly answered 40 items. 2)Convert the time taken in seconds. (2 minutes = 120 seconds) 3)Divide the number of seconds by the number correct. (120/40 = 3) 4)Calculate the number of seconds in the full 3 minutes. (3 minutes = 180 seconds) 5)Divide the number of full seconds by the calculated value from step 3. (180/3 = 60) If a student finishes all the items before 3 minutes, the score may be prorated.

151 Summary You now have the building blocks to begin CBM R-MAZE to ensure literacy growth. Practice to automaticity --You’ll become more efficient Practice to automaticity --You’ll become more efficient Check IRR using the AIRS for accuracy/efficiency with a colleague Check IRR using the AIRS for accuracy/efficiency with a colleague Stay in tune by periodically checking AIRS Stay in tune by periodically checking AIRS

152 Practice Exercise 1: Let’s Score R-CBM Workbook: Page 17

153 Practice Exercise 2: Let’s Score

154 R-CBM Workbook: Page 19

155 THE END.

156

157 Math-Curriculum Based Measurement (M-CBM)

158 Training Session Overview of Math Computation CBM (M-CBM) and Math Facts Probes for use with AIMSweb Part of a training series developed to accompany the AIMSweb Improvement System. Purpose is to provide the background information and data collection procedures necessary to administer and score Reading Comprehension- Curriculum Based Measurement (MAZE). Designed to accompany: Administration and Scoring of M-CBM Workbook Standard Math-CBM probes AIMSweb Web-based Software

159 Training Session Goals Brief review of M-CBM Its Purpose. Its Purpose. Its Origins. Its Origins. Learn how to administer and score through applied practice.

160 General Outcome Measures from Other Fields Medicine measures height, weight, temperature, and/or blood pressure. Federal Reserve Board measures the Consumer Price Index. Wall Street measures the Dow-Jones Industrial Average. Companies report earnings per share. McDonald’s measures how many hamburgers they sell.

161 Why Assess Math Computation via CBM: Computational skills are critical for math success. Using the algorithms for computation assist in building math skills needed to master higher-level math proficiency. Broad-range achievement tests in math cover many skills, but few of each type.

162 Why assess Math via M-CBM Cont’d: Many broad-range math achievement tests have only one form. Difficult to monitor progress repeatedly over time Hard to determine where student’s skill-levels start/stop M-CBM provides many, narrow-band tests (many items across a grade- level or problem type) Simple to administer, score, chart progress.

163 M-CBM Methodology: Research has shown that having students write answers to computational problems for 2-4 minutes is a reliable, valid means of assessing student progress in math computation. Appropriate for average-performing students in Grades 1-6 Appropriate for students enrolled above Grade 6, but are performing below this level.

164 M-CBM Methodology: Based on expected math computation skills for Grades 1-6: Benchmarking: M-CBM (10 probes available per grade level) Strategic Monitoring: M-CBM (40 probes available per grade level) Progress Monitoring: M-CBM (40 probes available per grade level) or: Progress Monitoring: Math Facts: (40 probes available per grade level) Each probe contains 2 pages of computations, (front/back) Students complete probes under standardized conditions (See specific directions) Administer individually, small group, or class-wide: (Grade 1-3 = 2 minutes) (Grade 4-6 = 4 minutes)

165 M-CBM Grade-Level Probes All M-CBM Grade-level type probes level are based on Prototype Probe Prototype Probes: All types of problems contained within the scope of a grade-level skill spectrum are consistent in order/sequence across all probes for each specified grade level. A-1: Same type of problem on each page Grade 2: Probe 1 Grade 2: Probe 2

166 M-CBM Sample 2nd Grade Probe (Student Copy)

167 Scoring based on number of DIGITS CORRECT (DS) M-CBM Sample 2nd Grade Probe (Teacher’s Answer Key)

168 M-CBM Math Facts: Addition Grades 1-3 (fewer problems) Grades 4-6 (more problems) Page 1 of 2

169 M-CBM Math Facts: Probe options Addition (+) facts Addition (+) facts Subtraction (-) facts Subtraction (-) facts Addition/Subtraction (+ / -) mixed facts Addition/Subtraction (+ / -) mixed facts Multiplication facts (x) Multiplication facts (x) Division facts ( ÷ ) Division facts ( ÷ ) Multiplication/Division mixed facts (x / ÷ ) Multiplication/Division mixed facts (x / ÷ ) Addition/Subtraction/Multiplication/Division mixed facts (+, -, x, ÷ ) Addition/Subtraction/Multiplication/Division mixed facts (+, -, x, ÷ )

170 AIMSweb Math Fact Probes Used primarily for instructional planning and short-term progress monitoring (p.8 in M-CBM Workbook)

171 M-CBM Administration Setup Things you will need: 1.Appropriate M-CBM or Math Facts Probes 2.Students need pencils. 3.Stopwatch or timers Setting up the testing room: 1.Large group: Monitor carefully to ensure students are not skipping and X-ing out items 2.Small group/individual: Monitor similarly. 3.If students are off task, cue with statements such as: “Try to do EACH problem.” or “You can do this kind of problem so don’t skip.”

172 M-CBM Directions for Administration Different M-CBM’s require different instructions (4 versions): Standard Math-Curriculum Based Measurement Probe instructions: Grades 1-3: Page 13 of workbook Grades 4-6: Page 14 of workbook Single-Skill Math Fact Probes—Standard Directions: Grades 1-6 probes: Page 15 of workbook Multiple Skill Math Fact Probes—Standard Directions: Grades 1-6 probes: Page 16 of workbook

173 M-CBM After Testing—Scoring What is Correct? Grades 1-6: Score Digits Correct (DC). Each digit correct in any answer = 1 point. If problem is “X”-ed out, ignore X and score anyway. Use Answer Key for quick scoring. See Page 18 of workbook for examples.

174 M-CBM: How to score Correct Digits (CD) Correct Digits - Each correct digit that a student writes is marked with an underline and counted.

175 M-CBM: How to score Incomplete Problems Incomplete Problems - Sometimes students don’t finish a problem. Score for the number of correct digits that are written.

176 M-CBM: How to score “X”-ed out problems X-ed Out Problems - Sometimes students start a problem and then cross it out. Sometimes students go back and write answers for problems they have crossed out. Ignore the X and score what you see.

177 M-CBM: How to score reversals Legibility and Reversed or Rotated Numbers - Sometimes trying to figure out what number the student wrote can be challenging, especially with younger students or older students with mathematics achievement problems. To make scoring efficient and reliable, we recommend attention to three rules: 1. If it is difficult to determine what the number is at all, count it wrong. 2. If the reversed number is obvious, but correct, count it as a correct digit. 3. If the numbers 6 or 9 are potentially rotated and the digit is currently incorrect, count it as an incorrect digit.

178 How to score M-CBM: Critical Processes (CP) Scoring Scoring Rules for Answer and Critical Processes: When students’ Grade 5 or Grade 6 M-CBM probes are scored for the number of CDs in the answer only and critical processes, the examiner uses the answer key that details which digits are to be counted. Each problem has an “assigned CD value” based on what AIMSweb believes is the most conventional method of solving the computational problem. Compare how the same multi-step multiplication problem would be scored using the different methods. Answer Only: Answer & Critical Processes: Use error patterns diagnostically

179 How to score M-CBM: Critical Processes (CP) Scoring (Correct vs. Incorrect) Although you don’t need to count every digit written in a correct answer, it is important to write the number of CDs awarded to the problem next to the answer. Should the student solve the problem correctly, their score would be 13 CD. Should the student solve the problem and not write any of the CD, the score would be 0 CD. (p. 21 in M-CBM Workbook)

180 How to score M-CBM: Critical Processes (CP) Scoring (Work not shown) If the answer is correct as shown below-left, their score is the number of CDs possible shown in the answer key. If they do not show their work and the answer is incorrect, the examiner can only “score what they see” as shown below. (p. 21 in M-CBM Workbook)

181 How to score M-CBM: Answer & Critical Processes (CP) Scoring (Alignment) When students’ answers are not aligned correctly according to place value: If the answer is correct, ignore the alignment problem and count the digits as correct as shown below. If the answer is incorrect, count the digits as they appear in approximate place value as shown below, even if a place value error may seem obvious. (p. 22 in M-CBM Workbook)

182 The End

183

184 Administration and Scoring of Early Numeracy Curriculum-Based Measurement (CBM) for use in General Outcome Measurement

185 Overview of TEN-CBM Assessment Training Session Part of a training series developed to accompany the AIMSweb Improvement System. Purpose is to provide the background information and data collection procedures necessary to administer and score the Tests of Early Numeracy (TEN). Designed to accompany: Test of Early Numeracy (TEN): Administration and Scoring of AIMSweb Early Numeracy Measures for Use with AIMSweb Training Workbook Standard Early Numeracy Measures AIMSweb Web-based Software Training Video

186 Training Session Goals Learn how to administer and score through applied practice. Oral Counting Number Identification Quantity Discrimination Missing Number

187 Early Numeracy General Outcome Measures (EN-GOMs)

188 Listen and follow along—Oral Counting

189 Number Identification: Listen and follow along

190 Quantity Discrimination: Listen and follow along

191 Missing Number: Listen and follow along

192 Things You Need BEFORE Testing Three Major Tasks for efficient and accurate assessment: 1. Understanding the typical timeframe for administering specific tests. 2. Getting the necessary testing materials; and 3. Arranging the test environment.

193 Recommended AIMSweb Early Numeracy Assessment Schedule Page 13 in your manual!

194 Things to Do BEFORE Testing — Oral Counting Specific Materials Arranged Examiner copy for scoring Clipboard or other hard surface for recording student answers. Stopwatch. Tape recorder (optional) to aid in any scoring questions or for qualitative analysis. The student does not need any materials.

195 DURING Testing DURING Testing — Oral Counting Place examiner copy on clipboard so student cannot see. Say specific directions to the student. “When I say start I want you to start counting aloud from 1 like this 1, 2, 3 until I tell you to stop. If you come to a number you don’t know, I’ll tell it to you. Be sure to do your best counting. Are there any questions? Ready, Start.” Start your stopwatch. If the student fails to say “1” after 3 seconds, say “one” and continue. Follow along on the examiner copy. Score according to scoring rules. After one minute has expired,place a bracket after the last number said and say “Stop.” Reported as Correct Oral Counts/errors [COC/errors] Reported as Correct Oral Counts/errors [COC/errors]

196 Things to do AFTER Testing — Oral Counting Score Immediately! Determine COC. Put a slash (/) through incorrectly counted numbers.

197 What is a Correct Oral Count? Any number named that comes next in a sequence. Repetitions. Self-Corrections.

198 What is an Error? Hesitations of three seconds or more Provide the correct number — once only! Omissions of a number in a sequence.

199 Practice Exercise 1 — Oral Counting “When I say start I want you to start counting aloud from 1 like this 1, 2, 3 until I tell you to stop. If you come to a number you don’t know, I’ll tell it to you. Be sure to do your best counting. Are there any questions? Ready, start.”

200 Practice Exercise 1: Listen and score—Oral Counting

201 Practice Exercise 1 — Compare your results: Oral Counting

202 Practice Exercise 2 — Listen and Score: Oral Counting

203 Practice Exercise 2 — Compare your results: Oral Counting

204 Number Identification — Standard Directions Student copy in front of student Examiner copy on clipboard so student does not see Say these specific directions to the student: “Look at the paper in front of you. It has a number on it (demonstrate by pointing). What number is this?” CORRECT RESPONSE: “Good. The number is 8. Look at the number next to 8. (demonstrate by pointing). What number is this?” INCORRECT RESPONSE: “This number is 8 (point to 8). What number is this? Good. Let’s try another one. Look at the number next to 8. (demonstrate by pointing). What number is this?”

205 Number Identification — Standard Directions, continued Continue with these specific directions to the student: “The paper in front of you has numbers on it. When I say start, I want you to tell me what the numbers are. Start here and go across the page (demonstrate by pointing). If you come to a number you don’t know, I’ll tell you what to do. Are there any questions? Put your finger on the first one. Ready, start.” CORRECT RESPONSE: “Good. The number is 4.” (Turn the page). INCORRECT RESPONSE: “This number is 4 (point to 4). What number is this? Good.” (Turn the page).

206 Number Identification — Standard Directions, continued Start your stopwatch. If the student fails to answer the first problem after 3 seconds, tell the student to “try the next one.” If the student does not get any correct within the first 5 items, discontinue the task and record a score of zero. Follow along on the examiner copy. Put a slash (/) through any incorrect responses. The maximum time for each item is 3 seconds. If a student does not provide an answer within 3 seconds, tell the student to “try the next one.” At the end of 1 minute, place a bracket ( ] ) around the last item completed and say “Stop.” Reported as Correct Number Identifications/errors [CND/errors]

207 What is a Correct Number Identification? Any number the student correctly identifies.

208 What is an Error?—Number Identification Substitutions: if the child states any number other than the item number Hesitations of three seconds or more Tell the student to try the next one Omissions or skips (if student skips entire row, count all as errors) If a student misses five items consecutively, discontinue testing.

209 Practice Exercise 2 —Number Identification “The paper in front of you has numbers on it. When I say start, I want you to tell me what the numbers are. Start here and go across the page (demonstrate by pointing). If you come to a number you don’t know, I’ll tell you what to do. Are there any questions? Put your finger on the first one. Ready, start.”

210 Practice Exercise 3: Listen and Score--Number Identification

211 Practice Exercise 3 — Number Identification

212 Practice Exercise 4 — Listen and Score: Number Identification

213 Practice Exercise 4 — Compare your results: Number Identification

214 Quantity Discrimination — Standard Instructions CORRECT RESPONSE: “Good. The bigger number is 7. Now look at this box (demonstrate by pointing). It has two numbers in it. Tell me the number that is bigger.” INCORRECT RESPONSE: “The bigger number is 7. You should have said 7 because 7 is bigger than 4. Now look at this box (demonstrate by pointing). It has two numbers in it. Tell me the number that is bigger.” Place the student copy in front of the student. Place the examiner copy on a clipboard and position so the student cannot see what the examiner records. Say these specific directions to the student: “Look at the piece of paper in front of you. The box in front of you has two numbers in it (demonstrate by pointing). I want you to tell me the number that is bigger.”

215 Quantity Discrimination — Standard Instructions, continued CORRECT RESPONSE: “Good. The bigger number is 4.” (Turn the page.) INCORRECT RESPONSE: “The bigger number is 4. You should have said 4 because is bigger than 2.” (Turn the page.) Continue with the following: “The paper in front of you has boxes on it. In the boxes are two numbers. When I say start, I want you to tell me the number in the box that is bigger. Start here and go across the page (demonstrate by pointing). If you come to a box and you don’t know which number is bigger, I’ll tell you what to do. Are there any questions? Put your finger on the first one. Ready, start.” Start your stopwatch. If the student fails to answer the first problem after 3 seconds, tell the student to “Try the next one.”

216 Quantity Discrimination — Standard Instructions, continued If the student does not get any correct within the first 5 items, discontinue the task and record a score of zero. Follow along on the examiner copy. Put a slash (/) through any incorrect responses. The maximum time for each item is 3 seconds. If a student does not provide an answer within 3 seconds, tell the student to “try the next one.” At the end of 1 minute, place a bracket ( ] ) around the last item completed and say “Stop.” Reported as Quantity Discriminations/errors [QD/errors] Reported as Quantity Discriminations/errors [QD/errors]

217 Scoring Rules Rule 1: If a student states the bigger number, score the item as correct. Rule 2: If the student states both numbers, score the item as incorrect. Rule 3: If the student states any number other than the bigger number, score the item as incorrect. Rule 4: If a student hesitates or struggles with an item for 3 seconds, tell the student to “try the next one.” Score the item as incorrect. Rule 5: If a student skips an item, score the item as incorrect. Rule 6: If a student skips an entire row, mark each item in the row as incorrect by drawing a line through the row on the examiner score sheet. Rule 7: If a student misses 5 items consecutively, discontinue testing.

218 Practice Exercise 5 — Let’s Score!

219 Practice Exercise 5 — Compare your results

220 Practice Exercise 6 — Let’s Score!

221

222 Missing Number — Standard Instructions CORRECT RESPONSE: “Good. 1 is the number that goes in the blank.” Let’s try another one (point to second box). What number goes in the blank?” INCORRECT RESPONSE: “The number that goes in the box is 1. See 1, 2, 3 (demonstrate by pointing). 1 goes in the blank. Let’s try another one (point to second box). What number goes in the blank?” Place the student copy in front of the student. Place the examiner copy on a clipboard and position so the student cannot see what the examiner records. Say these specific directions to the student: “The box in front of you has two numbers in it (point to first box). I want you to tell me the number that goes in the blank. What number goes in the blank?”

223 Missing Number — Standard Instructions, continued CORRECT RESPONSE: “Good. 7 is the number that goes in the blank.” (Turn the page.) INCORRECT RESPONSE: “The number that goes in the blank is 7. See 5, 6, 7 (demonstrate by pointing). 7 goes in the blank.” (Turn the page.) Continue with the following: “The piece of paper in front of you has boxes with numbers in them. When I say start you are going to tell me the number that goes in the blank for each box. Start with the first box and go across the row (demonstrate by pointing). Then go to the next row. If you come to one you don’t know, I’ll tell you what to do. Are there any questions? Put your finger on the first one. Ready, start.” Start your stopwatch. If the student fails to answer the first problem after 3 seconds, tell the student to “try the next one.”

224 Missing Number — Standard Instructions, continued If the student does not get any correct within the first 5 items, discontinue the task and record a score of zero. Follow along on the examiner copy. Put a slash (/) through any incorrect responses. The maximum time for each item is 3 seconds. If a student does not provide an answer within 3 seconds, tell the student to “try the next one.” At the end of 1 minute, place a bracket ( ] ) around the last item completed and say “Stop.” Reported as Missing Numbers/errors [MN/errors] Reported as Missing Numbers/errors [MN/errors]

225 Scoring Rules Rule 1: If a student correctly states the missing number, score the item as correct. Rule 2: If a student incorrectly states the missing number next, score the item as incorrect by placing a slash through the number on the examiner score sheet. Rule 3: If a student hesitates or struggles with an item for 3 seconds, tell the student to “try the next one” (demonstrate by pointing). Score the item as incorrect. Rule 4: If a student skips an item, score the item as incorrect. Rule 5: If a student skips an entire row, mark each item in the row as incorrect by drawing a line through the row on the examiner score sheet. Rule 6: If a student misses 5 items consecutively, discontinue testing.

226 Practice Exercise 7 — Let’s Score!

227 Practice Exercise 7 — Compare your results

228 Practice Exercise 8 — Missing Number

229 Practice Exercise 8 — Compare your results: Missing Number

230 Quantitative Features Checklist After the student completes the EN- GOM measures, judge the degree to which the student exhibits the early numeracy skills listed opposite.

231 Accuracy of Implementation (AIRS) One for each type of early numeracy probe. Find them in the Appendix of the administration and scoring manual.

232 Summary You now have the building blocks to begin TEN assessment. Practice to Automaticity Practice to Automaticity — You’ll become more proficient. Get Checked Out Get Checked Out with AIRS for accuracy/efficiency by a colleague. Stay in Tune Stay in Tune by periodically checking AIRS.

233 The End

234

235 Progress Monitoring Strategies for Writing Individual Goals in General Curriculum and More Frequent Formative Evaluation Mark Shinn, Ph.D. Lisa A. Langell, M.A., S.Psy.S.

236 Big Ideas About Frequent Formative Evaluation Using General Outcome Measures and the Progress Monitoring Program One of the most powerful interventions that schools can use is systematic and frequent formative evaluation. Benchmark Assessment is not enough for some students because they may be in ineffective programs too long. (3 mos +) The solution is to write individualized goals and determine a feasible progress monitoring schedule. The core of frequent progress monitoring is: 1. 1.Survey-Level Assessment 2. 2.Goal setting using logical educational practices 3. 3.Analysis of student need and resources for determining progress monitoring frequency.

237 Formative Assessment Formative Assessment: Process of assessing student achievement during instruction to determine whether an instructional program is effective for individual students. When students are progressing, keep using your instructional programs. When tests show that students are not progressing, you can change your instructional programs in meaningful ways. Has been linked to important gains in student achievement (L. Fuchs, 1986) with effect sizes of.7 and greater.

238 Systematic formative evaluation requires the use of: Standard assessment tools… 1. 1.That are the same difficulty 2. 2.That are Given the same way each time.

239 More Severe Achievement Problems and/or More Resource Intensive Programs Require More Frequent Formative Evaluation Benchmark Testing (3 - 4 x Per Year) is not enough for some students.

240 With Very Low Performers, Not Satisfactory to Wait This Long!

241 Programs That are More Resource Intensive… Title I, English Language Learning, Special Education Should monitor student outcomes more frequently than the Benchmark Testing schedule.

242 Formative Evaluation of Vital Signs Requires Quality Tools Technical adequacy (reliability and validity); Capacity to model growth (able to represent student achievement growth within and across academic years); Treatment sensitivity (scores should change when students are learning); Independence from specific instructional techniques (instructionally eclectic so the system can be used with any type of instruction or curriculum); Capacity to inform teaching (should provide information to help teachers improve instruction); Feasibility (must be doable).

243 Thinking About A Student’s Data Sample Student: Melissa Smart 3 rd grade student Progress Monitor

244 8 77 92 110 34 50 Melissa Smart

245 Formative Evaluation—Is simply data enough?

246 Formative Evaluation: Is data and a goal enough?

247 Formative Evaluation: Are data, goals & trends enough?

248 Formative Evaluation is Impossible without all data: Goals Make Progress Decisions Easier

249 Current Goal Setting Practices Are Unsatisfying! Do you like these IEPs? I do not like these IEPs I do not like them Jeeze Louise We test, we check We plan, we meet But nothing ever seems complete. Would you, could you Like the form? I do not like the form I see Not page 1, not 2, not 3 Another change A brand new box I think we all Have lost our rocks!

250 Need Shift to Few But Important Goals Often Ineffective Goal Smorgasboard! Student will perform spelling skills at a high 3rd grade level. Student will alphabetize words by the second letter with 80% accuracy. Student will read words from the Dolch Word List with 80% accuracy. Student will master basic multiplication facts with 80% accuracy. Student will increase reading skills by progressing through Scribner with 90% accuracy as determined by teacher-made fluency and comprehension probes by October 2006. To increase reading ability by 6 months to 1 year as measured by the Woodcock Johnson. Student will make one year's growth in reading by October 2006 as measured by the Brigance. Student will be a better reader. Student will read aloud with 80% accuracy and 80% comprehension. Student will make one year's gain in general reading from K-3. Students will read 1 story per week.

251 Improving the Process of Setting Goals for Formative Evaluation Set a few, but important goals. Ensure goals are measurable and linked to validated formative evaluation practices. Base goal setting on logical educational practices.

252 Reduce the Number of Goals to a Few Critical Indicators ReadingIn (#) weeks (Student name) will read (#) Words Correctly in 1 minute from randomly selected Grade (#) passages. SpellingIn (#) weeks (Student name) will write (#) Correct Letter Sequences and (#) Correct Words in 2 minutes from randomly selected Grade (#) spelling lists. Math ComputationIn (#) weeks (Student name) will write (#) Correct Digits in 2 minutes from randomly selected Grade (#) math problems. Written ExpressionIn (#) weeks (Student name) will write (#) Total Words and (#) Correct Writing Sequences when presented with randomly selected Grade (#) story starters.

253 Ensure the Goals are Measurable and Linked to Validated Formative Evaluation Practices Goals should be based on quality tests like CBM. Based on validated practices such as how often, how many samples, etc.

254 Conducting a Survey Level Assessment Students are tested in successive levels of general curriculum, beginning with their current expected grade placement, until a level at which they are successful is determined.

255 Conducting a Survey Level Assessment John 5 th grader: 5 th grade passage 26/12 John 4 th grade passage 49/7 John 3 rd grade passage 62/4

256 Base Goal Setting on Logical Educational Practices Example of PLEP statement: John currently reads about 26 words correctly from Grade 5 Standard Reading Assessment Passages. He reads Grade 3 reading passages successfully; 62 correct words per minute with 4 errors, which is how well beginning 3 rd grade students read this material.

257 Goal Setting Strategies Current Performance Information based on Survey-Level Assessment (SLA). Know the Time Frame for the Goal. Determine a Future Performance Level.

258 Setting the Time Frame, Goal Level Material, and Criterion Time Frame End of Year (At Risk or Grade-Level Expectations) In 18 Weeks… Annual IEP Goals (Special Education) In 1 year… (or) In 32 Weeks…

259 Setting the Goal Material Logical Task-- Matching or Not Matching Expected Grade Placement Title I: Fourth Grader--Grade 4 Material? Grade 4: Special Education Student--Grade 4 Material?

260 When Grade-Level Expectations Are Not Appropriate Consider the Severity of the Discrepancy Consider the Intensity of the Program

261 Determining the Criterion for Success: Options to use 1. 1.Local Benchmark Standards. 2. 2.Linkage to High Stakes Tests. 3. 3.Normative Growth Rates. 4. 4.Developing Your Own Sample of Standards.

262 1. Benchmark Standards Advantages and Disadvantages AdvantagesDisadvantages Easily UnderstoodUncomfortable, Especially in Low Achievement Environments Can Indicate When Student No Longer Needs Specialized Instruction Issues of “Equity” Determining Least Restrictive Environment (LRE)

263 2. Linkage to High Stakes Standards Advantages and Disadvantages AdvantagesDisadvantages Reduces Problems of Equity when Local Achievement is Low Need Linkage to High Quality High Stakes Test Increases Usefulness of High Stakes Tests Linkage Must Be Established Empirically Helps Specify Long-Term Targets (What Grade 2 Student Needs to Read to Be Successful on Grade 6 Test) Adoption of Assumption that the Attainment of the Target Guarantees Passing High Stakes Test

264 Normative Growth Rates Criterion for Success = Score on SLA + (Grade Growth Rate times # of Weeks) Score on SLA (30)+ (Ambitious Grade Growth Rate (2.0) times # of Weeks (32) Or 30 + (2.0 * 32) or 30 + 64 = Annual goal of 94 WRC

265 3. Growth Rate Standards Advantages and Disadvantages AdvantagesDisadvantages Easily UnderstoodMay Underestimate What Can Be Attained with High Quality Instruction

266 Developing Your Own Sample of Standards Developing a Small Local Norm

267 Benchmark Standards Advantages and Disadvantages AdvantagesDisadvantages Same Advantages as Benchmark Standards Same Disadvantages of Benchmark Standards Small Sample Size

268 How Frequently to Assess? Balancing IDEAL with FEASIBLE

269 Making Data-Based Decisions With Progress Monitor « «Need at LEAST 4-7 data points before making programming decision— …and you may sometimes want to collect more if you are uncertain. « «Err on the side of caution Criteria To Consider: Trendline meets Aimline for ultimate goal: Consider return to LRE. Trendline and AIMline will intersect in relatively near future? Keep with current intervention until goal is reached. Trendline exceeds AIMline? Consider increasing goal or difficulty level. Trendline not going to intersect AIMline—moves in opposite direction: Consider adding additional intervention, changing variable, and/or insensifying program changes (LRE).

270

271

272

273

274

275

276

277

278

279

280

281

282

283

284

285 The End


Download ppt "Local AIMSweb® Manager: Taking the Role An introduction to: Course Overview Role of a Local AIMSweb Manager (LAM) Skills needed to become a successful."

Similar presentations


Ads by Google