Overview of Curriculum-Based Measurement (CBM) and AIMSweb®

Slides:



Advertisements
Similar presentations
Progress Monitoring: Data to Instructional Decision-Making Frank Worrell Marley Watkins Tracey Hall Ministry of Education Trinidad and Tobago January,
Advertisements

Ed-D 420 Inclusion of Exceptional Learners. CAT time Learner-Centered - Learner-centered techniques focus on strategies and approaches to improve learning.
Progress Monitoring And RtI System
Responsiveness to Instruction North Carolina Problem Solving Model Problem Solving Model Session 1/4.
Scott Linner Aimsweb Trainer Aimsweb support
OVERVIEW OF CURRICULUM-BASED MEASUREMENT AS A GENERAL OUTCOME MEASURE
Response to Intervention (RtI) in Primary Grades
Fluency Assessment Bryan Karazia.
Overview Overview, history, and purpose of AIMSweb and Curriculum-Based Measurement AIMSweb and RTI Administration, Scoring, and Interpretation Oral Reading.
Overview of Progress Monitoring Training Session Part of a training series developed to accompany the AIMSweb Improvement System. Purpose is to provide.
Progress Monitoring in Reading: Why, What, and How
Measuring and Reporting Progress Toward Measurable Annual Goals Exceptional Children Division Policy, Monitoring, and Audit Section.
Administration and Scoring of READING-MAZE (R-MAZE) for Use in General Outcome Measurement Power Point Authored by Jillyan Kennedy Based on Administration.
RtI Case Studies RSS RtI Foundations Training August 2010 Amy Roberts & Erin Banks School Psychologists.
ABCs of CBMs Summary of A Practical Guide to
Knox County Schools Transition to RTI2
RTI Implementer Webinar Series: Establishing a Screening Process
Academic Data for Instructional Decisions: Elementary Level Dr. Amy Lingo, Dr. Nicole Fenty, and Regina Hirn Project ABRI University of Louisville Project.
Progress Monitoring project DATA Assessment Module.
0 From TN Department of Education Presentation RTII: Response to Instruction and Intervention.
Plan Evaluation/Progress Monitoring Problem Identification What is the problem? Problem Analysis Why is it happening? Progress Monitoring Did it work?
1 Module 2 Using DIBELS Next Data: Identifying and Validating Need for Support.
Universal Screening: Answers to District Leaders Questions Are you uncertain about the practical matters of Response to Intervention?
Progress Monitoring and Goal Writing Using CBMs to write RtI goals Using CBMs to write RtI goals.
Response to Intervention RTI – SLD Eligibility. What is RTI? Early intervention – General Education Frequent progress measurement Increasingly intensive.
STEEP Data Management Ease, Reliability, Power. STEEP Data Management Goal: Make it simple to implement RTI correctly –Select Students who Need Intervention.
Curriculum Based Evaluations Informed Decision Making Leads to Greater Student Achievement Margy Bailey 2006.
DATA-BASED DECISION MAKING USING STUDENT DATA-BECAUSE IT’S BEST PRACTICE & IT’S REQUIRED Dr. David D. Hampton Bowling Green State University.
Aimsweb overview Group-Administered Measures: Training Format
Universal Screening and Progress Monitoring Nebraska Department of Education Response-to-Intervention Consortium.
 “Fluency assessment consists of listening to students read aloud and collecting information about their oral reading accuracy, rate, and prosody.” (Page.
The Role of Assessment in Response to Intervention Connecting Research to Practice for Teacher Educators.
Curriculum Based Measures vs. Formal Assessment
Chapter 9 Fluency Assessment Tina Jensen. What? Fluency Assessment Consists of listening to students read aloud for a given time to collect information.
Webinar 3 Core Instruction (Tier 1). Assessments: – Screening – Evaluating effectiveness of core instruction Research-based/Evidence-based Instructional.
Assessment: Universal Screening Cadre 7 Initial Training September 29, 2011.
Summer Copyright (c) 2012 Pearson Education, Inc. or its affiliate(s). All rights reserved. All names and data used in this presentation are fictitious.
Utilizing AIMSweb to Inform Instruction June 25, 2012.
Progress Monitoring and Response to Intervention Solution.
OVERVIEW OF CURRICULUM-BASED MEASUREMENT AS A GENERAL OUTCOME MEASURE Mark R. Shinn, Ph.D. Michelle M. Shinn, Ph.D.
1 FUNCTIONAL ASSESSMENT: Curriculum-Based Measurement Progress Monitoring NC Department of Public Instruction Exceptional Children Division Program Improvement.
MI draft of IDEIA 2004 (Nov 2009) WHAT HAS CHANGED? How LD is identified:  Discrepancy model strongly discouraged  Response To Instruction/Intervention.
Response to Intervention (RTI) at Mary Lin Elementary Principal’s Coffee August 30, 2013.
Today’s Learning Objectives: 1.What is Progress Monitoring? 2.Brief review of Universal Screening data, 2.Brief review of Universal Screening data, identifying.
0 From TN Department of Education Presentation RTII: Response to Instruction and Intervention.
PROGRESS MONITORING FOR DATA-BASED DECISIONS June 27, 2007 FSSM Summer Institute.
Diagnostics Mathematics Assessments: Main Ideas  Now typically assess the knowledge and skill on the subsets of the 10 standards specified by the National.
Vision: Every child in every district receives the instruction that they need and deserve…every day. Oregon Response to Intervention Vision: Every child.
Using Data for Decisions Points to Ponder. Different Types of Assessments Measure of Academic Progress (MAP) Guided Reading (Leveled Reading) Statewide.
LIGHTS, CAMERA …. ACADEMIC DATA at the Elementary Level Cammie Neal and Jennifer Schau Forsyth County Schools.
From Screening to Verification: The RTI Process at Westside Jolene Johnson, Ed.S. Monica McKevitt, Ed.S.
Progress Monitoring and the Academic Facilitator Cotrane Penn, Ph.D. Intervention Team Specialist East Zone.
Special Education Referral and Evaluation Report Oregon RTI Project Sustaining Districts Trainings
RtI.  Learn: ◦ What is RtI ◦ Why schools need RtI ◦ What are the components that comprise an RtI system - must haves ◦ Underlying assumptions for the.
 Three Criteria: Inadequate classroom achievement (after intervention) Insufficient progress Consideration of exclusionary factors  Sources of Data.
Local AIMSweb ® Manager (LAM) Online Training MODULE 2: OVERVIEW OF CURRICULUM-BASED MEASUREMENT (CBM) AS A GENERAL OUTCOME MEASURE (GOM) Understanding.
Reading - Curriculum-Based Measurement (R-CBM)
Welcome Everyone!. Informal Agenda  Quick Trip With Jo Beck on navigating WebEx  Who you are and what you hope to get out of being a cohort  RTI/Measurement.
Progress Monitoring Goal Setting Overview of Measures Keith Drieberg, Director of Psychological Services John Oliveri, School Psychologist Cathleen Geraghty,
WestEd.org Washington Private Schools RtI Conference Follow- up Webinar October 16, 2012 Silvia DeRuvo Pam McCabe WestEd Center for Prevention and Early.
Response to Intervention for PST Dr. Kenneth P. Oliver Macon County Schools’ Fall Leadership Retreat November 15, 2013.
What is AIMSweb? AIMSweb is a benchmark and progress monitoring system based on direct, frequent and continuous student assessment.
Curriculum-Based Measurement for Student Assessment
Data Usage Response to Intervention
Math-Curriculum Based Measurement (M-CBM)
Overview: Understanding and Building a Schoolwide Assessment Plan
Data-Based Instructional Decision Making
Lake Myra Elementary School * July 2009
Special Education teacher progress monitoring refresher training
Response to Intervention Overview
Presentation transcript:

Overview of Curriculum-Based Measurement (CBM) and AIMSweb® Mark R. Shinn, Ph.D. Michelle M. Shinn, Ph.D. Lisa A. Langell, M.A., S.Psy.S.

Evaluations to Inform Teaching— Summative & Formative Assessment Summative Assessment: Culmination measure. Mastery assessment. Assessment after instruction. Pass/fail type assessments which summarize the knowledge students learn. Typical summative assessments include: End of chapter tests High-stakes tests (e.g., State assessments) GRE, ACT, SAT, GMAT, etc. tests Driver’s license test Final Exams. Formative Assessment: Process of assessing student achievement frequently during instruction to determine whether an instructional program is effective for individual students. Informs: When students are progressing, continue using your instructional programs. When tests show that students are not progressing, you can change your instructional programs in meaningful ways.

Summative & Formative Assessment Summative Assessment: Characterized as assessment of learning. Formative Assessment: Characterized as assessment for learning. (Citation: http://en.wikipedia.org/wiki/Summative_assessment) Summative assessment tells you what happened. Formative assessment tells you what’s happening.

Evaluations to Inform Teaching— Diagnostic Assessment Diagnostic Assessments: Measures that indicate specific skill strengths and those areas needing improvement. Results may indicate skill areas needing intervention/instruction. Programming may then address students’ needs. Examples: Criterion-referenced assessments Cognitive assessments Rating scales Norm-referenced, standardized assessments Tests may be based on the assessment of cognitive skills, academic skills, behavior, health, social-emotional wellbeing, etc.

So, Where Does Curriculum-Based Measurement (CBM) fit? Summative? Formative? Diagnostic?

What is Curriculum-Based Measurement (CBM)? CBM is a form of Curriculum-Based Assessment (CBA). CBM is the method of monitoring student progress through direct, continuous assessment of basic skills. CBM is used to assess skills such reading, spelling, mathematics, and written language. CBM probes require about 1 to 4 minutes to complete, depending on the skill being measured. Student performance is scored for speed and accuracy to determine proficiency. Because CBM probes are quick to administer and simple to score, they can be given frequently to provide continuous progress data. The results are charted to provide for timely evaluation based on hard data.

Origins of CBM CBM was initially developed more than 20 years ago by Stanley Deno and others at the University of Minnesota Institute for Research on Learning Disabilities to develop a reliable and valid measurement system for evaluating basic skills growth CBM is supported by 30 years of school-based research CBM is endorsed by the United States Department of Education as a method for assessing student progress. Starting in the area of reading, researchers have expanded to investigate additional academic areas over the years. This includes in-depth research and ultimately the publication of additional measures in literacy, mathematics, and written language Supporting documentation can be found in 100’s of articles, chapters, and books available within the professional literature describing the use of CBM to make a variety of important educational decisions

Advantages of CBM Direct measure of student performance Deno, S.L. (1985). Curriculum-based measurement: the emerging alternative. Exceptional Children. 52(3):219-32. Correlates strongly with “best practices” for instruction and assessment Correlates strongly with research-supported methods for assessment and intervention Focus is on repeated measures of performance (This cannot be done with most norm-referenced and standardized tests due to practice effect or limited forms.)

Common Characteristics of General Outcome Measures CBM involves the same kind of evaluation technology as other professions. Powerful measures that are: Simple Accurate Efficient indicators of performance that guide and inform a variety of decisions Generalizable thermometer that allows for reliable, valid, cross comparisons of data

General Outcome Measures (GOMs) from Other Fields Medicine measures height, weight, temperature, and/or blood pressure. Department of Labor measures the Consumer Price Index. Wall Street measures the Dow-Jones Industrial Average. Companies report earnings per share. McDonald’s® measures how many hamburgers they sell.

CBM is Used for Scientific Reasons Based on Evidence Reliable and valid indicator of student achievement Simple, efficient, and of short duration to facilitate frequent administration by teachers Provides assessment information that helps teachers plan better instruction Sensitive to improvement of students’ achievement over time Easily understood by teachers and parents Improves achievement when used to monitor progress

Things to Always Remember About CBM CBMs are designed to serve as “indicators” of general reading achievement: CBM probes don’t measure everything, but they measure the important things Standardized tests to be given, scored, and interpreted in a standard way Researched with respect to psychometric properties to ensure | accurate measures of learning

Items to Remember (continued) Are sensitive to improvement in brief intervals of time Also can tell us how students earned their scores (offers opportunity to gather qualitative information) Designed to be as short as possible (2-4min) to ensure its “do-ability” Are linked to decision making for promoting positive achievement and Problem-Solving

What is AIMSweb®? AIMSweb® uses CBM. AIMSweb® is a scientifically based, formative assessment system that 'informs' the teaching and learning process: Educators only have control over the quality of the instruction and the fidelity with which it is implemented. Take care not to automatically assume that a lack of progress is always due to lack of effectiveness of the educational programming/intervention. There may be times when a student’s lack of progress may be attributable to other causes, e.g., transient internal or familial stressors, physical illness, poor attendance.  Addressing programming to include such things as: Attendance plans Behavior Plans Mental health / physical health concerns   AIMSweb® provides continuous student performance data and reports improvement to parents, teachers, and administrators AIMSweb® enables evidence-based evaluation and data-driven instruction.

Three-Tiered Assessment Model Tier 3: PROGRESS MONITOR Intensive monitoring towards specific goals for students at significant risk for failure 1 2 3 Tier 2: STRATEGIC MONITOR Monthly monitoring for students who are mild to moderate risk for failure Tier 1: BENCHMARK Universal Screening for all students

Three-Tiered Assessment Model: Benchmark Tier 1: BENCHMARK Universal Screening for all students Tier 2: STRATEGIC MONITOR Monthly monitoring for students who are at mild to moderate risk for failure Tier 3: PROGRESS MONITOR Intensive monitoring towards specific goals for students at risk for failure 1 2 3

BENCHMARK : Universal Screening Identification of students at-risk for failure Program evaluation across all students 1-4 minutes to complete AIMSweb® assessments Some AIMSweb® assessments are individually administered, while others may be done as a group. 1

Measures Currently Available via AIMSweb®: Early Literacy [K-1 benchmark, Progress Monitor (PM) any age] Letter Naming Fluency Letter sound fluency Phonemic Segmentation Fluency Nonsense Word Fluency Early Numeracy (K-1 benchmark, PM any age) Oral Counting Number identification Quantity discrimination Missing number Oral Reading (K-8, PM any age) MAZE (Reading comprehension); (1-8, PM any age) Math Computation (1-6, PM any age) Math Facts (PM any age) Spelling (1-8, PM any age) Written Expression (1-8, PM any age) All students in an academic curriculum are “benchmarked” three times per year across any/all of these assessment areas.

Optional R-CBM Activity: Learning the Process of Benchmark Data Collection and Analysis: Short Case Studies

Data: Get the MEDIAN score for student’s 3 passages: CORRECT WORDS 67 / 2 85 / 8 74 / 9 1 min. 1 min. 1 min. Why use Median vs. Average? Averages are susceptible to outliers when dealing with small number sets. Median Score is a statistically more reliable number than average for R-CBM.

MEDIAN SCORE: 74/8 (Report in AIMSweb®) Data: Get the MEDIAN score for student’s 3 passages: ERRORS 67 / 2 85 / 8 74 / 9 MEDIAN SCORE: 74/8 (Report in AIMSweb®) NOTE: R-CBM is the only measure for which the median score is calculated. 1 min. 1 min. 1 min. Why use Median vs. Average? Averages are susceptible to outliers when dealing with small number sets. Median Score is a statistically more reliable number than average for R-CBM.

BENCHMARK DATA ACTIVITY Your Certified AIMSweb® Trainer (CAT) will assign you a number, from 1 to 4 Open up your workbook to: Calculate the Median Score for the student (number) you were assigned WRC = Words Read Correct ERRORS = Number of Errors Wait for all trainees to finish the exercise. Your trainer will confirm accuracy momentarily. WRC/Errors Example: 64/2

(All identifying information and scores are fictitious.)

(All identifying information and scores are fictitious.) 54 2 (All identifying information and scores are fictitious.)

(All identifying information and scores are fictitious.)

(All identifying information and scores are fictitious.) 98 9 (All identifying information and scores are fictitious.)

(All identifying information and scores are fictitious.)

(All identifying information and scores are fictitious.) 79 12 (All identifying information and scores are fictitious.)

(All identifying information and scores are fictitious.)

(All identifying information and scores are fictitious.) 48 12 (All identifying information and scores are fictitious.)

(All identifying information and scores are fictitious.) Easy Score Entry Once data are collected, it is easily entered into AIMSweb®’s web-based software. Simply type in the scores! (pictured at left) Dozens of reports are then instantly available. (Estimated time: 3-5 minutes.) (All identifying information and scores are fictitious.)

(All identifying information and scores are fictitious.) AIMSweb as a Program Evaluation Tool: Benchmark 3x/year for Universal Screening—All Kids (All identifying information and scores are fictitious.)

Michael Martin: A student with Benchmark Data that indicates he is performing significantly behind peers at his school.

Fall Benchmark Data for Michael Martin Martin, Michael: Grade 5 (All identifying information and scores are fictitious.)

Box & Whiskers Graphs (box plots): A 3-Step Explanation 1 3 2 Average Range (middle 50%) AIMSweb commonly uses box plots to report data. AIMSweb’s Box plots are somewhat similar in shape and representation as to a vertical bell curve. Michael Martin Above 90th percentile * Above Average Range 90th percentile * Target Line * 75th percentile Median (50th percentile) 25th percentile Average range of population included in sample. * * Below Average Range 10th percentile * Below 10th percentile * *In relation to user-defined comparison group

AIMSweb® “Box & Whiskers Graphs” / Comparison Groups: Michael Martin AIMSweb Comparison Group Choices/Options: All students in grade-level at student’s school (pictured left) All students in grade-level across student’s school district All students in grade-level across multiple districts within account All students in grade-level within student’s state* All students in grade-level nationally (Aggregate Norms)* (Comparison group includes all students for whom data are reported using AIMSweb® web-based software) (All identifying information and scores are fictitious.)

Grade 5: Michael’s School Grade 5: Michael’s District Discussion: Consider Michael’s R-CBM performance in relationship to different AIMSweb® Comparison Groups Grade 5: Michael’s School Grade 5: Michael’s District Grade 5: National Aggregate Norms (All identifying information and scores are fictitious.)

Discussion: SAMPLE STUDENT Grade 5: Luis’s School Grade 5: Luis’s District Grade 5: National Aggregate Norms

AIMSweb National Aggregate Norm Table

AIMSweb Aggregate Norm Table

AIMSweb District vs. Aggregate Norm Table Comparison

Grade 5: Michael’s School Grade 5: Michael’s District Michael appears to be performing below expectations when compared to all three comparison groups. Consider modifying instructional program(s) Consider increasing frequency of assessment to assess efficacy of alternate program(s) Review data regularly to assess progress Grade 5: Michael’s School Grade 5: Michael’s District Grade 5: National Aggregate Norms (All identifying information and scores are fictitious.)

An Introductory Look at Additional Benchmark Data

Individual Report: 3rd Grade Student QUESTIONS: What does report suggest about Viviana’s progress? What does report suggest about the school’s progress for its 3rd grade students? What if you saw this pattern in only one school’s 3rd grade within your district? What if you saw this pattern across most or all 3rd grade groups in your district?

Data to Know When Things are Working QUESTIONS: What does report suggest about Jamie Connor’s progress? What does report suggest about the school’s progress for 3rd grade students? What if you saw this pattern in only one school’s 3rd grade within your district? What if you saw this pattern across most or all 3rd grade groups in your district?

Data to Know that Things Went Well QUESTIONS: What does report suggest about Heather A’s progress? SPRING: Compared to Grade 3 peers at her school, is Heather performing in the: - Well Above Average Range? - Above Average Range? - Average Range? - Below Average Range? - Well-Below Average Range?

Have Data to Know When Things Need Changing QUESTIONS: What does report suggest about U. Boardman’s progress? What are possible reasons why U. Boardman might not be making progress? What might happen if nothing is done to address U. Boardman’s needs? Without this type of visual data, collected at each Benchmark period, do you believe U. Boardman’s stagnation would have been quickly noticed otherwise?

Data to Know that Changes Made a Difference QUESTIONS: What does report suggest about U. Boardman’s progress by spring? What are possible reasons why U. Boardman might be making progress? What does this report suggest about the progress Grade 3 students made from winter to spring? Could program changes that impacted U. Boardman simultaneously positively impact the whole class? How could this report be used for parent conferences? Grade level team meetings? Other ideas?

Identifying Students At-Risk for Failure QUESTIONS: What does report suggest about Lindsey’s progress (Spring)? What are possible reasons why Lindsey might not be making progress? Based on this report, is Lindsey’s instruction assisting her in closing the performance discrepancy between her school’s Grade 5 peers? How would this report be helpful if Lindsey were not currently receiving Tier 2 support? Tier 3? Special Education / Special Programs? How would this report still be helpful if Lindsey was already receiving Special Education?

Three-Tiered Assessment Model: Strategic Monitor TIER 3 PROGRESS MONITOR Intensive monitoring towards specific goals for at-risk students 2 TIER 2 STRATEGIC MONITOR Monthly monitoring for students who are questionable or of concern TIER 1 BENCHMARK Universal Screening

(All identifying information and scores are fictitious.) AIMSweb as a Program Evaluation Tool: Schools May Strategic Monitor Monthly for students at Mild to Moderate Risk (All identifying information and scores are fictitious.)

Tier 2: Strategic Monitor Strategic Monitor: (1x/month) Provides option to increase assessment frequency from three times per year to once per month for select students. Example of Strategic Monitor report containing monthly data collected over a full school year (All identifying information and scores are fictitious.)

Strategic Monitoring: October (Lindsey Hunter) (All identifying information and scores are fictitious.)

Strategic Monitoring: November (Lindsey Hunter) (All identifying information and scores are fictitious.)

TIER 3: Progress Monitor Intensive monitoring of individualized goals for students at-risk of failure 3

(All identifying information and scores are fictitious.) Progress Monitor (Tier 3): Intensive assessment with adjustable frequency that matches need 3 (All identifying information and scores are fictitious.)

(All identifying information and scores are fictitious.) Benchmark (Tier 1): 3x per year Strategic Monitoring (Tier 2): 1x per month for select students at risk for educational difficulties Progress Monitor (Tier 3): Intensive assessment with adjustable frequency that matches need (All identifying information and scores are fictitious.)

At-a-Glance Views of Student Ranking & Growth

At-a-Glance Views of Student Ranking & Growth Follow student progress over time. Sort by Service Code and enhance ability to conduct differential instruction, track progress by group type, and assess improvement.

Compare Sub-group Trends: Measure growth of: General Education Title 1 Special Education ELL/ESL Meal Status groups Compare with your custom-set targets View weekly growth rates by group type

Compare a School to a Composite

Many More Reporting Options Available

Finally… AIMSweb®’s Progress Monitoring and data reporting system involves testing using simple general, RESEARCHED outcome measures. It provides an ONGOING data base that reports progress and feedback for teachers, administrators, and parents, enabling everyone to make decisions about the growth and development of students’ basic skills. Your data, via AIMSweb®, is professionally managed by staff in a process that communicates that YOU are in charge of student learning.

The End