Curriculum-based Measures: Math

Slides:



Advertisements
Similar presentations
Progress Monitoring: Data to Instructional Decision-Making Frank Worrell Marley Watkins Tracey Hall Ministry of Education Trinidad and Tobago January,
Advertisements

Scott Linner Aimsweb Trainer Aimsweb support
Fluency Assessment Bryan Karazia.
Overview of Progress Monitoring Training Session Part of a training series developed to accompany the AIMSweb Improvement System. Purpose is to provide.
ABCs of CBMs Summary of A Practical Guide to
Cleveland County Schools
Big Ideas About Frequent Formative Evaluation Using General Outcome Measures and the Progress Monitoring Program One of the most powerful interventions.
Progress Monitoring project DATA Assessment Module.
Plan Evaluation/Progress Monitoring Problem Identification What is the problem? Problem Analysis Why is it happening? Progress Monitoring Did it work?
Reading Comprehension and Math Computation Screening and Progress Monitoring Assessments for Secondary Students Carrie Urshel, Ph.D., School Psychologist.
Novice Webinar 2 Overview of the Four Types and Purposes of Assessment.
Survey Level Assessment
Response to Intervention Finding RTI-Ready Measures to Assess and Track Student Academic Skills Jim Wright
Universal Screening: Answers to District Leaders Questions Are you uncertain about the practical matters of Response to Intervention?
CA Multi-Tiered System of Supports
Curriculum Based Evaluations Informed Decision Making Leads to Greater Student Achievement Margy Bailey 2006.
DATA-BASED DECISION MAKING USING STUDENT DATA-BECAUSE IT’S BEST PRACTICE & IT’S REQUIRED Dr. David D. Hampton Bowling Green State University.
Curriculum-Based Measurement, Common Assessments, and the Common Core Mathematics Assessment and Intervention.
Response to Intervention (RTI) Presented by Ashley Adamo and Brian Mitchell January 6, 2012.
Aimsweb overview Group-Administered Measures: Training Format
The ABCs of CBM for Math, Spelling, & Writing
Universal Screening and Progress Monitoring Nebraska Department of Education Response-to-Intervention Consortium.
 “Fluency assessment consists of listening to students read aloud and collecting information about their oral reading accuracy, rate, and prosody.” (Page.
Curriculum Based Measures vs. Formal Assessment
EasyCBM: Benchmarking and Progress Monitoring System Jack B. Monpas-Huber, Ph.D. Director of Assessment & Student Information Shereen Henry Math Instructional.
Chapter 9 Fluency Assessment Tina Jensen. What? Fluency Assessment Consists of listening to students read aloud for a given time to collect information.
Assessment: Universal Screening Cadre 7 Initial Training September 29, 2011.
CEDS 2005 ANNUAL CONFERENCE Educational Assessment: Looking Forward, Reaching Further November 17, 2005 When Assessment Isn’t Enough: Understanding Student.
Setting Ambitious & Attainable Student Goals OrRTI Spring Training May 3 rd, 2011.
Data Analysis: Literacy NC Content Literacy Continuum Fall Symposium High Point University October 15-16, 2014.
Utilizing AIMSweb to Inform Instruction June 25, 2012.
Fall, How does understanding the levels of assessment assist the LCMT with identification, development, implementation, and evaluation of strategies.
1 Progress Monitoring in a Response to Intervention World: Helping Classrooms to Implement Best Practices Jacki Bootel Rebecca Holland-Coviello Silvia.
Response to Intervention RTI Data Challenge: Setting Individual RTI Academic Goals Using Research Norms for Students Receiving.
Progress Monitoring and Response to Intervention Solution.
PROGRESS MONITORING FOR DATA-BASED DECISIONS June 27, 2007 FSSM Summer Institute.
Diagnostics Mathematics Assessments: Main Ideas  Now typically assess the knowledge and skill on the subsets of the 10 standards specified by the National.
1 Curriculum Based Measures Improving Student Outcomes through Progress Monitoring.
Vision: Every child in every district receives the instruction that they need and deserve…every day. Oregon Response to Intervention Vision: Every child.
Using Data in the EBIS System Universal Screening and Progress Monitoring.
LIGHTS, CAMERA …. ACADEMIC DATA at the Elementary Level Cammie Neal and Jennifer Schau Forsyth County Schools.
Progress Monitoring and the Academic Facilitator Cotrane Penn, Ph.D. Intervention Team Specialist East Zone.
Assessment. National Research Council (2001)  “students need to be efficient and accurate in performing basic computation with whole numbers” (p. 121)
Progress Monitoring Strategies for Writing Individual Goals in General Curriculum and More Frequent Formative Evaluation Mark Shinn, Ph.D. Lisa A. Langell,
Webinar 1: Overview. 1. Overview  Link to SLD Rule  Multi-tiered Systems of Support (MTSS)  Systems of Assessment 2. Introduction to Tiers  Tier 1:
RtI.  Learn: ◦ What is RtI ◦ Why schools need RtI ◦ What are the components that comprise an RtI system - must haves ◦ Underlying assumptions for the.
Data-Based Decision Making: Universal Screening and Progress Monitoring.
Shelly Dickinson, MTSS Trainer Charlie Eccleston, MTSS Trainer.
Response to Intervention in Mathematics Thinking Smart about Assessment Ben Clarke University of Oregon May 21, 2014.
2008 Student Progress Monitoring & Data-Based Instruction in Special Education Introduction to Using CBM for Progress Monitoring in Math An overview (Sample.
Universal Screening Cadre 6 Training October 12, 2010.
1 Math Uses calculation problems consistent with curricular expectations Uses Digits Correct score permitting partial credit scoring Uses Blanks Correct.
Part 2: Assisting Students Struggling with Reading: Multi-Tier System of Supports H325A
UNIVERSAL SCREENING AND PROGRESS MONITORING IN READING Secondary Level.
Updated Section 31a Information LITERACY, CAREER/COLLEGE READINESS, MTSS.
2008 Student Progress Monitoring & Data-Based Instruction in Special Education Introduction to Using CBM for Progress Monitoring.
National Center on Response to Intervention RTI Essential Component: Progress Monitoring National Center on Response to Intervention.
Using Data to Implement RtI Colleen Anderson & Michelle Hosp Iowa Department of Education.
CASE STUDY - MATH PROBLEM SOLVING TO DESIGN & IMPLEMENT INTERVENTIONS.
Progress Monitoring Goal Setting Overview of Measures Keith Drieberg, Director of Psychological Services John Oliveri, School Psychologist Cathleen Geraghty,
Intensive Reading Support 6.0 Evaluate Instructional Support
Intensive Reading Support 6.0 Evaluate Instructional Support 21.
Department of Curriculum and Instruction Considerations for Choosing Mathematics Progress Monitoring Measures from K-12 Anne Foegen, Ph.D. Pursuing the.
WestEd.org Washington Private Schools RtI Conference Follow- up Webinar October 16, 2012 Silvia DeRuvo Pam McCabe WestEd Center for Prevention and Early.
Response to Intervention for PST Dr. Kenneth P. Oliver Macon County Schools’ Fall Leadership Retreat November 15, 2013.
RTI & SRBI What Are They and How Can We Use Them?
Math-Curriculum Based Measurement (M-CBM)
Data-Based Instructional Decision Making
CHAPTER 13: Assessing Mathematics Achievement
Mathematics Progress Monitoring and Determining Response
Presentation transcript:

Curriculum-based Measures: Math Kat Nelson, M.Ed University of Utah

Objectives You will be able to define a CBM and articulate the big ideas of using math CBM with the CCSS and the MTSS model. You will be able to administer and score screener and progress monitoring probes. You will be able to use the problem solving process to interpret the data produced from the math CBM.

CBM: Big Ideas (Kelly, Hosp, Howell, 2008) “CBM is a quick and reliable method for gathering information about student performance and progress.” CBM is… Aligned with Curriculum Valid and Reliable Standardized measures Provides low-inference Information

CBM: Big Ideas (Kelly, Hosp, Howell, 2008) CBM probes are repeated measures that are efficient, and sensitive to growth. Sensitivity to growth = Informing your instruction frequently. Information about performance and growth can be easily shared with stakeholders Indicator of future reading and math achievement

Curriculum-Based Measurement And The Common Core State Standarads Big Ideas

Common Core & CBM (Shinn, 2012) The Common Core State Standards (CCSS) provide sets of College and Career focused outcomes and annual Criterion Referenced Tests to measure student learning as a summative evaluation. The assessment implications of CCSS are clearly related to summative evaluation and accountability No single test is sufficient for all the data-based decisions, screenings, intervention planning/diagnosis, progress monitoring, accountability/program evaluation that schools make in their attempts to identify student learning needs.

Common Core & CBM (Shinn, 2012) Assessment of CCSS need not be separate items or tests for each standard, but may include “rich tasks” that address a number of separate standards. AIMSweb’s Curriculum-Based Measurement (CBM) tests typically are based on these rich tasks that are validated as “vital signs” or “indicators” of general basic skill outcomes.

Common Core & CBM (Shinn, 2012) AIMSweb’s CBM tests are consistent with the CCSS. They are content valid. AIMSweb’s CBM tests are complementary to the assessment requirements to attain proficiency on the CCSS.

Curriculum Based- Measurement And Multi-tier System Of Support Big Ideas

Multi-Tiered System of Support Schools identify students at risk for poor learning outcomes Monitor student progress Provide evidence based interventions and adjust the intensity and nature of those interventions depending on a student’s responsiveness. (NCRtI, 2010)

Key Features of MTSS (Sugai, 2008) • Universal Design • Data-based decision making and problem solving • Continuous progress monitoring • Focus on successful student outcomes • Continuum of evidence-based interventions • A core curriculum is provided for all students • A modification of this core is arranged for students who are identified as non-responsive • A specialized and intensive curriculum for students with intensive needs • Focus on fidelity of implementation

Problem Solving Process

Using CBM within MTSS Tier 1 Universal Screening Establishes benchmarks three times throughout the school year Tier 2 Progress monitoring Monitoring students at-risk by assessing monthly Tier 3 Intensive Progress monitoring Frequent assessment for students at risk or significant needs

Conducting A Math CBM Directions and Scoring Procedures

Selecting the Measure At Kindergarten or Grade 1 At Grade 1-8 Oral Counting Quantity Array Number Identification Quantity Discrimination Missing Number At Grade 1-8 Computation (Mixed and/or Facts) Concepts & Applications As appropriate (Grade 9?) Algebra

Early Numeracy Measures Let’s take a Look

Concepts and Applications or M-Cap Let’s Take a Look

Computation Let’s Take a Look

Administration of Computation Probe The number of correctly written digits in 2 minutes from the end-of-year curriculum Correct digits Not correct problems or answers Why? 2 minutes Depends on grade and publisher

Computation Student(s) are given a sheet of math problems and pencil Student(s) complete as many math problems as they can in 2 minutes At the end of 2 minutes the number of correctly written digits is counted

Directions for Computation Give the child(ren) a math sheet(s) and pencil Say “The sheet on your desk is math facts. There are several types of problems on the sheet. Some are (insert types of problems on sheet). Look at each problem carefully before you answer it. When I say ‘please begin’, start answering the problems. Begin with the first problem and work across the page. Then go to the next row. If you cannot answer the problem, mark an ‘X’ through it and go to the next one. If you finish a page, turn the page and continue working. Are there any questions?”

Directions – Your Turn The sheet on your desk is math facts. There are several types of problems on the sheet. Some are (insert types of problems on sheet). Look at each problem carefully before you answer it. When I say ‘please begin’, start answering the problems. Begin with the first problem and work across the page. Then go to the next row. If you cannot answer the problem, mark an ‘X’ through it and go to the next one. If you finish a page, turn the page and continue working. Are there any questions?”

Directions Continued Say “Please begin” and start your timer Make sure students are not skipping problems in rows and do not skip around or answer only the easy problems Say “Please stop” at the end of 2 minutes

Scoring If the answer is correct, the student earns the score equivalent to the number of correct digits written using the “longest method” taught to solve the problem, even if the work is not shown If a problem has been crossed out, credit is given for the correct digits written If the problem has not been completed, credit is earned for any correct digits written

Scoring Continued Reversed digits (e.g., 3 as E) or rotated digits, with the exception of 6 & 9 are counted as correct Parts of the answer above the line (carries or borrows) are not counted as correct digits In multiplication problems, a “0”, “X”, or <blank> counts as a place holder and is scored as a CD

Scoring Continued A division BASIC FACT is when both the divisor and the quotient are 9 or less. If the answer is correct the total CD always equals 1 In division problems, remainder zeroes (r 0) are not counted as correct digits In division problems, place holders are not counted as correct digits

Scoring A correct digit is the correct is the correct numeral in the right place.

Computation Scoring – Your Turn

Put It To Practice Benchmarking, Survey Level Assessment, and Progress Monitoring

Tier 1- Universal Screening Big Ideas (Hosp, Hosp, Howell, 2007) Provides a reliable and valid way to identify Students who are at risk for failure Students who are not making adequate progress Students who need additional diagnostic evaluation Students’ instructional level. 3 times a year for the entire school 3 probes are given and you take the median score

What is Proficient. How Much Progress can we Expect What is Proficient? How Much Progress can we Expect? (Hosp, Hosp, Howell, 2007) Benchmarks - Use standards for level of performance that are empirically validated by researchers. Norms – Compare a student’s score to the performance of others in her grade or instructional level

Proficiency Levels or Benchmarks for Math CBM (Burns, VanDerHeyden, Jiban, 2006) Grade Placement Level Correct Digits 2-3 Frustration <14 Instructional 14-31 Mastery >31 4-5 <24 24-49 >49

Norms for Math CBM: Correct Digits (AIMSweb, 2006) Grade Percentile Fall (CD) Winter (CD) Spring (CD) 2 90% 31 39 43 75% 20 30 42 50% 12 24 25% 8 16 17 10% 5 10 68 76 85 53 59 69 37 45 52 25 33 23 27

Making Informed Data Based-Decisions Spring Benchmark Data for 2nd Grade Student Median Score 1 22 2 35 3 37 4 10 5 42 6 47 7 13 8 27 9

Making Informed Data Based-Decisions Spring Benchmark Data for 2nd Grade Student Median Score 6 47 9 42 5 3 37 2 35 8 27 1 22 7 13 4 10

Survey Level Assessment (Hosp, 2012) Purposes To determine the appropriate instructional placement level for the student The highest level of materials that the student can be expected to benefit from instruction in To provide baseline data, or a starting point for progress monitoring In order to monitor progress toward a future goal, you need to know how the student is currently performing

Survey Level Assessment (Hosp, 2012) Start with grade level passages/worksheets (probes) Administer 3 separate probes (at same difficulty level) using standard CBM procedures Calculate the median (i.e., find the middle score) Is the student’s score within instructional range? Yes: this is the student’s instructional level No: if above level (too easy), administer 3 probes at next level of difficulty No: if below level (too hard), administer 3 probes at previous level of difficulty

Survey Level Assessment 4 2 6/13/13 2 7 12 1010 F 1 25 23 27 25 I x

Progress Monitoring Big Ideas: Tier 2 & 3 Purpose: (Hosp, Hosp, Howell, 2007) To ensure that instruction is working To signal when a change is needed To guide adjustments in the program Frequency: Tier 2: Monthly – to show progress and to inform instruction Tier 3: Weekly to Bi-Weekly – to ensure that students who are the most treatment resistant are making progress.

Progress Monitoring: Determine the Goal Calculating Aim Line Weekly Growth Rates for Math CBM: Correct Digits Median Score from SLA or Benchmark + (Number of Weeks x Rate of Improvement) = Goal Student 4 25 + (20 x .50) = 35 Goal = 35 Correct Digits in 20 weeks Grade Realistic Growth rates per week (CD) Ambitious growth rates per week (CD) 1 0.30 0.50 2 3 4 0.70 1.15 5 0.75 1.20 6 0.45 1.00 (Fuchs, Fuchs, Hamlett, Walz, and Germann 1993)

Your Turn: Calculate Goal for Student 1 2nd grade: Spring Benchmark Scores Calculate Student Median Score 6 47 9 42 5 3 37 2 35 8 27 *1* 22 7 13 4 10 Grade Realistic Growth rates per week (CD) Ambitious growth rates per week (CD) 2 0.30 0.50 Median Score from SLA or Benchmark + (Number of Weeks x Rate of Improvement) = Goal

Making Informed Data Based-Decisions Is our intervention working? What changes should we make?

Progress Monitoring: Another Look

CBM and Web-Based Data Management Resources AIMSWeb EasyCBM EDCheckup Intervention Central iSTEEP Yearly Progress Pro NCRtI enumeracy PM Focus