Presentation is loading. Please wait.

Presentation is loading. Please wait.

Curriculum-based Measures: Math

Similar presentations


Presentation on theme: "Curriculum-based Measures: Math"— Presentation transcript:

1 Curriculum-based Measures: Math
Kat Nelson, M.Ed University of Utah

2 Objectives You will be able to define a CBM and articulate the big ideas of using math CBM with the CCSS and the MTSS model. You will be able to administer and score screener and progress monitoring probes. You will be able to use the problem solving process to interpret the data produced from the math CBM.

3 CBM: Big Ideas (Kelly, Hosp, Howell, 2008)
“CBM is a quick and reliable method for gathering information about student performance and progress.” CBM is… Aligned with Curriculum Valid and Reliable Standardized measures Provides low-inference Information

4 CBM: Big Ideas (Kelly, Hosp, Howell, 2008)
CBM probes are repeated measures that are efficient, and sensitive to growth. Sensitivity to growth = Informing your instruction frequently. Information about performance and growth can be easily shared with stakeholders Indicator of future reading and math achievement

5 Curriculum-Based Measurement And The Common Core State Standarads
Big Ideas

6 Common Core & CBM (Shinn, 2012)
The Common Core State Standards (CCSS) provide sets of College and Career focused outcomes and annual Criterion Referenced Tests to measure student learning as a summative evaluation. The assessment implications of CCSS are clearly related to summative evaluation and accountability No single test is sufficient for all the data-based decisions, screenings, intervention planning/diagnosis, progress monitoring, accountability/program evaluation that schools make in their attempts to identify student learning needs.

7 Common Core & CBM (Shinn, 2012)
Assessment of CCSS need not be separate items or tests for each standard, but may include “rich tasks” that address a number of separate standards. AIMSweb’s Curriculum-Based Measurement (CBM) tests typically are based on these rich tasks that are validated as “vital signs” or “indicators” of general basic skill outcomes.

8 Common Core & CBM (Shinn, 2012)
AIMSweb’s CBM tests are consistent with the CCSS. They are content valid. AIMSweb’s CBM tests are complementary to the assessment requirements to attain proficiency on the CCSS.

9 Curriculum Based- Measurement And Multi-tier System Of Support
Big Ideas

10 Multi-Tiered System of Support
Schools identify students at risk for poor learning outcomes Monitor student progress Provide evidence based interventions and adjust the intensity and nature of those interventions depending on a student’s responsiveness. (NCRtI, 2010)

11 Key Features of MTSS (Sugai, 2008)
• Universal Design • Data-based decision making and problem solving • Continuous progress monitoring • Focus on successful student outcomes • Continuum of evidence-based interventions • A core curriculum is provided for all students • A modification of this core is arranged for students who are identified as non-responsive • A specialized and intensive curriculum for students with intensive needs • Focus on fidelity of implementation

12 Problem Solving Process

13 Using CBM within MTSS Tier 1 Universal Screening
Establishes benchmarks three times throughout the school year Tier 2 Progress monitoring Monitoring students at-risk by assessing monthly Tier 3 Intensive Progress monitoring Frequent assessment for students at risk or significant needs

14 Conducting A Math CBM Directions and Scoring Procedures

15 Selecting the Measure At Kindergarten or Grade 1 At Grade 1-8
Oral Counting Quantity Array Number Identification Quantity Discrimination Missing Number At Grade 1-8 Computation (Mixed and/or Facts) Concepts & Applications As appropriate (Grade 9?) Algebra

16 Early Numeracy Measures Let’s take a Look

17 Concepts and Applications or M-Cap Let’s Take a Look

18 Computation Let’s Take a Look

19 Administration of Computation Probe
The number of correctly written digits in 2 minutes from the end-of-year curriculum Correct digits Not correct problems or answers Why? 2 minutes Depends on grade and publisher

20 Computation Student(s) are given a sheet of math problems and pencil
Student(s) complete as many math problems as they can in 2 minutes At the end of 2 minutes the number of correctly written digits is counted

21 Directions for Computation
Give the child(ren) a math sheet(s) and pencil Say “The sheet on your desk is math facts. There are several types of problems on the sheet. Some are (insert types of problems on sheet). Look at each problem carefully before you answer it. When I say ‘please begin’, start answering the problems. Begin with the first problem and work across the page. Then go to the next row. If you cannot answer the problem, mark an ‘X’ through it and go to the next one. If you finish a page, turn the page and continue working. Are there any questions?”

22 Directions – Your Turn The sheet on your desk is math facts. There are several types of problems on the sheet. Some are (insert types of problems on sheet). Look at each problem carefully before you answer it. When I say ‘please begin’, start answering the problems. Begin with the first problem and work across the page. Then go to the next row. If you cannot answer the problem, mark an ‘X’ through it and go to the next one. If you finish a page, turn the page and continue working. Are there any questions?”

23 Directions Continued Say “Please begin” and start your timer
Make sure students are not skipping problems in rows and do not skip around or answer only the easy problems Say “Please stop” at the end of 2 minutes

24 Scoring If the answer is correct, the student earns the score equivalent to the number of correct digits written using the “longest method” taught to solve the problem, even if the work is not shown If a problem has been crossed out, credit is given for the correct digits written If the problem has not been completed, credit is earned for any correct digits written

25 Scoring Continued Reversed digits (e.g., 3 as E) or rotated digits, with the exception of 6 & 9 are counted as correct Parts of the answer above the line (carries or borrows) are not counted as correct digits In multiplication problems, a “0”, “X”, or <blank> counts as a place holder and is scored as a CD

26 Scoring Continued A division BASIC FACT is when both the divisor and the quotient are 9 or less. If the answer is correct the total CD always equals 1 In division problems, remainder zeroes (r 0) are not counted as correct digits In division problems, place holders are not counted as correct digits

27 Scoring A correct digit is the correct is the correct numeral in the right place.

28 Computation Scoring – Your Turn

29 Put It To Practice Benchmarking, Survey Level Assessment, and Progress Monitoring

30 Tier 1- Universal Screening Big Ideas (Hosp, Hosp, Howell, 2007)
Provides a reliable and valid way to identify Students who are at risk for failure Students who are not making adequate progress Students who need additional diagnostic evaluation Students’ instructional level. 3 times a year for the entire school 3 probes are given and you take the median score

31 What is Proficient. How Much Progress can we Expect
What is Proficient? How Much Progress can we Expect? (Hosp, Hosp, Howell, 2007) Benchmarks - Use standards for level of performance that are empirically validated by researchers. Norms – Compare a student’s score to the performance of others in her grade or instructional level

32 Proficiency Levels or Benchmarks for Math CBM (Burns, VanDerHeyden, Jiban, 2006)
Grade Placement Level Correct Digits 2-3 Frustration <14 Instructional 14-31 Mastery >31 4-5 <24 24-49 >49

33 Norms for Math CBM: Correct Digits (AIMSweb, 2006)
Grade Percentile Fall (CD) Winter (CD) Spring (CD) 2 90% 31 39 43 75% 20 30 42 50% 12 24 25% 8 16 17 10% 5 10 68 76 85 53 59 69 37 45 52 25 33 23 27

34 Making Informed Data Based-Decisions
Spring Benchmark Data for 2nd Grade Student Median Score 1 22 2 35 3 37 4 10 5 42 6 47 7 13 8 27 9

35 Making Informed Data Based-Decisions
Spring Benchmark Data for 2nd Grade Student Median Score 6 47 9 42 5 3 37 2 35 8 27 1 22 7 13 4 10

36 Survey Level Assessment (Hosp, 2012)
Purposes To determine the appropriate instructional placement level for the student The highest level of materials that the student can be expected to benefit from instruction in To provide baseline data, or a starting point for progress monitoring In order to monitor progress toward a future goal, you need to know how the student is currently performing

37 Survey Level Assessment (Hosp, 2012)
Start with grade level passages/worksheets (probes) Administer 3 separate probes (at same difficulty level) using standard CBM procedures Calculate the median (i.e., find the middle score) Is the student’s score within instructional range? Yes: this is the student’s instructional level No: if above level (too easy), administer 3 probes at next level of difficulty No: if below level (too hard), administer 3 probes at previous level of difficulty

38 Survey Level Assessment
/13/13 F I x

39 Progress Monitoring Big Ideas: Tier 2 & 3
Purpose: (Hosp, Hosp, Howell, 2007) To ensure that instruction is working To signal when a change is needed To guide adjustments in the program Frequency: Tier 2: Monthly – to show progress and to inform instruction Tier 3: Weekly to Bi-Weekly – to ensure that students who are the most treatment resistant are making progress.

40 Progress Monitoring: Determine the Goal
Calculating Aim Line Weekly Growth Rates for Math CBM: Correct Digits Median Score from SLA or Benchmark + (Number of Weeks x Rate of Improvement) = Goal Student 4 25 + (20 x .50) = 35 Goal = 35 Correct Digits in 20 weeks Grade Realistic Growth rates per week (CD) Ambitious growth rates per week (CD) 1 0.30 0.50 2 3 4 0.70 1.15 5 0.75 1.20 6 0.45 1.00 (Fuchs, Fuchs, Hamlett, Walz, and Germann 1993)

41 Your Turn: Calculate Goal for Student 1
2nd grade: Spring Benchmark Scores Calculate Student Median Score 6 47 9 42 5 3 37 2 35 8 27 *1* 22 7 13 4 10 Grade Realistic Growth rates per week (CD) Ambitious growth rates per week (CD) 2 0.30 0.50 Median Score from SLA or Benchmark + (Number of Weeks x Rate of Improvement) = Goal

42 Making Informed Data Based-Decisions
Is our intervention working? What changes should we make?

43 Progress Monitoring: Another Look

44 CBM and Web-Based Data Management Resources
AIMSWeb EasyCBM EDCheckup Intervention Central iSTEEP Yearly Progress Pro NCRtI enumeracy PM Focus


Download ppt "Curriculum-based Measures: Math"

Similar presentations


Ads by Google