Download presentation

Presentation is loading. Please wait.

Published byMaurice Chestnut Modified about 1 year ago

1
Data Driven Decisions & Target Interventions in the Elementary Math Classroom Cleveland County Schools Giancarlo Anselmo, Brian Bettis, Carrie Knotts

2
Objectives Discuss the research behind Curriculum Based Measures (CBMs) Advantages of CBMs Critical Features Reliability and Validity Development of CBMs Norms and Growth rates Universal Screening Suggestions Team Initiated Problem Solving Data Collection and Analysis Progress Monitoring Appropriate Targeted Interventions

3
Nations Report Card 2012 NAEP

4
Hierarchy of CBM Research 1 st CBM reading elementary level 2 nd CBM reading secondary level 3 rd CBM math elementary level 4 th CBM math secondary level 5 th CBM for other subjects (writing, spelling, science, etc.)

5
Curriculum Based Measurement: Advantages Direct measure of student performance Helps target specific areas of instructional need for students Quick to administer Provides visual representation (reports) of individual student progress and how classes are acquiring essential reading skills Sensitive to even small improvements in performance Capable of having many forms Monitoring frequently enables staff to see trends in individual and group performance—and compare those trends with targets set for their students. Correlates strongly with “best practices” for instruction and assessment, and research-supported methods for assessment and intervention

6
Critical Features of CBM Technical adequacy Evaluation of general outcomes Assess student progress Stecker et al. 2005

7
Technical Adequacy Data exists for two distinct types of M- CBM –Computation –Concepts and Applications

8

9
Reliability of M-CBM (Basic Facts) Foegen 2000 –Alternate form= Foegen & Deno 2001 –Internal consistency= –Alternate form= –Test-Retest= Foegen 2008 –Alternate form= –Test-Retest=

10
Reliability of M-CBM Concepts and Applications Helwig & Tindal 2002 –Alternate form= Foegen 2008 –Alternate form= –Test-Retest=

11
Validity of M-CBM Concepts and Applications Helwig, Anderson, & Tindal 2002 –Criterion Validity=.80 Foegen 2008 –Concurrent Validity=

12
Validity of M-CBM Basic Facts Foegen 2000 –Criterion validity= Foegen & Deno 2001 –Criterion validity=.63 Foegen 2008 –Concurrent validity=

13
Different Approaches to Developing M-CBM Curriculum sampling approach Robust Indicators approach

14
Curriculum Sampling –Measures are developed by constructing representative samples of the year’s mathematics curriculum –Method is used with both math computation and math applications

15
1.Multidigit addition with regrouping 2.Multidigit subtraction with regrouping 3.Multiplication facts, factors to 9 4.Multiply 2-digit numbers by a 1-digit number 5.Multiply 2-digit numbers by a 2-digit number 6.Division facts, divisors to 9 7.Divide 2-digit numbers by a 1-digit number 8.Divide 3-digit numbers by a 1-digit number 9.Add/subtract simple fractions, like denominators 10.Add/subtract whole number and mixed number Fourth Grade Math Computation Curriculum

16
3 rd Grade Common Core Standards

17
Robust Indicators Measures that are not necessarily representative of a particular curriculum, but are instead characterized by the relative strength of their correlations to various overall mathematics proficiency criteria (Foegen et al., 2007)

18
Robust Indicators Little research but research done shows promise for this method Helwig & Tindal 2002 –Took 11 concept grounded math problems and correlated the results the Computer Adaptive Test (state test given in Oregon) –Results suggested correlations for general education students=.80

19
Cleveland County Math Probes Used curriculum sampling approach Designed our own universal screening probes using: –Math-aids.com –Math Concepts and Applications probes were adapted from Monitoring Basic Skills Progress: Basic Math Concepts and Applications –Fuchs, Hamlett, & Fuchs, (1999)

20
Local Norms and Growth Rates

21
M-CBM as part of a Three Tiered Model Tier I-Universal Screening Tier II-Progressing Monitoring Tier III-Further assessment as part of a problem solving process

22
Universal Screening Math assessments are generally done using one probe during universal screening Hintze et al, 2002 –Study showed that one can expect extremely high dependability with as little as one 2-minute multiple-skill math probe

23
Universal Screening Which probes to use? –Option 1: Design your own probes –Option 2: Choose a standardized set of published probes

24
Companies that Provide Standardized M-CBM Probes AimsWeb –http://www.aimsweb.comhttp://www.aimsweb.com Easy CBM –http://www.easycbm.comhttp://www.easycbm.com Yearly Progress Pro –http://www.mhdigitallearning.comhttp://www.mhdigitallearning.com

25
AIMSweb Math measures for Computation –Mixed computation, grades 1-6 –+ facts, - facts, x facts, / facts, +/-mix, mult./div. mix, all mix Math measures for Concepts and Applications –See table for areas covered

26

27
AimsWeb

28
Easy CBM Math and Reading Probes from grade 1-8 Probes covering: Number and Operations, Algebra, Measurement, Geometry, and Data/Analysis Math probes can be taken as a paper and pencil test or taken online

29

30
Easy CBM Sample

31
Easy CBM and AimsWeb Have established norms Have Math probes for grades 6-8 Have alternative forms for progress monitoring Have the capability storing data online for distribution and analysis

32
Overall Suggestions Have a district level team select measures based on critical criteria such as reliability, validity and efficiency Select screening measures based on the content they cover with an emphasis on critical instructional objectives for each grade level In grades 4-8, use screening measures in combination with state testing data Use the same screening tool across a district to enable analyzing results across schools Clarke and Baker

33
Now What? Curriculum Based Measure has been selected 1.Complete Universal Screening 3 times a year. BOY, MOY, EOY 2.Teachers analyze data looking to answer these questions. –How are all students performing? –Why are there deficits/strengths? –Are students growing?

34
“Prismation” of Data Multiple Data Sources: Classroom Performance, CFA, CBM, Behavior, Teacher Judgment. No one data source trumps another. They work in conjunction with each other to tailor an action plan for the student. Student’s Targeted Action Plan, customized to meet their individual needs.

35
Why? Determine how well your Foundational Core instructional programs are working for all students-- proficiency and growth Identify specific skill deficits/strengths of all students Used as a part of an early warning system

36
Universal Screening Allows Us To…. Problem solve –The whole school –A grade level –A class –Subgroups

37
Team Initiated Problem Solving Grade Level Teams –Analyze grade level data in conjunction with curriculum coaches –Define the problem –Answer the “why” questions –Design an action plan Core instruction and interventions

38
Identifying Areas of Need What are the students’ strengths? Why? What are their deficits? Why? Do we need to address this in core instruction? Do you need to address this with interventions?

39
Example 4 th Grade Math CBM data analysis What would you do?

40
Progress Monitoring

41
Appropriate Targeted Interventions

42
Questions?

Similar presentations

© 2017 SlidePlayer.com Inc.

All rights reserved.

Ads by Google