Presentation is loading. Please wait.

Presentation is loading. Please wait.

Curriculum-Based Measurement and General Outcome Measurement (GOM) and Mastery Monitoring (MM)

Similar presentations


Presentation on theme: "Curriculum-Based Measurement and General Outcome Measurement (GOM) and Mastery Monitoring (MM)"— Presentation transcript:

1

2 Curriculum-Based Measurement and General Outcome Measurement (GOM) and Mastery Monitoring (MM)
Mark R. Shinn, Ph.D. Professor and School Psychology Program National Louis University, Skokie, IL November 29th, 2012

3 My Area of Expertise Editor and Contributor to 2 Major Texts on CBM
1 of 6 members of Technical Review Panel, National Center for Student Progress Monitoring, USDE/OSEP Author of More than 75 Refereed Journal Articles and Book Chapters on the Topic of CBM, Progress Monitoring, and Screening

4 Disclosure Disclosure
Mark R. Shinn, Ph.D. Serves as a Paid Consultant for Pearson Assessment for their AIMSweb product that provides CBM assessment materials and organizes and report the information from 3 tiers, including RTI. He provides technical support and training. Mark R. Shinn, Ph.D. Serves as a Consultant for Cambium/Voyager/Sopris for their Vmath product, a remedial mathematics intervention but has no financial interests. He helped them develop their progress monitoring system. Mark R. Shinn, Ph.D. Serves as a Consultant for McGraw-Hill Publishing for their Jamestown Reading Navigator (JRN) product and receives royalties.He helped them develop their progress monitoring system. Mark R. Shinn, Ph.D. Serves as a Member of the National Advisory Board for the CORE (Consortium on Reaching Excellence) and receives a stipend for participation. He provides training and product development advice.

5 Background Reading on CBM and Decision Making In Multi-Tiered Model/RtI
Espin, C.A., McMaster, K., Rose, S., & Wayman, M. (Eds.). (2012). A measure of success: The influence of Curriculum-Based Measurement on education. Minneapolis, MN: University of Minnesota Press.

6 Presentation is Based on the Following White Paper
Available in pdf format iBook format A “glossy” and official Pearson version will be finished soon and sent to you. Shinn, M.R. (2012). Measuring general outcomes: A critical component in scientific and practical progress monitoring practices. Minneapolis, MN: Pearson Assessment.

7 References on CBM, GOM, and MM
Deno, S.L. (1986). Formative evaluation of individual student programs: A new role for school psychologists. School Psychology Review, 15, Espin, C.A., McMaster, K., Rose, S., & Wayman, M. (Eds.). (2012). A measure of success: The influence of Curriculum-Based Measurement on education. Minneapolis, MN: University of Minnesota Press. Fuchs, L.S., & Deno, S.L. (1991). Paradigmatic distinctions between instructionally relevant measurement models. Exceptional Children, 57, Fuchs, L.S., & Fuchs, D. (1999). Monitoring student progress toward the development of reading competence: A review of three forms of classroom-based assessment. School Psychology Review, 28, Jenkins, J.R., & Fuchs, L.S. (2012). Curriculum-Based Measurement: The paradigm, history, and legacy. In C. A. Espin, K. McMaster, S. Rose & M. Wayman (Eds.), A measure of success: The influence of Curriculum-Based Measurement on education (pp. 7-23). Minneapolis, MN: University of Minnesota Press. Shinn, M.R. (2012). Measuring general outcomes: A critical component in scientific and practical progress monitoring practices. Minneapolis, MN: Pearson Assessment.

8 Accessing Reading Materials
markshinn.org Click on the Downloads for Professionals Icon Click on the Presentations and Handouts Folder Click on AIMSweb GOM and MM Webinar (Sponsored by Pearson) Folder 8

9 A Personal Story: Approaching 60, I Needed to Get Healthier
What Could I Measure to Gauge the Effects of My Efforts? I wanted to measure something important. I wanted it to be easy to do and not take a lot of time and $$. I wanted it to be easy for me to understand, as well as for my wife and kids.

10 This is General Outcome Measurement
The Answer Was Obvious This is General Outcome Measurement Testing something “small” to make statements about something “big” (important)!”

11 There Were Other Things I Could Measure
Daily Calorie Targets Calories per Item Consumed Minutes of Daily Exercise Estimated Calories Burned from Exercise Inches Around Waist Miles per Day and Per Week of Bike Riding Average Biking MPH Average Cadence While Riding Energy Watts Generated These Things ALSO Were Important, But More Difficult to Measure, to Compare, and “Put Together” for a Picture of Progress This is Mastery Monitoring

12 Big Ideas Educators typically have lots of opinions about assessment and progress monitoring is no exception. However, few of us have sufficient training in assessment in general and progress monitoring in particular. (Yet) Frequent progress monitoring is one of the most powerful tools in educators’ intervention toolbox and the single most powerful teaching variable that they can control! There are two “families” of Progress Monitoring tools, General Outcome Measurement (GOM) and Mastery Monitoring (MM) GOM assesses progress on a standard and equivalent measure the same way over time. It answers the question of “Is the student becoming a ‘better reader?’ “ or “Is the student better at mathematics computation?” It is associated with gains in “important” outcomes or “big things.” MM assesses progress on ever changing and different tests aligned with short-term instructional objectives or units. It answers the question of “Did the student learn what I taught today (or this week)? It is associated with instructional validity. Most Curriculum-Based Measurement (CBM) tests are associated with GOM. The ideal progress monitoring system is a combination of GOM and MM.

13 My Assessment Training
<?xml version="1.0" encoding="ISO " ?> <poll url=" <!-- This snippet was inserted via the Poll Everywhere Mac Presenter --> <!-- The presence of this snippet is used to indicate that a poll will be shown during the slideshow --> <!-- TIP: You can draw a solid, filled rectangle on your slide and the Mac Presenter will automatically display your poll in that area. --> <!-- The Mac Presenter application must also be running and logged in for this to work. --> <!-- To remove this, simply delete it from the notes yourself or use the Mac Presenter to remove it for you. --> <title>My Assessment Training</title> </poll>

14 Schools Are Looking for Swiss Army Knife of Tests
Tests that Can... Do EVERYTHING With Little to No Teacher Time Little Hassle The Emphasis is On Program Evaluation, Accountability, Perhaps Screening, But Quality PM is Not Their Strength!

15 Frequent Progress Monitoring (of a Particular Type) is One of Our Most Powerful Intervention Tools
...effective across student age, treatment duration, frequency of measurement, and special needs status Major message is for teachers to pay attention to the formative effects of their teaching as it is this attribute of seeking (my emphasis) formative evaluation...that makes for excellence in teaching (p. 181) Hattie, J. (2009). Visible learning: A synthesis of over 800 meta-analyses relating to achievement. New York, NY: Routledge.

16 Frequent (Formative) Progress Monitoring
And the Number 1 Most Powerful TEACHING Variable Hattie, J. (2009). Visible learning: A synthesis of over 800 meta-analyses relating to achievement. New York, NY: Routledge.

17 Some Basic Vocabulary to Support Understanding

18 General Outcome Measurement
GOM assesses progress on a standard and equivalent measure the same way over time. Think: Testing “small” to make statements about something “big” (i.e.,very important)! Also often referred to as Long-Term Measurement (LTM)

19 Other Professions Are Highly Dependent on GOM
Medicine Blood Pressure Blood Glucose Levels Business Earnings per Share Economy Consumer Price Index Unemployment Rate The Key Concept is An Empirically Validated “Indicator”

20 Curriculum-Based Measurement
Short, standardized basic skills measures validated as general outcomes measures (GOM). General reading skill or ability: R-CBM: Oral reading Maze: Silent reading General mathematics skill or ability: M-COMP: General mathematics computation skills M-CAP: General math concepts and application skills General writing skill or ability: WE-CBM: General written expression skills General spelling skill or ability: S-CBM: General written expression skills

21 A Reading General Outcome: A “Rich Task” Consistent with CCSS
It was a pretty good composition. I felt proud knowing it was the best one at my school. After I’d read it five times, I was impatient to start reading it out loud. I followed the book’s directions again. First I read the composition out loud without trying to sound impressive, just to hear what the words sounded like. How many Words did Billy Read Correctly--Ask the audience. About to be exact What is Billy’s accuracy? have the audience estimate. Actual accuracy is 73%, which is poor reading. Our judgments can get us in trouble. That’s why we use objective tests. Billy, 4th Grader

22 Questions I Can Answer At a Single Point in Time:
Is This Student a Good or Poor Reader, gauged normatively or with standards?

23 Questions I Can Answer Over Time:
Is This Student Improving in His General Reading Skill?

24 The Judgment is Empirical
Shinn, M.R., Good, R.H., Knutson, N., Tilly, W.D., & Collins, V. (1992). Curriculum-Based reading fluency: A confirmatory analysis of its relation to reading. School Psychology Review, 21(3),

25 Points of Confusion with GOM
1. Short Tests can’t tell you anything. Tell that to your physician. They are short to reduce the amount of instructional time lost to testing. 2. Because something “little” is tested, this “thing” becomes the specific instructional target. It is not “oral reading fluency.” To Move the R-CBM “Dial,” We Need to Target a Variety of Reading Skills, Not Just Reading Speed CBM doesn’t include all the things we teach. It does not measure everything in reading, math, writing, etc. Less important for instructional planning and program evaluation and accountability

26 All Problems of DIFFERENT TYPES
A Mathematics Computation General Outcome: A “Rich Task” Consistent with CCSS All Problems of DIFFERENT TYPES Row Addition Column Addition How many Words did Billy Read Correctly--Ask the audience. About to be exact What is Billy’s accuracy? have the audience estimate. Actual accuracy is 73%, which is poor reading. Our judgments can get us in trouble. That’s why we use objective tests. 2 Digit Subtraction w/o Regrouping 2 Digit Addition w/o Regrouping

27 Mastery Monitoring MM assesses progress on constantly different tests that are closely tied to specific instructional content and equivalent measure the same way over time. Think: Testing “small” to make statements about something “small”! Also often referred to as Short-Term Measurement (STM) Examples: End-of Unit Tests, Specific Skills Tests, Quizzes

28 Mathematics Computation Mastery Monitoring:
Single Skill Mathematics Computation Probe Basic Addition Facts 0-12 All Problems of The SAME TYPE

29 Is This Student Learning Multi-Digit Addition Skills?
Questions I Can Answer In the Short Term: Is This Student Learning Multi-Digit Addition Skills?

30 Questions That Are More Difficult
In the Long Term: Is This Student Improving in Mathematics Computation?

31 Why Is This Question More Difficult?
1. It presumes the student has retained addition skills. 2. It assumes that addition skills must be taught the student has before subtraction skills. 3. It assumes that the addition and subtraction skills tests are reliable and valid. 4. It assumes that the criterion for mastery (in this case 80%) has been validated.

32 GOM Assumptions, Advantages, and Disadvantages
An “Indicator” Has Been Established Empirically Curriculum Independent Not Everything Students Need to Know Has a Validated Indicator; Currently Constrained to the Basic Skills Progress Monitoring is Relatively Easy to Do--Logistically Feasible NOT Consistent with How Teachers “Think” about PM Reliable and Valid Tests Have Been Created Lacks Exhaustive Information for Diagnosis and Instructional Planning Assessment for Retention and Generalization Built In Confident Decisions About Progress

33 MM Assumptions, Advantages, and Disadvantages
Validated Instructional Hierarchy High Instructional Validity Let’s Teachers Know if What They’ve Been Teaching Has Been Learned (and Least Initially) Curriculum Dependent-Different Curriculum Value Different Things, Teach Them in Different Orders, Etc. Comparing Progress Within and Across Different Curriculum is Difficult Reliable and Valid Tests are Available for Each Unit, Objective, Skills Consistent with How Teachers “Think” about PM Doesn’t Routinely Test for Retention and Generalization--Therefore Students May Not Be Taught to Mastery Mastery Criterion are Empirically Established Tests Can Often Be Used Diagnostically Logistically Complex, Even if Reliable and Valid Tests Have Been Created; Testing is Always Changing and If Students are Taught to Criterion, Can Be Overwhelming Reliable Decisions About Progress Are Sorta Iffy

34 Standards for Evaluating General Outcome Measures

35 Standards for Evaluating Mastery Monitoring Measures

36 Comparison of Progress Monitoring Standards
GOM Standards MM Standards Alternate Forms Skill Sequence Specified Sensitive to Student Improvement Sensitive to Improvement Reliability of the Performance Level Score Reliability Reliability of the Slope Validity of the Performance Level Score Validity Predictive Validity of the Slope of Improvement End-of-Year Benchmarks Pass/Fail Criterion Rates of Improvement Specified Disaggregated Reliability and Validity Data Norms Disaggregated for Diverse Populations

37 Mark’s Bottom Line Suggestions
THINK PROGRESS Progress Monitoring is Vital and We Have the Capacity to do This Efficiently and Effectively--In the Basic Skills Frequent GOM Using CBM is the “Best” Way to Do This--Let’s Get It Done, Especially for At Risk Students and Those with Severe Achievement Discrepancies THINK PERFORMANCE MM is Important--But Less So For Progress Performance is About What I am Teaching and If Students Don’t Perform What I’m Teaching, then No Learning Occurred

38 Bottom Line So... Build Basic Skills PM Using CBM at Tier 1 As Long As You Need To Use More Frequent PM Using CBM at Tiers 2 and 3 As Long As You Have Students with Basic Skills Discrepancies--And In Most Schools, That’s Through Grade 12 Use Your Existing Assessments WITHIN THE CURRICULUM as Performance Assessment, Instructional Planning, and Supporting Evidence (Not Primary) of Progress

39 Questions?


Download ppt "Curriculum-Based Measurement and General Outcome Measurement (GOM) and Mastery Monitoring (MM)"

Similar presentations


Ads by Google