2Curriculum-Based Measurement and General Outcome Measurement (GOM) and Mastery Monitoring (MM) Mark R. Shinn, Ph.D.Professor and School Psychology ProgramNational Louis University, Skokie, ILNovember 29th, 2012
3My Area of Expertise Editor and Contributor to 2 Major Texts on CBM 1 of 6 members of Technical Review Panel, National Center for Student Progress Monitoring, USDE/OSEPAuthor of More than 75 Refereed Journal Articles and Book Chapters on the Topic of CBM, Progress Monitoring, and Screening
4Disclosure Disclosure Mark R. Shinn, Ph.D. Serves as a Paid Consultant for Pearson Assessment for their AIMSweb product that provides CBM assessment materials and organizes and report the information from 3 tiers, including RTI. He provides technical support and training.Mark R. Shinn, Ph.D. Serves as a Consultant for Cambium/Voyager/Sopris for their Vmath product, a remedial mathematics intervention but has no financial interests. He helped them develop their progress monitoring system.Mark R. Shinn, Ph.D. Serves as a Consultant for McGraw-Hill Publishing for their Jamestown Reading Navigator (JRN) product and receives royalties.He helped them develop their progress monitoring system.Mark R. Shinn, Ph.D. Serves as a Member of the National Advisory Board for the CORE (Consortium on Reaching Excellence) and receives a stipend for participation. He provides training and product development advice.
5Background Reading on CBM and Decision Making In Multi-Tiered Model/RtI Espin, C.A., McMaster, K., Rose, S., & Wayman, M. (Eds.). (2012). A measure of success: The influence of Curriculum-Based Measurement on education. Minneapolis, MN: University of Minnesota Press.
6Presentation is Based on the Following White Paper Available inpdf formatiBook formatA “glossy” and official Pearson version will be finished soon and sent to you.Shinn, M.R. (2012). Measuring general outcomes: A critical component in scientific and practical progress monitoring practices. Minneapolis, MN: Pearson Assessment.
7References on CBM, GOM, and MM Deno, S.L. (1986). Formative evaluation of individual student programs: A new role for school psychologists. School Psychology Review, 15,Espin, C.A., McMaster, K., Rose, S., & Wayman, M. (Eds.). (2012). A measure of success: The influence of Curriculum-Based Measurement on education. Minneapolis, MN: University of Minnesota Press.Fuchs, L.S., & Deno, S.L. (1991). Paradigmatic distinctions between instructionally relevant measurement models. Exceptional Children, 57,Fuchs, L.S., & Fuchs, D. (1999). Monitoring student progress toward the development of reading competence: A review of three forms of classroom-based assessment. School Psychology Review, 28,Jenkins, J.R., & Fuchs, L.S. (2012). Curriculum-Based Measurement: The paradigm, history, and legacy. In C. A. Espin, K. McMaster, S. Rose & M. Wayman (Eds.), A measure of success: The influence of Curriculum-Based Measurement on education (pp. 7-23). Minneapolis, MN: University of Minnesota Press.Shinn, M.R. (2012). Measuring general outcomes: A critical component in scientific and practical progress monitoring practices. Minneapolis, MN: Pearson Assessment.
8Accessing Reading Materials markshinn.orgClick on the Downloads for Professionals IconClick on the Presentations and Handouts FolderClick on AIMSweb GOM and MM Webinar (Sponsored by Pearson) Folder8
9A Personal Story: Approaching 60, I Needed to Get Healthier What Could I Measure to Gauge the Effects of My Efforts?I wanted to measure something important.I wanted it to be easy to do and not take a lot of time and $$.I wanted it to be easy for me to understand, as well as for my wife and kids.
10This is General Outcome Measurement The Answer Was ObviousThis is General Outcome MeasurementTesting something “small” to make statements about something “big” (important)!”
11There Were Other Things I Could Measure Daily Calorie TargetsCalories per Item ConsumedMinutes of Daily ExerciseEstimated Calories Burned from ExerciseInches Around WaistMiles per Day and Per Week of Bike RidingAverage Biking MPHAverage Cadence While RidingEnergy Watts GeneratedThese Things ALSO Were Important, But More Difficult to Measure, to Compare, and “Put Together” for a Picture of ProgressThis is Mastery Monitoring
12Big IdeasEducators typically have lots of opinions about assessment and progress monitoring is no exception. However, few of us have sufficient training in assessment in general and progress monitoring in particular.(Yet) Frequent progress monitoring is one of the most powerful tools in educators’ intervention toolbox and the single most powerful teaching variable that they can control!There are two “families” of Progress Monitoring tools,General Outcome Measurement (GOM) andMastery Monitoring (MM)GOM assesses progress on a standard and equivalent measure the same way over time. It answers the question of “Is the student becoming a ‘better reader?’ “ or “Is the student better at mathematics computation?” It is associated with gains in “important” outcomes or “big things.”MM assesses progress on ever changing and different tests aligned with short-term instructional objectives or units. It answers the question of “Did the student learn what I taught today (or this week)? It is associated with instructional validity.Most Curriculum-Based Measurement (CBM) tests are associated with GOM.The ideal progress monitoring system is a combination of GOM and MM.
13My Assessment Training <?xml version="1.0" encoding="ISO " ?><poll url="http://www.polleverywhere.com/multiple_choice_polls/MTI4NDcwNzAxNQ"><!-- This snippet was inserted via the Poll Everywhere Mac Presenter --><!-- The presence of this snippet is used to indicate that a poll will be shown during the slideshow --><!-- TIP: You can draw a solid, filled rectangle on your slide and the Mac Presenter will automatically display your poll in that area. --><!-- The Mac Presenter application must also be running and logged in for this to work. --><!-- To remove this, simply delete it from the notes yourself or use the Mac Presenter to remove it for you. --><title>My Assessment Training</title></poll>
14Schools Are Looking for Swiss Army Knife of Tests Tests that Can...Do EVERYTHINGWith Little to No Teacher TimeLittle HassleThe Emphasis is On Program Evaluation, Accountability, Perhaps Screening, But Quality PM is Not Their Strength!
15Frequent Progress Monitoring (of a Particular Type) is One of Our Most Powerful Intervention Tools ...effective across student age, treatment duration, frequency of measurement, and special needs statusMajor message is for teachers to pay attention to the formative effects of their teaching as it is this attribute of seeking (my emphasis) formative evaluation...that makes for excellence in teaching (p. 181)Hattie, J. (2009). Visible learning: A synthesis of over 800 meta-analyses relating to achievement. New York, NY: Routledge.
16Frequent (Formative) Progress Monitoring And the Number 1 Most Powerful TEACHING VariableHattie, J. (2009). Visible learning: A synthesis of over 800 meta-analyses relating to achievement. New York, NY: Routledge.
18General Outcome Measurement GOM assesses progress on a standard and equivalent measure the same way over time.Think: Testing “small” to make statements about something “big” (i.e.,very important)!Also often referred to as Long-Term Measurement (LTM)
19Other Professions Are Highly Dependent on GOM MedicineBlood PressureBlood Glucose LevelsBusinessEarnings per ShareEconomyConsumer Price IndexUnemployment RateThe Key Concept is An Empirically Validated “Indicator”
20Curriculum-Based Measurement Short, standardized basic skills measures validated as general outcomes measures (GOM).General reading skill or ability:R-CBM: Oral readingMaze: Silent readingGeneral mathematics skill or ability:M-COMP: General mathematics computation skillsM-CAP: General math concepts and application skillsGeneral writing skill or ability:WE-CBM: General written expression skillsGeneral spelling skill or ability:S-CBM: General written expression skills
21A Reading General Outcome: A “Rich Task” Consistent with CCSS It was a pretty good composition. I felt proud knowing it was the best one at my school. After I’d read it five times, I was impatient to start reading it out loud. I followed the book’s directions again. First I read the composition out loud without trying to sound impressive, just to hear what the words sounded like.How many Words did Billy Read Correctly--Ask the audience. About to be exactWhat is Billy’s accuracy? have the audience estimate. Actual accuracy is 73%, which is poor reading. Our judgments can get us in trouble. That’s why we use objective tests.Billy, 4th Grader
22Questions I Can Answer At a Single Point in Time: Is This Student a Good or Poor Reader, gauged normatively or with standards?
23Questions I Can Answer Over Time: Is This Student Improving in His General Reading Skill?
24The Judgment is Empirical Shinn, M.R., Good, R.H., Knutson, N., Tilly, W.D., & Collins, V. (1992). Curriculum-Based reading fluency: A confirmatory analysis of its relation to reading. School Psychology Review, 21(3),
25Points of Confusion with GOM 1. Short Tests can’t tell you anything.Tell that to your physician. They are short to reduce the amount of instructional time lost to testing.2. Because something “little” is tested, this “thing” becomes the specific instructional target.It is not “oral reading fluency.”To Move the R-CBM “Dial,” We Need to Target a Variety of Reading Skills, Not Just Reading SpeedCBM doesn’t include all the things we teach.It does not measure everything in reading, math, writing, etc.Less important for instructional planning and program evaluation and accountability
26All Problems of DIFFERENT TYPES A Mathematics Computation General Outcome: A “Rich Task” Consistent with CCSSAll Problems of DIFFERENT TYPESRow AdditionColumn AdditionHow many Words did Billy Read Correctly--Ask the audience. About to be exactWhat is Billy’s accuracy? have the audience estimate. Actual accuracy is 73%, which is poor reading. Our judgments can get us in trouble. That’s why we use objective tests.2 Digit Subtraction w/o Regrouping2 Digit Addition w/o Regrouping
27Mastery MonitoringMM assesses progress on constantly different tests that are closely tied to specific instructional content and equivalent measure the same way over time.Think: Testing “small” to make statements about something “small”!Also often referred to as Short-Term Measurement (STM)Examples: End-of Unit Tests, Specific Skills Tests, Quizzes
28Mathematics Computation Mastery Monitoring: Single Skill Mathematics Computation Probe Basic Addition Facts 0-12All Problems of The SAME TYPE
29Is This Student Learning Multi-Digit Addition Skills? Questions I Can AnswerIn the Short Term:Is This Student Learning Multi-Digit Addition Skills?
30Questions That Are More Difficult In the Long Term:Is This Student Improving in Mathematics Computation?
31Why Is This Question More Difficult? 1. It presumes the student has retained addition skills.2. It assumes that addition skills must be taught the student has before subtraction skills.3. It assumes that the addition and subtraction skills tests are reliable and valid.4. It assumes that the criterion for mastery (in this case 80%) has been validated.
32GOM Assumptions, Advantages, and Disadvantages An “Indicator” Has Been Established EmpiricallyCurriculum IndependentNot Everything Students Need to Know Has a Validated Indicator; Currently Constrained to the Basic SkillsProgress Monitoring is Relatively Easy to Do--Logistically FeasibleNOT Consistent with How Teachers “Think” about PMReliable and Valid Tests Have Been CreatedLacks Exhaustive Information for Diagnosis and Instructional PlanningAssessment for Retention and Generalization Built InConfident Decisions About Progress
33MM Assumptions, Advantages, and Disadvantages Validated Instructional HierarchyHigh Instructional ValidityLet’s Teachers Know if What They’ve Been Teaching Has Been Learned (and Least Initially)Curriculum Dependent-Different Curriculum Value Different Things, Teach Them in Different Orders, Etc. Comparing Progress Within and Across Different Curriculum is DifficultReliable and Valid Tests are Available for Each Unit, Objective, SkillsConsistent with How Teachers “Think” about PMDoesn’t Routinely Test for Retention and Generalization--Therefore Students May Not Be Taught to MasteryMastery Criterion are Empirically EstablishedTests Can Often Be Used DiagnosticallyLogistically Complex, Even if Reliable and Valid Tests Have Been Created; Testing is Always Changing and If Students are Taught to Criterion, Can Be OverwhelmingReliable Decisions About Progress Are Sorta Iffy
34Standards for Evaluating General Outcome Measures
35Standards for Evaluating Mastery Monitoring Measures
36Comparison of Progress Monitoring Standards GOM StandardsMM StandardsAlternate FormsSkill Sequence SpecifiedSensitive to Student ImprovementSensitive to ImprovementReliability of the Performance Level ScoreReliabilityReliability of the SlopeValidity of the Performance Level ScoreValidityPredictive Validity of the Slope of ImprovementEnd-of-Year BenchmarksPass/Fail CriterionRates of Improvement SpecifiedDisaggregated Reliability and Validity DataNorms Disaggregated for Diverse Populations
37Mark’s Bottom Line Suggestions THINK PROGRESSProgress Monitoring is Vital and We Have the Capacity to do This Efficiently and Effectively--In the Basic SkillsFrequent GOM Using CBM is the “Best” Way to Do This--Let’s Get It Done, Especially for At Risk Students and Those with Severe Achievement DiscrepanciesTHINK PERFORMANCEMM is Important--But Less So For ProgressPerformance is About What I am Teaching and If Students Don’t Perform What I’m Teaching, then No Learning Occurred
38Bottom LineSo...Build Basic Skills PM Using CBM at Tier 1 As Long As You Need ToUse More Frequent PM Using CBM at Tiers 2 and 3 As Long As You Have Students with Basic Skills Discrepancies--And In Most Schools, That’s Through Grade 12Use Your Existing Assessments WITHIN THE CURRICULUM as Performance Assessment, Instructional Planning, and Supporting Evidence (Not Primary) of Progress