Making MAP More Meaningful Winter 2008 David Dreher, Project Coordinator Office of Accountability Highline Public Schools.

Slides:



Advertisements
Similar presentations
Measures of Academic Progress
Advertisements

How was your MAP ® experience?  As you get settled, tell us about your MAP experience.  Please add comments or questions to the graffiti wall. Use the.
MAP Information Martha - introduction October 23, 2013.
MAP: Basics Overview Jenny McEvoy and Page Powell.
Using MAP for College and Career Readiness
District-Wide Use of Northwest Evaluation Association (NWEA) Measures of Academic Progress (MAP) Data.
Fall 2014 MAP NWEA Score Comparison Alliance Dr. Olga Mohan High School October 22, 2014.
Measures of Academic Progress (MAP) Curt Nath Director of Curriculum Ocean City School District.
How was your MAP ® experience?  As you get settled, tell us about your MAP experience.  Please add comments or questions to the graffiti wall. Use the.
Minnesota Manual of Accommodations for Students with Disabilities Training Guide
MAP Testing for Parents
Classroom Assessment A Practical Guide for Educators by Craig A
Go Bobcats! MPG Parent Workshop Primary Grades K-2 Mitchell Elementary School Please note that many graphics and portions of text within presentation.
JUNE 26, 2012 BOARD MEETING Measures of Academic Progress (MAP)
EasyCBM: Benchmarking and Progress Monitoring System Jack B. Monpas-Huber, Ph.D. Director of Assessment & Student Information Shereen Henry Math Instructional.
This document is an editable template. Please make sure to customize it for your district before distributing. Copyright 2014 by The Colorado Education.
Measures of Academic Progress. Make informed instructional decisions  Identify gaps/needs  Support specific skill development across content areas 
Stepping Stones to Using Data Measures of Academic Progress, MAP, DesCartes: A Continuum of Learning, Partnering to help all kids learn, Power of Instructional.
SB : The Great Teachers and Leaders Act State-wide definition of “effective” teacher and principal in Colorado Academic growth, using multiple measures.
Benchmark Data. World History Average Score: 56% Alliance: 96%
Holt MS Sarah McKenzie PhD October 27, Staff Questions: Q) Is one test more difficult than another? How do we know that if the scale score falls.
Department of Research and Planning Leadership Meeting January 16, 2013 ASSESSMENT CORRELATIONS.
Eileen Boyce Toni Tessier Waterford Public Schools Literacy Specialists.
Goal Setting Measures of Academic Progress, MAP, DesCartes: A Continuum of Learning, Partnering to help all kids learn, Power of Instructional Design,
Public Schools of Petoskey Measures of Academic Progress (MAP) Implementation Report Northwest Evaluation Association (NWEA) April 2007.
 1. What is MAP?  2. Share how PCPS uses data from MAP to plan for instruction, track student progress, and provide formative feedback  3. Dr. David.
Understanding Alaska Measures of Progress Results: Reports 1 ASA Fall Meeting 9/25/2015 Alaska Department of Education & Early Development Margaret MacKinnon,
Stepping Stones to Using Data Measures of Academic Progress, MAP, DesCartes: A Continuum of Learning, Partnering to help all kids learn, Power of Instructional.
End-of –year Assessments MAP and DRA Testing Workshop for Parents
TEACHER EFFECTIVENESS INITIATIVE VALUE-ADDED TRAINING Value-Added Research Center (VARC)
Scaling and Equating Joe Willhoft Assistant Superintendent of Assessment and Student Information Yoonsun Lee Director of Assessment and Psychometrics Office.
Scale Scoring A New Format for Provincial Assessment Reports.
USING GRAPHICAL DISPLAY by John Froelich A Picture is Worth a Thousand Words:
Growth and Goals Measures of Academic Progress ® (MAP ® ) Measures of Academic Progress, MAP, and DesCartes: A Continuum of Learning are registered trademarks.
DVAS Training Find out how Battelle for Kids can help Presentation Outcomes Learn rationale for value-added progress measures Receive conceptual.
MAP Testing A Brief Overview of the Measures of Academic Progress Test. Grades KG Fall 2014.
Academic Growth in Math High Poverty Students – WASL and MAP Feng-Yi Hung, Ph.D Director of Assessment and Program Evaluation Clover Park School District.
Welcome to MMS MAP DATA INFO NIGHT 2015.
MAP: Measured Academic Progress© Parent Coffee February 10, 2010.
MAP Measurement of Academic Progress. What information does MAP provide classroom teachers? Performance data linked to Ohio standards A “RIT Score” that.
ISAT AND MAP ASSESSMENTS THORP SCHOLASTIC ACADEMY SESSION 2: 2/27/2013.
ACSIP data: a new lens Sarah McKenzie, PhD Assessment, Research and Accountability.
Minnesota Manual of Accommodations for Students with Disabilities Training January 2010.
EVAAS and Expectations. Answers the question of how effective a schooling experience is for learners Produces reports that –Predict student success –Show.
Aligning Assessments to Monitor Growth in Math Achievement: A Validity Study Jack B. Monpas-Huber, Ph.D. Director of Assessment & Student Information Washington.
Tuesday, November 27, 2015 Board of Education Meeting District MAP.
October 24, 2012 Jonathan Wiens, PhD Accountability and Reporting Oregon Department of Education.
A Review of the MAP/K-PREP Linking Study and College Readiness (ACT) Benchmarks Nate Jensen, Ph.D. Senior Research Scientist Northwest Evaluation Association.
The MAP to College: Getting Started Escondido Union School District.
Stepping Stones to Using Data Measures of Academic Progress, MAP, DesCartes: A Continuum of Learning, Partnering to help all kids learn, Power of Instructional.
[School Name]’s Student Perception Survey Results This presentation is a template and should be customized to reflect the needs and context of your school.
Trigg County Public Schools Continuous Assessment Update March 10, 2011.
Evaluating Growth Patterns. Setting the Stage  Welcome/Introductions  Structure for the day  Materials review R A M Materials Reports Activity M A.
Student Growth Measures in Teacher Evaluation: Writing SLOs August 2014 Presented by Aimee Kirsch.
WestEd.org Washington Private Schools RtI Conference Follow- up Webinar October 16, 2012 Silvia DeRuvo Pam McCabe WestEd Center for Prevention and Early.
Is it working? Tracking outcomes of Professional Development Title I & Title IIA Statewide Conference June 3, 2016.
USING MAP DATA TO SET GOALS AND ENGAGE STUDENTS AND FAMILIES 7/14/16.
STAR Reading. Purpose Periodic progress monitoring assessment Quick and accurate estimates of reading comprehension Assessment of reading relative to.
MAP Growth NWEA Northwest Evaluation Association.
M.A.P. Measures of Academic Progress
What is Value Added?.
What is MAP? There by, bringing about differentiated testing.
M.A.P. Measures of Academic Progress
Student Growth Measurements and Accountability
Measures of Academic Progress (MAP) – Overview
Third Grade Reading Guarantee Parent Meeting October 25, 2017
What is MAP? You may be familiar with paper and pencil assessments where all your child is asked the same questions and spend a fixed amount of time taking.
NWEA Measures of Academic Progress (MAP)
CORE Academic Growth Model: Results Interpretation
Measures of Academic Progress (MAP)
Presentation transcript:

Making MAP More Meaningful Winter 2008 David Dreher, Project Coordinator Office of Accountability Highline Public Schools

Overview Recap of Predicting 2008 WASL Examining 2008 WASL Predictions Moving Forward to 2009

RECAP WERA Spring 08 Presentation Predictions Released November 2007 –A “best guess” about each student’s performance on the upcoming WASL based on prior MAP and/or WASL performance –Intended Uses Provide building staff with a level of risk for not meeting WASL standard. School- and District- level 2008 WASL “forecasts” –Theory: Putting MAP scores in context with WASL scores will make MAP more meaningful.

Example of Projection and Prediction 7 th Grade Student in Reading MAP Scores Winter 2006Spring 2007Fall 2007Highest MAP Expected MAP Growth 3 WASL 2007 Projected MAP Spring Predicted Prediction Model 2008 WASL Score 2008 WASL Range MAP and WASL MAP Only WASL Only

WASL Prediction Range Constructed using the SEM values reported in the 2001 WASL Technical Reports. Predicted Range = Predicted WASL Score +/- SEM Grade LevelSEM – ReadingSEM – Math 3, 4, 5, ,

Interpreting Predictions If the prediction range is: –Entirely below 400 (ex.: ): student has less than a 20% chance on the WASL this spring unless we accelerate their learning. –Straddles 400 (ex.: ): student has basically a coin-flip chance on the WASL, even if their prediction is above 400. –Entirely above 400 (ex.: ): student has more than an 80% chance on the WASL in the spring, IF they continue to progress.

NWEA’s MAP/WASL Alignment Study Released January 2008 Reading Fall Testing Window GradeNWEA “Meets Standard” Cut Score HPS Accountability “Strategic” RIT Range

NWEA’s MAP/WASL Alignment Study Released January 2008 Mathematics Fall Testing Window GradeNWEA “Meets Standard” Cut ScoreStrategic RIT Range

Spring 2008: WASL happened... Late Summer 2009: WASL results arrive!

Examining the Predictions Reliability Analysis –Repeated the “Backward Look” analysis –“Within Group Look” Analysis of “Exceptional” Performances –Predicted Level Analysis

Grade WASL 2008 “Backward Look” (%) WASL 2007 “Backward Look” (%) “Backward Look”: Math % = Actual Met / Predicted to Meet Predicted to Meet = Predicted WASL score of 400 or better.

Grade WASL 2008 “Backward Look” (%) WASL 2007 “Backward Look” (%) “Backward Look”: Reading % = Actual Met / Predicted to Meet Predicted to Meet = Predicted WASL score of 400 or better.

Benchmark (%) Strategic (%) Intensive (%) GradeEstimated >80%Estimated ~50%Estimated <20% “Within Group”: Math % = Actual Met Within Group / Total Number In Group

Benchmark (%) Strategic (%) Intensive (%) GradeEstimated >80%Estimated ~50%Estimated <20% “Within Group”: Reading % = Actual Met Within Group / Total Number In Group

Questions/Comments Procedures for making predictions Results of reliability analyses What about our theory behind doing this? –“Putting MAP scores in context with WASL scores will make MAP more meaningful.”

“Meaningful”: Depends on Who You Ask My experience talking with the people who work directly with the kids suggests that the strength of our ability to assess the risk level of their students doesn’t impress them. –“You don’t need a weatherman to tell you which way the wind blows” Bob Dylan

What would be more “Meaningful”? Information that would help determine whether things done to help kids pass the WASL worked Data Users Principals, Coaches, Teachers Data Creators Office Of Accountability

But...I don’t really know what schools are doing to try to help kids pass the WASL... So...how can I find out?

“ Exceptional” Performance Analysis Please see handout Objective: Start conversations that would increase the flow of information from data users back to us in Accountability What are your observations of the data?

Expectations for “Above Level” Students were receiving interventions designed to address skills/knowledge deficits Students were receiving interventions designed to familiarize them with WASL format Students benefited from actions taken by the school to improve the WASL testing environment Your ideas?

Expectations for “Below Level” Students were ELL or SPED Students were chronically absent or highly mobile Students did not take the WASL seriously Your ideas?

Moving Forward Predictions simplified: Use BSI designations only Raising awareness and understanding of NWEA’s Alignment Study Increase understanding of NWEA goals and how to interpret goal-level results Investigate the possible use of MAP data in evaluation of interventions, initiatives, and programs

Continue to Solicit Input Data Users Principals, Coaches, Teachers Data Creators Office Of Accountability

Contact Information David Dreher, Project Coordinator Office of Accountability Highline Public Schools

What they said... Expected ResponsesUnexpected Responses SchResp Supplement/ Content Supplement/ Format Test Environ School/ Class Environ Perception/ Understanding of MAP Did Not Answer the Question BEVXX XX CED DESX X HAZX X MIDXX MOUXXX MCM SHO WCHXXX CASX?? CHIX?? GLOX X

What is MAP M easures of A cademic P rogress –Developed by the Northwest Evaluation Association –Norm-referenced assessment –Computerized and adaptive –Performance is reported as a RIT score The RIT Scale –Uses individual item difficulty values to estimate student achievement –A RIT score has the same meaning regardless of grade level –Equal interval scale Highline Public Schools –Three testing windows per year (Fall, Winter, Spring) –Test students in the areas of math and reading –Test students in grades 3-10

The Needs of the Data User Building staff were saying things like... –“How can we use MAP data to help us make decisions?” –“How do MAP and WASL performance compare?” –“I want to know what a student’s history is with MAP.” –“What is a RIT score?” –“Giving me a RIT score is like telling me the temperature in Celsius!”

Making The Predictions Snooped and found the best indicators of WASL success Applied linear regression models to generate WASL scores for each student Examined the predicted WASL scores

Projecting MAP to Spring For the models with “Projected MAP” as one of the factors individual student performance on MAP in the Spring of 2008 was projected. –The amount of expected growth added to a student’s Highest MAP score came from NWEA’s Growth Study

Snooping (Reading) R-Values WASL 2007 Reading Scale vs. Grade WASL Reading WASL Reading WASL Reading MAP-R Spring MAP-R Winter MAP-R Fall High MAP-Read (F06, W07, S07) High MAP-Read + High MAP-Math High MAP-Read +WASL06 Read scale High MAP-Read + WASL 04_Rscale 0.787

Snooping (Math) R-values WASL 2007 Math vs. Grade WASL Math WASL Math WASL Math MAP-Math Spring MAP-Math Winter MAP-Math Fall High MAP-Math (F06, W07, S07) High MAP-Math + High MAP-Read High MAP-Math + WASL06 Math scale Highest MAP-M + WASL04 Math scale 0.908

What we learned by snooping... Correlations were generally good. –Reading R-value range: –Math R-value range: Correlations in math were stronger than in reading. “Highest MAP” consistently correlated better than any single MAP score. Correlations were generally strongest when Highest MAP and WASL 2006 factors were combined.

Regression Models For students with both MAP and 2006 WASL scores (~95%) WASL 2007 = b0 + b1*Highest MAP + b2*WASL 2006 For students that only had MAP score(s) (~3%) WASL 2007 = b0 + b1*Highest MAP For students that only had WASL 2006 score (~2%) WASL 2007 = b0 + b1*WASL 2006 Where: Highest MAP = The student’s highest score on MAP from the Fall 2006, Winter 2007, or Spring 2007 windows. Typically Spring WASL 2006 = The student’s raw score from the 2006 WASL Spring testing.

Prediction Models For students with both MAP and 2007 WASL scores WASL 2008 = b0 + b1*Projected MAP + b2*WASL 2007 For students with only MAP score(s) WASL 2008 = b0 + b1*Projected MAP For students with only WASL 2007 score WASL 2008 = b0 + b1* WASL 2007 Where: Projected MAP = Projected Spring 2008 MAP score based on the student’s highest score on MAP from the Winter 2007, Spring 2007 or Fall 2008 windows. WASL 2007 = The student’s raw score from the 2007 WASL Spring testing.