Measuring growth in student performance on MCAS: The growth model

Slides:



Advertisements
Similar presentations
You have been given a mission and a code. Use the code to complete the mission and you will save the world from obliteration…
Advertisements

Understanding Student Learning Objectives (S.L.O.s)
NYC Teacher Data Initiative: An introduction for Principals ESO Focus on Professional Development October 2008.
Stepping Stones to Using Data
Advanced Piloting Cruise Plot.
Future Ready Schools ABCs/AYP Background Briefing August 23, 2007 Lou Fabrizio, Ph.D. Director of Accountability Services NC Department of Public.
/4/2010 Box and Whisker Plots Objective: Learn how to read and draw box and whisker plots Starter: Order these numbers.
New Jersey Statewide Assessment Results: Highlights and Trends State Board of Education, February 6, 2008 Jay Doolan, Ed.D., Assistant Commissioner,
Jeopardy Q 1 Q 6 Q 11 Q 16 Q 21 Q 2 Q 7 Q 12 Q 17 Q 22 Q 3 Q 8 Q 13
Jeopardy Q 1 Q 6 Q 11 Q 16 Q 21 Q 2 Q 7 Q 12 Q 17 Q 22 Q 3 Q 8 Q 13
Career and College Readiness Kentucky Core Academic Standards Characteristics of Highly Effective Teaching and Learning Assessment Literacy MODULE 1.
Assessment Literacy Kentucky Core Academic Standards Characteristics of Highly Effective Teaching and Learning Career and College Readiness MODULE 1.
Winter Education Conference Consequential Validity Using Item- and Standard-Level Residuals to Inform Instruction.
SBA to GLE: The Road Les Morse, Director Assessment & Accountability Alaska Department of Education & Early Development No Child Left Behind Winter Conference.
Alaska Accountability Adequate Yearly Progress January 2008, Updated.
Alaska Accountability Adequate Yearly Progress February 2007, Updated.
1 Adequate Yearly Progress 2005 Status Report Research, Assessment & Accountability November 2, 2005 Oakland Unified School District.
A presentation to the Board of Education
AYP Plan Performance Reporting Division Texas Education Agency TETN Accountability Update November 14, 2006.
1 Adequate Yearly Progress (AYP) U.S. Department of Education Adapted by TEA September 2003.
August 8, 2013 Texas Education Agency | Office of Assessment and Accountability Division of Performance Reporting Shannon Housson, Director Overview of.
Welcome School Improvement Advisory Committee Members We are happy youre here!
FACTORING ax2 + bx + c Think “unfoil” Work down, Show all steps.
MUIR FUNDAMENTAL SCHOOL May 2012 CST Data Presentation.
Accountability Reporting Webinar Adequate Yearly Progress (AYP) Determinations & Federal NCLB Accountability Status, State Accountability & Assistance.
Teacher Keys Effectiveness System
Box and Whiskers with Outliers. Outlier…… An extremely high or an extremely low value in the data set when compared with the rest of the values. The IQR.
The SCPS Professional Growth System
Pennsylvania Value-Added Assessment System (PVAAS) High Growth, High Achieving Schools: Is It Possible? Fall, 2011 PVAAS Webinar.
VOORBLAD.
Update on Virginia’s growth measure Deborah L. Jonas, Ph.D. Executive Director for Research and Strategic Planning Virginia Department of Education April.
1 New York State Education Department Interpreting and Using Your New York State-Provided Growth Scores August 2012.
Annual Title 1 Parent Meeting
SEED – CT’s System for Educator and Evaluation and Development April 2013 Wethersfield Public Schools CONNECTICUT ADMINISTRATOR EVALUATION Overview of.
+ Yerington Intermediate School
District Advisory Council (DAC) 1 October 22, 2012 Westlawn Elementary School.
Lets play bingo!!. Calculate: MEAN Calculate: MEDIAN
Chapter 5 Test Review Sections 5-1 through 5-4.
Essential Reports Measures of Academic Progress® (MAP®)
The Colorado Growth Model Module 1: What it is & What it means Sponsored by The Colorado Department of Education Summer 2009 Version 1.2.
October 19, 2010 State Growth Model Trends AYP Results 2010 School Committee Presentation Brockton Public Schools.
Addition 1’s to 20.
25 seconds left…...
Subtraction: Adding UP
Week 1.
LOUISIANA DEPARTMENT OF EDUCATION
We will resume in: 25 Minutes.
1 Phase III: Planning Action Developing Improvement Plans.
PSSA Preparation.
SMART GOALS APS TEACHER EVALUATION. AGENDA Purpose Balancing Realism and Rigor Progress Based Goals Three Types of Goals Avoiding Averages Goal.
SEPTEMBER 24, 2009 NICOLE SKALSKY, PH.D. ASSESSMENT FACILITATOR Colorado Growth Model Overview.
Connecting the Process to: -Current Practice -CEP -CIITS/EDS 1.
Overview of Next-Generation Learners Part 2 (Growth) Ken Draut, Associate Commissioner Office of Assessment and Accountability Kentucky Department of Education.
Student Growth Percentiles For Classroom Teachers and Contributing Professionals KDE:OAA:3/28/2014:kd:rls 1.
Student Growth Percentile (SGP) Model
Introduction to the Georgia Student Growth Model Student Growth Percentiles 1.
Including a detailed description of the Colorado Growth Model 1.
Prepared by: Scott R. Morrison Director of Curriculum and Instructional Technology 11/3/09.
1 Measuring growth in student performance on MCAS: The growth model.
Department of Elementary and Secondary Education July, 2011
Hull Public Schools MCAS Presentation Kathleen I. Tyrell, Ed.D., Superintendent of Schools October 7, 2013 mko.
MCAS REPORT Spring 2013 Presented to the Hingham School Committee November 18, 2013 by Ellen Keane, Assistant Superintendent.
Adequate Yearly Progress (AYP) and Accountability Status Determinations.
1 Watertown Public Schools Assessment Reports 2010 Ann Koufman-Frederick and Administrative Council School Committee Meetings Oct, Nov, Dec, 2010 Part.
Merrymount Elementary School PTO Assessment Presentation December 4, 2014.
Understanding the Rhode Island Growth Model An Introductory Guide for Educators May 2012.
STUDENT GROWTH PERCENTILE Measuring Student Learning Growth and Interpreting the Teacher Report Diane Fiello, Vice President of TCRP Harris Luu, TCRP Leadership.
Measuring growth in student performance on MCAS: The growth model
Measuring growth in student performance on MCAS: The growth model
Presentation transcript:

Measuring growth in student performance on MCAS: The growth model

Overview What is growth? Why are we doing this? How do we measure growth for students and groups? What have we learned so far? What will be available this fall?

What is growth? MCAS shows how each student is achieving relative to state standards Is John proficient in 6th grade mathematics? Cannot compare John’s scaled scores from year to year Growth measures change in an individual student’s performance over time How much did John improve in mathematics from 5th grade to 6th grade? Did John improve more or less than his academic peers? Key points: Because standards differ from grade to grade, so too does the relative difficulty of the MCAS, as well as how the test results are scaled. Scaled scores cannot be meaningfully compared from year to year. The growth model allows for meaningful and valid comparisons over time. It works by comparing students to their “academic peers” (students with similar MCAS score histories).

Why measure growth? A way to measure progress for students at all performance levels A student can achieve at a low level but still improve relative to his academic peers Another could achieve well but not improve much from year to year Provides evidence of improvement even among those with low achievement Gives high achieving students and schools something to strive for beyond proficiency Key points: Because the growth model compares students to others with similar MCAS score histories, students at all levels of achievement can show growth. Students with high achievement can show low growth. Students with low achievement can show high growth.

Uses of growth data Reconceptualizing performance Performance = achievement + growth Identifying strengths and weaknesses in student performance beyond traditional achievement data Targeting assistance Conducting program evaluations Eventually, making accountability decisions Key points: Growth adds another dimension to student performance. Now we can determine: how students are achieving compared to grade-level standards How students are changing from one year to the next compared to their “academic peers” This will allow all educators to identify strengths and weaknesses they couldn’t previously detect. It will make it easier to target assistance to schools, programs, etc. Growth may eventually be factored into accountability decisions (growth will be used this year in only one kind of accountability decision: determining Level 4 Schools).

Student growth percentiles Each student’s rate of change is compared to other students with a similar test score history (“academic peers”) The rate of change is expressed as a percentile. How much did John improve in mathematics from 5th grade to 6th grade, relative to his academic peers? If John improved more than 65 percent of his academic peers, then his student growth percentile would be 65. Key points: This is a verbal explanation of the student growth percentile measure. Visual and numerical examples will follow. Note that similar test score histories does not mean the exact same test score history. If asked: Growth is calculated off of the raw score, not the scaled score. Students with the same scaled scores may have had different raw scores.

Growth to grade 7: Three students Gina Advanced SGP 80 to 99 65% Proficient 60 to 79 40 to 59  230 20 to 39 Needs Improvement 35% 1 to 19 Key points: Gina scored 230 on the ELA MCAS in grade 5 in 2006, and 230 in grade 6 in 2007. We looked at all the students who scored similarly to her to see how they did in ELA in grade 7. Some of those students’ scores dropped substantially, to Warning/Failing. Some students with this test score history get all the way up to Advanced. The typical kid (in the orange part of the triangle) improves a little. Note that this is real data, not projections – this is how kids with this test score history actually performed in 2008. So what happened to Gina? In grade 7, Gina again scores 230. However, look at the distribution of her academic peers. Most of her academic peers scored higher than she did. In fact, 65% of them scored higher than she did. She outscored 35% of her academic peers. Therefore, Gina has a student growth percentile of 35. Without the growth model, it would seem as if a student scoring 230, 230, 230 is staying put...neither improving nor declining. However, we can now see from the growth model that this student improved less than 65% of her peers. If asked: The SGP groups on the right define the meaning of the bands in the triangles. The top part of the triangle – dark brown – corresponds to the scaled scores for students who achieved 80th to 99th percentile growth. The next triangle corresponds to those with 60th to 79th percentile growth, etc. SGPs between 40 to 59 are typical Warning/Failing 2006 2007 2008

Growth to grade 7: Three students Harry Advanced 75%  244 Proficient 25% Needs Improvement Key points: Harry scored 248 in grade 5, and 248 in grade 6. He’s compared with all the students that demonstrated a similar pattern of scoring. In grade 7, he scores 244, which doesn’t seem like a big dip in achievement. However, look at the distribution of his academic peers. Most of his academic peers scored higher than he did. In fact, 75% of them scored higher than he did. He only outscored 25% of his academic peers. Therefore, Harry has a student growth percentile of 25. Warning/Failing 2006 2007 2008

Growth to grade 7: Three students Ivy Advanced Proficient 8%  226 Needs Improvement 92% Key points: Ivy scored 214 in grade 5, and 214 in grade 6. She’s compared with all the students that demonstrated a similar pattern of scoring. In grade 7, she scores 226. While she’s still in the Needs Improvement performance level, look at the distribution of her academic peers. She outscored 92% of her academic peers. Only 8% of her academic peers scored higher than she did. Therefore, Ivy has a student growth percentile of 92. 214 to 226 seems like a modest improvement, but a growth percentile of 92 indicates that this student grew an unusually high amount between grades 6 and 7. Without the growth model, we would have known Ivy had made a big improvement, but we would not have been able to quantify how big it was. Warning/Failing 2006 2007 2008

Growth to grade 7: Three students Gina, Harry, and Ivy Advanced Harry Proficient Gina Needs Improvement Ivy Key points: Each student has the exact same opportunity to grow at the 99th percentile and the 1st percentile. Notice that the trend in the 40-60 range is pointing up slightly. That indicates that the median student in grade 6 ELA tends to score a bit better in grade 7 ELA, no matter if they’re in the Warning, Needs Improvement, Proficient, or Advanced category in grade 6. Warning/Failing 2006 2007 2008

Growth to grade 7: Three students English language arts Grade 5 2006 Grade 6 2007 Grade 7 2008 SGP 2008 Gina 230 230 230 35 Harry 248 248 244 25 Key points: This is a numerical representation of the graphical data we just showed. The student with the lowest absolute score actually demonstrated the most growth (and vice versa). Ivy 214 214 226 92

Interpreting student growth percentiles Gina’s SGP was 35. This means her SGP in grade 7 was higher than 35 percent of her academic peers (and less than 65 percent). Is that amount of growth typical? Lower growth Typical growth Higher growth Key points: The growth measure is really just a number from 1 to 99, with higher being better. There are no definitions or criteria that tell us definitively how much growth is high or low; it’s just our own professional judgment. (This differs from MCAS performance levels, where professional educators have helped us define what it means to be proficient on each test.) That being said, our guidance is that growth percentiles between about 40 and 60 are what we would call “typical.” Percentiles above 60 suggest high growth; percentiles below 40 suggest low growth. 35 Percent of students

Key concepts Growth is distinct from achievement A student can achieve at a low level but grow quickly, and vice versa Each student is compared only to their statewide academic peers, not to all students statewide Others with a similar test score history All students can potentially grow at the 1st or 99th percentile Growth is subject-, grade-, and year-specific Different academic peer groups for each subject, grade, and year Therefore, the same change in scaled scores can yield different student growth percentiles The percentile is calculated on the change in achievement, not the absolute level Differs from more familiar norm-referenced measures Key points are on slide. Additional points: This is just another way of saying Performance = Achievement + Growth. Often people think that students scoring in the high advanced category don’t have as much “room to grow” and therefore can’t show as much growth as lower performing students. But this isn’t true for this growth model; in fact, it’s one of the model’s advantages. Because we are comparing growth only among academic peers, all students can growth at the 1st or 99th percentile. This makes for a fairer comparison of growth up and down the achievement spectrum. Because this is a relative / normative comparison, the amount of growth a student shows depends on the overall state pattern on the test. If 2008’s grade 6 class as a whole does better on the MCAS math test than the previous 6th grade class did in 2007, then 2008’s 6th graders will have to improve even faster to get the same SGP as last year’s 6th graders. Most norm-referenced measures have to do with the level of a variable: a baby born at 8 pounds is at the 69th percentile for weight. But that’s not what this measure shows. The baby weight analogy would be: if a baby who was born at 8 pounds gains 1 pound in its first month, what percentile growth is that gain of 1 pound, given that the baby started out at 8 pounds?

Growth for groups How to report growth for groups of students? Districts, schools, grades, subgroups, classrooms Median student growth percentile The point at which half of the students in the group have a higher growth percentile and half lower Growth distribution charts The percentage of students in the group growing less than, similar to, or more than their academic peers Key points: Up to this point we’ve been talking about how to calculate growth for individual students, but often we’re more interested in growth for groups of students like schools or grades. We have two measures of growth for groups Median SGP, a measure of central tendency Growth distribution charts, which show the spectrum / range of growth

Median student growth percentile Last name SGP Lennon 6 McCartney 12 Starr 21 Harrison 32 Jagger 34 Richards 47 Crosby 55 Stills 61 Nash 63 Young 74 Joplin 81 Hendrix 88 Jones 95 Imagine that the list of students to the left are all the students in your 6th grade class. Note that they are sorted from lowest to highest SGP. The point where 50% of students have a higher SGP and 50% have a lower SGP is the median. Median SGP for the 6th grade class Key points: A reminder on how to calculate medians: sort student SGPs from lowest to highest and find the middle student. More info: The class of students is all 1960s era rockers. “Jones” is Davy Jones.

Using median student growth percentiles: growth by achievement for schools Higher achieving Lower growing Higher achieving Higher growing Key points: It’s helpful to look at growth and achievement using a scatter plot. Here, achievement (% proficient or advanced in ELA, 2009) is plotted against growth (ELA median student growth percentile, 2009) for a random selection of about 200 schools in the state. There is a huge range of growth among schools at similar performance levels. For instance, among schools with about 30 to 35% proficient/advanced, one school’s median student grew at the 23rd percentile, while another grew at about the 70th. There is Note that about 2/3 of schools have median SGPs between 40 and 60 (the “typical” range). Lower achieving Lower growing Lower achieving Higher growing

Growth distribution charts Lower growth Typical growth Higher growth median Key points: Distribution charts tell you more than just where the middle of the distribution is – it tells you about how many kids are growing at very high and very low rates. In a typical growing school, we’d expect 20% of students to exhibit growth in the bottom quintile, about 20% in the next quintile, etc. Remembering that, let’s compare the Madison and Monroe schools – two schools in the same district with the same grade configuration. The Madison school is showing about typical growth – about 20% of students in each group. But the Monroe school shows much higher growth. 62% of Monroe students are in the top quintile for growth statewide. You’d only expect 20% of the students to be in that category, so they have 42 percentage points more students in the highest growth quintile than expected. median Percent of students

Rules of thumb Typical student growth percentiles are between about 40 and 60 on most tests. Students or groups outside this range has higher or lower than typical growth. Differences of fewer than 10 SGP points are likely not educationally meaningful. Key points: Don’t make too much of small differences.

Growth model pilot Tested data, reports, and materials with nine districts, April to July 2009 Community Day Charter School, Franklin, Lowell, Malden, Newton, Northampton, Sharon, Springfield, Winchendon Suggestions were incorporated into this fall’s statewide rollout Key points: From the pilot process we got a lot of insights into what growth looks like in districts, and we’ll share a few of those with you.

New insights: Growth vs. achievement Grades 4, 5, 6 mathematics – All elementary schools in one district Higher achieving Lower growing Higher achieving Higher growing Key points: This is a graph showing all the elementary schools in a real school district in Massachusetts. Look at the Y axis at their MCAS performance. If this is all the information we had about them, we’d think they were pretty low performing – their schools average about 40% proficient. But when we look at their growth, we can see that not a single one of their elementary schools had lower than typical growth – none with a median SGP less than 40. About half are growing at above typical rates. Lower achieving Lower growing Lower achieving Higher growing

New insights: Impact of a new K-5 curriculum Key points: These charts show growth distributions in three grades within one district. Grade 4’s growth looks about typical, and grade 6’s is higher than average. But what’s going on in grade 5? Only 12 percent of their kids are in the top 20th percentile of growth. For us looking at the state level, we could calculate the numbers, but we didn’t know the story or explanation. But when we showed it to the superintendent, he immediately knew what had happened. They had implemented a new math curriculum that year, but rather than phasing it in, they implemented it in all grades at once. Therefore the 5th graders had only one year of the new curriculum, and when they got to the MCAS test, they clearly were confused. They hadn’t gained much over their fourth grade performance relative to others statewide, and you could see it in the growth data.

New insights: Changes in pilot districts One discovered that its median student grew at only the 15th percentile from grade 3 to grade 4 Reconfiguring schools to avoid building transition in grade 4 One found that buildings with full-time math coaches had stronger growth than buildings with part-time coaches Revised coaching jobs to ensure full-time coverage One implemented training on growth for all principals district-wide Key points: These are more examples of things pilot districts took away from their exploration of their growth data.

What data are available? Grades 4 through 8, ELA and mathematics 2008 and 2009 Grade 10, ELA and mathematics (measures the change from grade 8 to grade 10) Only available for 2009 Key points: Aggregate data on all these measures for schools and districts is available on the state’s Profiles website: http://profiles.doe.mass.edu/. School and district personnel have access to more detailed reports through the Data Warehouse. Note that grade 10 growth is only available for 2009 because we didn’t have enough prior years of test data until that year.

Next steps Data were released to districts on Oct. 2nd in the Data Warehouse Public release of aggregate data on Oct. 27th Web site, written materials, workshops, and other communications and PD to help district staff understand and use the measure

For more information Technical Questions (Accessing Data) ESEdatacollect@doe.mass.edu or 781-338-3282 Growth Data Interpretation Questions growth@doe.mass.edu http://www.doe.mass.edu/mcas/growth/