CORE Academic Growth Model: Step-By-Step

Slides:



Advertisements
Similar presentations
Value Added in CPS. What is value added? A measure of the contribution of schooling to student performance Uses statistical techniques to isolate the.
Advertisements

Impact on Student Learning The conversation is about multiple indicators for this category BUT Few if any places actually have viable multiple indicators.
Understanding Individual Student WESTEST 2 Growth & Achievement Report Office of Assessment and Accountability.
CASPA Comparison and Analysis of Special Pupil Attainment SGA Systems SGA Systems Limited A brief overview of CASPA's graphs and reports.
Haywood County Schools February 20,2013
Comparing Growth in Student Performance David Stern, UC Berkeley Career Academy Support Network Presentation to Educating for Careers/ California Partnership.
TEACHER EFFECTIVENESS INITIATIVE VALUE-ADDED TRAINING Value-Added Research Center (VARC)
Districts and States Working with VARC Minneapolis Milwaukee Racine Chicago Madison Tulsa Atlanta New York City Los Angeles Hillsborough County NORTH DAKOTA.
Enquiring mines wanna no.... Who is it? Coleman Report “[S]chools bring little influence to bear upon a child’s achievement that is independent of.
99th Percentile 1st Percentile 50th Percentile What Do Percentiles Mean? Percentiles express the percentage of students that fall below a certain score.
Introduction to GREAT for ELs Office of Student Assessment Wisconsin Department of Public Instruction (608)
The Oak Tree Analogy. For the past year, these gardeners have been tending to their oak trees trying to maximize the height of the trees. Explaining the.
Introduction to the Georgia Student Growth Model Student Growth Percentiles 1.
2012 Secondary Curriculum Teacher In-Service
UNDERSTANDING HOW THE RANKING IS CALCULATED Top-to-Bottom (TTB) Ranking
Introduction to the Georgia Student Growth Model Understanding and Using SGPs to Improve Student Performance 1.
1 New York State Growth Model for Educator Evaluation 2011–12 July 2012 PRESENTATION as of 7/9/12.
PVAAS – Pennsylvania Value-Added Assessment System added_assessment_system_(pvaas)/8751.
Using Student Growth Percentiles in Educator Evaluations
CLINTON HIGH SCHOOL 2012 MCAS Presentation October 30, 2012.
TEACHER EFFECTIVENESS INITIATIVE VALUE-ADDED TRAINING Value-Added Research Center (VARC) October 2012.
1 Watertown Public Schools Assessment Reports 2010 Ann Koufman-Frederick and Administrative Council School Committee Meetings Oct, Nov, Dec, 2010 Part.
Introduction to GREAT for ELs Office of Student Assessment Wisconsin Department of Public Instruction (608)
New York State Scores 2011—2012 School Year. Growth Ratings and Score Ranges Growth RatingDescriptionGrowth Score Range (2011–12) Highly EffectiveWell.
TEACHER EFFECTIVENESS INITIATIVE VALUE-ADDED TRAINING Value-Added Research Center (VARC)
1 New York State Growth Model for Educator Evaluation 2011–12 July 2012 PRESENTATION as of 7/9/12.
BROMWELL COMMUNITY MEETING November 17, SCHOOL PERFORMANCE FRAMEWORK (SPF)
Copyright © 2010, SAS Institute Inc. All rights reserved. How Do They Do That? EVAAS and the New Tests October 2013 SAS ® EVAAS ® for K-12.
Custom Reports: SCGs and VCGs. Standard Comparison Group (SCG)
Evaluation Institute Qatar Comprehensive Educational Assessment (QCEA) 2008 Summary of Results.
October 24, 2012 Jonathan Wiens, PhD Accountability and Reporting Oregon Department of Education.
Copyright © 2014 American Institutes for Research and Cleveland Metropolitan School District. All rights reserved. March 2014 Interpreting Vendor Assessment.
KHS PARCC/SCIENCE RESULTS Using the results to improve achievement Families can use the results to engage their child in conversations about.
1 New York State Growth Model for Educator Evaluation June 2012 PRESENTATION as of 6/14/12.
1 Testing Various Models in Support of Improving API Scores.
Understanding the Results Ye Tong, Ph.D. Pearson.
Understanding Growth and ACT Aspire Reports.
State of Wisconsin School Report Cards Fall 2014 Results
A Growth Measure for ALL Students.
Connecticut’s Growth Model for the Smarter Balanced Summative Assessments: English Language Arts (ELA) and Mathematics Hello. Thank you for listening.
CORE Academic Growth Model: Introduction to Growth Models
CORE Academic Growth Model: Introduction to Growth Models
Overview of the new State Accountability System
The Implementation of House Bill 22
Student Growth Measurements and Accountability
The CORE Data Collaborative and the Riverside County Office of Education Noah Bookman, Chief Strategy Officer (CORE) October 20, 2017.
TAILWINDS HEADWINDS. TAILWINDS HEADWINDS CNUSD SBAC ELA Based on the # of HW.
IT’S ALL ABOUT GROWTH!. Hemet Unified School District’s Use of Measures of Academic Progress (MAP)
Understanding My Paper Score Report
NWEA Measures of Academic Progress (MAP)
Overview of the Georgia Student Growth Model
Connecticut’s Growth Model for the Smarter Balanced Summative Assessments: English Language Arts (ELA) and Mathematics October 2016 Hello. Thank you.
EVAAS Overview.
South Dakota’s Growth Model
CORE Academic Growth Model: Results Interpretation
Danvers Public Schools: Our Story
Setting MAP Goals Grades 7 – 10.
Using Data for Improvement
New York State Education Department Using Growth Measures for Educator Evaluation August 2012.
ESSA for AFESC Schools 2018 Under the reauthorization of ESEA, the federal government required each state to design an accountability system that met.
Introduction to the Georgia Student Growth Model
An INTRODUCTION TO THE California School Dashboard
CORE Academic Growth Model: Step-By-Step
Measuring Student Growth
Academic Achievement Report for Meadow Homes Elementary School
DELAC Meeting March 14, 2017.
Growth Measure Pilot: Update
State of Wisconsin School Report Cards Fall 2014 Results
Impact of EL Students and TELPAS Performance on State Accountability
Purpose of This Deck This slide deck is intended for use by site administrators to provide information to Parents about the California School Dashboard.
Presentation transcript:

CORE Academic Growth Model: Step-By-Step Last updated 9.13.16

CORE Academic Growth Model Step 1 Step 2 After Spring testing is complete, EA collects student data from the CORE Districts & EA determines demographic and other adjustments Each student gets a customized statistical prediction based on his or her characteristics +35 Average Growth - 3 for Econ. Disadv. - 4 for Disability + 2 for EL Status _________ +32 points During the year + 1 for Foster Status + 2 School Averages - 1 for Homeless Status [main points on slide] Note: specific numbers on this slide for adjustments are for illustrative purposes, the actual adjustment amounts are calculated each year and for each grade/subject independently and reflect the actual observed trends across the CORE districts Spring 2015 Test Score Spring 2016 Predicted Test Score

CORE Academic Growth Model Step 3 Determine whether each student exceeded or did not meet prediction, and by how much Actual Score Student Exceeded Prediction by 5 Points Predicted Score Student Did Not Meet Prediction by 4 Points [main points on slide] On the left is a student who exceeded their prediction by 5 points On the right is a student who did not meet their prediction by 4 points The CORE Academic Growth Model takes into account the degree to which a student met or did not meet their prediction rather than just a “yes” or “no” on whether it was met Note: “Prediction” is being used in the statistical sense of the word. These “predictions” are made after all outcome data is already received and processed rather than a forward-looking forecast about how a particular student might do on the next test. In reality, a “prediction” should be thought about as typical growth for students with a similar starting point and characteristics. Predicted Score Actual Score Spring 2015 Test Score Spring 2016 Test Score Spring 2015 Test Score Spring 2016 Test Score

CORE Academic Growth Model Step 4 On average, did a school’s students tend to exceed or not meet their predictions, and by how much? School A (Average +3.25 Points) School B (Average -1.25 Points) -7 -4 +4 +8 +7 -3 +2 +4 [main points on slide] On the left is a school where students on average tended to exceed their predictions by a wide margin – on average, by a little over 3 points on the test. This is interpreted as the school having an above average impact on students’ growth, meaning this school will have an SGP well above 50. On the right is a school where students on average tended to not meet their predictions by a small amount – on average, they grew about one point slower on the test. This is interpreted as the school having below average impact on students’ growth, meaning this school will have an SGP slightly below 50. -6 +5 -6 -2 +3 +7 +6 -3 Above Average Impact Below Average Impact

CORE Academic Growth Model Growth result is converted to 0-100 Student Growth Percentile (SGP) Step 5 School 1 School 2 School 3 School 4 -7 -3 -2 -3 +4 +4 +2 +2 -4 -4 -2 -2 -1 +3 +8 +2 [main points on slide] These are four hypothetical example schools. In School 1, most students are growing quite a bit slower than similar students across CORE, which results in this school’s SGP being low in the scale. In School 2, students are growing on average just under typical growth for similar students, which results in this school’s SGP being slightly below 50. In School 3, students are growing on average a little faster than similar students across CORE, which results in this school’s SGP being slightly above 50. In School 4, most students are growing far faster than similar students across CORE, which results in this school’s SGP being high in the scale. [NOTE: The next few slides overlap with content from the “Results Interpretation” deck – skip if you plan to use that presentation too] 5 10 15 20 25 30 35 40 45 50 55 60 65 70 75 80 85 90 95 100 Slower Growth Average Growth Faster Growth

Converting to SGP to SQII Level Growth Percentile 50 100 1 2 3 4 5 6 7 8 9 10 Level 1 Level 2 Level 3 Level 4 Level 5 Level 6 Level 7 Level 8 Level 9 Level 10 Low High 9 10 19 20 29 30 39 40 49 50 59 60 69 70 79 80 89 90 100 SGP on the 0-100 scale is converted to the CORE’s School Quality Improvement Index (SQII) so that 10 percentile points fall within each level. For example, if a school’s SGP result is “35”, it falls within the 30th to 39th percentile in Level 4. Color-coding of results is red for Levels 1-3, orange for 4-7, and green for 8-10.

Converting to SGP to SQII Level Growth Percentile 50 100 07 57 88 Level 1 Level 6 Level 9 Here are three example results with their associated levels and color coding. 50 100 1 2 3 4 5 6 7 8 9 10 Level 1 Level 2 Level 3 Level 4 Level 5 Level 6 Level 7 Level 8 Level 9 Level 10 Low High 9 10 19 20 29 30 39 40 49 50 59 60 69 70 79 80 89 90 100

Basic Results Interpretation Slower Growth Average Growth Faster Growth 25 50 75 100 This grade-level team is producing… Overall 45 typical growth for their students 6th 55 faster-than-average growth for their students 7th 77 The SGP scale ranges from 0 to 100. Results can be provided overall across grade levels or for specific grade-level teams. Orange results mean the grade-level team is producing typical growth for their students. Green results mean the grade-level team is producing faster-than-average growth for their students. Red results mean the grade-level team is producing slower-than-average growth for their students – it DOES NOT mean that these students lost knowledge. These results already have context adjustments taken into account – these are “apples-to-apples” comparisons that take into account students starting point, English Learner status, etc. 8th 30 slower-than-average growth for their students* *does not mean these students lost knowledge