Using Growth Models to improve quality of school accountability systems October 22, 2010.

Slides:



Advertisements
Similar presentations
Value Added in CPS. What is value added? A measure of the contribution of schooling to student performance Uses statistical techniques to isolate the.
Advertisements

Winter Education Conference Consequential Validity Using Item- and Standard-Level Residuals to Inform Instruction.
A Model Description By Benjamin Ditkowsky, Ph.D. Student Growth Models for Principal and Student Evaluation.
Multiple Regression and Model Building
Hierarchical Linear Modeling: An Introduction & Applications in Organizational Research Michael C. Rodriguez.
AMOs 101 Understanding Annual Measurable Objectives Office of Educational Accountability Wisconsin Department of Public Instruction November 2012.
Student Learning Targets (SLT) You Can Do This! Getting Ready for the School Year.
Understanding Massachusetts’ new accountability measures November 2012.
Enquiring mines wanna no.... Who is it? Coleman Report “[S]chools bring little influence to bear upon a child’s achievement that is independent of.
Robert L. Linn CRESST, University of Colorado at Boulder Paper presented at a symposium sponsored by the National Association of Test Directors entitled.
Communicating through Data Displays October 10, 2006 © 2006 Public Consulting Group, Inc.
Evaluating Hypotheses Chapter 9. Descriptive vs. Inferential Statistics n Descriptive l quantitative descriptions of characteristics.
Evaluating Hypotheses Chapter 9 Homework: 1-9. Descriptive vs. Inferential Statistics n Descriptive l quantitative descriptions of characteristics ~
Using Growth Models for Accountability Pete Goldschmidt, Ph.D. Assistant Professor California State University Northridge Senior Researcher National Center.
Types of Evaluation.
Chapter 7 Correlational Research Gay, Mills, and Airasian
Delaware’s Accountability Plan for Schools, Districts and the State Delaware Department of Education 6/23/04.
An Introduction to HLM and SEM
STANDARDS BASED GOALS and OBJECTIVES
Introduction to the Georgia Student Growth Model Student Growth Percentiles 1.
NCAASE Work with NC Dataset: Initial Analyses for Students with Disabilities Ann Schulte NCAASE Co-PI
Montana’s statewide longitudinal data system Project Montana’s Statewide Longitudinal Data System (SLDS)
Including a detailed description of the Colorado Growth Model 1.
Educator Evaluations: Growth Models Presentation to Sand Creek Schools June 13, 2011.
Addressing Student Growth In Performance Evaluation For Teachers and Administrators Patricia Reeves.
Michigan’s Accountability Scorecards A Brief Introduction.
+ Equity Audit & Root Cause Analysis University of Mount Union.
Introduction to the Georgia Student Growth Model Understanding and Using SGPs to Improve Student Performance 1.
The Use of Trajectory-Modeled Growth as Part of Adequate Yearly Progress: One State's Results Christopher I Cobitz, Ph.D. Reporting Section Chief North.
Fall Testing Update David Abrams Assistant Commissioner for Standards, Assessment, & Reporting Middle Level Liaisons & Support Schools Network November.
SY PVAAS Scatter Plots State to IU Region to School District Grades 4-8, 11 Math & Reading PVAAS Statewide Team for PDE Contact your IU PVAAS contact.
School Improvement Planning Today’s Session Review the purpose of SI planning Review the components of SI plans Discuss changes to SI planning.
A Closer Look at Adequate Yearly Progress (AYP) Michigan Department of Education Office of Educational Assessment and Accountability Paul Bielawski Conference.
© 2014, Florida Department of Education. All Rights Reserved Annual District Assessment Coordinator Meeting VAM Update.
Measuring Student Growth. Feedback Loop Key Concept All models are wrong, but some are useful. George Box.
Issues in Assessment Design, Vertical Alignment, and Data Management : Working with Growth Models Pete Goldschmidt UCLA Graduate School of Education &
Will Growth Models Improve School Accountability and NCLB/AYP? Results From New Research Survey and Analysis of Current AYP Growth Proposals Kimberly O'Malley.
Growth Model for District “X” Why Use Growth Models? Showing progress over time is a more fair way of evaluating It is not just a “snap shot” in time.
UCLA Graduate School of Education & Information Studies National Center for Research on Evaluation, Standards, and Student Testing Practical Considerations.
PROGRESS & & ACHIEVEMENT Pennsylvania Value-Added Assessment System (PVAAS) The Power of Derry Township School District School Board Presentation Sept.
Washington State Teacher and Principal Evaluation Project Update 11/29/12.
School Accountability in Delaware for the School Year August 3, 2005.
“Value added” measures of teacher quality: use and policy validity Sean P. Corcoran New York University NYU Abu Dhabi Conference January 22, 2009.
Impediments to the estimation of teacher value added Steven Rivkin Jun Ishii April 2008.
Copyright © 2010, SAS Institute Inc. All rights reserved. How Do They Do That? EVAAS and the New Tests October 2013 SAS ® EVAAS ® for K-12.
NECAP Results and Accountability A Presentation to Superintendents March 22, 2006.
Annual Measurable Objectives (trajectory targets).
DVAS Training Find out how Battelle for Kids can help Presentation Outcomes Learn rationale for value-added progress measures Receive conceptual.
MERA November 26,  Priority School Study  Scorecard Analyses  House Bill 5112 Overview.
Changes in Professional licensure Teacher evaluation system Training at Coastal Carolina University.
Evaluating Impacts of MSP Grants Ellen Bobronnikov January 6, 2009 Common Issues and Potential Solutions.
1 Children Left Behind in AYP and Non-AYP Schools: Using Student Progress and the Distribution of Student Gains to Validate AYP Kilchan Choi Michael Seltzer.
Fidelity of Implementation A tool designed to provide descriptions of facets of a coherent whole school literacy initiative. A tool designed to provide.
Value Added Model and Evaluations: Keeping It Simple Polk County Schools – November 2015.
Copyright © 2010, SAS Institute Inc. All rights reserved. How Do They Do That? EVAAS and the New Tests October 2013 SAS ® EVAAS ® for K-12.
Application of Growth and Value-Added Models to WASL A Summary of Issues, Developments and Plans for Washington WERA Symposium on Achievement Growth Models.
1 Getting Up to Speed on Value-Added - An Accountability Perspective Presentation by the Ohio Department of Education.
ADEQUATE YEARLY PROGRESS. Adequate Yearly Progress Adequate Yearly Progress (AYP), – Is part of the federal No Child Left Behind Act (NCLB) – makes schools.
1 Accountability Systems.  Do RFEPs count in the EL subgroup for API?  How many “points” is a proficient score worth?  Does a passing score on the.
PUBLIC SCHOOLS OF NORTH CAROLINA STATE BOARD OF EDUCATION DEPARTMENT OF PUBLIC INSTRUCTION 1 ABCs/AYP Background Briefing Lou Fabrizio Director.
AYP and Report Card. Big Picture Objectives – Understand the purpose and role of AYP in Oregon Assessments. – Understand the purpose and role of the Report.
C R E S S T / CU University of Colorado at Boulder National Center for Research on Evaluation, Standards, and Student Testing Design Principles for Assessment.
C R E S S T / CU University of Colorado at Boulder National Center for Research on Evaluation, Standards, and Student Testing Measuring Adequate Yearly.
Measuring Turnaround Success October 29 th, 2015 Jeanette P. Cornier, Ph.D.
Value Added Model Value Added Model. New Standard for Teacher EvaluationsNew Standard for Teacher Evaluations Performance of Students. At least 50% of.
Examining Achievement Gaps
What is Value Added?.
NWEA Measures of Academic Progress (MAP)
EVAAS Overview.
Assessment Literacy: Test Purpose and Use
Presentation transcript:

Using Growth Models to improve quality of school accountability systems October 22, 2010

State Agency should state reasons for incorporating growth into the accountability system Schools ability to facilitate academic progress is a better indicator of performance than comparing students performance to an established target Can account for potentially negative relationship between status and growth Can account for students inputs effects on growth

Rationale for incorporating Growth Models into accountability systems Growth models will assess the progress of every student along a performance continuum Growth models better assess the achievement gap. Closing the achievement gap implies differential growth rates Implies that instruction and level of support would differ Growth models may provide teachers and schools with evidence of achievement gains that are not evident in status models Growth information will be used to evaluate programs Growth information will support classroom decision making Growth models can inform instruction (state tests not sensitive)

Status vs. Improvement Status Snapshots of subgroups or schools level of proficiency compared with a target Cross sectional AYP Improvement Change over time between different groups of students Cohort growth Less strict data requirements Safe harbor Did students from last year do better than students from this year?

Growth Models Measure progress by tracking the same group of students from one year to the next to see whether or not, on average, students made progress Students act as own controls vs. models that consider student input information Account for cumulative process of learning School level growth is aggregate of individual Key issue: How much growth is enough? Statewide or local targeted by stakeholders Implied in model chosen Should it be the same for every student? Typical/normal vs. expected/desired

Value Added Use background characteristics, prior achievement and other data as statistical controls to isolate particular teacher, program or school effects VAMdifference between actual and expected growth. Schools can experience growth, but negative value added Little evidence that VAM is valid at the teacher level

Transition Matrix Doesnt require a vertical scale Makes use of performance categories Student who scores proficient is making expected progress if he scores proficient the next year. Add or take away points for maintaining or changing categories. (Delaware example)

Transition Matrix Year 1 Year 2 BasicProficientAdvanced Basic264 Proficient4474 Advanced312

State systems can employ multiple models Status models Growth models can provide substantially more information that status models, but can benefit from considering multiple analyses of same data and multiple sources of information Can simultaneously model improvement (changes in subsequent performance of cohorts) and individual student growth

Types of Growth Measures Difference Gain Score Students scores at start is subtracted from students score at end. Difference is aggregated to obtain group measure. Require ability to link students records over time. Requires common scale Is the gain for the group higher or lower than average?

Types of Growth Measures Growth in Relation to Performance Standards Difference between students current score and the score that would need the standard in a set number of years Divide by number of years to get necessary annual gain Students annual gain can be compared to target growth to see if student is on track to meet standards (CO model)

Residual Gain Scores Students current scores are adjusted by their prior scores using regression Each student has a predicted score based on prior scores The difference between predicted and actual scores is the residual gain scores Residual gain scores near zero indicate average growth Residual scores can be averaged to obtain a group growth measure

Linear Equating Student growth is defined as the students actual score in year 2 minus the predicted score for year 2. Predicted score for year 2 is the score in the distribution that corresponds to the students year 1 score. Expected growth is defined as maintaining location in the distribution Heavily sample dependent

Date Requirements Database of matched student records over time at statewide level Common scale Differences in scores across grades and consistent and meaningful Tests are created with grwoth measurement in mind Content standards are aligned across grades

Confidence interval Take into account uncertainty in measurement Normal measurement error Sampling error Narrow-1 standard error-68% certainty Wider Decreases chance of incorrectly identifying student or school failing to meet target Increases changes of saying growth has been achieved when it hasnt Multiple measures can reduce error

Missing Data Student Mobility Movement across district boundaries Adjust growth target Adjust years to growth Exclude students Random distribution is fine Lack of representation is a big problem Lack of precision for subgroups Imputation Report % missing

Including Alternate Tests Alignment of performance levels is clear Tests are on common scale Transition matrix approach is good

Definitions of Expected Growth Transition Matrix: maintaining performance levels from year to year Linear equating: maintaining location in the distribution from year to year Difference in Gain Scores: set by stakeholders Residual Gain Scores: zero is average gain

Timeframe for Growth # of years to reach target proficiency Spacing of intermediate grwoth targets Description of normal growth that extend beyond the period of observed data are based upon assumptions that have a significant impact on the rate of growth (p. 30). Project future growth by fewer years than amount of observed data Have we done projections?

Including students above proficient Extra credit for moving students above proficient

Reporting At what levels Student Subgroups School Accuracy Third party auditing Clarity Transparency/Credibility Completeness

How to use growth data in accountability decisions Replacement for status In addition to status Replacement for safe harbor In addition to safe harbor In conjunction with status and safe harbor If status is X and Growth is Y then overall is Z

Examples Projection pg. 22 Labels pg. 23 Achievement Gap pg. 25 Multilevel pg. 28

Exploratory data analysis Average scores by time by trade Assesses linearity of growth Consider consequences of false positives and false negatives Achievement Gap pg. 25 Year to year fluctuations that may limit accuracy of growth indicates

Short term or long term vision To what extent are we able to produce models that support longitudinal, multivariate nested analyses of group and individual performance and that attempt to use all available data at once.