Sensitivity of Teacher Value-Added Estimates to Student and Peer Control Variables March 2012 Presentation to the Association of Education Finance and.

Slides:



Advertisements
Similar presentations
Value Added in CPS. What is value added? A measure of the contribution of schooling to student performance Uses statistical techniques to isolate the.
Advertisements

Using Growth Models to improve quality of school accountability systems October 22, 2010.
FLORIDA’S VALUE ADDED MODEL FLORIDA’S VALUE ADDED MODEL Overview of the Model to Measure Student Learning Growth on FCAT January
Completing the Classroom Teacher and Non-Classroom Teacher Evaluations for Presented by: The Office of Talent Development Employee Evaluations.
Teacher Effectiveness in Urban Schools Richard Buddin & Gema Zamarro IES Research Conference, June 2010.
Options for School Grades, AYP, and MAP/STAR FCAT Advisory Committee Meeting June 13, 2007.
Florida Department of Education Value-added Model (VAM) FY2012 Using Student Growth in Teacher and School Based Administrator Evaluations.
KIPP: Effectiveness and Innovation in Publicly-Funded, Privately-Operated Schools October 4, 2012 Presentation to the APPAM/INVALSI Improving Education.
Informing Policy: State Longitudinal Data Systems Jane Hannaway, Director The Urban Institute CALDER
Dallas ISD’s Value-Added Model School Effectiveness Index (SEI) Classroom Effectiveness Index (CEI) Data Analysis, Reporting, and Research Services.
1 Prepared by: Research Services and Student Assessment & School Performance School Accountability in Florida: Grading Schools and Measuring Adequate Yearly.
Magnet Schools and Peers: Effects on Student Achievement Dale Ballou Vanderbilt University November, 2007 Thanks to Steve Rivkin, Julie Berry Cullen, Adam.
Value-added Accountability for Achievement in Minneapolis Schools and Classrooms Minneapolis Public Schools December,
Reducing Chronic Absence What Will It Take? 2014.
March, What does the new law require?  20% State student growth data (increases to 25% upon implementation of value0added growth model)  20%
Different Skills? Identifying Differentially Effective Teachers of English Language Learners Ben Master, Susanna Loeb, Camille Whitney, James Wyckoff 5.
-- Preliminary, Do Not Quote Without Permission -- VALUE-ADDED MODELS AND THE MEASUREMENT OF TEACHER QUALITY Douglas HarrisTim R. Sass Dept. of Ed. LeadershipDept.
March 28, What does the new law require?  20% State student growth data (increases to 25% upon implementation of value0added growth model)  20%
John Cronin, Ph.D. Director The Kingsbury NWEA Measuring and Modeling Growth in a High Stakes Environment.
The Narrowing Gap in NYC Teacher Qualifications and its Implications for Student Achievement Don Boyd, Hamp Lankford, Susanna Loeb, Jonah Rockoff, & Jim.
NYC ACHIEVEMENT GAINS COMPARED TO OTHER LARGE CITIES SINCE 2003 Changes in NAEP scores Class Size Matters August
DRE Agenda Student Learning Growth – Teacher VAM – School Growth PYG Area Scorecards. PYG, and other Performance Indicators.
Impact Analyses for VAM Scores The following slides show the relationship of the teacher VAM score with various classroom characteristics The observed.
Meryle Weinstein, Emilyn Ruble Whitesell and Amy Ellen Schwartz New York University Improving Education through Accountability and Evaluation: Lessons.
Whiteboard Zoom Out Surveying Year One of the Oklahoma Value-Added Model.
1 New York State Growth Model for Educator Evaluation 2011–12 July 2012 PRESENTATION as of 7/9/12.
Copyright ©2006. Battelle for Kids. Understanding & Using Value-Added Analysis.
Sensitivity of Teacher Value-Added Estimates to Student and Peer Control Variables October 2013 Matthew Johnson Stephen Lipscomb Brian Gill.
OCM BOCES Day 7 Lead Evaluator Training 1. 2 Day Seven Agenda.
The Impact of Including Predictors and Using Various Hierarchical Linear Models on Evaluating School Effectiveness in Mathematics Nicole Traxel & Cindy.
The Evaluation of Charter School Impacts June 30, 2010 Presentation at the 2010 IES Research Conference Philip Gleason ● Melissa Clark Christina Clark.
© 2014, Florida Department of Education. All Rights Reserved Annual District Assessment Coordinator Meeting VAM Update.
State Charter Schools Commission of Georgia SCSC Academic Accountability Update State Charter School Performance
The Inter-temporal Stability of Teacher Effect Estimates J. R. Lockwood Daniel F. McCaffrey Tim R. Sass The RAND Corporation The RAND Corporation Florida.
Selecting and Recruiting Subjects One Independent Variable: Two Group Designs Two Independent Groups Two Matched Groups Multiple Groups.
Western Suffolk BOCES Boot Camp Emma Klimek Eastern Suffolk BOCES 2012.
Issues in Assessment Design, Vertical Alignment, and Data Management : Working with Growth Models Pete Goldschmidt UCLA Graduate School of Education &
The Policy Choices of Effective Principals David Figlio, Northwestern U/NBER Tim Sass, Florida State U July 2010.
DRE FLDOE “Value-Added Model” School District of Palm Beach County Performance Accountability.
Gifted Presentation Mike Nicholson, Senior Director of Research and Evaluation.
TEACHER EFFECTIVENESS INITIATIVE VALUE-ADDED TRAINING Value-Added Research Center (VARC)
Strategies for estimating the effects of teacher credentials Helen F. Ladd Based on joint work with Charles Clotfelter and Jacob Vigdor CALDER Conference,
1 New York State Growth Model for Educator Evaluation 2011–12 July 2012 PRESENTATION as of 7/9/12.
School-level Correlates of Achievement: Linking NAEP, State Assessments, and SASS NAEP State Analysis Project Sami Kitmitto CCSSO National Conference on.
Release of Preliminary Value-Added Data Webinar August 13, 2012 Florida Department of Education.
Value Added Model and Evaluations: Keeping It Simple Polk County Schools – November 2015.
PED School Grade Reports (with thanks to Valley High School) ACE August 3, 2012 Dr. Russ Romans District Accountability Manager.
1 Getting Up to Speed on Value-Added - An Accountability Perspective Presentation by the Ohio Department of Education.
Human Capital Policies in Education: Further Research on Teachers and Principals 5 rd Annual CALDER Conference January 27 th, 2012.
 How many total gifts are given in the song, "The Twelve Days of Christmas?"
Effectiveness of Selected Supplemental Reading Comprehension Interventions: Impacts on a First Cohort of Fifth-Grade Students June 8, 2009 IES Annual Research.
Florida Department of Education’s Florida Department of Education’s Teacher Evaluation System Student Learning Growth.
VAM Training. Florida’s value-added model developed by Florida educators  The Department convened a committee of stakeholders (Student Growth Implementation.
Digging into the Data Principals’ Meeting March 21, 2013.
Copyright © 2014 American Institutes for Research and Cleveland Metropolitan School District. All rights reserved. March 2014 Interpreting Vendor Assessment.
Value Added Model Value Added Model. New Standard for Teacher EvaluationsNew Standard for Teacher Evaluations Performance of Students. At least 50% of.
S. W. E. E. T. School Social Work Effectiveness Evaluation Tool Laura Richard, ABD, LCSW Louisiana State University 6/13/ LSU School of Social.
Florida Algebra I EOC Value-Added Model June 2013.
1 New York State Growth Model for Educator Evaluation June 2012 PRESENTATION as of 6/14/12.
3rd Grade Math Bee Practice.
What is Value Added?.
Educational Analytics
ACE August 3, 2012 Dr. Russ Romans District Accountability Manager
FY17 Evaluation Overview: Student Performance Rating
Portability of Teacher Effectiveness across School Settings
Impact Analyses for VAM Scores
Student Growth and Performance Update:
3rd Grade Classroom Math Bee School Year.
Student Mobility and Achievement Growth In State Assessment Mohamed Dirir Connecticut Department of Education Paper presented at National Conference.
Release of Preliminary Value-Added Data Webinar
Presentation transcript:

Sensitivity of Teacher Value-Added Estimates to Student and Peer Control Variables March 2012 Presentation to the Association of Education Finance and Policy Conference Matt Johnson Stephen Lipscomb Brian Gill

VAMs Used Today Differ in Their Specifications 2 Value-Added Model Student Characteristics Classroom Characteristics Multiple Years of Prior Scores Colorado Growth ModelNo Yes DC IMPACTYesNo FloridaYes New York CityYes No SAS EVAASNo Yes

 How sensitive are teacher value-added model (VAM) estimates to changes in the model specification? –Student characteristics –Classroom characteristics –Multiple years of prior scores  How sensitive are estimates to loss of students from sample due to missing prior scores? Research Questions 3

 Teacher value-added estimates are not highly sensitive to inclusion of: –Student characteristics (correlation ≥ 0.990) –Multiple years of prior scores (correlation ≥ 0.987)  Estimates are more sensitive to inclusion of classroom characteristics (correlation = to.955)  Estimates are not very sensitive to loss of students with missing prior test scores from sample (correlation = 0.992) –Precision increases when two prior scores are used but fewer teacher VAM estimates are produced Preview of Main Results 4

 Explore sensitivity to several specifications: –Exclude score from two prior years (Y i,t-2 ) –Exclude student characteristics (X i,t ) –Include class average characteristics  Student data from a medium-sized urban district for 2008–2009 to 2010–2011 school years  All models run using the same set of student observations  Instrument using opposite subject prior score to control for measurement error Baseline Model 5

Student LevelClass Level Free or Reduced-Price Meals Disability Gifted Program Participation Lagged Rate of Attendance Lagged Fraction of Year Suspended Race/Ethnicity Gender Age/Behind Grade Level Average Prior Achievement in Same Subject Standard Deviation of Lagged Achievement Number of Students in Classroom Student and Class Characteristics 6

Correlation of 6th-Grade Teacher Estimates Relative to Baseline VAM Specification 7 Math (n = 87) Reading (n = 99) Exclude Student Characteristics Exclude Prior Score from t Exclude Student Characteristics and Prior Score from t Add class average variables Baseline: Student Characteristics and Prior Scores from t-1 and t-2 Findings are based on VAM estimates from 2008–2009 to 2010–2011 on the same sample of students.

Exclude Student Characteristics 1st (Lowest)2nd3rd4th 5th (Highest) Baseline Model 1st (Lowest) nd rd th th (Highest) Percentage of 6th-Grade Reading Teachers in Effectiveness Quintiles, by VAM Specification 8 Findings are based on VAM estimates for 99 reading teachers in grade 6 from 2008–2009 to 2010–2011 for a medium-sized, urban district. Correlation with baseline =

Baseline + Class Average Characteristics 1st (Lowest)2nd3rd4th 5th (Highest) Baseline Model 1st (Lowest) nd rd th th (Highest) Percentage of 6th-Grade Reading Teachers in Effectiveness Quintiles, by VAM Specification 9 Findings are based on VAM estimates for 99 reading teachers in grade 6 from 2008–2009 to 2010–2011 for a medium-sized, urban district. Correlation with baseline =

 Benefits of including two prior years: –More accurate measure of student ability –Increase in precision of estimates  Costs of using two prior years: –Students with missing prior scores dropped –Some teachers dropped from sample  Relative magnitude of costs/benefits? One or Two Years of Prior Scores? 10

 Estimate two VAMs using one year of prior scores –First VAM includes all students –Second VAM restricts sample to students with nonmissing second prior year of scores  Correlation between teacher estimates:  Percentage of students dropped: 6.2  Percentage of teachers dropped: 3.9  Net increase in precision from using two prior years –Increase in average standard error of estimates: 2.3% when students with missing scores are dropped –Decrease in average standard error of estimates: 7.6% when second year of prior scores added One or Two Years of Prior Scores? 11

Mathematica ® is a registered trademark of Mathematica Policy Research.  Please contact –Matt Johnson –Stephen Lipscomb –Brian Gill For More Information 12