Sensitivity of Teacher Value-Added Estimates to Student and Peer Control Variables October 2013 Matthew Johnson Stephen Lipscomb Brian Gill.

Slides:



Advertisements
Similar presentations
Value Added in CPS. What is value added? A measure of the contribution of schooling to student performance Uses statistical techniques to isolate the.
Advertisements

LINDSAY CLARE MATSUMURA HELEN GARNIER BRIAN JUNKER LAUREN RESNICK DONNA DIPRIMA BICKEL June 30, 2010 Institute of Educational Sciences Conference Evidence.
Completing the Classroom Teacher and Non-Classroom Teacher Evaluations for Presented by: The Office of Talent Development Employee Evaluations.
Teacher Effectiveness in Urban Schools Richard Buddin & Gema Zamarro IES Research Conference, June 2010.
VALUE – ADDED 101 Ken Bernacki and Denise Brewster.
Baseline for school surveys - Young Lives longitudinal survey of children, households & communities every 3 years since ,000 children Ethiopia,
Selected Findings from the Vermont Student Mobility Project Doctoral Dissertation Research Study Educational Leadership and Policy Studies Program University.
KIPP: Effectiveness and Innovation in Publicly-Funded, Privately-Operated Schools October 4, 2012 Presentation to the APPAM/INVALSI Improving Education.
Pennsylvania Value-Added Assessment System (PVAAS): A Compass for Improvement Oxford Area School District February 14, 2013.
Upper Darby School District Growth Data
Informing Policy: State Longitudinal Data Systems Jane Hannaway, Director The Urban Institute CALDER
Using Hierarchical Growth Models to Monitor School Performance: The effects of the model, metric and time on the validity of inferences THE 34TH ANNUAL.
Value-added Accountability for Achievement in Minneapolis Schools and Classrooms Minneapolis Public Schools December,
Different Skills? Identifying Differentially Effective Teachers of English Language Learners Ben Master, Susanna Loeb, Camille Whitney, James Wyckoff 5.
-- Preliminary, Do Not Quote Without Permission -- VALUE-ADDED MODELS AND THE MEASUREMENT OF TEACHER QUALITY Douglas HarrisTim R. Sass Dept. of Ed. LeadershipDept.
Aligning Observations and In-service Professional Development Prek-3 rd Jason Downer May 9, 2012.
NYC ACHIEVEMENT GAINS COMPARED TO OTHER LARGE CITIES SINCE 2003 Changes in NAEP scores Class Size Matters August
Human Capital Policies in Education: Further Research on Teachers and Principals 5 rd Annual CALDER Conference January 27 th, 2012.
Student Engagement Survey Results and Analysis June 2011.
Impact Analyses for VAM Scores The following slides show the relationship of the teacher VAM score with various classroom characteristics The observed.
Whiteboard Zoom Out Surveying Year One of the Oklahoma Value-Added Model.
Early Selection in Hungary A Possible Cause of High Educational Inequality Daniel Horn research fellow Institute of Economics, Hungarian Academy of Sciences.
1 New York State Growth Model for Educator Evaluation 2011–12 July 2012 PRESENTATION as of 7/9/12.
The Impact of Including Predictors and Using Various Hierarchical Linear Models on Evaluating School Effectiveness in Mathematics Nicole Traxel & Cindy.
Slide 1 Estimating Performance Below the National Level Applying Simulation Methods to TIMSS Fourth Annual IES Research Conference Dan Sherman, Ph.D. American.
Instruction, Teacher Evaluation and Value-Added Student Learning Minneapolis Public Schools November,
State Charter Schools Commission of Georgia SCSC Academic Accountability Update State Charter School Performance
The Inter-temporal Stability of Teacher Effect Estimates J. R. Lockwood Daniel F. McCaffrey Tim R. Sass The RAND Corporation The RAND Corporation Florida.
Sensitivity of Teacher Value-Added Estimates to Student and Peer Control Variables March 2012 Presentation to the Association of Education Finance and.
The Policy Choices of Effective Principals David Figlio, Northwestern U/NBER Tim Sass, Florida State U July 2010.
Don Boyd, Pam Grossman, Karen Hammerness, Hamp Lankford, Susanna Loeb, Matt Ronfeldt & Jim Wyckoff This work is supported.
Portability of Teacher Effectiveness across School Settings Zeyu Xu, Umut Ozek, Matthew Corritore May 29, 2016 Bill & Melinda Gates Foundation Evaluation.
Julian Betts, Department of Economics, UCSD and NBER.
Transforming the High School Experience: Early Lessons from the New York City Small Schools Initiative Council of the Great City Schools October 21, 2010.
35th Annual National Conference on Large-Scale Assessment June 18, 2005 How to compare NAEP and State Assessment Results NAEP State Analysis Project Don.
“Value added” measures of teacher quality: use and policy validity Sean P. Corcoran New York University NYU Abu Dhabi Conference January 22, 2009.
Impediments to the estimation of teacher value added Steven Rivkin Jun Ishii April 2008.
Copyright © 2010, SAS Institute Inc. All rights reserved. How Do They Do That? EVAAS and the New Tests October 2013 SAS ® EVAAS ® for K-12.
Strategies for estimating the effects of teacher credentials Helen F. Ladd Based on joint work with Charles Clotfelter and Jacob Vigdor CALDER Conference,
1 New York State Growth Model for Educator Evaluation 2011–12 July 2012 PRESENTATION as of 7/9/12.
DVAS Training Find out how Battelle for Kids can help Presentation Outcomes Learn rationale for value-added progress measures Receive conceptual.
School-level Correlates of Achievement: Linking NAEP, State Assessments, and SASS NAEP State Analysis Project Sami Kitmitto CCSSO National Conference on.
CREATE – National Evaluation Institute Annual Conference – October 8-10, 2009 The Brown Hotel, Louisville, Kentucky Research and Evaluation that inform.
Release of Preliminary Value-Added Data Webinar August 13, 2012 Florida Department of Education.
1 Children Left Behind in AYP and Non-AYP Schools: Using Student Progress and the Distribution of Student Gains to Validate AYP Kilchan Choi Michael Seltzer.
Using EVAAS to Improve Student Performance Donna Albaugh Rachel McBroom Heather Stewart Region 4 PD Leads NCDPI.
Using School Choice Lotteries to Test Measures of School Effectiveness David Deming Harvard University and NBER.
PED School Grade Reports (with thanks to Valley High School) ACE August 3, 2012 Dr. Russ Romans District Accountability Manager.
Copyright © 2010, SAS Institute Inc. All rights reserved. How Do They Do That? EVAAS and the New Tests October 2013 SAS ® EVAAS ® for K-12.
The Nation’s Report Card: Trial Urban District Assessment: Science 2005.
Human Capital Policies in Education: Further Research on Teachers and Principals 5 rd Annual CALDER Conference January 27 th, 2012.
Effectiveness of Selected Supplemental Reading Comprehension Interventions: Impacts on a First Cohort of Fifth-Grade Students June 8, 2009 IES Annual Research.
VAM Training. Florida’s value-added model developed by Florida educators  The Department convened a committee of stakeholders (Student Growth Implementation.
Copyright © 2014 American Institutes for Research and Cleveland Metropolitan School District. All rights reserved. March 2014 Interpreting Vendor Assessment.
Value Added Model Value Added Model. New Standard for Teacher EvaluationsNew Standard for Teacher Evaluations Performance of Students. At least 50% of.
Florida Algebra I EOC Value-Added Model June 2013.
Research Questions  What is the nature of the distribution of assignment quality dimensions of rigor, knowledge construction, and relevance in Math and.
1 New York State Growth Model for Educator Evaluation June 2012 PRESENTATION as of 6/14/12.
Eric Hanushek, Steven Rivkin and Jeffrey Schiman February, 2017
What is Value Added?.
Standardized Testing and 2016 EOG Milestones Results
Educational Analytics
Portability of Teacher Effectiveness across School Settings
Dan Goldhaber1,2, Vanessa Quince2, and Roddy Theobald1
Jonathan Supovitz Abigail Gray
Impact Analyses for VAM Scores
Student Growth and Performance Update:
Matthew Finster, Ph.D., Westat
Student Mobility and Achievement Growth In State Assessment Mohamed Dirir Connecticut Department of Education Paper presented at National Conference.
Release of Preliminary Value-Added Data Webinar
Presentation transcript:

Sensitivity of Teacher Value-Added Estimates to Student and Peer Control Variables October 2013 Matthew Johnson Stephen Lipscomb Brian Gill

Value-Added Models (VAMs) Used Today Differ in Their Specifications 2 Value-Added Model Student Characteristics Peer Characteristics Multiple Years of Prior Scores Chicago Public SchoolsYesNo DC IMPACTYes No FloridaYes Pittsburgh Public SchoolsYes No SAS EVAASNo Yes

 How sensitive are teacher VAM estimates to choice of control variables? –Are estimates for teachers with more students from disadvantaged backgrounds affected by this choice?  Does the substitution of teacher-year level average student characteristics in place of classroom averages impact teacher VAM estimates?  Does allowing for relationship between current and lagged achievement to vary based on student demographic characteristics matter for teacher VAM estimates? Research Questions 3

 Data from a northern state and a medium-sized urban district in that state –District has more minority and low-income students than state average  Estimate separate VAMs using state data and district data – More control variables available in district VAMs –For peer characteristics, use teacher-year level averages in state VAMs, classroom averages in district VAMs  Each VAM uses three years of teaching data from through Data 4

 Explore sensitivity to several specifications: –Exclude peer average characteristics (X ̅ i,t ) –Exclude student characteristics (X i,t ) and peer characteristics (X ̅ i,t ) –Add scores from two prior years (Y i,t-2 ) –Interact free/reduced lunch status with baseline scores  Estimate all models using the same set of student observations  Control for measurement error in prior test scores using an errors-in-variables approach  Empirical Bayes (shrinkage) adjusted estimates Baseline Model 5

Student Controls (State) Student Controls (District) Peer Averages (State) Peer Averages (District) Free or Reduced-Price Mealsxxxx Disabilityxxxx Race/Ethnicityxxxx Genderxx English Language Learnerxxxx Age/Behind Grade Levelxx Gifted Program Participationxx Lagged Rate of Attendancexx Lagged Fraction of Year Suspendedxx Average Lagged Achievementxx SD of Lagged Achievementxx Number of Students in Classx Student and Peer Characteristics 6

Correlation of 8th-Grade State Teacher VAM Estimates Relative to Baseline Specification 7 Math (N = 2,778) Reading (N = 3,347) Exclude peer characteristics Exclude student and peer characteristics Add scores from t Add scores from t-2 and exclude student/peer characteristics Baseline: Student characteristics, peer characteristics, and prior scores from t-1 Findings are based on VAM estimates from 2008–2009 to 2010–2011 on the same sample of students.

Remove Student/Peer Controls and Add t-2 Scores 1st (Lowest)2nd3rd4th 5th (Highest) Baseline Model 1st (Lowest) nd rd th th (Highest) Percentage of 8th-Grade Reading Teachers in Effectiveness Quintiles, by VAM Specification 8 Findings are based on VAM estimates for 3,347 reading teachers in grade 8 from 2008–2009 to 2010–2011. Correlation with baseline =

How Are Teachers in One District Affected? 9  District has relatively large fraction poor and minority students Math Grade 8Reading Grade 8 District Percentile Rank State Percentile Rank: Baseline Exclude peer characteristics Exclude student and peer characteristics Percentile Rank of District Teachers in State Distribution

Using Additional Controls in District Data 10 Math Grades 6-8 (N = 164) Reading Grades 6-8 (N = 215) Exclude peer characteristics Exclude student and peer characteristics Add scores from t Add scores from t-2 and exclude student/peer characteristics Baseline: Student characteristics, peer characteristics, and prior scores from t-1 Findings are based on VAM estimates from 2008–2009 to 2010–2011 on the same sample of students.

Teacher-Year Average Student Characteristics vs. Classroom Average 11 Math Grades 6-8 (N = 164) Reading Grades 6-8 (N = 215) Correlation Between Effect Estimates Average Standard Error (Classroom) Average Standard Error (Teacher)

 Correlation of teacher effect estimates with baseline model above 0.99 for both subjects Different Relationship Current and Prior Test Scores for FRL Students 12 Math Grade 8 (N = 2,778) Reading Grade 8 (N = 3,347) Non-FRL Student Coefficient on Prior Year Math Score (SE) (0.002) (0.003) FRL student coefficient on Prior Year Math Score (SE) (0.003) (0.004) Non-FRL Student Coefficient on Prior Year Reading Score (SE) (0.002) (0.003) FRL student coefficient on Prior Year Reading Score (SE) (0.003) (0.004)

 Teacher VAM estimates highly correlated across specifications –Choice of control variables –Use of teacher-year level averages in place of classroom averages –Interaction between FRL status and prior scores  Choice of control variables can impact estimates for teachers of disadvantaged students Conclusions 13

 Other researchers have examined correlations in teacher effect estimates when different same-subject assessments are used as outcomes for teacher VAMs  The highest correlations these authors found are: –Lockwood et al. (2007): 0.46 –Sass (2008): 0.48 –Concoran et al. (2011): 0.62 –Lipscomb et al. (2010): 0.61 –Papay (2011): 0.54 Context for Results 14

Mathematica ® is a registered trademark of Mathematica Policy Research.  Please contact –Matthew Johnson –Stephen Lipscomb –Brian Gill For More Information 15

Mathematica ® is a registered trademark of Mathematica Policy Research. Remove Student/Peer Controls Add t-2 Scores 1st (Lowest)2nd3rd4th 5th (Highest) Baseline Model 1st (Lowest) nd rd th th (Highest) Percentage of Grade 6-8 Reading Teachers in Effectiveness Quintiles, by VAM Specification 16 Findings are based on VAM estimates for 215 reading teachers in grades 6-8 from 2008–2009 to 2010–2011. Correlation with baseline =