Educational Analytics

Slides:



Advertisements
Similar presentations
Value Added in CPS. What is value added? A measure of the contribution of schooling to student performance Uses statistical techniques to isolate the.
Advertisements

Hierarchical Linear Modeling for Detecting Cheating and Aberrance Statistical Detection of Potential Test Fraud May, 2012 Lawrence, KS William Skorupski.
VALUE – ADDED 101 Ken Bernacki and Denise Brewster.
Chapter Fifteen Understanding and Using Standardized Tests.
District Determined Measures aka: DDMs What is a DDM? Think of a DDM as an assessment tool similar to MCAS. It is a measure of student learning, growth,
Communicating through Data Displays October 10, 2006 © 2006 Public Consulting Group, Inc.
Using Growth Models for Accountability Pete Goldschmidt, Ph.D. Assistant Professor California State University Northridge Senior Researcher National Center.
Analysis of Variance & Multivariate Analysis of Variance
Chapter 7 Correlational Research Gay, Mills, and Airasian
-- Preliminary, Do Not Quote Without Permission -- VALUE-ADDED MODELS AND THE MEASUREMENT OF TEACHER QUALITY Douglas HarrisTim R. Sass Dept. of Ed. LeadershipDept.
An Introduction to HLM and SEM
Standardized Testing (1) EDU 330: Educational Psychology Daniel Moos.
NCLB AND VALUE-ADDED APPROACHES ECS State Leader Forum on Educational Accountability June 4, 2004 Stanley Rabinowitz, Ph.D. WestEd
Sensitivity of Teacher Value-Added Estimates to Student and Peer Control Variables October 2013 Matthew Johnson Stephen Lipscomb Brian Gill.
Causal Inference and Adequate Yearly Progress Derek Briggs University of Colorado at Boulder National Center for Research on Evaluation, Standards, and.
Slide 1 Estimating Performance Below the National Level Applying Simulation Methods to TIMSS Fourth Annual IES Research Conference Dan Sherman, Ph.D. American.
North Carolina Career Technical Education Assessments: Using Assessment and Data for Continuous Growth and Improvement Tammy Howard, PhD Director, Accountability.
Measured Progress ©2012 Student Growth in the Non-Tested Subjects and Grades: Options for Teacher Evaluators Elena Diaz-Bilello, Center for Assessment.
Using Regression Discontinuity Analysis to Measure the Impacts of Reading First Howard S. Bloom
“Value added” measures of teacher quality: use and policy validity Sean P. Corcoran New York University NYU Abu Dhabi Conference January 22, 2009.
Precision Gains from Publically Available School Proficiency Measures Compared to Study-Collected Test Scores in Education Cluster-Randomized Trials June.
Copyright © 2010, SAS Institute Inc. All rights reserved. How Do They Do That? EVAAS and the New Tests October 2013 SAS ® EVAAS ® for K-12.
McGraw-Hill/Irwin © 2012 The McGraw-Hill Companies, Inc. All rights reserved. Obtaining Valid and Reliable Classroom Evidence Chapter 4:
DVAS Training Find out how Battelle for Kids can help Presentation Outcomes Learn rationale for value-added progress measures Receive conceptual.
School-level Correlates of Achievement: Linking NAEP, State Assessments, and SASS NAEP State Analysis Project Sami Kitmitto CCSSO National Conference on.
Welcome to MMS MAP DATA INFO NIGHT 2015.
Changes in Professional licensure Teacher evaluation system Training at Coastal Carolina University.
Copyright © 2010, SAS Institute Inc. All rights reserved. How Do They Do That? EVAAS and the New Tests October 2013 SAS ® EVAAS ® for K-12.
Effectiveness of Selected Supplemental Reading Comprehension Interventions: Impacts on a First Cohort of Fifth-Grade Students June 8, 2009 IES Annual Research.
Florida Department of Education’s Florida Department of Education’s Teacher Evaluation System Student Learning Growth.
Internal Evaluation of MMP Cindy M. Walker Jacqueline Gosz Razia Azen University of Wisconsin Milwaukee.
Chapter Seventeen Copyright © 2004 John Wiley & Sons, Inc. Multivariate Data Analysis.
1 Outcome Measures for School Evaluation Coalition for Excellence in Science and Math Education.
Student Growth Model Salt Lake City School District Christine Marriott Assessment and Evaluation Department Salt Lake City School District State.
Copyright © 2014 American Institutes for Research and Cleveland Metropolitan School District. All rights reserved. March 2014 Interpreting Vendor Assessment.
Value Added Model Value Added Model. New Standard for Teacher EvaluationsNew Standard for Teacher Evaluations Performance of Students. At least 50% of.
Florida Algebra I EOC Value-Added Model June 2013.
 Mark D. Reckase.  Student achievement is a result of the interaction of the student and the educational environment including each teacher.  Teachers.
Methods of Presenting and Interpreting Information Class 9.
Chapter 8 Introducing Inferential Statistics.
Daniel Muijs Saad Chahine
Logic of Hypothesis Testing
Statutory Assessment at SPRINGFIELD PRIMARY SCHOOL
M.A.P. Measures of Academic Progress
Vertical Scaling in Value-Added Models for Student Learning
Classroom Assessment A Practical Guide for Educators by Craig A
SASEVAAS A Way of Measuring Schooling Influence
What is Value Added?.
Research Methodology Lecture No :25 (Hypothesis Testing – Difference in Groups)
M.A.P. Measures of Academic Progress
EVAAS EVALUATION EFFECTIVENESS …AND WHAT DOES IT SAY???
FY17 Evaluation Overview: Student Performance Rating
Dr. Robert H. Meyer Research Professor and Director
EVAAS Overview.
Dan Goldhaber1,2, Vanessa Quince2, and Roddy Theobald1
Jonathan Supovitz Abigail Gray
Impact Analyses for VAM Scores
Collecting and Interpreting Quantitative Data – Introduction (Part 1)
Joint Statistical Meetings, Vancouver, August 1, 2018
12 Inferential Analysis.
Understanding and Using Standardized Tests
An Introduction to Correlational Research
Understanding Statistical Inferences
CAASPP Results 2015 to 2016 Santa Clara Assessment and Accountability Network May 26, 2017 Eric E, Zilbert Administrator, Psychometrics, Evaluation.
CCSSO National Conference on Student Assessment June 21, 2010
Release of Preliminary Value-Added Data Webinar
“Reviewing Achievement and Value-Added Data”
Multivariate Analysis - Introduction
Regression Part II.
Nazmus Saquib, PhD Head of Research Sulaiman AlRajhi Colleges
Presentation transcript:

Educational Analytics JSM 2016 RealVAMS: Getting Real-World Value from Value Added Models Educational Analytics Jennifer Broatch, Ph.D., jennifer.broatch@asu.edu Assistant Professor of Statistics, Arizona State University at the West Campus Jennifer Green, Ph.D., Assistant Professor of Statistics, Montana State University

Overview Introduce the RealVAMS model for educational analytics Interpret the results of an example and discuss implications for the evaluation of other programs Compare/Contrast RealVAMS estimates with a standard VAM

What is a Value Added Model (VAM)? Typically a VAM estimate effects of educational factors (professional development programs, teachers, schools, districts, etc.) on student learning while controlling for prior achievement. VAM extensions- Estimate the effect of offensive and defensive ability on team performance (winning) Estimate the relative contributions of an employee to company performance measures

Value Added VAM Estimates Estimates are measured “indirectly” or “latent” effects “Teacher Effects” -estimate effects of educational factors (professional development programs, teachers, schools, districts, etc.) on student learning while controlling for prior achievement Differences at the classroom level not explained by other terms in the VAM (unexplained classroom-level heterogeneity) Typically estimated relative to other subjects (teachers) in the model

Estimating “Teacher” Effects Value Added Estimating “Teacher” Effects Non-parametric Standardized Gain Method (Reback, 2008) Estimate teacher effect by mean value of a teacher’s students Student Growth Percentiles (Betebenner, 2009) Estimate teacher effect by median value of quantile growth of a teacher’s students Education Value-Added Assessment System (EVAAS; Sanders et al., 1997) Mixed model in which teacher effects persist undiminished over time Generalized Persistence (Mariano, McCaffrey, Lockwood, 2010) Mixed model that allows for non-equated responses Different teacher effect for each year

Potential Limitations Value Added Potential Limitations VAMs typically rely on Continuous - interval scale data Vertically scaled/equated standardized tests - comparable over different forms, ability levels, and time VAMs typically measure one effect per teacher Single grade or time point Individual subject (e.g., Science, Math, Reading) Ignores potential relationship(s) between a teacher’s effects on different outcomes

Motivation for an Improved VAM Value Added Motivation for an Improved VAM High quality data are often hard to obtain Many projects rely on data, which may or may not meet requirements for a typical VAM Educational programs often broadly define “program impact” Desire a “holistic” picture of student success, which cannot be captured by achievement scores alone

What is the RealVAMS model? Multidimensional value-added model that… Accommodates multiple types of student outcomes Continuous, non-equated test scores from multiple subjects and testing instruments (e.g., ACT, SAT) Dichotomous (yes/no) categorical responses (e.g., college entry) Simultaneously estimates multiple effects for a single teacher Separate teacher effect for each response Relationship(s) between a teacher’s effects on different responses

RealVAMS What we want… Teacher “effect” on student achievement as measured by a “typical” standardized test. Teacher “effect” on a real-world outcome, specifically whether a student attended college. Teacher “effect” on student achievement as measured by a “typical” standardized test. Teacher “effect” on a real-world outcome, specifically whether a student attended college.

RealVAMS What we want… Teacher “effect” on student achievement as measured by a “typical” standardized test. Teacher “effect” on a real-world outcome, specifically whether a student attended college. Teacher “effect” on student achievement as measured by a “typical” standardized test. Teacher “effect” on a real-world outcome, specifically whether a student attended college.

RealVAMS Model Illustration…

RealVAMS Model Illustration…

RealVAMS Model Illustration…

RealVAMS Model Illustration…

RealVAMS Model The RealVAMS model for the ith student is 𝒚 𝑖 = 𝑿 𝑖 𝜷 + 𝑺 𝑖 𝜸+ 𝝴 𝑖 where 𝒚 𝑖 is the vector of the t responses on student i 𝜷 represents the coefficients of the fixed effects (teacher and/or student covariates) 𝑺 𝑖 indicates which teachers instruct student i 𝜸 represents the multivariate latent “teacher effect” 𝞮 𝑖 represents the random error term

RealVAMS Program RealVAMS R Package Package found on R CRAN mirror: https://cran.r-project.org/web/packages/RealVAMS/index.html Developed to meet the needs of RealVAMS model Computational issues associated with including the relationship between the teacher effects Computational issues associated with including a dichotomous response Open source and accessible!

Example Data from Large Public School RealVAMS Example Example Data from Large Public School 11th grade students who took the required state assessment Includes 912 students, 86 teachers Responses: Scale score on state assessment College entry (Y/N) Covariate Information: Student gender Student ethnicity Math/Reading PLAN scale scores Gifted status (Yes=1/No=0) Special education status (Yes=1/No=0) ELL status (Yes=1/No=0)

Example RealVAMS Estimates

Example Estimated Covariance Student Covariance: 𝑅 𝑖 = 602.00 3.62 3.62 1.00 Teacher Covariance: 𝐺 𝑖 = 64.75 3.38∗ 3.38∗ 0.07 * The off-diagonal is significantly different from 0 (p=.0007).

Finding Estimated Correlation Example Finding Estimated Correlation Teacher Covariance: 𝐺 𝑖 = 64.75 3.38 3.38 0.07 Teacher Effect Correlation: 𝑟 𝐺 = 3.38 64.75 ∗ 0.07 =0.79

Example Teacher Effects 𝑟 𝐺 =0.79 - Correlation between teacher effects on the score response (student achievement on a standardized test) and the outcome response (college entry)

Example Teacher Effects All Teachers Highlighted Teacher % of College Entry 70.9% 76.0% State Assessment 121.90 55.20

Evaluation Comparison Why not just use a simpler model? Completely ignore/exclude the other “real-world” responses How do the RealVAMS estimates for the standard response compare with a common model such as EVAAS?

Single Response (EVAAS) vs. Multidimensional (RealVAMS) Comparison Single Response (EVAAS) vs. Multidimensional (RealVAMS) *If independence was assumed in RealVAMS model, r=1.

Advantages of the RealVAMS Model Interpretation and Discussion Advantages of the RealVAMS Model

Interpretation and Discussion Comparison of the teacher rank differences shows moderate positive relationship (r=0.50)

Advantages of the RealVAMS Model Interpretation and Discussion Advantages of the RealVAMS Model Multidimensional estimates of “teacher effectiveness” Assessment of outcomes that align with project and/or district goals but are not compatible with other models Use of free open source software for analysis

Evaluation Caution Interpretation and Discussion As with all evaluation, Long-term student trajectories require Quality baseline characterizing starting point Data over multiple years to see how progressing Assessments and outcomes need to Have ability to adequately score all levels of achievement (i.e., free of floor or ceiling effects) Reflect learning objectives Be valid and reliable measures of student progress with respect to these objectives

General Caution Interpretation and Discussion VAMS in general Have standard errors of estimated latent effects that tend to be large May provide information on students’ performance, but not on how to improve teaching Estimate correlation vs. causation in non-randomized trials

Concluding Remarks Interpretation and Discussion “The combination of some data and an aching desire for an answer does not ensure that a reasonable answer can be extracted from a given body of data.” -- John Tukey This work is supported by the National Science Foundation under grants DRL-1336027 and DRL-1336265.