Cross-Cutting Issues in Recent State Growth Modeling Efforts Andrew Ho, Discussant Harvard Graduate School of Education Measuring Growth: A Key Feature.

Slides:



Advertisements
Similar presentations
Performance Assessment
Advertisements

Effective Questioning
EVAAS EVALUATION EFFECTIVENESS …AND WHAT DOES IT SAY??? Brawley Middle School November 27, 2012.
VALUE – ADDED 101 Ken Bernacki and Denise Brewster.
Douglas N. Harris University of Wisconsin at Madison Evaluating and Improving Value-Added Modeling.
Haywood County Schools February 20,2013
Using the IDEA Student Ratings System: An Introduction University of Saint Thomas Fall
Assisting Peers to Provide W orthwhile Feedback UC Merced SATAL Program.
Student Growth Percentile Model Question Answered
Research Methods for Counselors COUN 597 University of Saint Joseph Class # 8 Copyright © 2015 by R. Halstead. All rights reserved.
Accuracy, Transparency, and Incentives: Contrasting Criteria for Evaluating Growth Models Andrew Ho Harvard Graduate School of Education Maryland Assessment.
PROPOSED MULTIPLE MEASURES FOR TEACHER EFFECTIVENESS
Dallas ISD’s Value-Added Model School Effectiveness Index (SEI) Classroom Effectiveness Index (CEI) Data Analysis, Reporting, and Research Services.
The Art and Science of Teaching (2007)
Preparing for the Data Team Process 1.  Know the rationale for “Step A” with respect to the data team process.  Experience Step A as a tool to help.
Robert L. Linn CRESST, University of Colorado at Boulder Paper presented at a symposium sponsored entitled “Accountability: Measurement and Value-Added.
Using Growth Models for Accountability Pete Goldschmidt, Ph.D. Assistant Professor California State University Northridge Senior Researcher National Center.
Statistical Analysis SC504/HS927 Spring Term 2008 Week 17 (25th January 2008): Analysing data.
Business and Economics 7th Edition
Project Design and Data Collection Methods: A quick and dirty introduction to classroom research Margaret Waterman September 21, 2005 SoTL Fellows
Student Growth Percentile (SGP) Model
Productive Math Talk Math Alliance April 3, 2012.
SEPT 20 8:00-11:00 WHAT ARE WE MEASURING? HOW DO WE MEASURE? DHS English Department Professional Development.
Deliberate Practice Technical Assistance Day
OCTOBER ED DIRECTOR PROFESSIONAL DEVELOPMENT 10/1/14 POWERFUL & PURPOSEFUL FEEDBACK.
Determining Sample Size
Including a detailed description of the Colorado Growth Model 1.
1 Comments on: “New Research on Training, Growing and Evaluating Teachers” 6 th Annual CALDER Conference February 21, 2013.
Inferences about School Quality using opportunity to learn data: The effect of ignoring classrooms. Felipe Martinez CRESST/UCLA CCSSO Large Scale Assessment.
1 New York State Growth Model for Educator Evaluation 2011–12 July 2012 PRESENTATION as of 7/9/12.
Compass: Module 2 Compass Requirements: Teachers’ Overall Evaluation Rating Student Growth Student Learning Targets (SLTs) Value-added Score (VAM) where.
Describing and Exploring Data Initial Data Analysis.
Causal Inference and Adequate Yearly Progress Derek Briggs University of Colorado at Boulder National Center for Research on Evaluation, Standards, and.
© 2014, Florida Department of Education. All Rights Reserved Annual District Assessment Coordinator Meeting VAM Update.
School Performance Framework Sponsored by The Colorado Department of Education Summer 2010 Version 1.3.
An Introduction to Empirical Investigations. Aims of the School To provide an advanced treatment of some of the major models, theories and issues in your.
Correlational Research Chapter Fifteen Bring Schraw et al.
Measured Progress ©2012 Student Growth in the Non-Tested Subjects and Grades: Options for Teacher Evaluators Elena Diaz-Bilello, Center for Assessment.
Supporting Growth Interpretations Using Through-Course Assessments Andrew Ho Harvard Graduate School of Education Innovative Opportunities and Measurement.
New York State Scores 2011—2012 School Year. Growth Ratings and Score Ranges Growth RatingDescriptionGrowth Score Range (2011–12) Highly EffectiveWell.
Issues in Assessment Design, Vertical Alignment, and Data Management : Working with Growth Models Pete Goldschmidt UCLA Graduate School of Education &
Assumptions of value-added models for estimating school effects sean f reardon stephen w raudenbush april, 2008.
Regression Chapter 16. Regression >Builds on Correlation >The difference is a question of prediction versus relation Regression predicts, correlation.
“Value added” measures of teacher quality: use and policy validity Sean P. Corcoran New York University NYU Abu Dhabi Conference January 22, 2009.
1 New York State Growth Model for Educator Evaluation 2011–12 July 2012 PRESENTATION as of 7/9/12.
Release of Preliminary Value-Added Data Webinar August 13, 2012 Florida Department of Education.
EVAAS Proactive and Teacher Reports: Assessing Students’ Academic Needs and Using Teacher Reports to Improve Student Progress Cherokee County Schools February.
Chapter 7 Measuring of data Reliability of measuring instruments The reliability* of instrument is the consistency with which it measures the target attribute.
EVAAS Proactive and Teacher Reports: Assessing Students’ Academic Needs and Using Teacher Reports to Improve Student Progress Becky Pearson and Joyce Gardner.
Student Growth Percentiles (SGPs) MDE - AdvancED Michigan 2014 Fall School Improvement Conference November 18, 2014.
LISA A. KELLER UNIVERSITY OF MASSACHUSETTS AMHERST Statistical Issues in Growth Modeling.
What is Research?. Intro.  Research- “Any honest attempt to study a problem systematically or to add to man’s knowledge of a problem may be regarded.
Hypothesis Testing with z Tests Chapter 7. A quick review This section should be a review because we did a lot of these examples in class for chapter.
Copyright © 2014 American Institutes for Research and Cleveland Metropolitan School District. All rights reserved. March 2014 Interpreting Vendor Assessment.
Value Added Model Value Added Model. New Standard for Teacher EvaluationsNew Standard for Teacher Evaluations Performance of Students. At least 50% of.
Florida Algebra I EOC Value-Added Model June 2013.
Making Data Work for Kids: EVAAS Teacher Reports October 2012 SAS ® EVAAS ® for K-12.
1 New York State Growth Model for Educator Evaluation June 2012 PRESENTATION as of 6/14/12.
Measuring College Value-Added: A Delicate Instrument
EXPERIMENTAL RESEARCH
Statistics: The Z score and the normal distribution
Outline What is Literature Review? Purpose of Literature Review
Instructional Personnel Performance Appraisal System
Inferential statistics,
Validating Interim Assessments
CORE Academic Growth Model: Results Interpretation
Lead Evaluator for Principals Part I, Series 1
15.1 The Role of Statistics in the Research Process
Instructional Personnel Performance Appraisal System
Instructional Personnel Performance Appraisal System
Presentation transcript:

Cross-Cutting Issues in Recent State Growth Modeling Efforts Andrew Ho, Discussant Harvard Graduate School of Education Measuring Growth: A Key Feature of the Next Generation of Assessments National Conference on Student Assessment Friday, June 29, 2012

1)Lisa Keller’s presentation by Stephen Murphy – I’ll reflect on Lisa’s excellent overview of growth modeling issues and make some explicit recommendations. 2)Richard Vineyard (and Carol Crothers), Nevada – SGPs with an eye toward EVAAS 3)Juan Copa, Florida – Categorical + Gain … + Covariate-Adjustment VAM 4)Maridyth McBee, Oklahoma – Categorical … + VAM Four Presentations

Lisa and Stephen raise a number of critical issues: – Defining “growth,” aligning models to purposes, vertical scaling, missing data, background variables, error bands, transparency. I might push Lisa (and many academics, including myself) to be more explicit about what to do in particular situations. – We do a lot of “sensitivity studies” that show that results differ across models. We don’t as often address why, or when Model A is better than Model B. – Not only, “here are some important issues to consider,” but also, “and if you find X or care about Y, then you must do Z.” Lisa Keller by Stephen Murphy (1)

Shameless self-citation: Katherine Castellano (UC Berkeley) and I have a practitioner-oriented “Guide to Growth” due out this summer under CCSSO (contact me if you’re interested in a draft). – We cross-classify seven growth models by seven critical questions and criteria. – I’ll be extending our framework to discuss the growth models in Nevada, Florida, and Oklahoma. Lisa and Stephen’s presentation does an excellent job of listing some of the issues we consider and others that we should have discussed more than we did (particularly missing data and error bands). Castellano and Ho’s Practitioner’s Guide to Growth (1)

Practical definitions of “growth” and “growth model” Status describes the academic performance of a student or group at a single point in time. Growth describes the academic performance of a student or group over two or more time points. A Growth Model is a collection of definitions, calculations, and rules that quantifies student performance over two or more time points and supports interpretations about students, their classrooms, their educators, and their schools. – With apologies (and sympathy) to the “growth purists,” our call is to describe models as they are.

Nevada uses SGPs but is considering “value added” alternatives such as EVAAS. Comments and suggestions: – Try mean SGPs instead of median SGPs. More shameless self-citation: Katherine Castellano and I have a paper (scholar.harvard.edu/andrewho) that shows considerable advantages to means, including – much smaller theoretical standard errors – less variability in year-to-year comparisons – less scale-dependency – stronger correlations with familiar metrics. Measurement purists will tell you that you can’t average percentile ranks. We show that a simple tweak will satisfy many of their concerns while retaining the considerable advantages of averages. – In my opinion, a move to EVAAS would improve the argument for “value-added” inferences by a small degree but sacrifice transparency and interpretability. Richard Vineyard (and Carol Crothers), Nevada (2)

Florida uses a hybrid categorical model and gain- based model and is adding a value-added model. – The covariate-adjustment value-added model, like the SGP approach, is a “conditional status” model that does not align with “intuitive” descriptions of growth. – Adjusts for student-level and classroom-level characteristics in an attempt to compare teachers to “similar teachers” as defined by the model. – Could create some interesting incentives, where a teacher could “acquire” certain types of students to change their comparison group. – Adds to existing incentives to “acquire” students that might have artificially low prior scores. Juan Copa, Florida (3)

Sidebar: Conditional Status (SGPs, some VAMs) Ratings of restaurants in Harvard and Central Square plotted on the estimated cost of dinner. The location of a restaurant with respect to the regression line is its rating compared with expectations given price. Flour Bakery is more accurately described as having higher rating than expected given cost than “high growth.”

Oklahoma uses a categorical model. From our guide… Maridyth McBee, Oklahoma (4) Appealing Features: Transparency, communicability, simplicity. Things to consider: Loss of information due to coarse categories. Selection of articulated cut scores represents an implicit vertical scale.

Value Added Following the philosophies of many others (Braun, 2005; Reardon & Raudenbush, 2009; Rubin, Stuart, & Zanutto, 2004), I consider “value added” to describe a hypothesis, not a model. – School and teacher “effects” are akin to an average student status beyond or below expectations. – We must remember that our expectations can and should change as we base our expectations on different variables. So, too, will “effects.” – Extremes are worthy of further investigation and require convergent evidence for causal attribution.

Value Added

Value Added or Cheating Added? Atlanta Journal Constitution

Value Added Use and interpretation of “value-added” scores are far too often devoid of any theory of action for the improvement of teaching or school leadership. – In medicine, one could… analyze data use results to identify symptoms consider symptoms to arrive at a diagnosis use the diagnosis to prescribe a treatment. – With teacher value added models, we… analyze data use results to rank teachers on a single scale… and then? Many states have thoughtful teacher evaluation systems for the improvement of teaching. – We might remember that VAMs are just one element in this system. Thank you! scholar.harvard.edu/andrewho