Supporting Growth Interpretations Using Through-Course Assessments Andrew Ho Harvard Graduate School of Education Innovative Opportunities and Measurement.

Slides:



Advertisements
Similar presentations
Using Growth Models to improve quality of school accountability systems October 22, 2010.
Advertisements

Summary of NCDPI Staff Development 12/4/12
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Curve Fitting ~ Least Squares Regression Chapter.
Kentucky’s School Report Card and Spreadsheets
Using MAP for College and Career Readiness
Haywood County Schools February 20,2013
Comparing Growth in Student Performance David Stern, UC Berkeley Career Academy Support Network Presentation to Educating for Careers/ California Partnership.
Accuracy, Transparency, and Incentives: Contrasting Criteria for Evaluating Growth Models Andrew Ho Harvard Graduate School of Education Maryland Assessment.
Chapter 8 Linear Regression © 2010 Pearson Education 1.
Enquiring mines wanna no.... Who is it? Coleman Report “[S]chools bring little influence to bear upon a child’s achievement that is independent of.
Concept of Measurement
Determining How Costs Behave
Copyright © 2009 Pearson Addison-Wesley. All rights reserved. Chapter 8 Inflation: Its Causes and Cures.
Using Growth Models for Accountability Pete Goldschmidt, Ph.D. Assistant Professor California State University Northridge Senior Researcher National Center.
©2003 Prentice Hall Business Publishing, Cost Accounting 11/e, Horngren/Datar/Foster Determining How Costs Behave Chapter 10 2/07/05.
C R E S S T / U C L A Improving the Validity of Measures by Focusing on Learning Eva L. Baker CRESST National Conference: Research Goes to School Los Angeles,
Stepping Up with PARCC ● Create a parking lot for questions or have information booths to answer families’ questions. Make sure to remind families that.
Staar Trek The Next Generation STAAR Trek: The Next Generation Performance Standards.
Chapter 7 Correlational Research Gay, Mills, and Airasian
Unit 5c: Adding Predictors to the Discrete Time Hazard Model © Andrew Ho, Harvard Graduate School of EducationUnit 5c– Slide 1
Understanding Wisconsin’s New School Report Card.
Unit 5c: Adding Predictors to the Discrete Time Hazard Model © Andrew Ho, Harvard Graduate School of EducationUnit 5c– Slide 1
Chapter 13: Inference in Regression
Critical Information SAGE Critical Information 1 Judy Park, Ed.D. Associate Superintendent Utah State Office of Education.
Including a detailed description of the Colorado Growth Model 1.
NCLB AND VALUE-ADDED APPROACHES ECS State Leader Forum on Educational Accountability June 4, 2004 Stanley Rabinowitz, Ph.D. WestEd
Becoming a Teacher Ninth Edition
John Cronin, Ph.D. Director The Kingsbury NWEA Measuring and Modeling Growth in a High Stakes Environment.
Parent Training California Assessment for Student
Accountability SY Divisions of Assessment, Accountability and School Improvement.
Learning Objective 1 Explain the two assumptions frequently used in cost-behavior estimation. Determining How Costs Behave – Chapter10.
1 Race to the Top Assessment Program General & Technical Assessment Discussion Jeffrey Nellhaus Deputy Commissioner January 20, 2010.
Counseling Research: Quantitative, Qualitative, and Mixed Methods, 1e © 2010 Pearson Education, Inc. All rights reserved. Basic Statistical Concepts Sang.
UCLA Graduate School of Education & Information Studies National Center for Research on Evaluation, Standards, and Student Testing Practical Considerations.
1 Student Assessment Update Research, Evaluation & Accountability Angela Marino Coordinator Research, Evaluation & Accountability.
MELS 601 Ch. 7. If curriculum can be defined most simply as what is taught in the school, then instruction is the how —the methods and techniques that.
Cross-Cutting Issues in Recent State Growth Modeling Efforts Andrew Ho, Discussant Harvard Graduate School of Education Measuring Growth: A Key Feature.
PREPARING [DISTRICT NAME] STUDENTS FOR COLLEGE & CAREER Setting a New Baseline for Success.
CAROLE GALLAGHER, PHD. CCSSO NATIONAL CONFERENCE ON STUDENT ASSESSMENT JUNE 26, 2015 Reporting Assessment Results in Times of Change:
MEAP / MME New Cut Scores Gill Elementary February 2012.
“Value added” measures of teacher quality: use and policy validity Sean P. Corcoran New York University NYU Abu Dhabi Conference January 22, 2009.
July 2 nd, 2008 Austin, Texas Chrys Dougherty Senior Research Scientist National Center for Educational Achievement Adequate Growth Models.
5.1 WELCOME TO COMMON CORE HIGH SCHOOL MATHEMATICS LEADERSHIP SUMMER INSTITUTE 2014 SESSION 5 20 JUNE 2014 SEEING PATTERNS AND TRENDS IN BIVARIATE DATA.
Where Do We Go from Here? Next Generation Assessment Systems for Achievement and Accountability Presentation at the National Conference on Large Scale.
Pearson Copyright 2010 Some Perspectives on CAT for K-12 Assessments Denny Way, Ph.D. Presented at the 2010 National Conference on Student Assessment June.
Assessment Information from multiple sources that describes a student’s level of achievement Used to make educational decisions about students Gives feedback.
DVAS Training Find out how Battelle for Kids can help Presentation Outcomes Learn rationale for value-added progress measures Receive conceptual.
Accountability SY Divisions of Assessment, Accountability and School Improvement.
Multivariate Analysis and Data Reduction. Multivariate Analysis Multivariate analysis tries to find patterns and relationships among multiple dependent.
EVAAS Proactive and Teacher Reports: Assessing Students’ Academic Needs and Using Teacher Reports to Improve Student Progress Cherokee County Schools February.
Chapter 7 Measuring of data Reliability of measuring instruments The reliability* of instrument is the consistency with which it measures the target attribute.
Understanding the 2015 Smarter Balanced Assessment Results Assessment Services.
PED School Grade Reports (with thanks to Valley High School) ACE August 3, 2012 Dr. Russ Romans District Accountability Manager.
Brian Lukoff Stanford University October 13, 2006.
March 26, 2012 North Scott School Board Presentation.
C R E S S T / CU University of Colorado at Boulder National Center for Research on Evaluation, Standards, and Student Testing Design Principles for Assessment.
Combining Multiple Measures What are the indicators/ components? What are the priority outcomes? What are the performance expectations? How can we evaluate.
Part II – Chapters 6 and beyond…. Reliability, Validity, & Grading.
The READY Accountability Report: Growth and Performance of North Carolina Public Schools State Board of Education November 7, 2013.
Goal: Institute an accountability model that… improves student achievement increases graduation rates closes achievement gaps Globally Competitive Students.
November 2009 Copyright © 2009 Mississippi Department of Education 1 Mississippi Department of Education Office of Research and Statistics Mississippi.
Determining How Costs Behave
California Assessment of STUDENT PERFORMANCE and PROGRESS
Feedback/Performance Review and Compensation Process
Growth: Changing the Conversation
PowerSchool for Parents
New Statewide Accountability System
CORE Academic Growth Model: Results Interpretation
SAT and Accountability Evidence and Information Needed and Provided for Using Nationally Recognized High School Assessments for ESSA Kevin Sweeney,
CHAPTER 3 Describing Relationships
Presentation transcript:

Supporting Growth Interpretations Using Through-Course Assessments Andrew Ho Harvard Graduate School of Education Innovative Opportunities and Measurement Challenges in Through-Course Summative Assessments National Conference on Student Assessment Monday, June 20, 2011

Through-Course Assessments and Growth Multiple assessments through the school year to support a) instruction and learning and b) inferences about future performance, particularly end-of-year tests and, by extension, career and college readiness. Paper: Google [Andrew Supporting Growth] _Paper_Ho.pdf _Paper_Ho.pdf Webinar: Google [Ho Growth Webinar] In this presentation: Two important distinctions for any multiple-timepoint growth application. Then, additional challenges raised by through-course assessments.

Some important contrasts in growth use and interpretation Growth Description Growth Projection Gain Scores, Trajectories Status Beyond Prediction Trajectory, Gain- Score Model Regression/Pre -diction Model Use a regression model to predict future scores from past scores, statistically, empirically. Where a student was, where a student is, and what has been learned in between. Where a student is, above and beyond where we would have predicted she would be, given past scores. Extend past gains in systematic fashion into the future. Consider whether future performance is adequate

Two Approaches to Growth Description Gain Prediction from previous score (or scores, or scores and demographics) Status beyond prediction Adding two students with equal gains Adding two different students with equal statuses beyond predictions. Gain Scores, Trajectories Status Beyond Prediction

Two Accountability Models Therefrom An “Equal Gain” Requirement: Headline: New Growth Scores Set Unrealistic Expectations on Highest Scoring Students. –Low Standards for Low- Scoring Students. Three students with equal gains Three students with equal statuses beyond predictions. An “Equal Status-Beyond- Prediction” Requirement New “Growth” Scores Expect Less Learning from Highest Scoring Students. –Scores depend on arbitrary decisions

Pros and Cons Pros: Straightforward. Aligns with user intuition about growth. Describes growth and progress along an (ideally) meaningful, relevant scale. Cons: Defensible vertical scales are difficult and costly to support, can be poorly aligned with statistical predictions. Three students with equal gains Three students with equal statuses beyond predictions. Pros: Incorporates statistical predictions. Does not require vertical scales. Cons: Poorly aligned with user intuition about growth. Variables supporting predictions can be atheoretical; variable inclusion/exclusion changes predictions.

Two Approaches to Growth Projection Extends gains over time in straightforward fashion. With more prior years, a best- fit line or curve can be extended similarly. Extended trajectories do not have to be linear. Estimates a prediction equation for the “future” score. Because current students have unknown future scores, estimate the prediction equation from a previous cohort that does have their “future” year’s score. Input current cohort data into past prediction equation. Trajectory, Gain-Score ModelRegression/Prediction Model

Stark Contrasts in Projections Three students with equal projections from a regression model. The same three students’ predictions with a regression model. Three students with equal projections from a trajectory model

Stark Contrasts in Incentives Three students with equal projections from a trajectory model Three students with equal projections from a regression model. Lower initial scores can inflate trajectories: New Model Rewards Low Scores, Encourages “Fail-First Strategy” Very intuitive, requires vertical scales, less accurate in terms of future classifications. Low scorers require huge gains. High scorers can fall comfortably. New Model Labels High and Low Achievers Early, Permanently. Counterintuitive, does not require vertical scales, more accurate classifications.

Growth and Through-Course Assessments For low-stakes, through-course assessments, either incorporate vertical scales with gain-score growth information or don’t overpromise on the usefulness of through-course data to describe growth through a curricular domain. As stakes increase, anticipate and communicate incentives, and adjust weights to discourage both “fail-first” and “inertial status” gaming behaviors, ideally with low but not negative weights on earlier tests. Do not forget that “career and college readiness” does not solve the problem of setting a meaningful and defensible cut score, nor does it predetermine the type of growth model that projects to that target. Paper: Google [Andrew Supporting Growth] Webinar: Google [Ho Growth Webinar]