The Impact of Including Predictors and Using Various Hierarchical Linear Models on Evaluating School Effectiveness in Mathematics Nicole Traxel & Cindy.

Slides:



Advertisements
Similar presentations
NYC Teacher Data Initiative: An introduction for Teachers ESO Focus on Professional Development December 2008.
Advertisements

Value Added in CPS. What is value added? A measure of the contribution of schooling to student performance Uses statistical techniques to isolate the.
Hierarchical Linear Modeling: An Introduction & Applications in Organizational Research Michael C. Rodriguez.
VALUE – ADDED 101 Ken Bernacki and Denise Brewster.
Comparing Growth in Student Performance David Stern, UC Berkeley Career Academy Support Network Presentation to Educating for Careers/ California Partnership.
Upper Darby School District Growth Data
TEACHER EFFECTIVENESS INITIATIVE VALUE-ADDED TRAINING Value-Added Research Center (VARC)
Districts and States Working with VARC Minneapolis Milwaukee Racine Chicago Madison Tulsa Atlanta New York City Los Angeles Hillsborough County NORTH DAKOTA.
Enquiring mines wanna no.... Who is it? Coleman Report “[S]chools bring little influence to bear upon a child’s achievement that is independent of.
English Language Arts Performance Assessment: Its Fairness and Predictive Validity Jia Wang, David Niemi, Pete Goldschmidt, and Haiwen Wang UCLA Graduate.
Using Growth Models for Accountability Pete Goldschmidt, Ph.D. Assistant Professor California State University Northridge Senior Researcher National Center.
Science Achievement and Student Diversity Okhee Lee School of Education University of Miami National Science Foundation (Grant No. REC )
Correlation 1. Correlation - degree to which variables are associated or covary. (Changes in the value of one tends to be associated with changes in the.
1 Chapter 17: Introduction to Regression. 2 Introduction to Linear Regression The Pearson correlation measures the degree to which a set of data points.
This material is based upon work supported by the National Science Foundation Grant No Building, Supporting, and Sustaining Professional Growth.
Analysis of Clustered and Longitudinal Data
The Oak Tree Analogy. For the past year, these gardeners have been tending to their oak trees trying to maximize the height of the trees. Explaining the.
Vouchers in Milwaukee: What Have We Learned From the Nation’s Oldest and Largest Program? Deven Carlson University of Oklahoma.
Including a detailed description of the Colorado Growth Model 1.
Cindy M. Walker & Kevin McLeod University of Wisconsin - Milwaukee Based upon work supported by the National Science Foundation Grant No
NYC ACHIEVEMENT GAINS COMPARED TO OTHER LARGE CITIES SINCE 2003 Changes in NAEP scores Class Size Matters August
MULTIPLE MEASURES What are they… Why are they… What do we do… How will we know… Dr. Scott P. Myers KLFA Wednesday, August 28, 2013.
The Evaluation of Mathematics and Science Partnership Program A Quasi Experimental Design Study Abdallah Bendada, MSP Director
Introduction to Regression Analysis. Two Purposes Explanation –Explain (or account for) the variance in a variable (e.g., explain why children’s test.
Evaluating the Vermont Mathematics Initiative (VMI) in a Value Added Context H. ‘Bud’ Meyers, Ph.D. College of Education and Social Services University.
BOARD ENDS POLICY REVIEW E-2 Students will demonstrate a strong foundation in academic skills by working toward the Kansas Standards of Excellence in reading,
Sensitivity of Teacher Value-Added Estimates to Student and Peer Control Variables October 2013 Matthew Johnson Stephen Lipscomb Brian Gill.
Guide to the Social Studies Sample Using LDC to Measure and Support Student Growth.
Review and Validation of ISAT Performance Levels for 2006 and Beyond MetriTech, Inc. Champaign, IL MetriTech, Inc. Champaign, IL.
Link Between Inclusive Settings and Achievement in Urban Settings Elizabeth Cramer Florida International University.
Slide 1 Estimating Performance Below the National Level Applying Simulation Methods to TIMSS Fourth Annual IES Research Conference Dan Sherman, Ph.D. American.
State Charter Schools Commission of Georgia SCSC Academic Accountability Update State Charter School Performance
1 Milwaukee Mathematics Partnership Program Evaluation Year 6 Results Carl Hanssen Hanssen Consulting, LLC Cindy Walker University of Wisconsin-Milwaukee.
Understanding The PLAN. What Do My Scores Mean?  Your scores range between 1 and 32.  The PLAN takes the number of correct responses on each test and.
Issues in Assessment Design, Vertical Alignment, and Data Management : Working with Growth Models Pete Goldschmidt UCLA Graduate School of Education &
Introduction to Multilevel Modeling Stephen R. Porter Associate Professor Dept. of Educational Leadership and Policy Studies Iowa State University Lagomarcino.
Milwaukee Mathematics Partnership External Evaluation Schools and School Leadership Report by Tanya Suarez, Suarez & Associates June 9, 2005.
1 Milwaukee Mathematics Partnership Program Evaluation Year 5 Results Carl Hanssen Hanssen Consulting, LLC Cindy Walker University of Wisconsin-Milwaukee.
United to Make a Difference: Improving the Achievement of Young Men of Color Council of the Great City Schools Fall 2014.
Chapter 16 Data Analysis: Testing for Associations.
1 Milwaukee Mathematics Partnership The Relationship between MMP Involvement and Student Achievement MPS Research Brief Carl Hanssen Hanssen Consulting,
Copyright © 2010, SAS Institute Inc. All rights reserved. How Do They Do That? EVAAS and the New Tests October 2013 SAS ® EVAAS ® for K-12.
Quality Jeanne M. Burns, Ph.D. Louisiana Board of Regents Qualitative State Research Team Kristin Gansle Louisiana State University and A&M College Value-Added.
1 New York State Growth Model for Educator Evaluation 2011–12 July 2012 PRESENTATION as of 7/9/12.
© 2006 by The McGraw-Hill Companies, Inc. All rights reserved. 1 Chapter 12 Testing for Relationships Tests of linear relationships –Correlation 2 continuous.
DVAS Training Find out how Battelle for Kids can help Presentation Outcomes Learn rationale for value-added progress measures Receive conceptual.
CREATE – National Evaluation Institute Annual Conference – October 8-10, 2009 The Brown Hotel, Louisville, Kentucky Research and Evaluation that inform.
Release of Preliminary Value-Added Data Webinar August 13, 2012 Florida Department of Education.
Copyright © 2010, SAS Institute Inc. All rights reserved. How Do They Do That? EVAAS and the New Tests October 2013 SAS ® EVAAS ® for K-12.
ACT ASPIRE GROWTH REPORTS. DISTRICTS AND SCHOOLS THAT PARTICIPATED IN ACT ASPIRE ASSESSMENTS (READING, MATH, ENGLISH, SCIENCE AND WRITING) WITH AN N COUNT.
Quality George Noell, Ph.D. Louisiana State University and A&M College/ Louisiana Department of Education And Jeanne M. Burns, Ph.D. Louisiana Board of.
The Nation’s Report Card: Trial Urban District Assessment: Science 2005.
The Nation’s Report Card: 2005 Reading and Mathematics Trial Urban District Assessments.
CORRELATION ANALYSIS.
Internal Evaluation of MMP Cindy M. Walker Jacqueline Gosz Razia Azen University of Wisconsin Milwaukee.
A Comparison of Marking Levels in the Reviewed Majors.
1 Testing Various Models in Support of Improving API Scores.
Early Literacy Screening: Comparing PALS-K with AIMSweb and STAR
What is Value Added?.
Holli G. Bayonas, Ph.D & Eric S. Howard, M.A.
Evaluation of the Wisconsin Educator Effectiveness System Pilot: Results of the Teacher Practice Rating System Pilot Curtis Jones, UW Milwaukee Steve.
2017 TUDA NAEP Results for Miami-Dade
Dr. Robert H. Meyer Research Professor and Director
Chapter 15: Correlation.
School-Wide Achievement Mathematics
IFs and Nested IFs =IF(R3<60,”F”,”P”)
CORRELATION ANALYSIS.
Student Mobility and Achievement Growth In State Assessment Mohamed Dirir Connecticut Department of Education Paper presented at National Conference.
Measuring Student Growth
Educational Testing Service
Presentation transcript:

The Impact of Including Predictors and Using Various Hierarchical Linear Models on Evaluating School Effectiveness in Mathematics Nicole Traxel & Cindy Walker University of Wisconsin - Milwaukee April 14, 2009 The Milwaukee Mathematics Partnership (MMP) is supported by the National Science Foundation under Grant No

Introduction Value added models Value added models Fair and accurate way to assess the effectiveness of schools Fair and accurate way to assess the effectiveness of schools Determine how much value a school adds to student learning by examining student progress over time Determine how much value a school adds to student learning by examining student progress over time Operational definition: Effectiveness=growth Operational definition: Effectiveness=growth Hierarchical linear models can be used to implement a value added accountability system Hierarchical linear models can be used to implement a value added accountability system Hierarchical because students nested within schools Hierarchical because students nested within schools Can determine how much of growth can be attributed to the student and to the school Can determine how much of growth can be attributed to the student and to the school

Types of Hierarchical Linear Models Several different hierarchical linear models can be used to assess school effectiveness, so which is best? Several different hierarchical linear models can be used to assess school effectiveness, so which is best? 2-level hierarchical model predicts final achievement from initial achievement. Can include student level and school level predictors of achievement, not growth 2-level hierarchical model predicts final achievement from initial achievement. Can include student level and school level predictors of achievement, not growth 2-level growth model predicts change in test scores from one year to the next. Can include student level and school level predictors of growth, not achievement 2-level growth model predicts change in test scores from one year to the next. Can include student level and school level predictors of growth, not achievement 3-level individual growth model predicts achievement and change over time. Can include student level and school level predictors of growth and achievement 3-level individual growth model predicts achievement and change over time. Can include student level and school level predictors of growth and achievement

Research Questions Do effectiveness rankings differ depending on which type of model is used? Do effectiveness rankings differ depending on which type of model is used? Does predictor significance remain constant across model types? Does predictor significance remain constant across model types? Does including predictors change effectiveness rankings of school? Does including predictors change effectiveness rankings of school?

Sample & Measures 7,232 students from 128 school from a large urban school district in the Midwest 7,232 students from 128 school from a large urban school district in the Midwest 3 rd to 4 th grade 3 rd to 4 th grade 87% minority, 79% receive free/reduced lunch 87% minority, 79% receive free/reduced lunch Mathematics scores on a state mandated standardized test Mathematics scores on a state mandated standardized test Math Focus score for each school – “There is a strong focus on increasing student achievement in mathematics at my school.” Math Focus score for each school – “There is a strong focus on increasing student achievement in mathematics at my school.” Gain in Math Focus calculated by subtracting Math Focus score from 1 st year from Math Focus score from 2 nd year Gain in Math Focus calculated by subtracting Math Focus score from 1 st year from Math Focus score from 2 nd year

The Models That Were Fit 2-Level 2-Level Gain 3-Level Initial Score None Student Race Student SES Student Race & SES Student & School Race Student & School SES Student & School Race & SES Note: Initial Score was included as a student level covariate in all 2-level and 3-level models, but not in the 2-level gain models.

Comparisons Predictor significance across models Predictor significance across models Effectiveness rankings across predictors being included within each model type Effectiveness rankings across predictors being included within each model type Effectiveness rankings across model types for models including only initial score or no predictors Effectiveness rankings across model types for models including only initial score or no predictors Effectiveness rankings across model types for models including only initial score or no predictors validated using gain in Math Focus score Effectiveness rankings across model types for models including only initial score or no predictors validated using gain in Math Focus score

Predictor Significance Student-level SES and Student-level Race were significant predictors of the average achievement of students within schools but not of the average growth of students within schools Student-level SES and Student-level Race were significant predictors of the average achievement of students within schools but not of the average growth of students within schools School-level Race and School-level SES were significant predictors of the average achievement among schools but not of the average growth among schools School-level Race and School-level SES were significant predictors of the average achievement among schools but not of the average growth among schools But not when both were included, due to collinearity But not when both were included, due to collinearity

Predictors Do Not Change Effectiveness 2-Level Model2-Level Gain Model3-Level Model Student Level Race Student Level SES Student Level Race & SES Student & School Level Race Student & School Level SES Student & School Level Race & SES Spearman Correlations between the null model and models including predictors for each model type.

Correlations Among Model Types (Null Models) 2-level and 2-level gain models 2-level and 2-level gain models r =.090 r = level and 3-level models 2-level and 3-level models r =.101 r = level gain and 3-level models 2-level gain and 3-level models r =.993 r =.993 Therefore, rankings from 2-level gain and 3-level models are very similar to one another Therefore, rankings from 2-level gain and 3-level models are very similar to one another

Validating Effectiveness Rankings Pearson correlations between gain in math focus and effectiveness estimates from null models of each model type. Pearson correlations between gain in math focus and effectiveness estimates from null models of each model type. 2-level: r =.034, p = level: r =.034, p = level gain: r =.200, p = level gain: r =.200, p = level: r =.177, p = level: r =.177, p =.089

Conclusions, Part One Including predictors, even if they are significant, does not change effectiveness estimates Including predictors, even if they are significant, does not change effectiveness estimates Effectiveness estimates from null 2-level model were different from effectiveness estimates from null 2-level gain and 3-level models Effectiveness estimates from null 2-level model were different from effectiveness estimates from null 2-level gain and 3-level models Effectiveness estimates from 2-level gain and 3- level models were very similar Effectiveness estimates from 2-level gain and 3- level models were very similar

Conclusions, Part Two Correlation between 2-level gain and 3-level models and gain in math focus had higher magnitudes than correlation between 2-level model and gain in math focus Correlation between 2-level gain and 3-level models and gain in math focus had higher magnitudes than correlation between 2-level model and gain in math focus Effectiveness estimates from 2-level gain and 3- level models are more valid than those from 2- level model Effectiveness estimates from 2-level gain and 3- level models are more valid than those from 2- level model

So which model type should I use? 3-level model has several advantages over 2-level gain model 3-level model has several advantages over 2-level gain model Includes ALL available data-all participants with at least one observation are included Includes ALL available data-all participants with at least one observation are included Can include many years of data Can include many years of data Provides more information (growth and achievement estimates) Provides more information (growth and achievement estimates)