1 Children Left Behind in AYP and Non-AYP Schools: Using Student Progress and the Distribution of Student Gains to Validate AYP Kilchan Choi Michael Seltzer.

Slides:



Advertisements
Similar presentations
Value Added in CPS. What is value added? A measure of the contribution of schooling to student performance Uses statistical techniques to isolate the.
Advertisements

Using Growth Models to improve quality of school accountability systems October 22, 2010.
A Model Description By Benjamin Ditkowsky, Ph.D. Student Growth Models for Principal and Student Evaluation.
Comparing State Reading and Math Performance Standards Using NAEP Don McLaughlin Victor Bandeira de Mello National Conference on Large-Scale Assessment.
Presented to the State Board of Education August 22, 2012 Jonathan Wiens, PhD Office of Assessment and Information Services Oregon Department of Education.
Elementary and Secondary Education Act (ESEA) “No Child Left Behind” Act of 2001 Public Law (NCLB) Brian Jeffries Office of Superintendent of.
No Child Left Behind Act January 2002 Revision of Elementary and Secondary Education Act (ESEA) Education is a state and local responsibility Insure.
VALUE – ADDED 101 Ken Bernacki and Denise Brewster.
Haywood County Schools February 20,2013
Data 101 Presented by Janet Downey After School Program Specialist Riverside Unified School District.
1 Prepared by: Research Services and Student Assessment & School Performance School Accountability in Florida: Grading Schools and Measuring Adequate Yearly.
Robert L. Linn CRESST, University of Colorado at Boulder Paper presented at a symposium sponsored by the National Association of Test Directors entitled.
Communicating through Data Displays October 10, 2006 © 2006 Public Consulting Group, Inc.
Robert L. Linn CRESST, University of Colorado at Boulder Paper presented at a symposium sponsored entitled “Accountability: Measurement and Value-Added.
The Special Education Leadership Training Project January, 2003 Mary Lynn Boscardin, Ph.D. Associate Professor Preston C. Green, III, Ed.D., J.D., Associate.
Using Growth Models for Accountability Pete Goldschmidt, Ph.D. Assistant Professor California State University Northridge Senior Researcher National Center.
Using Hierarchical Growth Models to Monitor School Performance: The effects of the model, metric and time on the validity of inferences THE 34TH ANNUAL.
Using School Climate Surveys to Categorize Schools and Examine Relationships with School Achievement Christine DiStefano, Diane M. Monrad, R.J. May, Patricia.
What Makes For a Good Teacher and Who Can Tell? Douglas N. Harris Tim R. Sass Dept. of Ed. Policy Studies Dept. of Economics Univ. of Wisconsin Florida.
MINNEAPOLIS PUBLIC SCHOOLS. Instructional Core Adapted from Harvard University PELP Framework.
Archived Information. MPR Associates 1 Effective Performance Measurement Systems  Define valid and reliable measures of student performance  Use appropriate.
Creating Assessments with English Language Learners in Mind In this module we will examine: Who are English Language Learners (ELL) and how are they identified?
Montana’s statewide longitudinal data system Project Montana’s Statewide Longitudinal Data System (SLDS)
Introduction to Adequate Yearly Progress (AYP) Michigan Department of Education Office of Psychometrics, Accountability, Research, & Evaluation Summer.
School Performance Index School Performance Index (SPI): A Comprehensive Measurement System for All Schools Student Achievement (e.g. PSSA) Student Progress.
SCHOOL ACCOUNTABILITY REPORT ALBUQUERQUE PUBLIC SCHOOLS RESEARCH, DEVELOPMENT AND ACCOUNTABILITY DEPARTMENT.
1 No Child Left Behind Critical Research Findings For School Boards Ronald Dietel UCLA Graduate School of Education & Information Studies National Center.
Florida’s Implementation of NCLB John L. Winn Deputy Commissioner Florida Department of Education.
PVAAS – Pennsylvania Value-Added Assessment System added_assessment_system_(pvaas)/8751.
The Impact of Including Predictors and Using Various Hierarchical Linear Models on Evaluating School Effectiveness in Mathematics Nicole Traxel & Cindy.
Slide 1 Estimating Performance Below the National Level Applying Simulation Methods to TIMSS Fourth Annual IES Research Conference Dan Sherman, Ph.D. American.
Park County School District #6 MAP and PAWS DATA REPORT FOR
1 Watertown Public Schools Assessment Reports 2010 Ann Koufman-Frederick and Administrative Council School Committee Meetings Oct, Nov, Dec, 2010 Part.
© 2011, Tulsa Public Schools Copyright © Tulsa Public Schools 2011 © 2011, Tulsa Public Schools Jana Burk, Tulsa Public Schools Fellow Office of Teacher.
Growth Model for District “X” Why Use Growth Models? Showing progress over time is a more fair way of evaluating It is not just a “snap shot” in time.
Applying SGP to the STAR Assessments Daniel Bolt Dept of Educational Psychology University of Wisconsin, Madison.
TVAAS Tennessee Value-Added Assessment System
UCLA Graduate School of Education & Information Studies National Center for Research on Evaluation, Standards, and Student Testing Practical Considerations.
Robert L. Linn Center for Research on Evaluation, Standards, and Student Testing University of Colorado at Boulder CRESST Conference, UCLA September 9,
NH Commissioner’s Task Force Meeting September 21, 2010 NH DOE 1 Commissioner's Task Force Meeting: September 21, 2010.
Developing a Framework for Ensuring the Validity of State Accountability Systems Council of Chief State School Officers AERA San Diego April 15, 2004.
July 2 nd, 2008 Austin, Texas Chrys Dougherty Senior Research Scientist National Center for Educational Achievement Adequate Growth Models.
Further Research Baker, E., Goldschmidt, P., Martinez, F., & Swigert, S. (February, 2002). In search of school quality and accountability: Moving beyond.
Gifted Presentation Mike Nicholson, Senior Director of Research and Evaluation.
PROGRESS & & ACHIEVEMENT Pennsylvania Value-Added Assessment System (PVAAS) The Power of School District School District.
State and Federal Accountability Old English Consortium Assistant Principals’ Conference October 2009.
NH Commissioner’s Task Force Meeting August 10, 2010 NH DOE 1 Commissioner's Force Meeting: August 10, 2010.
Michigan School Report Card Update Michigan Department of Education.
Mathematics and Science Partnerships Program Improving Math and Science Achievement in Low-Performing, High-Poverty Schools: Implications for Professional.
CREP Center for Research in Educational Policy SES Student Achievement Methods/Results: Multiple Years and States Steven M. Ross Allison Potter The University.
School Accountability No Child Left Behind & Arizona Learns.
CREATE – National Evaluation Institute Annual Conference – October 8-10, 2009 The Brown Hotel, Louisville, Kentucky Research and Evaluation that inform.
Parents as Partners: How Parents and Schools Work Together to Close the Achievement Gap.
ESEA Federal Accountability System Overview 1. Federal Accountability System Adequate Yearly Progress – AYP defined by the Elementary and Secondary Education.
1 Getting Up to Speed on Value-Added - An Accountability Perspective Presentation by the Ohio Department of Education.
Federal and State Student Accountability Data Update Testing Coordinators Meeting Local District 8 09/29/09 1.
1 Accountability Systems.  Do RFEPs count in the EL subgroup for API?  How many “points” is a proficient score worth?  Does a passing score on the.
No Child Left Behind California’s Definition of Adequate Yearly Progress (AYP) July 2003.
1 Mississippi Statewide Accountability System Adequate Yearly Progress Model Improving Mississippi Schools Conference June 11-13, 2003 Mississippi Department.
University of Colorado at Boulder National Center for Research on Evaluation, Standards, and Student Testing Challenges for States and Schools in the No.
AYP and Report Card. Big Picture Objectives – Understand the purpose and role of AYP in Oregon Assessments. – Understand the purpose and role of the Report.
C R E S S T / CU University of Colorado at Boulder National Center for Research on Evaluation, Standards, and Student Testing Measuring Adequate Yearly.
Annual Progress Report Summary September 12, 2011.
American Education Research Association April 2004 Pete Bylsma, Director Research/Evaluation/Accountability Office of Superintendent of Public Instruction.
Exploring Data Use & School Performance in an Urban School District Kyo Yamashiro, Joan L. Herman, & Kilchan Choi UCLA Graduate School of Education & Information.
Carina Omoeva, FHI 360 Wael Moussa, FHI 360
What is Value Added?.
What is API? The Academic Performance Index (API) is the cornerstone of California's Public Schools Accountability Act of 1999 (PSAA). It is required.
School A: Highest IS School D: Low IS, Better than exp gain
AYP and Report Card.
Presentation transcript:

1 Children Left Behind in AYP and Non-AYP Schools: Using Student Progress and the Distribution of Student Gains to Validate AYP Kilchan Choi Michael Seltzer Joan Herman Kyo Yamashiro UCLA Graduate School of Education & Information Studies National Center for Research on Evaluation, Standards, and Student Testing (CRESST)

2 Research Questions  Are there schools that meet AYP yet still have children who are not making substantial progress? i.e., leaving some children behind?  Are there schools that do not meet AYP yet still enable students to make substantial progress?  Do AYP schools achieve a more equitable distribution of student growth? Are students at all ability levels making progress in AYP schools?  Are there non-AYP schools that are reducing the achievement gap?

3 Sample  Large, Urban District in WA  2,524 students  2 time-point ITBS reading scores (Grade 3 in 2001 & Grade 5 in 2003)  Standard Errors of Measurement (SE) on ITBS reading scores (Bryk, et.al., 1998)  72 schools Average # students/school: 35 Average % qualifying for FRPL: 36.4% Average % Minority (African American, Native American, or Latino): 68.6%

4 AYP vs. Non-AYP schools In WA  School AYP decision made based on 4 th grade performance on WA Assessment of Student Learning (WASL)  51 schools made AYP; 21 did not make AYP in baseline year (2002), according to WA State Dept of Ed  Our study re-evaluates AYP and non-AYP schools with a new value-added model (an advanced hierarchical Modeling technique)

5 A New Methodology for School Effect / Accountability: Latent Variable Regression in Hierarchical Model  Additional Questions and Interest using LVR-HM  Move beyond school mean growth rates and examine hidden/underlying process  How equitably is student achievement distributed? (The distribution of student growth: Children Left Behind or No Child Left Behind)  Why is it that student achievement is distributed in a more equitable fashion in some schools than in other schools?

6 Distribution of Student Growth (Relationship between initial status and rate of change)

7 Why a New Value-Added Model (LVR-HM)?  Gains or Growth might be highly dependent upon a status at certain point of time (i.e., initial status)  Initial status can be a strong and important factor to “valued-added gain or growth”  New value-added gain or growth:  Adjusting student intake characteristics PLUS student initial difference  Adjusting school intake characteristics, policies and practice PLUS school initial difference  Thus, providing value-added gain or growth PLUS revealing the distribution of student achievement

8 Latent Variable Regression Hierarchical Model (LVR-HM)  Level 1: Time series within student Y ti =  0i +  1i Time ti +  ti  ti ~ N (0, 1)  Estimating initial status and gain for each student i with standard errors  Level 2: Student level  0i =  00 + r 0i r 0i ~ N (0,  00 )  1i =  10 + b( 0i -  00 ) + r 1i r 1i ~ N (0,  11 ) Cov(r 0i, r 1i ) = 0  Gain for student i is modeled as function of his or her initial status

9 Different Levels of Initial Status  Many ways to define performance subgroups based on initial status  Examined gains for 3 performance subgroups within each school  Defined by initial status  Hi Performers: 15 pts above the school mean initial status  Mean: School mean initial status  Low Performers: 15 pts below the school mean initial status

10 Estimating Expected Gains for Different Levels of Initial Status  We estimate expected (predicted) gain for each of the performance subgroups using LVR-HM  Model-based estimation, not separate group analysis  Point estimate of gain & its 95% confidence interval (statistical inferences)  Possible to estimate expected gains after controlling for factors that lie beyond school’s control (e.g., student SES, school compositional factors)

11  Only 12 of 52 AYP schools have 95% interval above the district avg.  1 AYP school’s 95% interval includes 0 Expected mean gain in ITBS reading scores for AYP schools

12 Expected mean gain in ITBS reading scores for non-AYP schools  2 Non-AYP schools have 95% interval above district avg.

13  7 AYP schools’ 95% interval  30  3 AYP schools’ 95% interval includes 0 (low performers make no gains) Expected gain for low-performing students (AYP schools)

14 Expected gain for low-performing students (non-AYP schools)  5 Non-AYP schools have gains for low performers >20

15 Expected gain for high-performing students (AYP schools)  9 AYP schools’ 95% interval  30  3 AYP schools’ 95% interval < 10 (high performers make little or no gains)

16 Expected gain for high-performing students (non-AYP schools)  5 Non-AYP schools’ 95% interval  30  3 Non-AYP schools’ 95% interval < 10 (high performers make little or no gains)

17 Distribution of Gains Within A School  Type I: Substantial gain across all performance subgroups (e.g., no child left behind – ex: AYP school #8, non-AYP school #26)  Type II: No adequate gain for high performers; substantial gain for low performers (ex: AYP schools #19, non-AYP school #27)  Type III: No adequate gain for low performers; substantial gain for high performers (ex: AYP schools, non-AYP school #6 )

18 15pts above the school meanEqual to the school mean15pts below the school mean Estimate95% intervalEstimate95% intervalEstimate95% interval AYP School Type I Sch. #8 Sch. #22 Sch. # ( 30.7, 43.8 ) ( 30.0, 43.2 ) ( 30.8, 42.8 ) ( 35.2, 42.2 ) ( 31.7, 42.0 ) ( 32.2, 40.5 ) ( 35.0, 45.7 ) ( 31.1, 43.9 ) ( 31.1, 41.1 ) Type II Sch. #19 Sch. # ( 6.9, 35.9 ) ( 18.2, 39.0 ) ( 22.8, 44.1 ) ( 27.8, 40.3 ) ( 30.7, 60.3 ) ( 30.5, 49.6 ) Type III Sch. #28 Sch. # ( 33.0, 51.2 ) ( 31.4, 39.5 ) ( 27.4, 35.1 ) ( 28.5, 34.0 ) ( 12.1, 28.7 ) ( 23.7, 40.4 ) Non-AYP school Type I Sch. # ( 24.1, 41.7 )32.2( 27.6, 36.9 )31.6( 22.5, 40.4 ) Type II Sch. # ( 9.0, 27.2 )24.6( 17.6, 31.5 )30.5( 21.5, 40.1 ) Type III Sch. #6 Sch. #38 Sch. # ( 31.5, 50.8 ) ( 30.0, 49.9 ) ( 31.2, 44.2 ) ( 26.2, 38.5 ) ( 25.5, 34.2 ) ( 26.9, 34.4 ) ( 13.6, 33.7 ) ( 9.3, 30.2 ) ( 17.0, 30.2 )

19 Distribution of student gain for 3 AYP schools

20 Distribution of student gain for 3 non-AYP schools

21 Comparing Features: AYP & the CRESST Approach AYPCRESST Approach Data Structure Cross-sectional (follow grade levels, e.g., 4 th graders in a school, over time) Longitudinal (follow individual students over time) Performance Measure (Outcome) Proficiency levels (using cut scores) Individual gains or growth Subgroup Demographic characteristicsPerformance-level groups Plus Demographic characteristics Adjustments / Controls for Student or School Characteristics No controls or adjustments, just disaggregations – loss of advantages when comparing against other schools Can adjust for differences between schools and students in the model Type of Growth Examined Percent Proficient may mask different underlying growth patterns: Even flexibility given to schools through Safe Harbor option is only for movement around the proficiency cut score More complete picture of growth PLUS growth distribution

22 Different Growth By Performance Subgroups & Demographic Subgroups

23 Conclusions  Analyses using our alternative approach: More informative picture of growth using individual, longitudinal student gains More complete picture of how student growth is distributed within a school  Stimulate discussion among teachers and administrator to identify students in need earlier (Seltzer, Choi & Thum, 2003)  Encourage educators to think about achievement levels rather than (or in addition to) current subgroup categories - may be more productive and actionable