Value Added in CPS. What is value added? A measure of the contribution of schooling to student performance Uses statistical techniques to isolate the.

Slides:



Advertisements
Similar presentations
NYC Teacher Data Initiative: An introduction for Teachers ESO Focus on Professional Development December 2008.
Advertisements

NYC Teacher Data Initiative: An introduction for Principals ESO Focus on Professional Development October 2008.
Jamesville-DeWitt School Report Card Presented to the Board of Education May 10, 2010.
Completing the Classroom Teacher and Non-Classroom Teacher Evaluations for Presented by: The Office of Talent Development Employee Evaluations.
Data Analysis State Accountability. Data Analysis (What) Needs Assessment (Why ) Improvement Plan (How) Implement and Monitor.
EVAAS EVALUATION EFFECTIVENESS …AND WHAT DOES IT SAY??? Brawley Middle School November 27, 2012.
VALUE – ADDED 101 Ken Bernacki and Denise Brewster.
Haywood County Schools February 20,2013
Florida Department of Education Value-added Model (VAM) FY2012 Using Student Growth in Teacher and School Based Administrator Evaluations.
Educator Evaluations Education Accountability Summit August 26-28,
Latino Students in the Worcester Public Schools March 30, 2010 Miren Uriarte Mauricio Gaston Institute for Latino Community Development and Public Policy.
Dallas ISD’s Value-Added Model School Effectiveness Index (SEI) Classroom Effectiveness Index (CEI) Data Analysis, Reporting, and Research Services.
TEACHER EFFECTIVENESS INITIATIVE VALUE-ADDED TRAINING Value-Added Research Center (VARC)
Districts and States Working with VARC Minneapolis Milwaukee Racine Chicago Madison Tulsa Atlanta New York City Los Angeles Hillsborough County NORTH DAKOTA.
Annual Professional performance review (APPR overview) Wappingers CSD.
Communicating through Data Displays October 10, 2006 © 2006 Public Consulting Group, Inc.
Science Achievement and Student Diversity Okhee Lee School of Education University of Miami National Science Foundation (Grant No. REC )
Value-added Accountability for Achievement in Minneapolis Schools and Classrooms Minneapolis Public Schools December,
Introduction to NYC Teacher Data Initiative Training for Schools Fall 2008.
The Oak Tree Analogy. For the past year, these gardeners have been tending to their oak trees trying to maximize the height of the trees. Explaining the.
TEACHER EFFECTIVENESS INITIATIVE VALUE-ADDED TRAINING Value-Added Research Center (VARC)
Title III Accountability. Annual Measurable Achievement Objectives How well are English Learners achieving academically? How well are English Learners.
CRIOP Professional Development: Program Evaluation Evaluatio Susan Chambers Cantrell, Ed.D. Pamela Correll, M.A. Victor Malo-Juvera, Ed.D.
DRE Agenda Student Learning Growth – Teacher VAM – School Growth PYG Area Scorecards. PYG, and other Performance Indicators.
1 New York State Growth Model for Educator Evaluation 2011–12 July 2012 PRESENTATION as of 7/9/12.
Florida Department of Education Value-added Model (VAM) FY2012 Using Student Growth in Teacher and School Based Administrator Evaluations.
Instruction, Teacher Evaluation and Value-Added Student Learning Minneapolis Public Schools November,
Blue Springs Elementary School Standards Based Report Card Parent Meeting.
STUDENT AIMS PERFORMANCE IN A PREDOMINANTLY HISPANIC DISTRICT Lance Chebultz Arizona State University 2012.
Western Suffolk BOCES Boot Camp Emma Klimek Eastern Suffolk BOCES 2012.
© 2011, Tulsa Public Schools Copyright © Tulsa Public Schools 2011 © 2011, Tulsa Public Schools Jana Burk, Tulsa Public Schools Fellow Office of Teacher.
Growth Model for District “X” Why Use Growth Models? Showing progress over time is a more fair way of evaluating It is not just a “snap shot” in time.
Van Hise Elementary School Review of Data School Improvement Process March 3, 2009.
DRE FLDOE “Value-Added Model” School District of Palm Beach County Performance Accountability.
Assessment Brian Cole, Max Ames, Lyza Reichelt. Summative  Unit tests  end of chapters/sections/term/semester  District, State, National assessments.
MMSD Value-Added Results January 3, Attainment versus Growth Grade 3Grade 4Grade 5Grade 6Grade 7Grade 8 2.
1 New York State Growth Model for Educator Evaluation 2011–12 July 2012 PRESENTATION as of 7/9/12.
Connecticut’s Performance on Title III Annual Measurable Achievement Objectives, Presentation to Connecticut Administrators of Programs for English.
NECAP Results and Accountability A Presentation to Superintendents March 22, 2006.
DVAS Training Find out how Battelle for Kids can help Presentation Outcomes Learn rationale for value-added progress measures Receive conceptual.
New Jersey Assessment Of Skills and Knowledge Science 2015 Carmela Triglia.
Evaluating Impacts of MSP Grants Ellen Bobronnikov January 6, 2009 Common Issues and Potential Solutions.
Teacher SLTs General Format for Teacher SLTs with a District-wide Common Assessment The percent of students scoring proficient 1 in my 8 th.
Value Added Model and Evaluations: Keeping It Simple Polk County Schools – November 2015.
PED School Grade Reports (with thanks to Valley High School) ACE August 3, 2012 Dr. Russ Romans District Accountability Manager.
EVAAS and Expectations. Answers the question of how effective a schooling experience is for learners Produces reports that –Predict student success –Show.
No Child Left Behind Impact on Gwinnett County Public Schools’ Students and Schools.
Florida Department of Education’s Florida Department of Education’s Teacher Evaluation System Student Learning Growth.
VAM Training. Florida’s value-added model developed by Florida educators  The Department convened a committee of stakeholders (Student Growth Implementation.
Ready At Five & Maryland State Department of Education.
Value Added Model Value Added Model. New Standard for Teacher EvaluationsNew Standard for Teacher Evaluations Performance of Students. At least 50% of.
BY MADELINE GELMETTI INCLUDING STUDENTS WITH DISABILITIES AND ENGLISH LEARNERS IN MEASURES OF EDUCATOR EFFECTIVENESS.
Session Objectives Decode the Teacher Summative Evaluation form, including the Student Achievement Measures, so it can be used to give teachers feedback.
KHS PARCC/SCIENCE RESULTS Using the results to improve achievement Families can use the results to engage their child in conversations about.
Florida Algebra I EOC Value-Added Model June 2013.
 Mark D. Reckase.  Student achievement is a result of the interaction of the student and the educational environment including each teacher.  Teachers.
1 New York State Growth Model for Educator Evaluation June 2012 PRESENTATION as of 6/14/12.
CORE Academic Growth Model: Introduction to Growth Models
Metropolitan Nashville Public Schools
Mesa Union School District “A Day in the Life of Data”
Value-Added Evaluation & Tenure Law
Dr. Robert H. Meyer Research Professor and Director
TESTING: How We Measure Academic Achievement
CORE Academic Growth Model: Results Interpretation
Impact Analyses for VAM Scores
Student Mobility and Achievement Growth In State Assessment Mohamed Dirir Connecticut Department of Education Paper presented at National Conference.
Value Added in CPS.
CORE Academic Growth Model: Step-By-Step
CORE Academic Growth Model: Step-By-Step
Measuring Student Growth
Presentation transcript:

Value Added in CPS

What is value added? A measure of the contribution of schooling to student performance Uses statistical techniques to isolate the impact of schooling from other factors Focuses on how much students improve from one year to the next

Demographic adjustments Value added makes adjustments for demographics of schools and classrooms Adjustments determined by relationships between growth, student characteristics Adjustments measure partial differences in growth across groups district-wide

Some schools with low percent meet/exceed are high value-added schools in which students grow

Value added in many domains Annual state assessments –Focus on year-to-year student improvement Short-term assessments –Focus on short-term student improvement –Potentially faster turnaround High school assessments –Explore/PLAN/ACT, for example –Focus on improvement in high school

Value added in CPS Based on ISAT for grades 3 through 8 Analyzes students ISAT scores, demographics, and schools attended Schools and classrooms where students improve more (relative to similar students) identified as high value added Extra ISAT points gained by students at a school/classroom on average relative to observably similar students across district

Alternative understanding Average student gain on ISAT relative to district average, with adjustments for: –Shape of the test scale (Prior ISAT score) –Grade level –Gender, mobility, free/reduced-price lunch, race/ethnicity, disability, language proficiency, homelessness, other-subject pretest –Enrollment in multiple schools or classrooms

Regression model (in English) Posttest Pretest Post-on-Pre Link Student Characteristics School and Classroom Effects Unobserved Factors = x + ++ Value Added

Student characteristics Gender Race/ethnicity Free or reduced-price lunch Language proficiency (by Access score) Disability (by disability type) Mobility Homelessness Other-subject pretest

Why include student characteristics? One goal of value-added analysis is to be as fair as possible We want to remove the effect of factors that were not caused by the school during the specific period we are evaluating

Examples Curriculum Classroom teacher School culture Math pull-out program at school Structure of lessons in school Safety at the school Value added reflects the impact of these factors Examples Student motivation English Language Learner Status At home support Household financial resources Learning disability Prior knowledge These factors need to be measured and isolated Related to the school Not related to the school What do we want to evaluate?

Controlling for other factors Students bring different resources to the classroom. These factors can affect growth, so we want to remove the effects of these non- school factors. Examples Student motivation English Language Learner Status At home support Household financial resources Learning disability Prior knowledge These factors need to be measured and isolated Not related to the school

Controlling for other factors In order to include a characteristic in the model, we must have data on that characteristic for all students. Some characteristics are harder to measure and collect than others. The data that we do have available can tell us something about the effect of data we would like to have.

Controlling for other factors For example, we can use free or reduced- price lunch as a substitute for our ideal data about household finances in our calculations. What we want Household financial resources What we have Free or reduced-price lunch Related data

Adjustments are based on real data Why is it important that VARC uses student test scores to calculate adjustment factors? –We do not have a preconceived notion of which student subgroups will grow faster than others –We want to be as fair as possible when evaluating school performance –Student subgroups perform differently on each subject area from year-to-year –We want our adjustments to apply specifically to the situation we are evaluating

Multiple regression Measures effects of each variable on posttest controlling for all other variables –Effect of pretest on posttest controls for student characteristics, schools –Effects of student characteristics on posttest control for pretest, schools –Effects of schools (value added) on posttest control for pretest, student characteristics All effects measured simultaneously

Dosage Accounts for students changing schools and classrooms Students enrolled in a school or classroom for a fraction of a year get a fractional dose of the schools or classrooms effect Apportions student growth among schools and classrooms enrolled in the same year

Pretest measurement error Pretest measures student attainment in previous year with measurement error –Models that ignore this will bias in favor of high-attainment schools and classrooms Measurement error is accounted for in VA model to correctly account for pretest –Using approaches in Fuller (1987) –Ensures against bias

Models that correct for measurement error avoid biasing in favor of schools and classrooms with high initial scores

Value added model All of these features ensure that value added reflects the results of schooling on student achievement Value added uses the data available to measure the impact of schools and classrooms as accurately, fairly, and realistically as possible

Work in progress Classroom-level value added –Measures student growth within classrooms Differential effects value added –Measures growth among students of a particular group (ELL, disability, etc.) in a school or classroom Value added from other assessments –Scantron (short-term) –Explore/PLAN/ACT (high school)