TEACHER EFFECTIVENESS INITIATIVE VALUE-ADDED TRAINING Value-Added Research Center (VARC)

Slides:



Advertisements
Similar presentations
NYC Teacher Data Initiative: An introduction for Teachers ESO Focus on Professional Development December 2008.
Advertisements

Value Added in CPS. What is value added? A measure of the contribution of schooling to student performance Uses statistical techniques to isolate the.
VALUE – ADDED 101 Ken Bernacki and Denise Brewster.
Irwin/McGraw-Hill © Andrew F. Siegel, 1997 and l Chapter 12 l Multiple Regression: Predicting One Factor from Several Others.
Haywood County Schools February 20,2013
Florida Department of Education Value-added Model (VAM) FY2012 Using Student Growth in Teacher and School Based Administrator Evaluations.
Comparing Growth in Student Performance David Stern, UC Berkeley Career Academy Support Network Presentation to Educating for Careers/ California Partnership.
Joining the dots Supporting & challenging your school Governor Dashboard 1 Paul Charman Director of Strategy & Operations, FFT Chair of Governors, Dyson.
Dallas ISD’s Value-Added Model School Effectiveness Index (SEI) Classroom Effectiveness Index (CEI) Data Analysis, Reporting, and Research Services.
Measuring Student Growth An introduction to Value-Added.
TEACHER EFFECTIVENESS INITIATIVE VALUE-ADDED TRAINING Value-Added Research Center (VARC)
Districts and States Working with VARC Minneapolis Milwaukee Racine Chicago Madison Tulsa Atlanta New York City Los Angeles Hillsborough County NORTH DAKOTA.
June  Articulate the impact SLG goals have on improving student learning  Identify the characteristics of assessments that measure growth and.
Enquiring mines wanna no.... Who is it? Coleman Report “[S]chools bring little influence to bear upon a child’s achievement that is independent of.
Communicating through Data Displays October 10, 2006 © 2006 Public Consulting Group, Inc.
Introduction to Probability and Statistics Linear Regression and Correlation.
Stat 112: Lecture 9 Notes Homework 3: Due next Thursday
Introduction to NYC Teacher Data Initiative Training for Schools Fall 2008.
Copyright © 2005 by Evan Schofer
THE LOW ACHIEVEMENT TRAP IN MIDDLE INCOME COUNTRIES: COMPARING BOTSWANA AND SOUTH AFRICA A “NATURAL EXPERIMENT” IN BOTSWANA AND SOUTH AFRICA.
The Oak Tree Analogy. For the past year, these gardeners have been tending to their oak trees trying to maximize the height of the trees. Explaining the.
James Scott Ford Northside ISD, San Antonio, TX DEMOGRAPHIC FACTORS IN THE NEW TEXAS SCHOOL ACCOUNTABILITY SYSTEM: GROWTH AND ACHIEVEMENT.
-- Preliminary, Do Not Quote Without Permission -- VALUE-ADDED MODELS AND THE MEASUREMENT OF TEACHER QUALITY Douglas HarrisTim R. Sass Dept. of Ed. LeadershipDept.
This Week: Testing relationships between two metric variables: Correlation Testing relationships between two nominal variables: Chi-Squared.
Hypothesis Testing:.
CRIOP Professional Development: Program Evaluation Evaluatio Susan Chambers Cantrell, Ed.D. Pamela Correll, M.A. Victor Malo-Juvera, Ed.D.
Value-Added Overview Sapulpa Public Schools August 16, 2012.
+ Equity Audit & Root Cause Analysis University of Mount Union.
Statutory Requirements Quantitative Measures (35% of Total TLE) The State Board voted to use a Value Added Model to measure student academic growth for.
1 New York State Growth Model for Educator Evaluation 2011–12 July 2012 PRESENTATION as of 7/9/12.
Florida Department of Education Value-added Model (VAM) FY2012 Using Student Growth in Teacher and School Based Administrator Evaluations.
VALUE-ADDED NETWORK WORKSHOP May Agenda  9:00-11:00 Value-Added Updates  11:15-11:30 Welcome, Introductions  11:30-12:15 Keynote Luncheon (Kurt.
TEACHER EFFECTIVENESS INITIATIVE VALUE-ADDED TRAINING Value-Added Research Center (VARC) October 2012.
VALUE-ADDED REPORT INTERPRETATION AND FAQS Minnesota Report Example.
© 2011, Tulsa Public Schools Copyright © Tulsa Public Schools 2011 © 2011, Tulsa Public Schools Jana Burk, Tulsa Public Schools Fellow Office of Teacher.
I NFUSING V ALUE -A DDED INTO THE SLO PROCESS August 2015.
© 2007 Board of Regents of the University of Wisconsin System, on behalf of the WIDA Consortium WIDA Focus on Growth H Gary Cook, Ph.D. WIDA.
Advanced Quantitative Research ED 602. You know, Mary Stevens has really blossomed this year. She is doing much better. Actually, this whole fifth grade.
TEACHER EFFECTIVENESS INITIATIVE VALUE-ADDED TRAINING Value-Added Research Center (VARC)
Stat 112 Notes 9 Today: –Multicollinearity (Chapter 4.6) –Multiple regression and causal inference.
EE Coaches August Welcome! Introductions –Who you are –Where you are from –An EE success from last year.
MMSD Value-Added Results January 3, Attainment versus Growth Grade 3Grade 4Grade 5Grade 6Grade 7Grade 8 2.
1 New York State Growth Model for Educator Evaluation 2011–12 July 2012 PRESENTATION as of 7/9/12.
DVAS Training Find out how Battelle for Kids can help Presentation Outcomes Learn rationale for value-added progress measures Receive conceptual.
1 Psych 5510/6510 Chapter 13 ANCOVA: Models with Continuous and Categorical Predictors Part 2: Controlling for Confounding Variables Spring, 2009.
VALUE-ADDED NETWORK WORKSHOP May Agenda  8:30-10:30 Value-Added Growth Model Overview and Refresher  10:45-11:00 Welcome, Introductions  11:00-11:30.
Welcome to MMS MAP DATA INFO NIGHT 2015.
Value Added Model and Evaluations: Keeping It Simple Polk County Schools – November 2015.
Copyright © 2010, SAS Institute Inc. All rights reserved. How Do They Do That? EVAAS and the New Tests October 2013 SAS ® EVAAS ® for K-12.
October 24, 2012 Jonathan Wiens, PhD Accountability and Reporting Oregon Department of Education.
Introduction to Statistics for the Social Sciences SBS200, COMM200, GEOG200, PA200, POL200, or SOC200 Lecture Section 001, Fall 2015 Room 150 Harvill.
1 Grade 3-8 English Language Arts Results Student Growth Tracked Over Time: 2006 – 2009 Grade-by-grade testing began in The tests and data.
Florida Department of Education’s Florida Department of Education’s Teacher Evaluation System Student Learning Growth.
Value Added Model Value Added Model. New Standard for Teacher EvaluationsNew Standard for Teacher Evaluations Performance of Students. At least 50% of.
Statistical principles: the normal distribution and methods of testing Or, “Explaining the arrangement of things”
Making Data Work for Kids: EVAAS Teacher Reports October 2012 SAS ® EVAAS ® for K-12.
1 New York State Growth Model for Educator Evaluation June 2012 PRESENTATION as of 6/14/12.
Analysing the Primary RAISE
CORE Academic Growth Model: Introduction to Growth Models
CORE Academic Growth Model: Introduction to Growth Models
What is Value Added?.
TAILWINDS HEADWINDS. TAILWINDS HEADWINDS CNUSD SBAC ELA Based on the # of HW.
EVAAS Overview.
CORE Academic Growth Model: Results Interpretation
Value Added in CPS.
Chapter 12 Power Analysis.
CORE Academic Growth Model: Step-By-Step
CORE Academic Growth Model: Step-By-Step
Measuring Student Growth
MGS 3100 Business Analysis Regression Feb 18, 2016
Presentation transcript:

TEACHER EFFECTIVENESS INITIATIVE VALUE-ADDED TRAINING Value-Added Research Center (VARC)

Oak Tree Analogy Expansion 1. What about tall or short trees?  (high or low achieving students) 2. How does VARC choose what to control for?  (proxy measurements for causal factors) 3. What if a gardener just gets lucky or unlucky?  (groups of students and confidence intervals)

(High or low achieving students) 1. What about tall or short trees?

If we were using an Achievement Model, which gardener would you rather be? Gardener C Gardener D Oak C Age 4 Oak D Age 4 How can we be fair to these gardeners in our Value-Added Model? 28 in. 93 in.

Why might short trees grow faster? More “room to grow” Easier to have a “big impact” Gardener C Gardener D Why might tall trees grow faster? Past pattern of growth will continue Unmeasured environmental factors How can we determine what is really happening? Oak C Age 4 Oak D Age 4

In the same way we measured the effect of rainfall, soil richness, and temperature, we can determine the effect of prior tree height on growth. The Effect of Prior Tree Height on Growth Growth from Year 4 to 5 (inches) Prior Tree Height (Year 4 Height in Inches) Oak C (28 in) 9 in Oak D (93 in) 30 in

Our initial predictions now account for this trend in growth based on prior height. The final predictions would also account for rainfall, soil richness, and temperature. Oak C Age 4 Oak D Age 4 Oak C Age 5 (Prediction) Oak D Age 5 (Prediction) +9 in. +30 in. How can we accomplish this fairness factor in the education context?

df Analyzing test score gain to be fair to teachers Student3 rd Grade Score4 th Grade Score Abbot, Tina Acosta, Lilly Adams, Daniel Adams, James Allen, Susan Alvarez, Jose Alvarez, Michelle Anderson, Chris Anderson, Laura Anderson, Steven Andrews, William Atkinson, Carol High Low Test Score Range High Achiever Low Achiever

If we sort 3 rd grade scores high to low, what do we notice about the students’ gain from test to test? Student3 rd Grade Score4 th Grade Score Gain in Score from 3 rd to 4 th Allen, Susan Anderson, Laura Alvarez, Jose Adams, Daniel Anderson, Steven Acosta, Lilly Adams, James Atkinson, Carol Anderson, Chris Alvarez, Michelle Abbot, Tina Andrews, William High Low Test Score Range

If we find a trend in score gain based on starting point, we control for it in the Value-Added model. Student3 rd Grade Score4 th Grade Score Gain in Score from 3 rd to 4 th Allen, Susan Anderson, Laura Alvarez, Jose Adams, Daniel Anderson, Steven Acosta, Lilly Adams, James Atkinson, Carol Anderson, Chris Alvarez, Michelle Abbot, Tina Andrews, William High Low Test Score Range High Low Gain

What do we usually find in reality?  Looking purely at a simple growth model, high achieving students tend to gain about 10% fewer points on the test than low achieving students.  In a Value-Added model we can take this into account in our predictions for your students, so their growth will be compared to similarly achieving students.

School A School BSchool C Advanced Proficient Basic Minimal Student Population Why isn’t this fair? Comparisons of gain at different schools before controlling for prior performance High Achievement Medium Achievement Low Achievement Artificially inflated gain Artificially lower gain

Comparisons of Value-Added at different schools after controlling for prior performance School A School BSchool C Fair Student Population Advanced Proficient Basic Minimal

Checking for Understanding  What would you tell a teacher or principal who said Value-Added was not fair to schools with:  High-achieving students?  Low-achieving students?  Is Value-Added incompatible with the notion of high expectations for all students?

(Proxy measures for causal factors) 2. How does VARC choose what to control for?

Imagine we want to evaluate another pair of gardeners and we notice that there is something else different about their trees that we have not controlled for in the model. In this example, Oak F has many more leaves than Oak E. Is this something we could account for in our predictions? Oak E Age 5 Oak F Age 5 Gardener E Gardener F 73 in.

In order to be considered for inclusion in the Value-Added model, a characteristic must meet several requirements: Check 1: Is this factor outside the gardener’s influence? Check 2: Do we have reliable data? Check 3: If not, can we pick up the effect by proxy? Check 4: Does it increase the predictive power of the model?

Check 1: Is this factor outside the gardener’s influence? Outside the gardener’s influence Starting tree height Rainfall Soil Richness Temperature Starting leaf number Gardener can influence Nitrogen fertilizer Pruning Insecticide Watering Mulching

Check 2: Do we have reliable data? CategoryMeasurementCoverage Yearly record of tree height Height (Inches)100% RainfallRainfall (Inches)98% Soil RichnessPlant Nutrients (PPM)96% TemperatureAverage Temperature (Degrees Celsius) 100% Starting leaf numberIndividual Leaf Count7% Canopy diameterDiameter (Inches)97%

Check 3: Can we approximate it with other data? CategoryMeasurementCoverage Yearly record of tree height Height (Inches)100% RainfallRainfall (Inches)98% Soil RichnessPlant Nutrients (PPM)96% TemperatureAverage Temperature (Degrees Celsius) 100% Starting leaf numberIndividual Leaf Count7% Canopy diameterDiameter (Inches)97% ?

Canopy diameter as a proxy for leaf count The data we do have available about canopy diameter might help us measure the effect of leaf number. The canopy diameter might also be picking up other factors that may influence tree growth. We will check its relationship to growth to determine if it is a candidate for inclusion in the model. Oak E Age 5 Oak F Age 5 Gardener E Gardener F 33 in.55 in.

If we find a relationship between starting tree diameter and growth, we would want to control for starting diameter in the Value-Added model. The Effect of Tree Diameter on Growth Growth from Year 5 to 6 (inches) Tree Diameter (Year 5 Diameter in Inches) ?

If we find a relationship between starting tree diameter and growth, we would want to control for starting diameter in the Value-Added model. The Effect of Tree Diameter on Growth Growth from Year 5 to 6 (inches) Tree Diameter (Year 5 Diameter in Inches)

What happens in the education context? Check 1: Is this factor outside the school or teacher’s influence? Check 2: Do we have reliable data? Check 3: If not, can we pick up the effect by proxy? Check 4: Does it increase the predictive power of the model?

Outside the school’s influence At home support English language learner status Gender Household financial resources Learning disability Prior knowledge School can influence Curriculum Classroom teacher School culture Math pull-out program at school Structure of lessons in school Safety at the school Check 1: Is this factor outside the school or teacher’s influence? Let’s use “Household financial resources” as an example

Check 2: Do we have reliable data? What we want Household financial resources

Check 3: Can we approximate it with other data? Using your knowledge of student learning, why might “household financial resources” have an effect on student growth? What we have Free / reduced lunch status Related data What we want Household financial resources Check 4: “Does it increase the predictive power of the model?” will be determined by a multivariate linear regression model based on real data from your district or state (not pictured) to determine whether FRL status had an effect on student growth.

What about race/ethnicity? What we have Race/ethnicity What we want General socio-economic status Family structure Family education Social capital Environmental stress Related complementary data may correlate with one another (not a causal relationship) Race/ethnicity causes higher or lower performance Check 4 will use real data from your district or state to determine if race/ethnicity has an effect on student growth. If there is no effect, we decide with our partners whether to leave it in.

What about race/ethnicity? If there is a detectable difference in growth rates  We attribute this to a district or state challenge to be addressed  Not as something an individual teacher or school should be expected to overcome on their own

Checking for Understanding  What would you tell a 5 th grade teacher who said they wanted to include the following in the Value- Added model for their results?: A. 5 th grade reading curriculum B. Their students’ attendance during 5 th grade C. Their students’ prior attendance during 4 th grade D. Student motivation Check 1: Is this factor outside the school or teacher’s influence?Check 2: Do we have reliable data?Check 3: If not, can we pick up the effect by proxy?Check 4: Does it increase the predictive power of the model?

(Groups of students and confidence intervals) 3. What if a gardener just gets lucky or unlucky?

Gardener G Oak A Age 3 (1 year ago) Predicted Oak A Oak A predicted growth: 10 inches

Gardener G Oak A Age 3 (1 year ago) Actual Oak A Oak A actual growth: 2 inches For an individual tree, our predictions do not account for random events.

Gardeners are assigned to many trees Gardener G Each tree has an independent prediction based on its circumstances (starting height, rainfall, soil richness, temperature)

How confident would you be about the effect of these gardeners? Total trees assigned 5 Trees that missed predicted growth 3 Trees that beat predicted growth 2 Due to unlucky year 2 Due to gardener 1 2 Due to lucky year 0 Total trees assigned 50 Trees that missed predicted growth 30 Trees that beat predicted growth 20 Due to unlucky year 7 Due to gardener 23 Due to gardener 15 Due to lucky year 5 Gardener G Gardener H

Reporting Value-Added In the latest generation of Value-Added reports, estimates are color coded based on statistical significance. This represents how confident we are about the effect of schools and teachers on student academic growth. Green and Blue results are areas of relative strength. Student growth is above average. Gray results are on track. In these areas, there was not enough data available to differentiate this result from average. Yellow and Red results are areas of relative weakness. Student growth is below average.

Grade represents meeting predicted growth for your students. Since predictions are based on the actual performance of students in your district or state, 3.0 also represents the district or state average growth for students similar to yours. Numbers lower than 3.0 represent growth that did not meet prediction. Students are still learning, but at a rate slower than predicted. Numbers higher than 3.0 represent growth that beat prediction. Students are learning at a rate faster than predicted. Value-Added is displayed on a 1-5 scale for reporting purposes. About 95% of estimates will fall between 1 and 5 on the scale.

Grade % Confidence Interval 30 READING Value-Added estimates are provided with a confidence interval. Based on the data available for these thirty 4 th Grade Reading students, we are 95% confident that the true Value-Added lies between the endpoints of this confidence interval (between 3.2 and 4.4 in this example), with the most likely estimate being

Confidence Intervals Grade READING Grade 4 36 Grade Grade Grade 4 36 Grade MATH Color coding is based on the location of the confidence interval. The more student data available for analysis, the more confident we can be that growth trends were caused by the teacher or school (rather than random events).

Checking for Understanding  A teacher comes to you with their Value-Added report wondering why it’s not a green or blue result.  She wants to know why there’s a confidence interval at all when VARC had data from each and every one of her students. (Don’t we know exactly how much they grew?) Grade READING 3