Presentation is loading. Please wait.

Presentation is loading. Please wait.

Human Capital Policies in Education: Further Research on Teachers and Principals 5 rd Annual CALDER Conference January 27 th, 2012.

Similar presentations


Presentation on theme: "Human Capital Policies in Education: Further Research on Teachers and Principals 5 rd Annual CALDER Conference January 27 th, 2012."— Presentation transcript:

1 Human Capital Policies in Education: Further Research on Teachers and Principals 5 rd Annual CALDER Conference January 27 th, 2012

2 Where You Come From Or Where You Go? Distinguishing Between School Quality And The Effectiveness Of Teacher Preparation Programs January 27, 2012 Kata Mihaly RAND J. R. Lockwood RAND Daniel McCaffrey RAND Tim Sass Georgia State University

3 Introduction Improving teacher effectiveness is one of four pillars for education reform under the Obama Administration Improving teacher effectiveness is one of four pillars for education reform under the Obama Administration States are using evidence based techniques to evaluate teacher preparation program effectiveness States are using evidence based techniques to evaluate teacher preparation program effectiveness One technique links student achievement to the preparation programs where their teacher was trained and certified One technique links student achievement to the preparation programs where their teacher was trained and certified Among the many concerns is that the school context could affect preparation program estimates Among the many concerns is that the school context could affect preparation program estimates

4 Table 1 - Characteristics of Schools Where Graduates from a Sample Program in Florida were Hired (N=22) Mean Std. Dev. MinMax Black0.32430.312301 Hispanic0.24900.215800.8000 Female0.47910.129600.6000 Parent no English 0.32970.232300.8000 LEP0.12490.131400.4093 Free Lunch 0.50000.272300.9465 Math gain score (norm) -0.02760.3364-0.66620.6895 New Teachers 0.40650.256601 Number of Prep Programs 1.68180.893714

5 There are many potential problems of linking preparation program to student achievement There are many potential problems of linking preparation program to student achievement selection of teachers into and out of programs selection of teachers into and out of programs selection of program graduates into teaching positions selection of program graduates into teaching positions how teacher performance is measured how teacher performance is measured Here we consider problem of trying to distinguish preparation program effects from environment of schools where graduates teach Here we consider problem of trying to distinguish preparation program effects from environment of schools where graduates teach We estimate preparation program effectiveness using Value Added Model of student achievement with data from Florida We estimate preparation program effectiveness using Value Added Model of student achievement with data from Florida Introduction

6 Overview of Research Questions 1. Can school fixed effects be included in the value added model? 2. If yes, does the inclusion of school fixed effects change preparation program estimates? 3. What are the implications of including school fixed effects on precision of estimates? 4. Are fixed effects suitable in this setting? 5. What is the impact of the sample restrictions: Years of dataYears of data Inexperienced teachersInexperienced teachers

7 Prior Research Comparing Value-Added of Teachers Across Preparation Programs Models with Program and School Fixed Effects Models with Program and School Fixed Effects New York City -- Boyd, et al. (2008) New York City -- Boyd, et al. (2008) Florida -- Sass (2008) Florida -- Sass (2008) Kentucky -- Kukla-Acevedo, et al. (2009) Kentucky -- Kukla-Acevedo, et al. (2009) HLM Models HLM Models Texas – Mellor, et al. (2010) Texas – Mellor, et al. (2010) Louisiana – Noell, et al. (2009) Louisiana – Noell, et al. (2009)

8 Data Analyze recent graduates (<3 years of experience) from Florida elementary education teacher preparation programs Analyze recent graduates (<3 years of experience) from Florida elementary education teacher preparation programs teaching in grades 4 and 5 during 2000/01-2004/05 period teaching in grades 4 and 5 during 2000/01-2004/05 period 33 preparation programs 33 preparation programs 1 to 496 teacher graduates in tested grades/subjects 1 to 496 teacher graduates in tested grades/subjects Graduates from a single program working in 1 to 271 schools Graduates from a single program working in 1 to 271 schools Over 550,000 students Over 550,000 students Sample also includes experienced teachers and those certified out of state or in Florida through alternative pathways. Sample also includes experienced teachers and those certified out of state or in Florida through alternative pathways.

9 Fixed Effects Identification School fixed effect estimation only possible if all preparation programs are linked to one another through the schools where their graduates teach School fixed effect estimation only possible if all preparation programs are linked to one another through the schools where their graduates teach Preparation programs do not need to be linked directly Preparation programs do not need to be linked directly as long as there are some new teachers in the school who graduated from other programs as long as there are some new teachers in the school who graduated from other programs Regional Clustering could lead to stratification Regional Clustering could lead to stratification Work of Boyd et al. (2005) on the “draw of home” suggests graduates tend to teach in schools close to where they grew up or where they went to college Work of Boyd et al. (2005) on the “draw of home” suggests graduates tend to teach in schools close to where they grew up or where they went to college

10 Figure 1 - Estimated Probability of Preparation Program Graduate Teaching at School with at Least one Graduate from another Program as a Function of Distance from Program to School Negative relationship indicates graduates are more likely to teach in schools closer to where they graduated

11 Figure 2 – Preparation Program and School Connections Shade of line represents strength of connection - the number of graduates from a program going to that school

12 Figure 3 – Preparation Program Network Using a 5-Year Data Window All preparation programs are connected, so school fixed effect estimation is possible

13 Model – Preparation Program Effectiveness We estimate a model of student achievement gains as a function of student characteristics, teacher experience, grade and year indicators and program fixed effects We estimate a model of student achievement gains as a function of student characteristics, teacher experience, grade and year indicators and program fixed effects Program effects are estimated relative to the average preparation program in Florida using STATA felsdvregdm command Program effects are estimated relative to the average preparation program in Florida using STATA felsdvregdm command We rank programs on effectiveness, and divide the rankings into quartiles We rank programs on effectiveness, and divide the rankings into quartiles We compare the rankings and ranking quartiles with and without school fixed effects We compare the rankings and ranking quartiles with and without school fixed effects

14 Table 2 – Top Tier Preparation Programs No School FE With School FE Program ID Rank Rank Quartile Rank 201161 3221324 173131 44192 751132 286121 137171 1281142 1910241 3118311 2420351 2627481

15 Table 3 – Bottom Tier Preparation Programs No School FE With School FE Program ID Rank Rank Quartile Rank 3221324 14142294 11264102 2627481 22284304 15294284 23304264 27314314 21324274 33334334

16 Results – Preparation Program Rankings Rankings are significantly affected by the inclusion of school fixed effects in the value added model Rankings are significantly affected by the inclusion of school fixed effects in the value added model Of the 12 programs in the top quartile of rankings in either specification, 8 programs are in a different quartile of rankings in the other specification Of the 12 programs in the top quartile of rankings in either specification, 8 programs are in a different quartile of rankings in the other specification The bottom quartile of program rankings is more stable, with 6 programs in this quartile for both specifications. The bottom quartile of program rankings is more stable, with 6 programs in this quartile for both specifications. There are 2 programs that switch from the bottom quartile of rankings for one specification to the top in the other specification There are 2 programs that switch from the bottom quartile of rankings for one specification to the top in the other specification

17 Results – Variance Inflation Schools where all new teachers came from a single program do not help identify preparation program effects in school fixed effect model: ~32% of our sample of teachers Schools where all new teachers came from a single program do not help identify preparation program effects in school fixed effect model: ~32% of our sample of teachers Loss of these teachers can greatly inflate the standard errors of the estimated program effects for some programs Loss of these teachers can greatly inflate the standard errors of the estimated program effects for some programs The standard errors of the preparation program estimates increases by an average of 16.5% after including school fixed effects The standard errors of the preparation program estimates increases by an average of 16.5% after including school fixed effects The variance inflation is severe for 10 of the 33 programs, with standard errors increasing over 20% The variance inflation is severe for 10 of the 33 programs, with standard errors increasing over 20%

18 Homogeneity Assumption School fixed effects can only yield consistent estimates of program effectiveness if there are no systematic differences among teachers and schools that create the connections among programs from other teachers and schools in the state School fixed effects can only yield consistent estimates of program effectiveness if there are no systematic differences among teachers and schools that create the connections among programs from other teachers and schools in the state Three tests of homogeneity assumption: Three tests of homogeneity assumption: 1. Are schools with graduates from 1 program different than school with graduates from more than 4 programs? 2. Are teachers that teach in schools with graduates from 1 program different than teachers that teach in schools with graduates from more than 4 programs 3. Are central schools, ones that help the most in connecting preparation programs different than other schools in the state?

19 Figure 3 – Central School Locations Central schools have a disproportionately high influence in identifying program effects

20 Results – Homogeneity Assumption Three tests of homogeneity assumption: Three tests of homogeneity assumption: 1. Schools different? YES: schools with new teachers from 4+ preparation programs are larger, higher % black and Hispanic, lower test scores and higher % free lunch YES: schools with new teachers from 4+ preparation programs are larger, higher % black and Hispanic, lower test scores and higher % free lunch 2. Teachers different? YES: average characteristic of teachers in schools with from 4+ preparation programs are higher % black and Hispanic, lower test scores and lower SAT exams YES: average characteristic of teachers in schools with from 4+ preparation programs are higher % black and Hispanic, lower test scores and lower SAT exams 3. Central schools different? YES: larger schools, higher % Hispanic, immigrant, and LEP students YES: larger schools, higher % Hispanic, immigrant, and LEP students Homogeneity Assumption Likely Violated

21 Years of Data The data window length affects program connections The data window length affects program connections programs will have a tie through a school if they both have a graduate teaching in the school sometime during the window programs will have a tie through a school if they both have a graduate teaching in the school sometime during the window As we lengthen the window As we lengthen the window  more programs will have ties making estimation possible  however, requires the assumption that both school and program effects are constant over the entire window Other implications Other implications Variance inflation Variance inflation Homogeneity assumption Homogeneity assumption

22 Results – Years of Data Identification is robust to shorter window length Identification is robust to shorter window length Even with 3 years of data the school fixed effects can be identified Even with 3 years of data the school fixed effects can be identified Restricting to 2 years of data results in 3 preparation programs being disconnected Restricting to 2 years of data results in 3 preparation programs being disconnected Variance inflation is worse Variance inflation is worse Due to an increase in the proportion of teachers in schools with graduates from a single program who are not used to estimate program effects Due to an increase in the proportion of teachers in schools with graduates from a single program who are not used to estimate program effects Characteristics of schools that contribute to program estimates with school fixed effects are very similar Characteristics of schools that contribute to program estimates with school fixed effects are very similar Somewhat larger schools and higher % Hispanic Somewhat larger schools and higher % Hispanic

23 Sample of Teachers In the results reported so far only inexperienced teachers In the results reported so far only inexperienced teachers (< 3 years experience) were included in the analysis. (< 3 years experience) were included in the analysis. This restriction is warranted if the impact of the preparation program dissipates as the teacher gains experience on the job This restriction is warranted if the impact of the preparation program dissipates as the teacher gains experience on the job We can include experienced teachers in the sample and restrict program effects to exclude these teachers We can include experienced teachers in the sample and restrict program effects to exclude these teachers but experienced teachers change the connections between schools and preparation programs but experienced teachers change the connections between schools and preparation programs Experienced teachers will identify school fixed effects Experienced teachers will identify school fixed effects This can result in reduced variance of program effects This can result in reduced variance of program effects Our ability to compare programs could rely on differences between schools on experienced teachers Our ability to compare programs could rely on differences between schools on experienced teachers

24 Results – Sample of Teachers Preparation program ranking quartiles are unaffected by the inclusion of experienced teachers in models without school fixed effects Preparation program ranking quartiles are unaffected by the inclusion of experienced teachers in models without school fixed effects Ranking quartiles in models with school fixed effects change for 8 out of 33 programs Ranking quartiles in models with school fixed effects change for 8 out of 33 programs 3 out of these 8 changes are more than 2 quartile differences 3 out of these 8 changes are more than 2 quartile differences Experienced teacher result in lower program effect variances in models with school fixed effects by 13% on average Experienced teacher result in lower program effect variances in models with school fixed effects by 13% on average

25 Summary and Conclusions Good News for School Fixed Effects Models: Good News for School Fixed Effects Models: Despite regional clustering, Florida preparation programs are connected, so use of school fixed effects is feasible Despite regional clustering, Florida preparation programs are connected, so use of school fixed effects is feasible There is significant variation in school characteristics for graduates of any preparation program, so use of school fixed effects desirable There is significant variation in school characteristics for graduates of any preparation program, so use of school fixed effects desirable Bad News for Models with School Fixed Effects Bad News for Models with School Fixed Effects Including school fixed effects inflates variance of estimated program effects Including school fixed effects inflates variance of estimated program effects Homogeneity assumption likely violated Homogeneity assumption likely violated Preparation program effectiveness rankings differ significantly with and without school fixed effects Preparation program effectiveness rankings differ significantly with and without school fixed effects

26 Summary and Conclusions A 3-year data window and use of school fixed effects may provide a reasonable compromise between bias and variance inflation A 3-year data window and use of school fixed effects may provide a reasonable compromise between bias and variance inflation However, there is no clean empirical method to identify a model with no bias or a model that yields program effect estimates with the smallest MSE However, there is no clean empirical method to identify a model with no bias or a model that yields program effect estimates with the smallest MSE States will need to make a choice knowing that the choice may affect preparation program rankings and might be yielding a biased estimate unless untestable assumptions hold States will need to make a choice knowing that the choice may affect preparation program rankings and might be yielding a biased estimate unless untestable assumptions hold States may need to consider if value added modeling alone can provide useful information about preparation program States may need to consider if value added modeling alone can provide useful information about preparation program


Download ppt "Human Capital Policies in Education: Further Research on Teachers and Principals 5 rd Annual CALDER Conference January 27 th, 2012."

Similar presentations


Ads by Google