Pascaline Dupas, UCLA Pupil-Teacher Ratios, Locally-Hired Contract Teachers, and School-Based Management: Evidence from Kenya Making Schools Accountable:

Slides:



Advertisements
Similar presentations
ACCOMMODATIONS MANUAL
Advertisements

RE-THINKING ACCOUNTABILITY Social Accountability and the Search for More Effective Public Expenditure Jeff Thindwa Participation and Civic Engagement.
Leon County Schools Performance Feedback Process August 2006 For more information
1 Example: Extra Teacher Provision. Example: Extra teacher provision Many countries have large class sizes and have recruited local or informal teachers.
The Education Endowment Foundation (EEF) is an independent grant-making charity dedicated to breaking the link between family income and educational achievement,
Girls’ scholarship program.  Often small/no impacts on actual learning in education research ◦ Inputs (textbooks, flipcharts) little impact on learning.
Teaching Community Assistant Initiative
AME Education Sector Profile
Baseline for school surveys - Young Lives longitudinal survey of children, households & communities every 3 years since ,000 children Ethiopia,
Student achievement in the Hazleton Area SD has increased substantially since Celebrate Success! Hazleton Area SD.
Performance Based Incentives for Learning in the Mexican Classroom Brian Fuller, MPA, Foundation Escalera Victor Steenbergen, MPA Candidate, London School.
Grading Scenarios.
TM Confidential and Proprietary. Copyright © 2007 by Educational Testing Service. 1.
Impact Evaluation: The case of Bogotá’s concession schools Felipe Barrera-Osorio World Bank 1 October 2010.
Chapter 13: Experiments and Observational Studies
How Policy Evaluations and Performance Management are Used Maureen Pirog Rudy Professor of Public and Environmental Affairs, Indiana University Affiliated.
Impact Evaluation as a tool for decision-making Markus Goldstein The World Bank.
The Student Has Become the Teacher: Tracking the Racial Diversity and Academic Composition of the Teacher Supply Pipeline Brad White & Eric Lichtenberger,
PAI786: Urban Policy Class 2: Evaluating Social Programs.
Comments on Teacher Incentive Papers: Duflo, Dupas and Kremer (Kenya) Muralidharan and Sundararaman (India) Bruns, Ferraz and Rangel (Brazil) By Paul Glewwe.
Transforming Student Learning in Chemistry and Physics with Supplemental Instruction Jordan D. Mathias and Mitch H. Weiland April 30, 2013.
Lessons for Education Policy in Africa Evidence from Randomized Evaluations in developing countries James Habyarimana Georgetown University.
Incentives James Habyarimana Impact Evaluation Initiative World Bank.
1 Types of Evaluation. 2 Different types of evaluation Needs assessment Process evaluation Impact evaluation Cost-benefit analysis.
Education and HIV/AIDS in Western Kenya: Results from a Randomized Trial Assessing the Long-Term Biological and Behavioral Impact of Two School-Based Interventions.
Evaluating Outcomes Across the Partnerships Tom Loveless Director, Brown Center on Education Policy The Brookings Institution Saturday,
Community Participation in Public Schools: Impact of Information Campaigns in three Indian States Priyanka Pandey, Sangeeta Goyal & Venkatesh Sundararaman.
Meryle Weinstein, Emilyn Ruble Whitesell and Amy Ellen Schwartz New York University Improving Education through Accountability and Evaluation: Lessons.
Dr. Tracey Bywater Dr. Judy Hutchings The Incredible Years (IY) Programmes: Programmes for children, teachers & parents were developed by Professor Webster-Stratton,
How We Approach Leadership in a High-Performing Schools Dr. Akram M. Zayour Dubai International School AlQuoz Branch 9/19/20151.
Lessons for Education in Africa Evidence from Randomized Evaluations in Kenya Esther Duflo J-PAL A B D U L L A T I F J A M E E L P O V E R T Y A C T I.
T tests comparing two means t tests comparing two means.
Economics 172 Issues in African Economic Development Lecture 14 March 2, 2006.
Pitfalls of Participatory Programs: Evidence from a randomized evaluation in education in India Abhijit Banerjee (MIT) Rukmini Banerji (Pratham) Esther.
Education for Accountability Workshop, June 22 nd, 2009 Overview of the Evidence: Interventions in Teacher Incentives Barbara Bruns Lead Education Economist,
Education in Africa Sudan and Kenya Case Studies.
Attendance Matters in Alabama
Project CLASS “Children Learning Academic Success Skills” This work was supported by IES Grant# R305H to David Rabiner Computerized Attention Training.
Professional Performance Process Presented at March 2012 Articulation Meetings.
1 Milwaukee Mathematics Partnership Program Evaluation Year 6 Results Carl Hanssen Hanssen Consulting, LLC Cindy Walker University of Wisconsin-Milwaukee.
Loss Aversion as Incentive to Study Guglielmo Volpe School of Economics and Finance Queen Mary University of London Developments in Economics Education.
Adolescent girls, school, HIV, and pregnancy: evidence from Kenya Michael Kremer, Harvard University Esther Duflo, Pascaline Dupas, Samuel Sinei; Edward.
2007 Grade 3-8 English Test Results. 2 Raising Achievement Over past several years, Board of Regents has voted measures to raise standards and require.
TAP Expansion, Impact and Outcomes Lewis C. Solmon President Teacher Advancement Program Foundation April 27, 2006 TAP Expansion, Impact and Outcomes Lewis.
Administrative Review & Restructuring. 1 The President’s Charge Review administrative organization and delivery of administrative services at all levels.
ECON 3039 Labor Economics By Elliott Fan Economics, NTU Elliott Fan: Labor 2015 Fall Lecture 31.
Angelo Bradley Taikein Cooper Parita Shah
Transformation: Making a Difference Is there a simple, transparent process to promote high quality implementation of our aligned plans that accelerates.
Second Cross-country Workshop of the Africa Programs for Education and AIDS Impact Evaluation Dakar, December 2008 What do we know about school bursaries?
Research on teacher pay-for-performance Patrick McEwan Wellesley College (Also see Victor Lavy, “Using performance-based pay to improve.
“Value added” measures of teacher quality: use and policy validity Sean P. Corcoran New York University NYU Abu Dhabi Conference January 22, 2009.
Evaluation of the DC Opportunity Scholarship Program: Final Report Conducted by Westat, University of Arkansas, Chesapeake Research Associates Presented.
Web-Homework Platforms: Measuring Student Efforts in Relationship to Achievement Michael J. Krause INTRODUCTION Since the 2007 fall semester, I have used.
Africa Impact Evaluation Program on AIDS (AIM-AIDS) Cape Town, South Africa March 8 – 13, Steps in Implementing an Impact Evaluation Nandini Krishnan.
T EACHER INCENTIVES AND LOCAL PARTICIPATION : E VIDENCE FROM A RANDOMIZED PROGRAM IN K ENYA Joost de Laat Michael Kremer Christel Vermeersch.
Comments on: The Evaluation of an Early Intervention Policy in Poor Schools Germano Mwabu June 9-10, 2008 Quebec City, Canada.
PRESENTATION BY THE GHANA TEAM By Eunice Dapaah Senior Education Specialist World Bank- Ghana Office.
School Based Management: Evidence from Kenya
What is Impact Evaluation … and How Do We Use It? Deon Filmer Development Research Group, The World Bank Evidence-Based Decision-Making in Education Workshop.
Measuring Turnaround Success October 29 th, 2015 Jeanette P. Cornier, Ph.D.
College Credit Plus Welcome Students and Parents to: Information Session.
Revamping the Teaching Profession by Attracting Non-Teachers to It: Evidence from Enseña Chile Mariana Alfonso Education Division, Inter-American Development.
Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, Causal Inference Nandini.
TCAI: Lessons from first Endline TCAI Development Partners Feb 27, 2013.
Presentation for MEP Research Symposium Eric Grodsky April 26, 2018
North Carolina Positive Behavior Support Initiative
Young Lives, University of Oxford
Follow-up on Data Requests from Board of Education April 2018 Retreat
Steps in Implementing an Impact Evaluation
REACH Accreditation Preparing Your School for a Team Visit
Presentation transcript:

Pascaline Dupas, UCLA Pupil-Teacher Ratios, Locally-Hired Contract Teachers, and School-Based Management: Evidence from Kenya Making Schools Accountable: What Works? World Bank, June 22, 2009

Kenya Extra-Teacher Study  Collaborative effort:  Academics: Esther Duflo, Michael Kremer and myself  Implementing NGO: ICS Africa  Funding from World Bank (BNPP)

Kenyan Context: Free Primary Education  Free Primary Education started in 2003  Enrollment in primary school increased from 5.9 to 7.6 million, particularly in lower grades  Reform reduced income for school committees; fewer locally-hired teachers  Average PTR in Grade 1: 80 in area of study  Greater heterogeneity of student preparation

ICS Extra Teacher Program (ETP)  Ran for two academic (= calendar) years: 2005 and 2006  Involved 140 schools:  70 control, 140 ETP treatment schools  ETP Treatment: Provided funds to school committee to hire an extra teacher locally  Extra Teacher required to have same qualifications as civil service teachers  Salary: 2,500 Ksh (~$35) a month, compared to ~7,000 ksh a month +benefits for civil service teachers  Short-term contract, renewable after a year, school can fire extra teacher if performance unsatisfactory

Mechanics of the ETP program  Extra-Teacher assigned to 1 st grade  Added one section in 1 st grade: from 1 to 2 in most school, 2 to 2 in a few schools  Extra Teacher randomly assigned to one section; followed students in that section through Grade 2 (vs. rotation)  Division of students between sections was done at random (70 schools) or based on initial preparation level (70 schools)  Schools supposed to treat teachers equally  Resources supposed to be shared equally

School-Based Management (SBM):  Add-on implemented in half of ETP schools  Designed to enhance role of parents in monitoring ETP teachers  Training of school committee on how to monitor contract teacher’s performance Soliciting input from parents Checking teacher attendance  Formal subcommittee to evaluate contract teacher’s performance; Review meeting at end of first year of contract to decide whether to renew

Questions this design can answer:  Can hiring contract teachers locally at low pay help increase students’ learning? Can contract teachers perform well despite their lack of experience and low pay?  Can empowering the community to monitor teachers’ performance increase teachers’ effort and students’ learning?  Does lowering the pupil-teacher ratio improve learning?  Do more homogenous classes increase average learning? Do they hurt the students who are “tracked” in the lower-performing class?

Outcomes of Interest  Final outcome: Test scores  Intermediate outcomes:  Teacher Effort  Student Attendance

Effects on Test Scores  Overall: test score gain of 0.16 standard deviations in ETP schools relative to comparison schools  But not every student benefitted equally showshow  Students of civil service teachers  No significant gain relative to comparison schools despite reduction in class size from ~80 to ~40!  Students of contract teachers  Scored 0.23 SD more than students of civil service teachers in same schools  Students in SBM schools  Not affected if assigned to contract teacher  If assigned to civil service teacher No significant gains for literacy Scored.18 SD more in math than comparable students in non-SBM ETP schools

Test Scores at End of Program back

Possible explanations for test score results  Why such a large contract teacher effect?  Incentives short term renewable contract, possibility to become permanent more likely to be local  Less rotation Continuity could be good for students Also could increase accountability  Why didn’t reduction in pupil-teacher ratio increase scores?  Civil service teachers did not change teaching technique?  Increased absence? indeedindeed  Why an SBM effect on civil service teachers? Reinforce mission of contract teacher Civil service teachers cannot expect contract teachers to take their classes

Teacher Effort  Contract teachers  In class and teaching 29.1 percentage points more than civil service counterparts in same school  No significant effect of SBM (not surprising since contract teachers essentially teaching all the time)  Civil service teachers  Probability in class and teaching 12.9 percentage points lower in ETP schools than comparison schools (from base of 58.2%)  Displacement effect weaker in SBM schools (7.3 percentage points more likely to be in class and teaching, not quite significant) Grouping by initial preparation (9.2 percentage points more likely to be in class and teaching)  Classes sometimes combined when teacher absent

Student Attendance  Contract teacher  Increases student attendance by 1.7 percentage points (from base of 86.1%)  SBM  Increases student attendance by 2.8 percentage points back

Grouping Students by Initial Preparation  Tracking appears to be effective  Raises average test scores by approximately 0.13 s.d.  Gains throughout distribution of initial scores  Consistent with focus model of peer effects  Highlights importance of response of teacher behavior

Long-term results only persist with tracking Test Scores at the end of 2007 (one year after ETP program had ended)

Conclusions and Caveats  Scaling up ETP/SBM/dedicated teacher assignment combination attractive in this context  Raises test scores for students  Costs are modest  Caveats in generalizing  Contract teachers were trained  ETP teachers may be motivated by prospect of obtaining civil service positions  Hard to isolate impact of dedicated teacher (rotation effect)  Adding civil service teachers might have different effects