We think you have liked this presentation. If you wish to download it, please recommend it to your friends in any social system. Share buttons are a little bit lower. Thank you!
Presentation is loading. Please wait.
Published byChandler Wipper
Modified over 2 years ago
© Institute for Fiscal Studies Evaluation design for Achieve Together Ellen Greaves and Luke Sibieta
© Institute for Fiscal Studies Achieve Together Bring together three programmes in a school: –Teach First –Teaching Leaders –Future Leaders Intensive human capital investment Original motivation was also to encourage schools to work together and to engage the local community and organisation in school-improvement Cluster-design Difficult to evaluate quantitatively Evaluation and pilot funded by the Education Endowment Foundation (EEF)
Outline The original design of the evaluation What went wrong –Design of the pilot –Recruitment (round 1) –Recruitment (round 2) Final design of the evaluation Lessons for evaluators © Institute for Fiscal Studies
Achieve Together Two pilots: 1.Area-based design 2.School-level human capital investment
© Institute for Fiscal Studies Achieve Together Two pilots: 1.Area-based design One-cluster in Bournemouth 4 primary schools and 6 secondary schools Involvement of local community/organisations Process evaluation 2.School-level human capital investment
© Institute for Fiscal Studies Achieve Together Two pilots: 1.Area-based design 2.School-level human capital investment School-level intervention No co-ordination within clusters or involvement of external organisations Quantitative evaluation and process evaluation
© Institute for Fiscal Studies Original evaluation design Randomised controlled trial Number of schools fixed by EEF: 24 treatment and 24 control Primary outcomes –Attainment at KS4 –Attainment at Year 7 (focus of Achieve Together impact project) Secondary outcomes –Number of persistent absentees –Overall absence rate
© Institute for Fiscal Studies Original evaluation design Randomised controlled trial Number of schools fixed by EEF: 24 treatment and 24 control Primary outcomes –Attainment at KS4 –Attainment at Year 7 (focus of Achieve Together impact project) Secondary outcomes –Number of persistent absentees –Overall absence rate Subgroups –Pupils eligible for free school meals –Pupils with low prior attainment “Business as usual” in control schools –Able to access one programme element of Achieve Together
Power calculations Model Model Model Note: These calculations represent the effect size that will be possible to detect using a two-sided hypothesis test with significance level of 5%, and with power against an alternative hypothesis of 80%. Model 1 reports the minimum detectable effect size when the variance of the outcome unexplained by attributes of the pupils (including prior attainment) is 60%. Model 2 reports a less optimistic scenario (70% unexplained), whilst Model 3 is more optimistic (50% unexplained). © Institute for Fiscal Studies
What went wrong: design of the pilot School-level RCT began to look clustered... Cluster based recruitment Co-ordination between schools Complicates and creates risks for evaluation: 1.What can we learn from the evaluation? 2.How will the power calculations be affected? © Institute for Fiscal Studies
What went wrong: design of the pilot School-level RCT began to look clustered... Cluster based recruitment Co-ordination between schools Complicates and creates risks for evaluation: 1.What can we learn from the evaluation? Is positive impact due to the human capital approach? Or better co-ordination between schools? Our findings would be inconclusive 2.How will the power calculations be affected? © Institute for Fiscal Studies
What went wrong: design of the pilot School-level RCT began to look clustered... Cluster based recruitment Co-ordination between schools Complicates and creates risks for evaluation: 1.What can we learn from the evaluation? 2.How will the power calculations be affected? At the extreme, we can think of the unit of treatment as the cluster Uncertain risk for the minimum detectable effect size Required treatment effect from power calculations with clustering at the school level already looked ambitious... Clustering may increase the intra-cluster correlation and increase the challenge of detecting a significant effect © Institute for Fiscal Studies
What went wrong: recruitment (round 1) Target: 48 Recruited: 13 Problems for recruitment: Time available Uncertainty about staff availability Uncertainty about school budget (for costly programme) Risk of being allocated to control group Clarity about the pilot The recruited schools began Achieve Together in September 2013 © Institute for Fiscal Studies
What went wrong: recruitment (round 2) Target: 48 Recruited: 15 Problems for recruitment: Time available Uncertainty about staff availability Uncertainty about school budget (for costly programme) Risk of being allocated to control group Clarity about the pilot The recruited schools will begin Achieve Together in September 2014 © Institute for Fiscal Studies
Final evaluation design Non-experimental Matching (“well-matched comparison group”) 1.Similar in terms of observable characteristics 2.Expressed a strong interest in Achieve Together How credible are the non-experimental estimates? Depends on the factors that determine take-up and growth in pupil attainment - observable or unobservable? Assess the credibility of the non-experimental matching estimates Achieve Together round 1 schools: compare matching estimates to a “gold standard” comparison group - schools that are similar in both observable and unobservable characteristics Achieve Together round 2 schools © Institute for Fiscal Studies
Final evaluation design Matching likely to be credible Matching unlikely to be credible © Institute for Fiscal Studies
Lessons for evaluators (1) Evaluators must have good communication with the project team How are plans for the pilot developing? What are the implications for the evaluation design? Why is the evaluation important? Evaluators should be clear about the necessary requirements for the evaluation What is expected of control schools? Restrictions on “business as usual” What is expected of treatment schools? Additional testing Involvement with process evaluation What are non-negotiable elements of the evaluation © Institute for Fiscal Studies
Lessons for evaluators (2) Recruitment can be difficult! What barriers does the evaluation impose and can these be reduced? Be creative What evaluation design is feasible as circumstances change? Be selective! What is the potential for a robust and informative evaluation? What are the risks to the evaluation? © Institute for Fiscal Studies
Building Evidence in Education: Conference for EEF Evaluators 11 th July: Theory 12 th July: Practice
Designing an impact evaluation: Randomization, statistical power, and some more fun…
Donald T. Simeon Caribbean Health Research Council.
Challenges in evaluating social interventions: would an RCT design have been the answer to all our problems? Lyndal Bond, Kathryn Skivington, Gerry McCartney,
Pupil Premium Review Format of the day Additional information Questions asked Report findings Future cctions.
Pilot studies Karla Hemming The University of Birmingham.
Children and Young Peoples Department Secondary and Primary Schools working together to support Primary Physical Education and Sport Gaby Crolla 3 rd December.
Moving from Development to Efficacy & Intervention Fidelity Topics National Center for Special Education Research Grantee Meeting: June 28, 2010.
Impact of two teacher training programmes on pupils’ development of literacy and numeracy ability: a randomised trial Jack Worth National Foundation for.
1 Science Subject Leader Training Preparing new and aspiring Subject Leaders for the role. East Riding/North Yorkshire LAs.
Improving the Economic Efficiency of Clinical Trials.
EAST SUSSEX PRIMARY HEADS MEETING Development of Area Groups, Alliances and School to School support Margaret Coleman and Anne Radford 20th June 2014.
Designing Influential Evaluations Session 5 Quality of Evidence Uganda Evaluation Week - Pre-Conference Workshop 19 th and 20 th May 2014.
© Nuffield Trust 22 June 2015 Matched Control Studies: Methods and case studies Cono Ariti
Approaches to quantitative data analysis Lara Traeger, PhD Methods in Supportive Oncology Research.
SEED – CT’s System for Educator and Evaluation and Development April 2013 Wethersfield Public Schools CONNECTICUT ADMINISTRATOR EVALUATION Overview of.
Success for All - Living up to its name? Dr Louise Tracey & Professor Bette Chambers 21 March 2013.
Narrowing the gap and the effective use of the Pupil and Service Premium with SEN young people Glyn Wright Autumn Term 2013.
© Crown copyright 2007 Study Plus training. © Crown copyright 2007 Aims of Study plus To accelerate the progress of pupils who are not on track to attain.
Compass: Module 3 Compass Requirements: Teachers’ Overall Evaluation Rating Student Growth Student Learning Targets (SLTs) Value-added Score (VAM) where.
Creating a lifelong sporting habit Embedding Commissioning for Sport & Leisure: the Community Sport Activation Fund Joel Brookfield London CLOA Conference.
© Nuffield Trust Inner North West London Integrated Care Pilot – year one evaluation 8 July 2013 Holly Holder Fellow in health policy Ian Blunt Senior.
Webinar: June 6, :00am – 11:30am EDT The Community Eligibility Option.
Healthy Child Development Suggestions for Submitting a Strong Proposal.
Western Hemisphere Membership Development Strategy.
Communication Leaders A project all about communication led by and for children and young people.
Povertyactionlab.org Planning Sample Size for Randomized Evaluations Esther Duflo MIT and Poverty Action Lab.
An Inspector Called: Key findings from Ofsted English Review 2009 “English at the Crossroads”: Ofsted 2009.
Australian Teacher Performance and Development Framework Consultation proposal.
1 Briefing for schools on government proposals for School Funding Reform Agenda Welcome Government proposals for School Funding Reform Government.
Standards report Standards Report CT Board 18 th March 2016.
The Middle Leadership Development Pilot Further information:
1 Budgeting & Control Week 8. 2 The nature of budgeting Budget is a detailed plan, expressed in quantitative terms, that specifies how resources will.
Understanding p-values Annie Herbert Medical Statistician Research and Development Support Unit
Raising standards, improving lives The use of assessment to improve learning: the evidence 15 September Jacqueline White HMI National Adviser for Assessment.
Background to Adaptive Design Nigel Stallard Professor of Medical Statistics Director of Health Sciences Research Institute Warwick Medical School
6th European University-Business Forum PARTNERSHIPS FOR JOBS AND GROWTH Social Innovation and Social Entrepreneurship Ellen Shipley, Partnership & Support.
Impact of School Grants in Primary Schools In The Gambia Impact Evaluation Team The Gambian Team AFRICA IMPACT EVALUATION INITIATIVE, AFTRL Africa Program.
Raising standards, improving lives The inspection arrangements for maintained schools and academies from September 2013.
Quasi Experimental Methods I Nethra Palaniswamy Development Strategy and Governance International Food Policy Research Institute.
Study Size Planning for Observational Comparative Effectiveness Research Prepared for: Agency for Healthcare Research and Quality (AHRQ)
DISTRICT MANAGEMENT COUNCIL ACADEMIC RETURN ON INVESTMENT (A-ROI)
Primary. There was a greater level of improvement in Literacy than Numeracy for both FSME and Non-FSME pupils. Boys showed a greater level of.
“Working to ensure children, birth to 5, are prepared for success in school and life.” Wake County SmartStart Logic Model Training May 2013.
©PPRNet 2014 Impact of Patient Engagement on Treatment Decisions and Patient-Centered Outcomes in the Implementation of New Guidelines for the Treatment.
Rigorous Quasi-Experimental Evaluations: Design Considerations Sung-Woo Cho, Ph.D. June 11, 2015 Success from the Start: Round 4 Convening US Department.
Addressing educational disadvantage, sharing evidence, finding out what works Camilla Nevill Evaluation Manager.
Fill in missing numbers or operations 1)10 X = 70 1)40 - = 30 1) + 10 = 30 1)20 ÷ = 2 1)2 X = 10 1) - 10 = 70 1)30 ÷ = 3 1)20 + = 60 1) + = 10 1) - = 50.
© 2017 SlidePlayer.com Inc. All rights reserved.