Presentation is loading. Please wait.

Presentation is loading. Please wait.

Building Evidence in Education: Conference for EEF Evaluators 11 th July: Theory 12 th July: Practice www.educationendowmentfoundation.org.uk.

Similar presentations


Presentation on theme: "Building Evidence in Education: Conference for EEF Evaluators 11 th July: Theory 12 th July: Practice www.educationendowmentfoundation.org.uk."— Presentation transcript:

1 Building Evidence in Education: Conference for EEF Evaluators 11 th July: Theory 12 th July: Practice

2 Panel session 3: Working with schools Creative solutions: lessons learnt from evaluating the LIT programme Sarah Haywood NatCen

3 Richard Dorsett, NIESR EEF Evaluators Conference 12 July 2013 “Mind the Gap”

4 What is being tested? a parental engagement intervention – parents work with their children to create a short animated film – series of facilitated sessions whole-school intervention – Metacognition: training teachers in the principles of “learning to learn” Targeted at year 4 pupils in academic year 2012/13 Predicted effect size of

5 Randomisation design All schools: NS=50 Treatmen t NS 1=25 CPD & PE NC=25 CPD NC=25 Control NS 0=25 Control NC=25 School randomisation Class randomisation A BC

6 Treatment effect 1: CPD & PE All schools: NS=50 Treatmen t NS 1=25 CPD & PE NC=25 CPD NC=25 Control NS 0=25 Control NC=25 School randomisation Class randomisation A BC

7 Treatment effect 2: CPD All schools: NS=50 Treatmen t NS 1=25 CPD & PE NC=25 CPD NC=25 Control NS 0=25 Control NC=25 School randomisation Class randomisation A BC

8 Treatment effect 3: CPD & PE vs. CPD All schools: NS=50 Treatmen t NS 1=25 CPD & PE NC=25 CPD NC=25 Control NS 0=25 Control NC=25 School randomisation Class randomisation A BC

9 Recruitment and randomisation Birmingham, Devon, Haringey, Manchester Drop-out is a worry – 2 controls did so before knowing treatment status – 3 controls, 1 treatment dropped out & substituted – 2 controls dropped out & were not substituted Substitute schools – take treatment status of dropouts they replace – excluded from the impact estimates – provide potentially useful supplementary data Wanted 2-form entry but not always achieved

10 Achieved sample NS=43 T NS 1=24 CPD/PE NC=24 CPD NC=15 C NS 0=19 C NC=19

11 Some lessons RCT Design is relatively easy – practical issues are more complicated The process of inducting schools is important to secure full engagement pre-randomisation Having something to offer schools control schools in particular may help with drop out Minimising drop out is best. But some drop-out is inevitable – need for a protocol? Some implications for analysis – Helpful to understand reasons behind dropout – Can consider nonexperimental techniques – NPD analysis may be unaffected by drop-out of controls

12 EEF Conference 2013 Towards a Protocol for Effective Recruitment Mary Sheard July 12, 2013

13 Recruitment as a problematic and complex relationship “Recruit schools to the evaluation not the project”

14 Contexts EEF projects: Project and design, challenges and solutions EEF Protocol and Survey Outcomes:Effective recruitment; what has worked well and what have been the challenges Non-EEF projects: Experience across a wide range of research studies and evaluations

15 What do we mean by ‘effective recruitment’?

16 Terminology: What is meant and understood? Programme Intervention Initiative Project Evaluation …

17 What has worked well Relationships: schools, LA/parent organisations, programme developers, evaluators trainers; test providers [Ethics] Partnerships with schools: key personnel in school; lead project contact; teacher implementers; technical support. Roles: clarity, responsibilities, expectations, inclusiveness [Ethics]

18 What has also worked well Information: quality, clarity, conciseness, sufficiency, inclusivity; suitability, accessibility (audiences, ethics] Examples of documentation; inviting initial expression of interest; school agreement form/contract; pupil data; data protection

19 Challenges and resolutions Identifying and linking with key personnel Senior leadership involvement Lines of communication Information overload Saturation of constituency/schools as participant partners

20 More challenges and resolutions Defining/explaining and the relationship between school, programme developer (trainer) and evaluation team Timing The concept of random assignment Participation as control Testing preparation and procedures

21 What we have learned Need to systematise a comprehensive recruitment strategy, to establish a recruitment protocol or checklist as the prequel to a project data management plan Need to create a recruitment database Need to consider equity/equal opportunity and fairness in recruitment approaches: hard to reach schools and schools that are missed out

22 Developing a consistent recruitment strategy

23 Creating a protocol/checklist for effective recruitment in future large-scale evaluations

24 Contact:


Download ppt "Building Evidence in Education: Conference for EEF Evaluators 11 th July: Theory 12 th July: Practice www.educationendowmentfoundation.org.uk."

Similar presentations


Ads by Google