Presentation is loading. Please wait.

Presentation is loading. Please wait.

A Better Start Evaluation Jane Barlow University of Warwick.

Similar presentations


Presentation on theme: "A Better Start Evaluation Jane Barlow University of Warwick."— Presentation transcript:

1 A Better Start Evaluation Jane Barlow University of Warwick

2 Who are we?

3 What are we aiming to do? Conduct of a robust evaluation of A Better Start which will run throughout the 11-year period of the programme: Focus on the setup, implementation and impact of the programme within and across the areas Disseminate learning across the areas involved and more widely

4 How do we plan to do it? Workstream 1: Implementation evaluation of the setup and delivery of the programme; Workstream 2: Impact and economic evaluation of the area programmes; Workstream 3: A programme of learning and dissemination that will extend across the five areas, and beyond

5 Workstream 1: Implementation Evaluation Mairi Ann Cullen CEDAR, University of Warwick

6 Outline The team Aims and design Phase 1 –First steps –Rest of phase 1 Outline of Phase 2

7 The Team CEDAR (Centre for Educational Development, Appraisal and Research) Research –Geoff Lindsay, Mairi Ann Cullen, Stephen Cullen Admin –Diana Smith, Alison Baker, Shauna Yardley

8 Aims To evaluate the setup and delivery of A Better Start –Each area as a case study –Programme as a whole And developments outward, across other areas of England, and beyond

9 Our approach Participatory and collaborative research –The research team, the areas and BIG –Participatory action approach Working with the areas to understand each area’s programme Examining the setup Create the evaluation for Phase 2 Independent evaluation Responsibilities: CEDAR, Areas, BIG

10 Participatory action approach Independent evaluation Responsibilities to optimise evaluation

11 Methods Ethical approval – University of Warwick Data collected by us: –Interviews, surveys, document analysis Data collected by you –Reviewing your monitoring data, reports etc Methods

12 Phase 1 setup (now – end 2015) Working with areas to understand your programmes and evaluate the implementation of the setup phase – Feeding back to enable learning optimising each area’s programme Common and area specific objectives – Co-production Context/Inputs, activities, outputs

13 Where are we now? Context and Inputs, e.g. Identification and mapping of current services, interventions, delivery mechanisms, data monitoring, and reporting to create baseline scenarios, to include e.g. –Needs analysis –Infrastructure including staff, IT systems, management systems –Dartington’s support

14 (cont.) Development and agreement of policies and procedures for A Better Start (e.g. examination of evidence for possible interventions and decision- making regarding the selection): –Processes for agreeing the interventions –Infrastructure to implement and manage the interventions –Data monitoring –Budget creation –Pressures

15 Activities Process implementation – putting the agreed policies and procedures into place e.g.: –Staff recruitment –Training –Supervision –Data collection and management to track progress –Financial system

16 Outputs What are the results? [NB this will largely occur at Stage 2 when the interventions are underway] E.g. locally collected pre- and post- intervention measures –E.g. improvements in children’s language and communication Common measures where possible

17 Examples –Priorities within the ABS framework –Selection of interventions –Service configuration –Staffing and training –Governance system –Management system –Data collection, analysis and reporting system –Parents’ (and others’) engagement

18 First steps Initial discussions here Where are you at? –Your timetable of activities, e.g. Recruitment of key staff, Governance set up and meetings. Selection of interventions –Arrange visits to August – September? Interviews with strategic and operational leads Identify other interviewees, e.g. heads of services, specific programmes

19 October – end 2015 Fieldwork in each site Common data for each area –To enable aggregation across 5 areas e.g. largely standard interview schedules with key personnel –To enable comparison, e.g. different approaches to same objective –Specific data for each area

20 October – end 2015 Timetable –Constructed with each area to meet final end of Phase 1 deadline agreed with BIG –Activity moving from exploiting the setting up of the systems to evaluating the systems in action, e.g. What supports optimal implementation What are the barriers? How can this learning be used to revise systems And to inform others?

21 Phase 2 Starts 2016 – subject to progress 5 year study of the areas’ systems in operation, including: –Interviews and surveys of personnel, stakeholders, users (Process). –Examination of locally collected data (Outputs and Outcomes), E.g. comparisons of pre-post data from interventions to examine change

22 Workstream 2: Impact & Economic Evaluation Impact study Overview of surveys Sally Panayiotou Research Director, Ipsos MORI

23 How will this be carried out? We will track two cohorts (an early and a late cohort) of families in the service areas in addition to matched cohorts of similar families living in carefully selected comparison areas Collect i) individual; and ii) population level data for 3 key outcomes (nutrition; language; socioemotional development) Following families from pregnancy to 7 years

24 What is the purpose of the surveys? Identify short- medium- and long-term changes in: parental functioning children Socio- emotional health Nutrition Speech, language and learning

25 How will we measure these changes? Warwick consortium composed of specialists in each of these fields Series of surveys starting in pregnancy Range of validated measures and survey questions Depression Edinburgh Post- Natal Depression Scale (EPDS) Bonding with Baby Mother-to-Infant- Bonding Scale (MIBS) Anxiety State-Trait Anxiety Inventory Brief Infant-Toddler social and emotional assessment (BITSEA) Patterns of food intake Children’s Dietary Questionnaire (CDQ) Child’s cognitive ability British Ability Scales (BAS) For example

26 Additional biometric measures Some participants in the study asked if they would be happy for their child to also participate in additional biometric measures Hair Sample Age 2 Measure stress hormone, cortisol Buccal (cheek) swab Age 2 Measure epigenetic changes (i.e. which inherited genes are ‘switched on or off’ due to environment) Interviewer team Measures suggested below will only be requested from approx. 10% of participants, IF: National research ethical approval is granted, and; Intervention areas approve, and; After consultation with local ‘user groups’ to seek their opinions, and address any concerns

27 Who will we be speaking to? Mothers recruited to the survey at 16 weeks in their pregnancy – series of face-to-face and postal interviews Paper questionnaires for the partners where applicable During the course of the evaluation we follow the child

28 Where will we be conducting interviews? Interviews in:  5 selected areas = programme sample  15 matched comparison areas = comparison sample

29 Baseline study Gain a pre-intervention measure of outcomes in 2015 Initial pilot of 90 interviews to check survey materials 1620 face-to-face interviews with families across the programme and comparison areas. Interviews with :  Mothers of 1 year-olds  Mothers of 2 year-olds  Mothers of 3 year-olds and survey tasks with the child

30 Two cohort studies Cohort 1 – starts in the second year of the evaluation (2016) Cohort 2 – starts in the fourth year of the evaluation (2018) Programme n= 775 (150 per area) Matched comparison n= 550 (35 per area) Programme n= 1710 (340 per area) Matched comparison n= 1170 (75 per area) (Numbers are approximate)

31 What will recruitment involve? Led by Debra Bick and Sarah Beake, Florence Nightingale School of Nursing and Midwifery Kings Health Partners will identify tertiary maternity units in the 5 intervention and 15 comparison areas Contacts/meetings with the Heads of Midwifery in each unit. Letter of support from units which agree to take part

32 Recruitment Process All pregnant women will be sent a study information leaflet with their pregnancy booking information from before their first antenatal appointment at 12 weeks At their 16 week routine antenatal appointment, the midwife will: - Check women received a study information leaflet and offer another leaflet if appropriate - Ask women if they do not want their contact details forwarded to the research team Ipsos MORI team will contact the women to arrange date to meet when women weeks gestation At this first interview, women will be asked for written consent to participate in the study Midwives will update the team about women no longer eligible for inclusion (i.e. if pregnancy loss)

33 Survey points Survey pointSurvey with mother (/ main carer in later stages if child does not live with mother) weeks pregnantFace-to-face in-home 2Baby aged 2-monthsPostal 3Baby aged 4-monthsTelephone 4Child aged 1Face-to-face in-home 5Child aged 2Face-to-face in-home 6Child aged 3Face-to-face in-home 7Child aged 5Postal / online 8Child aged 7 (Cohort 1 only)Postal / online

34 Survey points Survey pointMother / main carerPartner paper questionnaire weeks pregnantFace-to-face in-home 2Baby aged 2-monthsPostal 3Baby aged 4-monthsTelephone 4Child aged 1Face-to-face in-home 5Child aged 2Face-to-face in-home 6Child aged 3Face-to-face in-home 7Child aged 5Postal / online 8Child aged 7(Cohort 1 only) Postal / online

35 Additional measures with a sub- sample of participants Survey pointMother / main carerAdditional assessments weeks pregnantFace-to-face in-home 2Baby aged 2-monthsPostal 3Baby aged 4-monthsTelephone 4Child aged 1Face-to-face in-homeCARE Index (video-coding of free play) 5Child aged 2Face-to-face in-homeCheek swab (epigenetic changes) Hair sample (cortisol) 6Child aged 3Face-to-face in-homeAttachment story stem 7Child aged 5Postal / online 8Child aged 7(Cohort 1 only) Postal / online

36 Looking after participants during the study  Interviews will be conducted by experienced Ipsos MORI interviewers in line with the MRS code of conduct – they will receive full training on this study  The study will gain ethics approval, and local R&D approvals from all study sites prior to sending women any information  We will gain informed consent at each survey stage  Participants will be free to opt-out of taking part at any stage  All survey data will be anonymised  We will provide a supporting participant website, summaries during the research and information leaflets with details on where to seek advice on the issues covered in the survey  We will provide respondents with a survey and a free-phone telephone number for them to contact the survey team at any stage  If survey responses indicate a participant is at serious risk of harm we will seek their permission to contact an appropriate service provider on their behalf; if permission is not given we will still advise them to seek support

37 How could you help us with recruitment? What contacts do you already have with your local maternity units? Are you aware of any ongoing recruitment of women in early pregnancy to studies in your area? Could you advise on maternity units in your area that we should contact in the first instance?

38 What do you need to do during the surveys? All survey work will be carried out by Ipsos MORI and there is nothing you need to do during the surveys But we would really appreciate your support! During the surveys if you receive any queries please contact us and we will respond quickly to resolve any issues as soon as we can

39 Will you receive the anonymised survey data? Yes – we would like to discuss the best format for you in the context of the Management Information System

40 Workstream 3: Dissemination and Learning Jane Barlow University of Warwick

41 What does this mean? Essential that ongoing & final findings from the evaluation be shared: - wider community - local authorities - CCGs - third sector providers - central government - policy interest organisations

42 Why is this important? Sharing learning from the evaluation will help to: - improve practice - influence local and national decision- makers to make fundamental shift in policy in early years prevention

43 How will this be done? Exact methodology not yet completely defined, BUT, likely to involve: - supporting peer learning within and across intervention areas - ‘Learning & Dissemination’ website/online evaluation resource & forum - delivering effective learning amongst 5 intervention areas to ensure ongoing improvement in interventions, approaches and systems change

44 Conclusions ABS is a tremendous opportunity to improve children’s lives And to improve our knowledge of effectiveness of interventions –What works? For whom? Under what circumstances? And what aids or undermines its success. –We look forward to working with you!! 44


Download ppt "A Better Start Evaluation Jane Barlow University of Warwick."

Similar presentations


Ads by Google