Analysing regional data and using evidence to manage demand 1 Susannah Bowyer, Research & Development Manager Kath Wilkinson, Research & Evaluation Officer
Session overview Review national and local data on demand Consider ways of monitoring and evaluating your services Research in Practice support 2
Increasing demand. March 2014 figures (DfE release ) 5% increase in CiN. Episodes of need are getting longer 47% of CiN state abuse or neglect primary need 12% increase in S. 47 enquiries Sharp 12% increase in children subject of a CP plan. In comparison, the increase from to was 0.5%. 3
11% increase in referrals to CSC Re-referrals also up on previous year. 4
Increasing demand March 2014 figures (DfE release ) LAC – steady increase over five years Up 1% on Up 7% compared to % increase in children starting to be looked after 6% increase in children ceasing to be looked after 26% increase in adoptions – now falling back Rates vary significantly between LAs 5
6
The regional picture Rate of LAC in West Midlands
Children who started to be looked after and the percentage taken into care All children who started to be looked after in the year ending 31 March …of which: children who were taken into care Percentage of children taken into care England 30,43010,92036 West Midlands3,6001,43040 Coventry Dudley Sandwell Walsall Staffordshire Wolverhampton Telford and Wrekin Birmingham Solihull Stoke-On-Trent Warwickshire Shropshire Herefordshire Worcestershire
Percentage of children taken into care England West Midlands Birmingham Coventry Dudley Herefordshire Sandwell Shropshire Solihull Staffordshire Stoke-On-Trent Telford and Wrekin Walsall Warwickshire Wolverhampton Worcestershire Decrease by at least 10% Increase by at least 10%
10 DfE Children’s Social Care Innovation Fund Focus on: Rethinking children’s social work Rethinking support for adolescents in or on the edge of care Supporting on three levels Taking innovation with an evidence base to new organisations, places and/or contexts and supporting others to copy it Supporting pilots and change programmes which spread more effective ways of supporting vulnerable children Supporting pilots and change programmes which test/’prototype’ more effective ways of supporting vulnerable children to build evidence Develop IdeasTest & Improve Scale and spread
Evaluation and monitoring: Measuring outcomes Required to: Make a case for resources Identify areas for improvement Demonstrate effectiveness Evaluate impact and value Inform future planning There are a number of challenges: Deciding which outcomes to measure Deciding how to measure outcomes Deciding whether outcomes have been achieved Deciding what to do with the outcome measures 11
12 How do you measure outcomes? What are the outcomes for your edge of care services? How do you currently measure these outcomes? What does this information tell you about: - Impact on demand across other services - Impact on family outcomes - Cost savings What other data is collected that you could utilise?
Joined up delivery webinars
Joined up thinking
15 webinars
16 Example: RiP evaluation project Context: Development of a new Specialist Adolescence Service to reduce the number of unplanned adolescents placed in care Evaluation aims: Does the service work in an effective and efficient way? (process evaluation) Is the service achieving its aims? (impact evaluation) Are there any other associated impacts? Key stakeholders: adolescents and their families, service staff, staff in other services and voluntary organisations connected to the service Methods (over time): Collaborative planning and development of instruments Quantitative (use data already being collected for comparison) Qualitative (combination of surveys, interviews and focus groups)
17 Contact us Website: