What did we do and how well did we do it? A consistent approach to identifying and measuring outcomes.

Slides:



Advertisements
Similar presentations
South Africas MTEF Effective expenditure for development Malawi Poverty Monitoring System Workshop July 2002.
Advertisements

Results/Outcomes Based Accountability
Results-Based Accountability (RBA) The Fiscal Policy Studies Institute Websites raguide.org resultsaccountability.com Book - DVD Orders amazon.com resultsleadership.org.
David Taylor Formerly Director of Inspection, Ofsted
Child Safeguarding Standards
Integrating Children and Young Peoples Services Will Greenhow - Home Affairs David Killip - Health and Social Security John Cain - Department of Education.
From QA to QI: The Kentucky Journey. In the beginning, we were alone and compliance reigned.
Supporting continuous improvement in the replication process Getting to Grips with Replication Seminar 3: Monitoring, evaluation & continuous improvement.
Representing Central Government in the South East Monday, 27 April 2015 Vivien Lines DCSF Safeguarding Adviser VCS Safeguarding Seminar 17 December 2009.
Decision Making Tools for Strategic Planning 2014 Nonprofit Capacity Conference Margo Bailey, PhD April 21, 2014 Clarify your strategic plan hierarchy.
RBM in the context of Operations and Programme and Project Management Material of the Technical Assistance Unit (TAU)
Children’s Social Care Workload Management System (WMS) A Two-fold approach DSLT 16 th November 2010 Updated with new SWRB standards.
Questions from a patient or carer perspective
1. RECENT PERFORMANCE AND CAPACITY TO DRIVE PROGRESS Recent data Areas to considerExample questions Red Green Is the school on trajectory? Is attendance.
Molly Chamberlin, Ph.D. Indiana Youth Institute
DSCB Self Evaluation December 2014 update. Self Evaluation Regular refresh Feedback from Board Members Outcomes Focused Compare and contrast with other.
Strategic Planning the RBA way Stephen Mondy and Merissa Barden Centacare Broken Bay Oh no, not someone else with a mission statement!
HR Initiatives in the NHS
1 Qualitative Evaluation Terms Coding/categorization The process of condensing qualitative data through the identification of common themes. Data Matrix.
Performance Monitoring : Thoughts, Lessons, and Other Practical Considerations.
Outline Update on localities establishment System Integration Clinical Framework Global Budget Setting Whanau Ora Networks Next Steps Questions?
1 GM Public Service Reform Complex Dependency April 2014.
Impact assessment framework
Module 3. Session DCST Clinical governance
Measuring the Impact of Service Delivery Through Outcomes - Focused Planning and Partnership Working Parvinder Chana 2009.
Performance Measurement: A Brief Introduction For Kunuwanimano Child and Family Services.
Turning the curve: children’s services and outcomes based accountability Jacky Tiotto Deputy Director, Government Office planning and performance support,
The role of governance in self-assessment NATSPEC conference Sue Preece HMI March
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Evaluation Assists with allocating resources what is working how things can work better.
Children’s Trust Network 19 October 2011 Developments in Safeguarding Anthony May Corporate Director for Children, Families and Cultural Services.
1 Outcomes Focussed Practice; the Paradigm shift in Public Services Rob Hutchinson, CBE April 2007 Based on Results Based Accountability
ASSURANCES, FRAMEWORKS, DOMAINS… OH MY! Everything You Always Wanted to Know About QM Strategies But Were Afraid to Ask.
Monitoring and evaluation of service delivery and outcomes Sally Thompson Welsh Government.
Investing in Services for Outcomes (ISO) Information for NGOs June 2014.
Director of Evaluation and Accountability Manager, UW’s Grand Rapids, Michigan Robert McKown, CIRS Director of Evaluation and Accountability Sherri.
Julie R. Morales Butler Institute for Families University of Denver.
Workforce matters How to workforce plan to support
W HAT IS M&E  Day-to-day follow up of activities during implementation to measure progress and identify deviations  Monitoring is the routine and systematic.
Mapping Employability enabling partnership Billy Lynch Manager Open Door Fife Veronica MacMillan Capacity Building Coordinator Wise Move.
General Capacity Building Components for Non Profit and Faith Based Agencies Lakewood Resource and Referral Center nd Street, suite 204 Lakewood,
HFTC Collaborative Council Strategic Plan Update.
Results Based Accountability Basics An Introduction to RBA Standard Training Slides Sponsored by the Ministry of Social Development.
Association of Childrens Welfare Agencies Conference 2006 Improving Care Through Accreditation- The Role of the NSW Children’s Guardian.
SUPERVISION: SIGNS OF SAFETY STYLE Phase 1 The Supervision Contract Phase 2 Case Specific Supervision Phase 3 Performance Booster Phase 4 Review of P.E.
RBA Summit Shared Outcome Pilots Beth Stockton Capacity Building Consultant NSW Family Services Dean Williamson Sector Support Manager Youth Action NSW.
Results Based Accountability Basics An Introduction to RBA Standard Training Slides Sponsored by the Ministry of Social Development.
Every Child Matters Improvement Programme Integrated Working In Localities Project Phase 2 – October 2009 update.
SEN and Disability Reform Partner Supplier briefing event December 2012.
Project management Topic 1 Project management principles.
Start Something Quality Assurance System Assessment ™
Kathy Corbiere Service Delivery and Performance Commission
Catholic Charities Performance and Quality Improvement (PQI)
Performance Management Training October , 2015 Grace Gorenflo, MPH, RN Principal Gorenflo Consulting, Inc.
Welsh Neglect Project The findings from the Welsh Government commissioned Welsh Neglect Project November 2015.
Results Based Accountability. Overview of Results Based Accountability The context of the development of Results Based Accountability in New Zealand An.
Results Based Accountability The Fiscal Policy Studies Institute Santa Fe, New Mexico 1.
Developing a Monitoring and Evaluation Framework: Insight into internal stakeholder learnings Beth Ferguson AES Conference Sydney 2 September 2011.
Lorna Howarth Local Parenting Strategy Team Families Policy, Development & Delivery Unit Parenting Support Policy Update.
Presentation By L. M. Baird And Scottish Health Council Research & Public Involvement Knowledge Exchange Event 12 th March 2015.
Homeless Management Information Systems The Calgary HMIS - A joint initiative between the CHF and the Homeless Serving Sector in Calgary Date: April 21,
A Framework for Evaluating Coalitions Engaged in Collaboration ADRC National Meeting October 2, 2008 Glenn M. Landers.
SESSION 2 ISSAT Governing Board Core Group Meeting 2013 PERFORMANCE MANAGEMENT.
CHILDREN OF PRISONERS PARTNERSHIP What is the Children of Prisoners (CP) Partnership? The CP Partnership is a funding project between PFI and selected.
Workshop on social services for vulnerable groups Social Care Governance in Scotland Alexis Jay, Chief Social Work Adviser October 2011, Ukraine.
To Learn & Develop Christine Johnson Lead Nurse Safeguarding (named nurse) - STFT Health Visitors Roles and Responsibilities in Domestic Abuse.
Middle Managers Workshop 2: Measuring Progress. An opportunity for middle managers… Two linked workshops exploring what it means to implement the Act.
Home and Community-Based Medicaid Waiver Services Aged and Disabled Medicaid Waiver Update March 2016.
The Horrocks Family. Roy Horrocks What do you know about Roy? What will your Initial Assessment reveal? Which other professional bodies are involved?
Collective Impact: Starting with the end in mind
Presentation transcript:

What did we do and how well did we do it? A consistent approach to identifying and measuring outcomes

RBA – what it means to Family Works HB Systematic approach to planning and measuring outcomes A framework for monitoring and evaluating what we do A framework for reflection A framework for reporting A framework for planning To know we are doing is making a difference

How did we get started?: Evaluation cycle: monitoring, evaluation and learning Define and plan service delivery Do it Implement service Analyse data and review service outcomes Reflect on outcomes and refine service  Accountability  Clients  Governance  Funders  Communications  internal & external  Fundraising Evaluation in Action Action Research model

Process used to establish evaluation framework Map the programme ‘focus group’ discussions with staff describing “what we do, how we do it and how we know when its working” Build a programme logic derived from programme map Identify most useful performance measures based on an RBA approach

The Journey In 2008 we developed our programme logic and embarked on RBA. In July 2009 we merged another organisation into Family Works HB – 10 new staff, 2 new services and a number of new programmes. In 2010 – we purchased a data base and have spent time ensuring a fit with our tools, processes etc designed around an RBA framework. This is work largely completed.

INPUTSACTIVITIESOUTPUTS Direct products of the activities OUTCOMES Short-term Results we expect to see Medium term Results we want to see Longer term Results we hope to see Planned workIntended resultsImpacts PROGRAMME LOGIC: A way of describing a programme for planning and evaluation Resources+ What you do= Results Within your control

INPUTSACTIVITIESOUTPUTS Direct products of the activities OUTCOMES Short-term Results we expect to see Medium term Results we want to see Longer term Results we hope to see Planned workIntended resultsImpacts                   Performance Measures / Indicators             PROGRAMME LOGIC: A way of describing a programme for planning and evaluation Resources+ What you do= Results How do you know this has happened?

INPUTSACTIVITIESOUTPUTSOUTCOMES Short-term (working with FW ) Medium term (On closure with FW ) Longer term (6–12 mnths post?) Performance Measures / Indicators Tracking Inputs:  costs  staff time  # clients resources Quantity & quality of Activities:  Level & quality of engagement with family  Quality of the services  Monitor & review intra/inter- agency processes Tracking Outputs: Assessment completed  contract agreed  goal plan in progress  # sessions  # reviews  case status on closure  Safety assessment completed  Community participation assessment completed  Client participation rate  Change achieved in issues in Goal Plan – client reviews  Changes in Safety, Care & Stability – SW reviews  Community participation reviewed  Client participation rate  Change achieved in issues in Goal Plan – intake vs closure  Changes in Safety, Care & Stability – intake vs closure  Client service evaluation – incl. PSNZ measure  Community participation  Educational status- children  Family PHO registration- going to GP as required  Number of adverse contacts with Agencies for client families (CYF, WINZ, Police - s.15 notifications; s.19 referrals)  Families represent through PR How did we start: programme mapping

How much service did we deliver? Performance Measures How well did we deliver it? How much change / effect did we produce? What quality of change / effect did we produce? Quantity Quality Effect Effort Output Input

How much did we do? Outcomes - Effect Inputs - Effort # clients # sessions # reviews funding case completions vs non-completions Referral sources Agencies involved Client demographics Presenting issues Match with priority populations Background Info Community profiles Events / issues in community Client service evaluation – satisfaction survey Social worker satisfaction survey - [workloads, resources, support – supervision & direction etc.] Social work practice - progress with Goal plan etc., % reviews at 6-8 sessions Closures / completion rates / reasons for non- completion Client participation rates / Level and quality of engagement with clients  Ratings show changes in issues listed in Client contract – client assessed  Ratings show changes in Child Safety dimensions– SW assessed  Ratings show changes in community participation rates- SW assessed  Client service evaluation – satisfaction survey including PSNZ measure  Client engagement & participation rate After 6 & 12 months:  Educational status of children- children enrolled and attending school  Family PHO Registration-going to GP as required  Number of adverse contacts with Agencies for client families (CYF, WINZ, Police - s.15 notifications; s.19 referrals) How well did we do it? Is anyone better off?

What have we achieved to date. Client plans completed Client outcomes met 24% 2008 / /102010/112011/12 57%45%96%95% 2008/092009/102010/112011/12 29%55%75%85%

Social Work results

Our results – what happened Review Findings Tools were okay – some minor tweeks Staff inconsistent in use of tools and reporting Inadequate QA Action plan Improve QA and monitor practice closely Policy changes and training around “forming a belief” and reporting accurately. Infrastructure changes

Key enablers Outcomes reporting and RBA was supported by MSD My CEO, and Executive wanted a tool to evaluate what we did. Staff understand our need to report outcomes CMS aided the process of implementation.

Barriers/ Challenges New Manager, new staff, new systems, new CMS. Existing capability / capacity versus increasing demand and complexity Greater levels of quality assurance needed CMS – added an additional challenge Fear – a tool to measure individual performance? Fear – of change, Inconsistent practice

Lessons learnt Bring all staff on the journey, don’t leave any behind Be clear with staff of our legal obligations for reporting to funders and our Board and clients One step at a time. Don’t be afraid of change This is about the service, not individuals Look for patterns – these tell a story on their own

Lessons learnt Be prepared to delve into the files for solutions and to get the story behind the data e.g. – 25% disengagement – find out when in the process and why and take action Review data management tools and rating scales- are they doing what you want them to do? Don’t be afraid of change.

What surprised me? Our results. Not all our results are good but we had a starting place from which to launch practice improvements. We have to have a framework by which to hang our results Client demographics remain stable - No surprises

Issues and challenges Consistency, completeness, case reviews, quality assurance – protect against garbage in garbage out. All our work is on CMS and reporting is easier. A recent restructure and subsequent staff changes challenge our ability to meet demand. Reviewing and updating our practice model- change raises levels of anxiety.

Recap Programme mapping Vigilance around reporting Good QA system Integrity of data

Would I recommend RBA Most definitely It provides structure, it asks and allows us to answer the key questions – do we make a difference in the lives of those we work with. And If not, why not. It helps to justify our existence by providing an evidence based framework.

Children born healthy, Children succeeding in school, Safe communities, Clean Environment, Prosperous Economy Rate of low-birthweight babies, rate of high school graduation, crime rate, air quality index 1. How much did we do? 2. How well did we do it? 3. Is anyone better off? RESULT or OUTCOME INDICATOR or BENCHMARK PERFORMANCE MEASURE A condition of well-being for children, adults, families or communities. A measure of how well a programme, agency or service system is working. Three types: A measure which helps quantify the achievement of a result. = Customer Results Whole population Client population