Evaluating Complex Health Programmes Fraser Battye.

Slides:



Advertisements
Similar presentations
Linking regions and central governments: Indicators for performance-based regional development policy 6 th EUROPEAN CONFERENCE ON EVALUATION OF COHESION.
Advertisements

Dr Erio Ziglio World Health Organization European Office for Investment for Health and Development COMMISSION ON SOCIAL DETERMINANTS OF HEALTH 4th Meeting.
When quality meets quantity: the role of qualitative data in framing health inequalities policy Chris Carmona, Catherine Swann and Mike Kelly National.
© 2009 Berman Group. Evidence-based evaluation RNDr. Jan Vozáb, PhD partner, principal consultant Berman Group.
Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
2 © Pro Bono Economics  PBE acts as a broker, matching professional economists with charities;  providing pro bono help to measure performance and understand.
Donald T. Simeon Caribbean Health Research Council
Evaluation Research Pierre-Auguste Renoir: Le Grenouillere, 1869.
Integrated Care Pathways (ICPs) Ali El-Ghorr Rosie Cameron
Good Evaluation Planning – and why this matters Presentation by Elliot Stern to Evaluation Network Meeting January 16 th 2015.
Monitoring and evaluation of carers’ services and projects Dr Andrea Wigfield - Associate Professor of Social Policy Centre for International Research.
The need for gender disaggregated data and its impact on policies, and achieving gender equality goals Hamidan Bibi.
Project Monitoring Evaluation and Assessment
BOND Value for Money Discussions Feb 3 rd, Oxfam GB Oxfam International – confederation of 14 independent affiliate organisations, working in over.
Program Design: Analyzing Program Models
Evaluating the Impact of Educational Technology Erno Lehtinen University of Turku Finland European Association for Research on Learning and Instruction.
WHAT DO WE KNOW ABOUT SOCIAL INCLUSION?. SOCIAL INCLUSION Social inclusion is a process which ensures that those at risk of poverty and social exclusion.
IAEA International Atomic Energy Agency The IAEA Safety Culture Assessment Methodology.
ActionAid Value for Money Pilot Update May Origins of the VFM Pilot -Measuring cost effectiveness approved in September 2010 as part of the new.
Hertfordshire Single Assessment Process Briefing Sessions For Residential and Nursing Homes.
Measuring the Quality of Long-Term Care in England Juliette Malley Personal Social Services Research Unit LSE Health and Social Care London School of Economics.
Monitoring Evaluation Impact Assessment Objectives Be able to n explain basic monitoring and evaluation theory in relation to accountability n Identify.
Integrating Therapy using a Transdisciplinary Approach Lois M Addy Senior Lecturer Integrated Children's Services in Higher Education (ICS-HE): Preparing.
Program Evaluation Using qualitative & qualitative methods.
Evidence based research in education Cathy Gunn University of Auckland.
1 Greater Manchester Public Service Reform Aligning whole-family support and work and skills programmes in Greater Manchester Gemma Marsh Jane Forrest.
What do we know about the health impacts of urban regeneration programmes? A systematic review of UK regeneration programmes ( ) Hilary Thomson,
Strategic Objectives Benefits Significantly reduce costs Better outcomes for residents Better quality of service Fewer services/ providers subject to safeguarding.
W HERE BER S TARTED Evaluation of Oxfam GB’s Global Climate Change Campaign Solution to evaluation challenge of considering value for money of a complex,
Evaluation methods and tools (Focus on delivery mechanism) Jela Tvrdonova, 2014.
Transnational Engagement Protect my future – The links between child protection and population dynamics in the post 2015 development agenda European Working.
Leading better together – working with local government Martin Seymour Principal Consultant, Healthy Communities Programme.
Urban Regeneration in Northern Ireland Consultation on proposals for a new Strategy for Neighbourhood Renewal Urban Regeneration and Community Development.
Criteria for Assessing The Feasibility of RCTs. RCTs in Social Science: York September 2006 Today’s Headlines: “Drugs education is not working” “ having.
Enhancing Practice in Work with Offenders: the Role of Evaluation Jean Hine, De Montfort University.
SAWSTON VILLAGE COLLEGE Research: Fixed and Growth mind-sets Fixed mind set traits include: - Avoiding challenges rather than risk failing - Give up easily.
Implementation and process evaluation: developing our approach Ann Lendrum University of Manchester Neil Humphrey University of Manchester Gemma Moss Institute.
Learning about learning The GEC approach to M&E UKFIET Conference Joseph Holden & Jason Calvert 15 th September 2015 © PEAS.
The Evaluation of Publicly Funded Research Berlin, 26/27 September 2005 Evaluation for a changing research base Paul Hubbard Head of Research Policy, HEFCE,
Slide 1 The Young Foundation 2010 Making the difference – why measurement matters in work with young people Gemma Rocyn Jones The Young Foundation NYCI.
What a difference a year makes Vanessa Pittard Director, Evidence and Evaluation.
Policy development and monitoring for quality and equity in education Quick report.
Advice on Data Used to Measure Outcomes Friday 20 th March 2009.
Disability Services Value for Money and Policy Review 29/11/20151 Value for Money and Policy Review of Disability Services in Ireland Presentation to the.
@theEIFoundation eif.org.uk CAREY OPPENHEIM, CEO Wednesday 5 th November, 2014 Child development and policy interventions.
Independent Office of Evaluation IFAD’s Approach to Evaluation of Agriculture programmes Presentation at ECD Workshop, Addis Ababa, 6 November 2015.
Evaluation design and implementation Puja Myles
© nef consulting Value for Money (VfM) in international development: useful elements of Social Return on Investment (SROI) Michael Weatherhead nef consulting.
Outcomes, Value and Impact Metrics for Library Success Sept th Summary of notes Presented by Phillippa Brown, Planning Coordinator.
Benchmarking impact: lessons from transnational practice.
How good is our school? (4 th edition) Professional learning events November 2015.
Supporting people who support communities in Scotland.
Back on track, stay on track Taking the opportunities, rising to the challenges Sue Morris-King HMI 3 July 2009.
Wellbeing, Evaluation & Prevention Fraser Battye 22 nd October 2009.
Hints and tips for evaluating programmes and providing evidence for long-term sustainability A Practitioners Perspective Mark Sanderson Suffolk Fire &
Review of evaluations of area-based initiatives JON CARLING Head of NERIP NERIP CONFERENCE 6 SEPTEMBER 2006.
Implementation Science: Finding Common Ground and Perspectives Laura Reichenbach, Evidence Project, Population Council International Conference on Family.
Knowledge, skills and values of the future acute mental health practitioner: a Delphi study Tracy Flanagan – Nurse Consultant, Humber Foundation Trust.
Going Smarter Monitoring and Evaluating Smarter Choices and Smarter Places Derek Halden DHC E:
Neeraj Kumar Negi Senior Evaluation Officer GEF Independent Evaluation Office March 11 th 2015 Performance Measurement in GEF.
How to show your social value – reporting outcomes & impact
Evaluation of 15 projects – ‘Supporting School Leavers’
Evaluation of Nutrition-Sensitive Programs*
Denise Elliott Interim Head of Commissioning Adult & Health Services
مدل زنجیره ای در برنامه های سلامت
Monitoring & Evaluation of Livingwell
Resource 1. Evaluation Planning Template
Wheatley Academy, Glasgow
Purpose of Outcomes measurement
Regulated Health Professions Network Evaluation Framework
Presentation transcript:

Evaluating Complex Health Programmes Fraser Battye

“Programmes chart out a perceived course whereby wrongs might be put to rights, deficiencies of behaviour corrected, inequalities of condition alleviated. Programmes are thus shaped by a vision of change and they succeed or fail according to the veracity of that vision. Evaluation…has the task of testing out the underlying programme theories… is that basic plan sound, plausible, durable, practical and, above all, valid?” A word from our gurus(!)

Points & structure  Main Points: 1.Of course it’s complex! 2.It’s our job to respond 3.Mix of approaches needed: but always underpinned by programme theory  Structure: o What is GHK? o Why are these evaluations difficult? o How have we addressed this? o Conclusions

What is GHK?  Multi-disciplinary  Independent and employee-owned  Various policy areas  Specialism in evaluation  (Working with ETHNOS)

Why might programmes like COFSS be hard to evaluate?  Complexity: neighbourhoods not labs  Interactions with context: o Other interventions o The real world (e.g. residential ‘churn’ / global catastrophe & war!)  Timescales, effects and attribution: o Intervention output outcome impact  Determinants of health o Lack of / debated evidence o Standards of evidence

How have we addressed this?  Approach based on programme theory 1.Define theory behind the programme: o What is COFSS? o What does it do? o Desired effects? o How does it expect to work? 2.Design research to test it: o Quantitatively (neighbourhood; individual) o Qualitatively (lit reviews; interviews; case studies; tracking beneficiaries) o Mixed methods

Defining the theory 1.Aims to reduce health inequalities 2.Significant resources at its disposal 3.Multi-agency and multi-disciplinary 4.Significant investment in management and ‘True’ partnership working 5.Community based and uses an assertive outreach approach 6.Aims to change mainstream service provision 7.Resident input is central

Testing the theory (example) Feature:Hypotheses:Tested by… 4) Significant investment in management & ‘True’ partnership working Outcomes achieved by beneficiaries are better because services are coordinated Service providers spend less time managing cases, doing admin. etc & more time delivering services Interviews with providers & management Interviews with beneficiaries Lit review: costs & benefits of partnership VFM assessment

Concluding points  It’s always going to be complex (GOOD!)  Evaluation must respond  Theory-driven approaches promising o Best mixed with ‘traditional’ methods  Challenge to accepted public health understandings of ‘evidence’