Planning for Evaluation: An Introduction to Logic Models

Slides:



Advertisements
Similar presentations
Measuring health outcomes of engagement in the arts: the Arts Health Strategy for the Australia Council.
Advertisements

The Aged Care Standards and Accreditation Agency Ltd Continuous Improvement in Residential Aged Care.
Project Monitoring Evaluation and Assessment
Monitoring Evaluation Impact Assessment Objectives Be able to n explain basic monitoring and evaluation theory in relation to accountability n Identify.
AssessPlanDo Review QuestionYesNo? Do I know what I want to evaluate and why? Consider drivers and audience Do I already know the answer to my evaluation.
AGRO PARKS “The Policy Cycle” Alex Page Baku November 2014.
By Dr. Talat AnwarAdvisor Centre for Policy Studies, CIIT, Islamabad Centre for Policy Studies, CIIT, Islamabad
Supported by: Student Hubs is a registered charity in England and Wales, number Impact Measurement Proving and Improving your Social Impact.
Applied Methodologies in PHI Session 5: Evaluation Kath Roberts (NOO/EMPHO)
Guidance for Analyses of Pilot Interventions European Workplace and Alcohol Berlin, 9 November 2012 Jon Dawson.
The Engagement Cycle : engaging with patients and public throughout the commissioning process In collaboration with NHS Institute and DH.
Planning for Curriculum Framework Implementation.
Insert name of presentation on Master Slide The Quality Improvement Guide Insert Date here Presenter:
Stages of Research and Development
Introduction to Workforce Planning
Evaluating the Quality and Impact of Community Benefit Programs
How to show your social value – reporting outcomes & impact
Title of the Change Project
Incorporating Evaluation into a Clinical Project
Project monitoring and evaluation
Knowledge Transfer Partnership Project Nottingham Trent University and Nottinghamshire County Council Dr Adam Barnard Rachel Clark Catherine Goodall 19/4/16.
SLP Training Day 3 30th September 2016
Designing Effective Evaluation Strategies for Outreach Programs
Agcas Scotland Knowing your outcomes
Resource 1. Involving and engaging the right stakeholders.
GENDER TOOLS FOR ENERGY PROJECTS Module 2 Unit 2
Evaluation Plan Akm Alamgir, PhD June 30, 2017.
Introduction to evaluating and measuring impact in career development Presented by – Date – Doubt, the essential preliminary of all improvement and.
Training Trainers and Educators Unit 8 – How to Evaluate
Right-sized Evaluation
Investment Logic Mapping – An Evaluative Tool with Zing
Maternity and Neonatal, Skin, Vision and Hearing Deep Dive Workshop Understanding “what to change” using the NHS Right Care methodology Part of the NEW.
Introduction to Comprehensive Evaluation
Introduction to Program Evaluation
HEALTH IN POLICIES TRAINING
Data in the third sector (Health Development Officer)
9/16/2018 The ACT Government’s commitment to Performance and Accountability – the role of Evaluation Presentation to the Canberra Evaluation Forum Thursday,
Business Environment Dr. Aravind Banakar –
Business Environment Dr. Aravind Banakar –
Business Environment
Business Environment
Business Environment
Business Environment
“CareerGuide for Schools”
NHS Education for Scotland Always Event Project
Training Trainers and Educators Unit 8 – How to Evaluate
End of Year Performance Review Meetings and objective setting for 2018/19 This briefing pack is designed to be used by line managers to brief their teams.
Strategic Communication Planning
Learning Link Scotland
Logic Models and Theory of Change Models: Defining and Telling Apart
CEF Valuation Sub-Group Societal Valuation Programme Update
Resource 1. Evaluation Planning Template
Neurological Services Deep Dive Workshop Understanding “what to change” using the NHS Right Care methodology Part of the NEW Devon Way.
school self-evaluation and improvement toolkit
4.2 Identify intervention outputs
Welcome Recording Slide Deck Chat Box Mute
Introduction to M&E Frameworks
CATHCA National Conference 2018
Regulation 4 - Elements of the Plan
Public Health Intelligence Adviser
WHAT is evaluation and WHY is it important?
Regulated Health Professions Network Evaluation Framework
SUSTAINABLE MICRO-FINANCE for WOMEN’S EMPOWERMENT
The Compelling Case for Integrated Community Care: Setting the Scene
Building Capacity for Quality Improvement A National Approach
AICT5 – eProject Project Planning for ICT
Environment and Development Policy Section
Evaluating Community Link Working in Scotland: Learning from the ‘early adopters’ Jane Ford, NHS Health Scotland Themina Mohammed & Gordon Hunt, NSS Local.
Wednesday 13 September UKCF Conference Cardiff
Civil Society Facility and Media Programme Call for proposals: EuropeAid/162473/DH/ACT/Multi Webinar no. 3: Preparing effective Concept Note.
Presentation transcript:

Planning for Evaluation: An Introduction to Logic Models Public Health Institute, LJMU 4th July 2019

Learning outcomes for the event: Understand evaluation and the role of logic models in the first step of the planning process. Understand how to create a logic model. Understand how to use a logic model to inform the evaluation planning process, including: Confirming key stakeholders. Identifying the scale and scope of the evaluation. Identifying what data are already available/being collected. Identifying what types of data to collect and when. The focus for the session follows a CPD session we delivered in 2018 which focused on the role of evidence in commissioning. Following this event, delegates suggested they would welcome a session which provided advice for planning and carrying out evaluation. The focus for today’s session is on planning for effective evaluation. A further session could explore how to carry out an evaluation, should delegates consider this to be useful.

Evidence Ecosystem Systematic reviews Disseminate evidence to policy makers and practitioners Disseminate evidence to public Implement evidence Evaluate and improve practice Produce evidence Synthesise evidence Systematic reviews Guidance, recommendations, professional standards Evidence Ecosystem Primary research, real world evidence, big data The evidence ecosystem outlines the various stages in evidence-based practice. It shows the flow of evidence through the various stages of production, synthesis, dissemination and adoption. The process is represented as a cycle and as a system that learns through the cycle. Examples: (1) The Digital and Trustworthy Evidence Ecosystem (2) How to achieve more effective services: the evidence ecosystem (3) Evidence for the Frontline Real world evidence

Actors in the Ecosystem Evidence synthesisers Universities Government departments NGO/Charities Private Sector NICE/‘What Works’ Centres Disseminate evidence to policy makers and practitioners Disseminate evidence to public Implement evidence Evaluate and improve practice Produce evidence Synthesis evidence Evidence processors and disseminators Professional bodies/networks Policy organisations NGOs, Media, Private Sector Local Government Actors in the Ecosystem Evidence producers Universities Government departments Research Councils Private Sector Primary research, real world evidence, big data Various actors contribute to the evidence ecosystem. A well functioning ecosystem should produce rigorous, trustworthy evidence about what works and why. All of the actors have a part to play in this. Evidence implementers Practitioners Professional bodies/networks Local commissioners

Evaluation in the evidence ecosystem “Choosing an evidence-based intervention is the foundation, but there are additional necessary tools that adept agencies/organisations must wield to successfully construct an intervention program.” Dr Carolyn Webster–Stratton When commissioning an evidence-based intervention or service there might be a range of well-evidenced options to select from or there may be options that are not yet proven, but which have potential. In both scenarios evaluation can be used. To monitor the ‘real-world’ impact of evidence-based interventions and services; and to generate the evidence needed to move innovations (and other new ways of doing things) up the ‘evidence pipeline’.

Commissioning Cycle https://www.england.nhs.uk/participation/ Evaluation is a key part of the commissioning process. https://www.england.nhs.uk/participation/ resources/commissioning-engagement-cycle/

What is evaluation? Conducted to define or judge current care. Explore current standards. Measure service without reference to another. Involves an intervention which is designed and delivered in accordance with guidance, professional standards. Involves existing data but may include new data. No allocation to an intervention. No randomisation. Although we understand the term evaluation, it is important to distinguish this from ‘audit’ or ‘research’. Here are some key principles about what evaluation is.

Why evaluate? Assess how the objectives of the service or project are being met and any areas where they are not. Assess value for money. Assess whether a service is progressing according to plan. Identify opportunities for improvement. Assess service users and or service providers actual experience of a service. Document lessons to be learned for others and for the future. Establish a baseline of performance against which the impact of future initiatives can be compared. Evaluation is an integral part of understand implementation and impact of a project.

What are the questions you need to answer? Are things going according to plan? Why or why not? Are there things we could do to improve what we are doing? Is what we are doing making any difference? How do you know? Is the difference we are making worth the time/effort/money? Can you show this?

Why think about this from the start? What is the plan? Are you collecting the right information from the beginning that will help you understand why it is working (or not)? Are things going according to plan? Why or why not? Are there things we could do to improve what we are doing? Is what we are doing making any difference? How do you know? Is the difference we are making worth the time/effort/money? Can you show this? Are you setting up the necessary process from the beginning to identify areas for improvement as you go along? Have you thought what success will look like before you start? What information do you need to gather to demonstrate this when the time comes? Will you keep a detailed record of the resource invested so that you can make this judgement further down the line?

Evaluation needs to… Be planned from the start Collect data, reflect and refine throughout the life cycle of the programme

What to include What to exclude Identify scope

Evaluation needs to… Have specific aims and objectives. Have a clear purpose and focus. Have a clear time-frame. Use stakeholder involvement (PPI). Ideally use mixed- methods. Have clear reporting deadlines. Provide defining research handout

Identify stakeholders (individual, significant others, wider stakeholders) Have clearly defined roles, responsibilities, resources Consider different perspectives

Are the right people attending? Is the targeting appropriate? Process and outcome evaluation should be carried out throughout to ensure ongoing programme development e.g. Are the right people attending? Is the targeting appropriate? Is the delivery right for your population?

Logic models help with this… Identify programme gaps in activity Identify if the right data are being collected

This process (a logic model) helps identify programme gaps in activity and whether the right data are being collected to evidence outcomes. Can be desk based or captured with stakeholders during a meeting. This is your theory of change. i.e. the delivery of these activities will achieve these outcomes in the short, medium and longer- term. Evaluation will then test whether this happens and explore how and why.

What are logic models? A convincing picture of what you are trying achieve that shows the links between your intended activities, outputs and outcomes A framework for integrating planning, delivery and evaluation It’s not reality but your best prediction of what needs to happen to get to your outcomes Part of a wider planning and performance cycle

What DOES A logic model LOOK LIKE? Display of boxes and arrows, vertical or horizontal Any shape possible Level of detail – simple or complex

Input Output Outcome The stuff that is done The results that are seen The impact you are looking for

Why use them? Evidence-based story telling (road map) Communicate your (agreed) vision and plans Provide clarity re activities and outcomes Engage and sustain stakeholders (inspiration) Aid planning and management Focuses and improves implementation Helps you know what and when resources are needed Highlights assumptions and risks Shows similarities and differences to other programme activities Links with bigger picture

The changes achieved as a result of the activities Developing a logic model… Define the outcomes The changes achieved as a result of the activities

What does the programme actually do? Developing a logic model… Define the activities What does the programme actually do?

The countable products Developing a logic model… Define the outputs The countable products

Input Output Outcome The stuff that is done The results that are seen The impact you are looking for

Now, lets consider whether we are collecting the right data to evidence whether the outcomes are achieved. Use the arrows to connect - the activities to outputs - and the outputs to outcomes

How to prioritise what to evaluate? Now consider the gaps… How to prioritise what to evaluate? RE:AIM (Glasgow, Boles & Vogt, 1999) Reach Effectiveness Adoption Implementation Maintenance

Key points: Develop a shared sense of purpose amongst key stakeholders – identify and acknowledge roles and responsibilities within the delivery of a programme. Includes commissioners but also others who would benefit and/or be affected by the delivery of a programme. Who will analyse, collect and report on the data? Up to 10% of a programme budget should be set aside for evaluation. Provide defining research handout

Designing data collection tools Next steps: Designing data collection tools How and when to collect different types of data How to analyse and interpret different types of data  

Types of process evaluation data to collect and may include: (qualitative: interviews, focus groups, surveys, monitoring data) Service user: How did they find out about the service? Why did they attend? How easy was it to attend? What was their experience of the service? Were their needs met? Reach Service provider: How easy was it to implement the service? Non-service users: Awareness and barriers to use

Types of outcome evaluation data to collect may include: (quant & qual: interviews, focus groups, surveys, monitoring data) Service user: Achievement of intended outcomes Unintended outcomes Impact on quality of life Service provider: Intended and unintended outcomes Wider system-level outcomes: Impact on partnerships and pathways Do other organisations benefit from the intervention? Significant others: Impact on quality of life

More information: Public Health Institute Faculty of Health, Education and Community Liverpool John Moores University 0151 2314382 H.Timpson@ljmu.ac.uk