Presentation is loading. Please wait.

Presentation is loading. Please wait.

Planning for Evaluation: An Introduction to Logic Models

Similar presentations


Presentation on theme: "Planning for Evaluation: An Introduction to Logic Models"— Presentation transcript:

1 Planning for Evaluation: An Introduction to Logic Models
Public Health Institute, LJMU 4th July 2019

2 Learning outcomes for the event:
Understand evaluation and the role of logic models in the first step of the planning process. Understand how to create a logic model. Understand how to use a logic model to inform the evaluation planning process, including: Confirming key stakeholders. Identifying the scale and scope of the evaluation. Identifying what data are already available/being collected. Identifying what types of data to collect and when. The focus for the session follows a CPD session we delivered in 2018 which focused on the role of evidence in commissioning. Following this event, delegates suggested they would welcome a session which provided advice for planning and carrying out evaluation. The focus for today’s session is on planning for effective evaluation. A further session could explore how to carry out an evaluation, should delegates consider this to be useful.

3 Evidence Ecosystem Systematic reviews
Disseminate evidence to policy makers and practitioners Disseminate evidence to public Implement evidence Evaluate and improve practice Produce evidence Synthesise evidence Systematic reviews Guidance, recommendations, professional standards Evidence Ecosystem Primary research, real world evidence, big data The evidence ecosystem outlines the various stages in evidence-based practice. It shows the flow of evidence through the various stages of production, synthesis, dissemination and adoption. The process is represented as a cycle and as a system that learns through the cycle. Examples: (1) The Digital and Trustworthy Evidence Ecosystem (2) How to achieve more effective services: the evidence ecosystem (3) Evidence for the Frontline Real world evidence

4 Actors in the Ecosystem
Evidence synthesisers Universities Government departments NGO/Charities Private Sector NICE/‘What Works’ Centres Disseminate evidence to policy makers and practitioners Disseminate evidence to public Implement evidence Evaluate and improve practice Produce evidence Synthesis evidence Evidence processors and disseminators Professional bodies/networks Policy organisations NGOs, Media, Private Sector Local Government Actors in the Ecosystem Evidence producers Universities Government departments Research Councils Private Sector Primary research, real world evidence, big data Various actors contribute to the evidence ecosystem. A well functioning ecosystem should produce rigorous, trustworthy evidence about what works and why. All of the actors have a part to play in this. Evidence implementers Practitioners Professional bodies/networks Local commissioners

5 Evaluation in the evidence ecosystem
“Choosing an evidence-based intervention is the foundation, but there are additional necessary tools that adept agencies/organisations must wield to successfully construct an intervention program.” Dr Carolyn Webster–Stratton When commissioning an evidence-based intervention or service there might be a range of well-evidenced options to select from or there may be options that are not yet proven, but which have potential. In both scenarios evaluation can be used. To monitor the ‘real-world’ impact of evidence-based interventions and services; and to generate the evidence needed to move innovations (and other new ways of doing things) up the ‘evidence pipeline’.

6 Commissioning Cycle https://www.england.nhs.uk/participation/
Evaluation is a key part of the commissioning process. resources/commissioning-engagement-cycle/

7 What is evaluation? Conducted to define or judge current care.
Explore current standards. Measure service without reference to another. Involves an intervention which is designed and delivered in accordance with guidance, professional standards. Involves existing data but may include new data. No allocation to an intervention. No randomisation. Although we understand the term evaluation, it is important to distinguish this from ‘audit’ or ‘research’. Here are some key principles about what evaluation is.

8 Why evaluate? Assess how the objectives of the service or project are being met and any areas where they are not. Assess value for money. Assess whether a service is progressing according to plan. Identify opportunities for improvement. Assess service users and or service providers actual experience of a service. Document lessons to be learned for others and for the future. Establish a baseline of performance against which the impact of future initiatives can be compared. Evaluation is an integral part of understand implementation and impact of a project.

9 What are the questions you need to answer?
Are things going according to plan? Why or why not? Are there things we could do to improve what we are doing? Is what we are doing making any difference? How do you know? Is the difference we are making worth the time/effort/money? Can you show this?

10 Why think about this from the start?
What is the plan? Are you collecting the right information from the beginning that will help you understand why it is working (or not)? Are things going according to plan? Why or why not? Are there things we could do to improve what we are doing? Is what we are doing making any difference? How do you know? Is the difference we are making worth the time/effort/money? Can you show this? Are you setting up the necessary process from the beginning to identify areas for improvement as you go along? Have you thought what success will look like before you start? What information do you need to gather to demonstrate this when the time comes? Will you keep a detailed record of the resource invested so that you can make this judgement further down the line?

11 Evaluation needs to… Be planned from the start
Collect data, reflect and refine throughout the life cycle of the programme

12 What to include What to exclude Identify scope

13 Evaluation needs to… Have specific aims and objectives.
Have a clear purpose and focus. Have a clear time-frame. Use stakeholder involvement (PPI). Ideally use mixed- methods. Have clear reporting deadlines. Provide defining research handout

14 Identify stakeholders (individual, significant others, wider stakeholders)
Have clearly defined roles, responsibilities, resources Consider different perspectives

15 Are the right people attending? Is the targeting appropriate?
Process and outcome evaluation should be carried out throughout to ensure ongoing programme development e.g. Are the right people attending? Is the targeting appropriate? Is the delivery right for your population?

16 Logic models help with this…
Identify programme gaps in activity Identify if the right data are being collected

17 This process (a logic model) helps identify programme gaps in activity and whether the right data are being collected to evidence outcomes. Can be desk based or captured with stakeholders during a meeting. This is your theory of change. i.e. the delivery of these activities will achieve these outcomes in the short, medium and longer- term. Evaluation will then test whether this happens and explore how and why.

18 What are logic models? A convincing picture of what you are trying achieve that shows the links between your intended activities, outputs and outcomes A framework for integrating planning, delivery and evaluation It’s not reality but your best prediction of what needs to happen to get to your outcomes Part of a wider planning and performance cycle

19

20 What DOES A logic model LOOK LIKE?
Display of boxes and arrows, vertical or horizontal Any shape possible Level of detail – simple or complex

21 Input Output Outcome The stuff that is done The results that are seen
The impact you are looking for

22 Why use them? Evidence-based story telling (road map)
Communicate your (agreed) vision and plans Provide clarity re activities and outcomes Engage and sustain stakeholders (inspiration) Aid planning and management Focuses and improves implementation Helps you know what and when resources are needed Highlights assumptions and risks Shows similarities and differences to other programme activities Links with bigger picture

23 The changes achieved as a result of the activities
Developing a logic model… Define the outcomes The changes achieved as a result of the activities

24

25 What does the programme actually do?
Developing a logic model… Define the activities What does the programme actually do?

26

27 The countable products
Developing a logic model… Define the outputs The countable products

28 Input Output Outcome The stuff that is done The results that are seen
The impact you are looking for

29

30 Now, lets consider whether we are collecting the right data to evidence whether the outcomes are achieved. Use the arrows to connect - the activities to outputs - and the outputs to outcomes

31

32 How to prioritise what to evaluate?
Now consider the gaps… How to prioritise what to evaluate? RE:AIM (Glasgow, Boles & Vogt, 1999) Reach Effectiveness Adoption Implementation Maintenance

33 Key points: Develop a shared sense of purpose amongst key stakeholders – identify and acknowledge roles and responsibilities within the delivery of a programme. Includes commissioners but also others who would benefit and/or be affected by the delivery of a programme. Who will analyse, collect and report on the data? Up to 10% of a programme budget should be set aside for evaluation. Provide defining research handout

34 Designing data collection tools
Next steps: Designing data collection tools How and when to collect different types of data How to analyse and interpret different types of data

35 Types of process evaluation data to collect and may include:
(qualitative: interviews, focus groups, surveys, monitoring data) Service user: How did they find out about the service? Why did they attend? How easy was it to attend? What was their experience of the service? Were their needs met? Reach Service provider: How easy was it to implement the service? Non-service users: Awareness and barriers to use

36 Types of outcome evaluation data to collect may include:
(quant & qual: interviews, focus groups, surveys, monitoring data) Service user: Achievement of intended outcomes Unintended outcomes Impact on quality of life Service provider: Intended and unintended outcomes Wider system-level outcomes: Impact on partnerships and pathways Do other organisations benefit from the intervention? Significant others: Impact on quality of life

37 More information: Public Health Institute
Faculty of Health, Education and Community Liverpool John Moores University


Download ppt "Planning for Evaluation: An Introduction to Logic Models"

Similar presentations


Ads by Google