Presentation is loading. Please wait.

Presentation is loading. Please wait.

Applied Methodologies in PHI Session 5: Evaluation Kath Roberts (NOO/EMPHO)

Similar presentations


Presentation on theme: "Applied Methodologies in PHI Session 5: Evaluation Kath Roberts (NOO/EMPHO)"— Presentation transcript:

1 Applied Methodologies in PHI Session 5: Evaluation Kath Roberts (NOO/EMPHO)

2 Learning Objectives 1.To understand what is meant by evaluation 2.To increase awareness, through examples, of the use of evaluation in public health 3.To increase awareness of the process of evaluation

3 Evaluation What it is? Why do it? Who is it for? How is it done?

4 What it is “Evaluation is concerned with assessing an activity against values and goals in such a way that results can contribute to future decision making and or policy” Tones & Tilford, 1994

5 What it is “Evaluation is attributing value to an intervention by gathering reliable and valid information about it in a systematic way, and by making comparisons, for the purpose of making more informed decisions or understanding causal mechanisms or general principles.” Ovretveit.J 1998

6 What is it ‘Evaluation is the formal process of judging the ‘value’ of something.’ Nutbeam and Bauman, 2006

7 Myths about evaluation..... does not always involve extensive questionnaires; does not have to mean employing expensive consultants; is not simply about counting everything; does not necessarily have to involve a control group; is not something that should be tagged onto the end of a project.

8 Why evaluate? Finding out whether a project’s aims and objectives have been achieved Assessing what else has been achieved Finding out what went well and what could be improved Influencing a project’s development

9 Why evaluate? Feeding back progress to everyone including funding bodies and supporters Monitoring progress Demonstrating that resources are well- allocated (or not) Sharing experiences with other including potential funders and decision makers

10 What can be evaluated? services - new or well-established one-off interventions policy changes screening programmes media campaigns surveillance systems outbreak investigations communication methods – newsletters, websites, etc IT systems and other tools a training course!... pretty much anything

11 Questions that evaluations can answer What impact is the intervention/service having on the community? Have service users/local community benefited? Are we meeting their needs? Are we reaching the right people? Who are we reaching? Who are we not reaching? Why? Are our services of a high standard? Do they provide good value for money? Which services are well regarded/not so well regarded? How well are we working with partners?

12 When to evaluate? Plan evaluation from the beginning Not an ‘add-on’ Thinking about evaluation can help focus intervention aims and objectives Allocate resources to evalution – WHO suggests 10% of a budget

13 Who wants to know? Many potential stakeholders: need to focus on needs of audience Health practitioners/those involved in delivery and managing the service/intervention The community served or affected Policy makers/commissioners/funders

14 Who wants to know? Different motivators and ‘drivers’ for an evaluation can prompt different questions about a project that need answering: What defines ‘success’? What is seen as the project’s strength and weakness? How can these be measured? Who pays for the evaluation? Who analyses the information? How are results to be shared?

15 Who wants to know? Different perspectives on ‘success’: Number of participants receiving the intervention % with measured change in a key indicator perhaps compared with control group % reduced body weight OR: What actually happened? How did the participants feel about it? How was it delivered?

16 How to evaluate? Key questions to consider.... 1. Why might we expect the project to work? 2. Does the project work? 3. How does the project work?

17 How to Evaluate: Evaluation involves comparing e.g....... a group which gets the intervention v. one which doesn’t (controlled trial)... state of people, populations, organisations, services before and after an intervention... the achievements of intervention compared with the objectives at outset... audit is not the same as evaluation - focuses on processes against standards NOT on impacts

18 How to evaluate: Steps towards an robust evaluation design..... Consider...... Type/Stages of evaluation: Formative Process Impact/Outcome

19 How to evaluate: Steps towards an robust evaluation design..... Consider...... Evaluation design: Experimental Quasi-experimental Pre-experimental

20 Towards an evaluation design: clarifying the customer’s needs 1.What is the purpose of the evaluation? 2.Who are you hoping to influence? 3.What outputs and outcomes will they value? 4.What activities are currently underway and what are you planning to do next? 5.What initial results do you expect from the project? (early wins) 6.What medium term outcomes do you expect from the project (i.e. in the next year/18 months)? 7.What long term outcomes do you expect from the project (i.e. after 5 years)?

21 Towards an evaluation design: Reliable and valid information is key! -Relate evaluation to aims & objectives (SMART measures!) -Data Collection Methods: direct and indirect measures, focus groups, interviews, questionnaires -Analyses and Reporting

22 Think about what information is needed for the evaluation – process and outcome data Think about the different ways you can collect additional information that might be useful (?quantitative, ?qualitative) and try to avoid collecting data that will not be useful Ensure consistent data collection mechanisms There is no right and wrong way to collect information Be creative Towards an evaluation design: Reliable and valid information is key!

23 Towards an evaluation design: Think about people who you can collect information from Professionals Managers Frequent users Infrequent/sporadic users Partner organisations The local community – and its different component parts Politicians Anyone involved in or effected by the service or intervention!

24 Reporting and Analyses Often data are collected and never analysed! Waste! Data need to be analysed in order to LEARN from the evaluation Even basic analysis can show important findings Type of analysis needs to be closely related to study design Appropriate statistical tests chosen (for quantitative analysis) Appropriate analytical analysis method (for qualitative studies) Number of texts to help with this or ask your PHO!

25 Reporting and Analyses How to present the data analyses? Is there going to be a written report and / or other ways of reporting the findings of the evaluation? Will the findings need to be reported to different audiences using different formats? How can the clarity of the report be ensured? (it should use clear, plain language, and be honest about its findings!) Ensure style of reporting suits the audience’s needs and evaluation’s aims

26 Some things to think about (1) Every intervention/service should be evaluated Try and build evaluation into the intervention/service as early as possible Be clear and transparent about why you are evaluating it Evaluation time and resources should be in proportion to the overall project Think about what future funders would want to know ETHICS! Need to give consideration to this to ensure the rights, safety, dignity and well-being of participants

27 Some things to think about (2) Involve people from the start – all stakeholders Keep it simple and (if possible) fun Make sure that everyone is using the same terminology Provide plenty of feedback... to stakeholders and participants Think qualitative as well as quantitative Share all outcomes - positive, negative, unexpected Share findings and learning as widely as possible

28 Useful additional reading National Obesity Observatory, 2009. Standard Evaluation Framework for weight management interventions. www.noo.org.uk\SEF Nutbeam, D. & Nauman, A., 2006. Evaluation in a Nutshell: a practical guide to the evaluation of health promotion programs. Sydney: McGraw Hill Publishers Pawson, R. & Tilley, N., 2007. Realistic Evaluation. London: Sage Publications

29 Case Study/Exercise: HEALTH FOR LIFE – weight management intervention


Download ppt "Applied Methodologies in PHI Session 5: Evaluation Kath Roberts (NOO/EMPHO)"

Similar presentations


Ads by Google