We think you have liked this presentation. If you wish to download it, please recommend it to your friends in any social system. Share buttons are a little bit lower. Thank you!
Presentation is loading. Please wait.
Published byEsmeralda Lightell
Modified over 2 years ago
© SEESAC, 2006 Name, Organisation, Event Location, Date Monitoring and evaluation during SALW Awareness programmes
© SEESAC, 2006 Part 1 – 30 minutes 1. Define monitoring and evaluation 2. Explain the differences between them 3. Explain why they are important
© SEESAC, 2006 Part minutes 1. Describe the different levels at which monitoring and evaluation can be carried out
© SEESAC, 2006 Part 3 – 40 minutes 1. List some common tools for doing monitoring and evaluation 2. Explain why it is best to use a mix of tools 3. List some common criteria for selecting monitoring and evaluation tools
© SEESAC, 2006 Part 1
© SEESAC, 2006 Exercise 1 1. What is monitoring? 2. What is evaluation? 3. What are the similarities and differences? 4. Is it important to do M&E, and if so, why?
© SEESAC, 2006 An informal explanation: Monitoring and evaluation are two forms of activity which aim to measure how your work is progressing Measurement during a project = monitoring Measurement after a project = evaluation What are monitoring and evaluation?
© SEESAC, 2006 A formal definition (SASP 2)… Monitoring and evaluation (M&E) = Collecting and analysing information…. …to determine whether those groups engaged by a SALW Awareness programme have,….. …as a result of the intervention,…. …changed their awareness of, and attitudes and behaviour towards, SALW,…. … in line with the stated programme objectives.
© SEESAC, 2006 Table 9, (SASP 2)… WHAT IT INVOLVESWHEN IT OCCURS WITHIN THE PROGRAMME CYCLE Monitoring Tracking progress towards the achievement of objectives, in order to identify what is working and what isnt working so well, allowing a degree of adaptability in strategy and tactics as appropriate. Ongoing. Evaluation A more structured and formal process of reviewing achievements, in order to make judgements about past effectiveness and learn from experience to improve future practice. At fixed times – for projects lasting more than 18 months, this would normally include a mid-term review as well as an evaluation conducted at the completion of the project.
© SEESAC, 2006 Why are they important? Test for effectiveness Better appreciation of area where working Learn how events have affected the work Adapt programme both during project life- cycle (M) and before next phase (E) Identify good practice to use elsewhere Information to share with others
The programme cycle Feasibility study Analysing and planning Evaluation Designing activities and materials Field-testing Implementing activities Monitoring and reviewing Needs and capacity assessment Planning for monitoring and evaluation
© SEESAC, 2006 The time difference… RESEARCH + ANALYSIS DESIGNIMPLEMENTEVALUATE - Monitoring -- Evaluation -
© SEESAC, 2006 Part 2
© SEESAC, 2006 Imagine you are doing SALW Awareness …
© SEESAC, 2006 Levels of M&E (SASP 2) LEVEL OF EVALUATION PURPOSEKEY QUESTIONS Activities To assess how well the programme has been organised and whether resources have been used efficiently. Outcomes To identify changes in knowledge, attitudes and behaviour among target groups that can be reasonably attributed to the programme. Impact To explore how a particular programme may have made a difference to the lives of specific groups of people, e.g. better security. - Are we sending people the correct messages? - Are the messages reaching the right people? - Are there any signs that knowledge, attitudes and beliefs are changing? - Are there any signs that behaviour is changing? - What impact has the programme had in terms of security / casualties etc.?
© SEESAC, 2006 Activity monitoring example.. Monitoring a TV spot Watch TV! Monitoring a community meeting Phone the local organiser afterwards
© SEESAC, 2006 Outcome monitoring example… Balkan Youth Union (BYU) 5 th April 2003, central Belgrade BYU and children destroyed 500 toy weapons Puppet show T-shirts MUP information leaflets to support collection
© SEESAC, 2006 RTS and RTV B-92, as well as to the journalists of dailies DANAS, BLIC and POLITIKA
© SEESAC, 2006 Outcome monitoring example… Outcome evaluation by: Letters to BYU (hundreds) Media coverage of interviews with the public
© SEESAC, 2006 Impact monitoring and evaluation… Difficult Other factors Casualty figures Crime levels Observe weapons visibility
© SEESAC, 2006 Six questions… 1. How has your awareness programme reduced the number of weapons casualties resulting from weapons in target communities? 2. Are the messages being promoted reaching the right people? 3. Are there any signs of changes in practice or behaviour? 4. How has your awareness programme changed security in targeted communities? 5. Are there any signs that knowledge, attitudes and beliefs are changing? 6. Are the messages being promoted the right ones?
© SEESAC, 2006 Part 3
© SEESAC, 2006 Things to consider: Cost Staffing Skill-levels Representativeness Geographic coverage Depth of explanation Access to social groups Level of participation
© SEESAC, 2006 Common M&E Tools Interviews Focus groups Questionnaires Secondary (desk) research Participatory methods
© SEESAC, 2006 Group Exercise… Group 1: Cheap Give a deep understanding of target groups feelings about SALW Capture womens views well Group 2: Allow generalisations to be made about the wider population Allow respondents to participate Group 3: Build the skills of respondents Capture information about unexpected impacts
Assessment, Data collection methods Baseline Survey Module 3 – Session 1 Assessment – Time line Data collection methods Baseline survey.
IAEA International Atomic Energy Agency The IAEA Safety Culture Assessment Methodology.
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
In-depth look at ISACS Conducting small arms and light weapons surveys.
Introduction to Impact Assessment This introduction uses the Weavers Triangle, designed by Jane Weaver it has been developed by the Charities Evaluation.
Course on Data Analysis and Interpretation P Presented by B. Unmar Sponsored by GGSU PART 2 Date: 5 July
Exploring Evidence. What is evidence? The evidence of learning that is collected should enable teachers to make judgements about what individual students.
1 Minority SA/HIV Initiative MAI Training SPF Step 3 – Planning Presented By: Tracy Johnson, CSAP’s Central CAPT Janer Hernandez, CSAP’s Northeast CAPT.
WFP Initial EFSA Learning Session 2.1. Initial EFSA Primary Data: Key Informants & Sites Selection 1.
Exploitation means to use and benefit from something. For Erasmus+ this means maximising the potential of the funded activities, so that the results are.
CONDUCTING A PUBLIC OUTREACH CAMPAIGN IMPLEMENTING LEAPS IN CENTRAL AND EASTERN EUROPE: TRAINERS’ HANDBOOK Conducting a Public Outreach Campaign.
1 Critical issue module 5 Landmine awareness. 2 Topic 1 The issue for children Topic 2 The law and child rights Topic 3 Assessment and situation analysis.
Evaluation is a professional and ethical responsibility and is a core part of PHN professional practice Commitment to evaluation helps build the PHN intelligence.
1 A proposed skills framework for all 11- to 19-year-olds.
1 Project Cycle Management and Statistics Module 5.
Using Communications for Development 19 May 2006.
Community Planning Training 1-1. Community Plan Implementation Training 1- Community Planning Training 1-3.
© SEESAC, 2006 Name, Organisation, Event Location, Date Information collection and management.
Community Planning Training 5- Community Planning Training 5-1.
Tools and techniques to measuring the impact of youth work (Caroline Redpath and Martin Mc Mullan – YouthAction NI)
Recreational Therapy: An Introduction Chapter 3: The Recreational Therapy Process PowerPoint Slides.
SJI Impact Assessment 2014 Based on Impact Assessment for Development Agencies – Learning to Value Change (Chris Roche, Oxfam, digital file 2010)
Evaluating Training The Kirkpatrick Model. The Four Levels Level 1: Reaction To what degree participants react favorably to the training Level 2: Learning.
Copyright Keith Morrison, 2004 CURRICULUM DESIGN AND DEVELOPMENT.
Copyright © 2014 by The University of Kansas Using the Evaluation System to Answer Key Questions About Your Initiative.
Rationale To encourage all students to take a full part in the life of our school, college, workplace or wider community. To provide opportunities to enable.
European Social Fund Promoting improvement Shirley Jones.
Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
Tools in Media Research In every research work, if is essential to collect factual material or data unknown or untapped so far. They can be obtained from.
Donald T. Simeon Caribbean Health Research Council.
Monitoring and Evaluating in the PHEA Educational Technology Initiative Patrick Spaven.
Monitoring and Evaluation Systems for NARS organizations in Papua New Guinea Day 4. Session 10. Evaluation.
Evaluation: Methods & Concerns Otojit Kshetrimayum V.V. Giri National Labour Institute, Noida
Managing SEN: Monitoring and Evaluation – Gathering Evidence: Making Judgements Day 1.
Monitoring and Evaluation for Adult Education Programmes Module 1 © 2013 PRIA International Academy | Appreciation Courses Monitoring and Evaluation for.
1 Planning Presented By: Tracy Johnson, Central CAPT.
ASSESSMENTS PSYCHOSOCIAL INTERVENTIONS ASSESSMENT.
1 Evaluating Communication Plans Cvetina Yocheva Evaluation Unit DG REGIO 02/12/2009.
Formative and Summative Evaluations Instructional Design For Multimedia.
The P Process Strategic Design. Presentation Objectives By the end of this presentation you should be able to: explain the second step of P Process; describe.
Communication and Language. Listening and attention: Children listen attentively in a range of situations. They listen to stories accurately anticipating.
21/4/2008 Evaluation of control measures 1. 21/4/2008 Evaluation of control measures 2 Family and Community Medicine Department.
Evelyn Gonzalez Program Evaluation. AR Cancer Coalition Summit XIV March 12, 2013 MAKING A DIFFERENCE Evaluating Programmatic Efforts.
PPA 502 – Program Evaluation Lecture 2c – Process Evaluation.
HIV/AIDS Results Monitoring and Evaluation Systems Measuring the Multi-Sector Response.
Policies and Procedures for Civil Society Participation in GEF Programme and Projects presented by GEF NGO Network ECW.
Evaluating Physical Activity Intervention Programs Thomas Schmid, PhD Physical Activity and Health Branch CDC Atlanta, Georgia, USA.
Introduction to Hygiene promotion in Emergency Module 1 – Session 3 Definitions, Methods and Tools.
Monitoring Afghanistan, 2015 Food Security and Agriculture Working Group – 9 December 2015.
© 2017 SlidePlayer.com Inc. All rights reserved.