Evaluation tools training

Slides:



Advertisements
Similar presentations
Monitoring and Evaluating Training. What is monitoring? Collecting information about your project Planned, organised and regular collection Information.
Advertisements

Evaluating and Measuring Impact Presented by – Date –
Project Monitoring Evaluation and Assessment
Observing Learning Helen Bacon and Jan Ridgway Inclusion Support Services.
Formulating the research design
Monitoring Evaluation Impact Assessment Objectives Be able to n explain basic monitoring and evaluation theory in relation to accountability n Identify.
The Evaluation Plan.
“Advanced” Data Collection January 27, Slide 2 Innovation Network, Inc. Who We Are: Innovation Network National nonprofit organization Committed.
Birmingham Primary Strategy Team Subject Leader Training Literacy Co-ordinator Session Monitoring Progress - Book Trawls/Scrutinies.
Evaluation Workshop Self evaluation – some workable ideas and approaches.
Alain Thomas Overview workshop Background to the Principles Definitions The National Principles for Public Engagement What.
EVALUATING ARTS AND HEALTH COLLABORATIVE PRACTICE October 31 st 2012 Dublin.
Childhood Neglect: Improving Outcomes for Children Presentation P21 Childhood Neglect: Improving Outcomes for Children Presentation Measuring outcomes.
BTEC Unit 6 Outcome measures. Objectives To evaluate some outcome measurement forms To identify ways of measuring outcomes effectively To design an outcome.
Supported by: Student Hubs is a registered charity in England and Wales, number Impact Measurement Proving and Improving your Social Impact.
Applied Methodologies in PHI Session 5: Evaluation Kath Roberts (NOO/EMPHO)
Measuring the Results of Your Volunteer Efforts With acknowledgement to Nikki Russell, United Good Neighbors of Jefferson County.
Reading Together, Working Together. What is shared reading? Reading aloud, together, for pleasure rather than functional literacy Aimed at people who.
Research in Sociology  Like all scientists, sociologists gain knowledge by doing research. They ask “how” and “why” and then they form a hypothesis 
EVALUATION An introduction: What is Audience Data and
Data Collection Techniques
Consulting with deaf children and young people
SLP Training Day 3 30th September 2016
Introduction paragraph – what looking to investigate.
Monitoring and Evaluation in Asset Based Approaches
An introduction to Research Methods
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 3. Session 9. Periodic data collection methods.
Monitoring, Evaluation and Learning
The Government’s perspective on measuring disability employment
Support for English, maths and ESOL Module 5 Integrating English, maths and ICT into apprenticeship programmes.
Research Skills Workshop
Making Housing Matter 5 July 2016, Skainos Centre.
Evaluation of Research Methods
Person Centred Care in NHS Wales
Community program evaluation school
Data collection Initial thoughts
MARKET RESEARCH Can you identify 3 types of market research?
Learning and Teaching –
Study Programmes: Modelling & Operation Project
OUTCOME MEASUREMENT TRAINING
Providing Evidence for your Impact
4.1.
Program Evaluation Essentials-- Part 2
Business and Management Research
Chapter Three Research Design.
Evaluating CLE Community Legal Education Masterclass
Year 10 Business Unit 1: Introduction to Small Business
DATA COLLECTION PRIMARY & SECONDARY Presentation By Akbar Salim Shaikh.
Participatory Evaluation
West Sussex SEND Pathways to Adulthood Strategy The story so far.
Youngwummin: Ethics and Data Collection Methods
Developing an Evaluation Plan
Evaluating with Students Workshop
Evaluate the effectiveness of the implementation of change plans
Data and Data Collection
Starter Name the two types of data One disadvantage of primary data
Market Research Sampling Methods.
Chapter 23 Deciding how to collect data
Developing an Evaluation Plan
Participatory Evaluation
4.12 Understand data-collection methods to evaluate their appropriateness for the research problem/issue.
Business and Management Research
Improvement 101 Learning Series
Monitoring, Evaluation and Learning
Research Methods The goal of sociological research is to test “common sense” assumptions and replace false ideas with facts and evidence. Sociologists.
Focus Groups.
Creative assessment and feedback
Information Gathering
West Midlands Funders Forum Participation Round Table
Lesson 4: Self-Report Techniques
Presentation transcript:

Evaluation tools training BBO Shropshire Evaluation tools training

Introductions Who you are Three things about your project: Who is it for? What will they do? How will they benefit?

Evaluation objectives To obtain feedback from participants, partners and stakeholders on what is working well and less well, in order to inform changes to the delivery model as the project progresses, To share good practice and improve the way partners work together to improve employment prospects of those in need To provide evidence of impact which will help inform future funding bids In addition to these objectives, the evaluation should be: Meaningful Objective, accurate and high quality Be embedded in the delivery of the project

Primary and secondary research Evaluation framework Primary and secondary research Analysis and Report Learning Events Self-evaluation (data you collect) Independent evaluation (data we collect)

Outcomes Questions Tools Analysis and Report

Outcomes Short Medium Long Increased skills Better partnership working Reduced social exclusion Improved self-esteem and relationships Improved wellbeing and quality of life Improved employment prospects Long Less long term dependency on benefits Improved long term prosperity and quality of life

Methods Participant monitoring data – who, what, when etc… Partner research - e-surveys and interviews Participant research: E-survey (process) Interviews/focus groups Entry and exit surveys (impact) Outcome star Qualitative tools – choose from a selection Your own tools Us You

Surveys Entry – at start of involvement in BBO Exit – at end of involvement in BBO Doesn’t matter if they complete the exit survey more than once Outcome star – entry, exit and every 6 months

Market Research Society Guidelines Give consent to take part Be honest Be clear why you are doing it Respect confidentiality Ensure that respondents are not harmed or adversely affected Balance the needs of individuals, clients and the research Carried out by people who have been trained

Do’s and Don’ts Do’s and don’ts Common issues: Mis-reporting Mis-understanding Reluctance to complete Literacy issues Mental capacity issues Other scenarios – what do you do?

Completing the survey Designed for people to complete themselves But be available to provide support Reassure them what the data will be used for Make it entirely optional Practice session – ask each other the questions. Anything they don’t understand or want clarification about? Any changes needed?

Completing the survey Before you start: Do you understand the questions? What type of answers are expected? Be prepared to answer questions

Sampling Why is this important? Aim is to achieve a confidence level of 95%, with a margin of error of +/- 5% The smaller the population, the greater % of returns you need Total participants Surveys needed % 1000 278 28% 1500 306 20% 2000 322 16%

Practice Ask each other the survey Any issues?

Outcome star Complete at start, end and every 6 months

Qualitative tools Quantitative tools (surveys) will tell you WHAT you have achieved Qualitative tools tell you WHY and HOW You get to choose! Observation Interview Stories

Observation About the Tool The person leading or observing the sessions tracks progress of individuals or the group Advantages It is easy to do and requires little time from participants in the group. It allows the observer to consider changes in individuals and the way the whole group interacts. They work well with groups that struggle with surveys and with participants who may not recognise the changes in themselves Disadvantages You are making a judgement on the change that has happened in people and so it is your opinion rather than the participants. The observer needs to be systematic so it takes time from the group leader. It can take time to record and analyse data.

Interviews About the Tool You ask people a set of questions about involvement in the project. Advantages It is easy to do and gives you key insights into why or how your project is working. They provide lots of information. You can use interesting tools like video or pod casts to collect and then share what participants say. They can be a great way of involving volunteers in the project. Disadvantages They are generally conducted one to one and so this can be time consuming and take the group leader away from the session. They generate a lot of information that needs to be analysed.

Stories About the Tool People tell you about their experience of being involved in your project or activity Advantages Stories are brilliant. They allow participants to tell you in their own way what they have got out of a project. You can use writing, drawing, film or audio to collect peoples feedback. You can do it formally or informally at the end of sessions. They can generate information you can use in case studies and to showcase with funders as well as giving you insight into your project. Disadvantages If you are collecting lots of stories it can generate lots of information for you to analyse.

Your choice…. Pick a tool! Every quarter, complete either… 3 interviews 3 stories, or 3 observations

Evaluation timetable Next week – final versions of tools and guidance issued Data you have collected – send to us by end November ’17 Sep-Nov ’17 – we will: E-survey/interviews with partners E-survey/interviews with participants Year 1 report – end December ‘17 Learning to Action workshop – Jan ‘18