Introduction to Evaluation Odette Parry & Sally-Ann Baker

Slides:



Advertisements
Similar presentations
Focusing an Evaluation Ben Silliman, Youth Development Specialist NC 4-H Youth Development.
Advertisements

Evaluation Mary Rowlatt MDR Partners. Definition of project evaluation Evaluation focuses on whether the project was effective, achieved its objectives,
Donald T. Simeon Caribbean Health Research Council
HR Manager – HR Business Partners Role Description
NMAHP – Readiness for eHealth Heather Strachan NMAHP eHealth Lead eHealth Directorate Scottish Government.
OECD/INFE High-level Principles for the evaluation of financial education programmes Adele Atkinson, PhD OECD With the support of the Russian/World Bank/OECD.
Project Monitoring Evaluation and Assessment
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
B121 Chapter 7 Investigative Methods. Quantitative data & Qualitative data Quantitative data It describes measurable or countable features of whatever.
Advances research methods and proposal writing Ronan Fitzpatrick School of Computing, Dublin Institute of Technology. September 2008.
Research Methods for Business Students
Business research methods: data sources
Chapter 3 Preparing and Evaluating a Research Plan Gay and Airasian
Evaluation. Practical Evaluation Michael Quinn Patton.
Training and assessing. A background to training and learning 1.
Formulating the research design
Evaluating Physical Activity Intervention Programs Thomas Schmid, PhD Physical Activity and Health Branch CDC Atlanta, Georgia, USA.
Standards and Guidelines for Quality Assurance in the European
Evaluation tools. Content Interview Focus group Interview are the main methods for collecting data in qualitative research. more structured the questions.
Engaging Service Users – Resources for Qualitative Research (CRSI Workshop 10 th June 2009) An example of qualitative research Helena O Connor.
Creating a service Idea. Creating a service Networking / consultation Identify the need Find funding Create a project plan Business Plan.
Implementation & Evaluation Regional Seminar ‘04 School Development Planning Initiative “An initiative for schools by schools”
Instructional System Design
Data and Data Collection Questionnaire
Program Evaluation Using qualitative & qualitative methods.
1 School Inspection Update Key Changes since January 2014 Updates continued 17 June 2014 Name Farzana Aldridge – Strategic Director & Caroline Lansdown.
Developing Business Practice –302LON Introduction to Business and Management Research Unit: 6 Knowledgecast: 2.
Impact assessment framework
PLANNING YOUR EPQ How to write a great research paper – Cambridge Uni.
Achieving Quality: Involving clients, staff and other stakeholders in quality audits Claire Tuffin, Head of Business Excellence.
Kazakhstan Centres of Excellence Teacher Education Programme Assessment of teachers at Level Two.
Slide 1 D2.TCS.CL5.04. Subject Elements This unit comprises five Elements: 1.Define the need for tourism product research 2.Develop the research to be.
The Audit Process Tahera Chaudry March Clinical audit A quality improvement process that seeks to improve patient care and outcomes through systematic.
OECD/INFE Tools for evaluating financial education programmes Adele Atkinson, PhD Policy Analyst OECD With the support of the Russian/World Bank/OECD Trust.
INTERNATIONAL LABOUR ORGANIZATION Conditions of Work and Employment Programme (TRAVAIL) 2012 Module 13: Assessing Maternity Protection in practice Maternity.
What is a reflection? serious thought or consideration the fixing of the mind on some subject;
EVALUATION APPROACHES Heather Aquilina 24 March 2015.
Workshop 6 - How do you measure Outcomes?
Methods: Pointers for good practice Ensure that the method used is adequately described Use a multi-method approach and cross-check where possible - triangulation.
W HAT IS M&E  Day-to-day follow up of activities during implementation to measure progress and identify deviations  Monitoring is the routine and systematic.
Evaluation: what is it? Anita Morrison. What is evaluation? Evaluation… –…is the process of determining the merit, worth, or value of something, or the.
1 Evaluating the Quality of the e-Learning Experience in Higher Education Anne Jelfs and Keir Thorpe, Institute of Educational Technology (IET), The Open.
Program Evaluation.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
Chapter 8 New Wave Research: Contemporary Applied Approaches.
Monitoring and Evaluation
Evaluation: Methods & Concerns Otojit Kshetrimayum V.V. Giri National Labour Institute, Noida
Evaluation and Designing
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
Research Methods Observations Interviews Case Studies Surveys Quasi Experiments.
Evaluation: from Objectives to Outcomes Janet Myers, PhD MPH AIDS Education and Training Centers National Evaluation Center
Pilot and Feasibility Studies NIHR Research Design Service Sam Norton, Liz Steed, Lauren Bell.
Evaluating Engagement Judging the outcome above the noise of squeaky wheels Heather Shaw, Department of Sustainability & Environment Jessica Dart, Clear.
 The key concept when looking at research methods is to determine the ways in which sociologist go about developing theories.  A theory is a general.
AssessPlanDo Review QuestionYesNo? Do I know what I want to evaluate and why? Consider drivers and audience Do I already know the answer to my evaluation.
WHY IS THIS HAPPENING IN THE PROGRAM? Session 5 Options for Further Investigation & Information Flow.
P3 Business Analysis. 2 Section F: Project Management F1.The nature of projects F2. Building the Business Case F4. Planning,monitoring and controlling.
The research ethics review process Hazel Abbott, Chair University Research Ethics Committee.
Fifth Edition Mark Saunders, Philip Lewis and Adrian Thornhill 2009 Research Methods for Business Students.
Evaluation Planning Checklist (1 of 2) Planning Checklist Planning is a crucial part of the evaluation process. The following checklist (based on the original.
Improved socio-economic services for a more social microfinance.
Analysis of Research and Impact on Practice, Action Planning Monday December 13th.
Supported by: Student Hubs is a registered charity in England and Wales, number Impact Measurement Proving and Improving your Social Impact.
Applied Methodologies in PHI Session 5: Evaluation Kath Roberts (NOO/EMPHO)
Training Trainers and Educators Unit 8 – How to Evaluate
Data Collection Methods for Problem Statement
Unit 6 Research Project in HSC Unit 6 Research Project in Health and Social Care Aim This unit aims to develop learners’ skills of independent enquiry.
Training Trainers and Educators Unit 8 – How to Evaluate
Using the EFQM Excellence Model to support the role of a trustee
Presentation transcript:

Introduction to Evaluation Odette Parry & Sally-Ann Baker

Aim and objectives of our presentation  Define evaluation and examine how it differs from research  Briefly introduce different evaluation designs and data collection approaches -Strengths and limitations  Developing evaluation plans  Governance and ethical issues

What is evaluation?  Evaluation is “a set of procedures to judge a services merit by providing a systematic assessment of it’s aims, objectives, activities, outputs, outcomes and costs.” (NHS Executive, 1997)  Evaluation is learning about ‘what works’ and lessons for future development Evaluation determines how a service is doing

So why evaluate?  Provides evidence of: - Is what we are doing working? - What are the benefits and impacts? - What was successful and what not? And why? - Have/are objectives being achieved?

So why evaluate?  Provides evidence for: - Stakeholders - Further programme development - Staff development - Other organisations - Funders

What do you want to find out?  What is happening and how often?  How is it happening and why is it happening as it is?

Approaches  The ‘What and How Much’ question in evaluation addresses measurable ‘OUTCOMES’ associated with Quantitative approaches (e.g. RCTs and Surveys)  The ‘How and Why’ question addresses PROCESS, associated with Qualitative approaches (e.g. semi-structured interviews, observation and focus groups)

Horses for Courses  Evaluation usually requires both Outcome & Process information  Evaluation may Formative &/or Summative  Small scale evaluations, while most often qualitative, do collect some quantifiable data

Plan your evaluation Evaluation is not a ‘bolt on’ Evaluation key to informing project development and delivery Plan early Involve funders, stakeholders in the planning process

Define your purpose  Clearly define the purpose of the evaluation  What are the main aims and objectives This is a key step in the planning process and will guide how the findings will be used

Develop an evaluation plan What is being evaluated Purpose of the evaluation What is known already The questions the evaluation will address Evaluation method to be used Who & Where will the evaluation take place Timescales The resources available

Be realistic about what can be achieved Often evaluation is a compromise between the ideal and the achievable - the wants of different groups - constraints of methodology - evaluator skills - resource limitations - time limitations - ethical and governance issues

Different ways to collect data  Decide which methods to use in order to get the information you need. You may use one or more of the following: - Existing data - Document analysis - Interviews - Self completed questionnaires - Observation - Focus Group

Using existing data  Use data routinely collected to examine process and outcomes  Can save time but need to ensure that data collected, in a form that can be analysed and consistent with evaluation plan  Evaluation of All Wales Dietetics Scheme used Minimum data sets. Developed in conjunction with WAG and project dieticians to ensure consistency with evaluation aims  Collected data on project activity e.g. courses run, start and completion dates, description of activities, participant details, course outcomes, partnerships built etc

Document Analysis  Policy documents, minutes, operational policies - Track project development, aims of project  Whilst can provide rich information  May only provide part of picture  May be open to subjective interpretation

Observation  Observing participants in activities  Participant vs non participant  Observer effects  Not suitable for some settings  Can be difficult to collect & record data

Interviews  Enables in depth exploration of how people think & feel about certain topics, effectiveness of your intervention etc  Rich data, allows in-depth understanding  Can explore more sensitive area  Can tailor to the individual  No group influence  Resource intensive

Focus groups  Investigation of how groups perceive topics, view the effectiveness of your intervention  Clarify issues identified in surveys  Provide solutions to problems  Less resource intensive and 1 to 1 interviews  Group management issues  Group process issues

Questionnaires  Identification of patterns, trends  Investigate needs, expectations, perspectives, preferences, satisfaction, knowledge  Large samples, relatively lower cost  Ease of analysis  Low response rates  Respondent bias  Language and literacy issues

Analysing your data  Avoid bias, involve more than one person in the task  Address your key questions  Combine data types & findings from different sources  Compare views of different groups  Don’t anticipate results, look for unexpected findings

Evidence of impact Focus on whether:  the purpose has been achieved  the needs of those who take part have been met  there are unintended outcomes arising from the intervention  the intervention has resulted in changes in behaviour  there are barriers to and facilitators of successful implementation

Using your findings  Inform strategy, policy development  Inform budgets  Inform funding proposals  Inform improvement plans & and make changes  Identification of training needs  Identification of areas for future research/evaluation

Presenting evaluation findings  Present balanced view, don’t just report the positive  Consider the needs of your audience,  Different versions of reports or types of presentation might be needed

Reflect on the evaluation process  Were aims and objectives met? If not why  Were methods employed appropriate?  Tools, recruitment methods, analysis etc  Did you reach target groups?  Were resources sufficient?  What changes resulted ?  Future impacts?

Governance Issues  Be careful about crossing over from evaluation into the realms of research – carries additional connotations for ethics approval etc  Take advice  Take an ethical/professional approach to the evaluation, consider the participant and yourself

Ethics and the rights of Participants  Requirements of Data Protection Act (1998)  Some evaluations may require ethical approval Issues to consider - Informed Consent - Deception - Debrief - Withdrawal - Confidentiality

Data Protection Act 1998 – Your responsibility  Only hold genuinely required information  Certain types of sensitive information carry added restrictions and may only be used with permission  Information should only be collected for the purposes for which it was initially required  Data must only be processed in accordance with the legislation  People must not be harmed by how their information is used  People have a right to view information being held about them  Information must be accurate, kept up to date, kept secure and deleted when obsolete

In summary  Evaluation an essential component  Need to plan  Variety of approaches  Methods used dependent on questions to be asked  Consideration of Governance issues

Resources  Data Protection Act see the information commissioners pages aspx aspx  National Research Ethics Service if your evaluation takes place on NHS premises or with Staff, patients ethical approval might be needed check with your local R&D office and LREC administrator  UK Evaluation Society has a useful resource page