Note to evaluator… The overall purpose of this presentation is to guide evaluators through the completion of step 7 of the UFE checklist and to “level.

Slides:



Advertisements
Similar presentations
Protocol Development.
Advertisements

Barbara M. Altman Emmanuelle Cambois Jean-Marie Robine Extended Questions Sets: Purpose, Characteristics and Topic Areas Fifth Washington group meeting.
The Systems Analysis Toolkit
Enhancing Data Quality of Distributive Trade Statistics Workshop for African countries on the Implementation of International Recommendations for Distributive.
Evaluation.
Problem Identification
Evaluating and Revising the Physical Education Instructional Program.
Business research methods: data sources
By the end of this chapter you will be able
1 Writing the Research Proposal Researchers communicate: Plans, Methods, Thoughts, and Objectives for others to read discuss, and act upon.
Evaluation. Practical Evaluation Michael Quinn Patton.
Types of interview used in research
Technical Writing Function. The purpose of having guidelines is to make the document more readable. Standard guidelines govern – Format – page layout,
The Dissertation/Research Proposal Guidelines are adapted from Yildirim’s “Student Handbook for Ph.D. Program”.
Formulating the research design
Creating Research proposal. What is a Marketing or Business Research Proposal? “A plan that offers ideas for conducting research”. “A marketing research.
WRITING A RESEARCH PROPOSAL
The phases of research Dimitra Hartas. The phases of research Identify a research topic Formulate the research questions (rationale) Review relevant studies.
Evaluation and Attitude Assessment BME GK-12 Fellows 12/1/10, 4-5pm 321 Weill Hall Tom Archibald PhD Student, Adult & Extension Education GRA, Cornell.
Proposal Writing.
How to Develop the Right Research Questions for Program Evaluation
PACINA Online A Mini-Tutorial on Reading and Interpreting a Published Information Needs Analysis facilitated by Andrew Booth, ScHARR, University of Sheffield.
FORMATIVE EVALUATION Intermediate Injury Prevention Course August 23-26, 2011, Billings, MT.
1 Qualitative Evaluation Terms Coding/categorization The process of condensing qualitative data through the identification of common themes. Data Matrix.
Product Evaluation the outcome phase. Do the magic bullets work? How do you know when an innovative educational program has “worked”? How do you know.
Research Questions and Introduction to the Master’s Project
McGraw-Hill © 2006 The McGraw-Hill Companies, Inc. All rights reserved. The Nature of Research Chapter One.
Business research methods: using questions and active listening
Cognitive Interviewing for Question Evaluation Kristen Miller, Ph.D. National Center for Health Statistics
Developing Business Practice –302LON Introduction to Business and Management Research Unit: 6 Knowledgecast: 2.
Impact assessment framework
INTRODUCTION TO UTILIZATION FOCUSED EVALUATION SLEVA Colombo June 6, 2011 Facilitators: Sonal Zaveri Chelladurai Solomon IDRC Consultants Assisted by Nilusha.
Too expensive Too complicated Too time consuming.
Week 8: Research Methods: Qualitative Research 1.
Program Evaluation EDL 832 Jeffrey Oescher, Instructor 6 June 2013.
How to Write a Critical Review of Research Articles
Semester 2: Lecture 9 Analyzing Qualitative Data: Evaluation Research Prepared by: Dr. Lloyd Waller ©
Proposal Development Sample Proposal Format Mahmoud K. El -Jafari College of Business and Economics Al-Quds University – Jerusalem April 11,2007.
Course on Data Analysis and Interpretation P Presented by B. Unmar Sponsored by GGSU PART 2 Date: 5 July
INTERNATIONAL LABOUR ORGANIZATION Conditions of Work and Employment Programme (TRAVAIL) 2012 Module 13: Assessing Maternity Protection in practice Maternity.
Evaluating a Research Report
Setting the Stage: Workshop Framing and Crosscutting Issues Simon Hearn, ODI Evaluation Methods for Large-Scale, Complex, Multi- National Global Health.
EVALUATION APPROACHES Heather Aquilina 24 March 2015.
Methods: Pointers for good practice Ensure that the method used is adequately described Use a multi-method approach and cross-check where possible - triangulation.
Good Hygiene Practices along the coffee chain Training – Evaluating and reinforcing training Module 5.3.
Brandi Kirkland and Skyla Forrester. Is Instruction the Answer? Purpose for identifying the problem is to determine whether instruction should be part.
S519: Evaluation of Information Systems Result D-Ch10.
 Now we are ready to write our evaluation report.  Basically we are going to fill our content to the checklist boxes we learned in lec2. S519.
The Major Steps of a Public Health Evaluation 1. Engage Stakeholders 2. Describe the program 3. Focus on the evaluation design 4. Gather credible evidence.
Facilitating UFE step-by-step: a process guide for evaluators Joaquín Navas & Ricardo Ramírez December, 2009 Module 1: Steps 1-3 of UFE checklist.
Copyright © Allyn & Bacon 2008 Intelligent Consumer Chapter 14 This multimedia product and its contents are protected under copyright law. The following.
Copyright 2010, The World Bank Group. All Rights Reserved. Testing and Documentation Part II.
The Proposal AEE 804 Spring 2002 Revised Spring 2003 Reese & Woods.
1 A QTS Web Training Writing Consumer Education & Referral Outcomes.
Criteria for selection of a data collection instrument. 1.Practicality of the instrument: -Concerns its cost and appropriateness for the study population.
1 RESEARCH METHODOLOGY FOR ED, BABM AND MBA STUDENTS PREPARED BY: MUKUNDA KUMAR.
Lesson Plans Objectives
Facilitating UFE step-by-step: a process guide for evaluators Module 4: Steps 8-12 of UFE Checklist.
Trouble? Can’t type: F11 Can’t hear & speakers okay or can’t see slide? Cntrl R or Go out & come back in 1 Sridhar Rajappan.
Session 2: Developing a Comprehensive M&E Work Plan.
Cedric D. Murry APT Instructor of Applied Technology in research and development.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
Monitoring and Evaluation Systems for NARS organizations in Papua New Guinea Day 4. Session 10. Evaluation.
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 3. Session 9. Periodic data collection methods.
Writing the Methods Section
Facilitating UFE step-by-step: a process guide for evaluators
S519: Evaluation of Information Systems
Note to evaluator… The overall purpose of this presentation is to guide evaluators through the completion of steps 4 to 6 of the UFE checklist. The main.
Facilitating UFE step-by-step: a process guide for evaluators
Note to evaluator… The overall purpose of this presentation is to guide evaluators through the completion of steps 4 to 6 of the UFE checklist. The main.
Presentation transcript:

Note to evaluator… The overall purpose of this presentation is to guide evaluators through the completion of step 7 of the UFE checklist and to “level the field” for the simulation (step 8). The main goal is to guide the primary user through the definition of intended outcomes and the selection of required data and appropriate methods to respond to the key evaluation questions. (PLEASE READ THE NOTES SECTION OF THE DIFFERENT SLIDES) Please adapt this presentation to the context of the project that you are evaluating and to your facilitation style.

Facilitating UFE step-by-step: a process guide for evaluators Joaquín Navas & Ricardo Ramírez February, 2010 Module 3: Step 7 of UFE checklist

Meeting’s Objetives 1.Review report of previous meeting(s) & validate preliminary KEQ analysis. 2.Define the program’s intended outcomes. 3.Define required data in order to respond to the KEQ. 4.Select appropriate methods for data collection (Step 7 of UFE).

Agenda 1.Discussion on report of previous meeting – validation of preliminary analysis of KEQ. 2.Identification of intended outcomes of the program. 3.Break. 4.Definition of required data. 5.Selection of appropriate data collection methods.

What we have accomplished so far… 1.First draft of KEQ that seems useful to guide the remainder of the evaluation process. 2.First 6 steps of the UFE checklist have been covered. 3.The process has been well documented up to this point.

Comments on Previous Report

Comments on Second Report Does the report describe the process well? Is it worth documenting the process in a very detailed manner? Are you happy with the KEQ? Is the analysis presented on the report valid?

KEQ Validation Analysis # Key Evaluation Question Related Primary Intended Use KEQ Category Does the KEQ comply with the desired KEQ features? Related specific program objective KEQ #1 KEQ #2 KEQ #3 KEQ #4

1.Objective #1. 2.Objective #2. 3.Objective #3. Project’s specific objectives This slide is only for reference in case someone in the audience needs to look at the objectives to discuss the table on slide #8.

Categories of key evaluation questions INPUT / RESOURCES IMPACT OUTCOMES APPROACH / MODEL PROCESS QUALITY COST- EFFECTIVENESS This slide is only for reference in case someone in the audience needs to look at the KEQ categories to discuss the table on slide #8. (Adapted from Dart, 2007)

Specific enough to be useful in guiding you through the evaluation Broad enough to be broken down - are not the same as a question in a survey Data (qualitative/quantitative) can be brought to bear on the KEQ KEQs are open questions (can’t answer yes or no!) Have meaning for those developing the plan Lead to useful, credible, evaluation There aren’t too many of them (2-4 is enough). What makes good KEQs? (adapted from Dart, 2007) This slide is only for reference in case someone in the audience needs to look at the desired KEQ features to discuss the table on slide #8.

Utilization-Focused Outcomes Framework as roadmap Participant target group How results will be used Details of data collection Desired outcomes for the target group Outcome Indicators Performance Targets KEQ Adapted from Patton (2008: ): Utilization-Focused Outcomes Framework DO NOT SHOW THIS SLIDE

The trajectory of change… INPUT / RESOURCES ▼ ACTIVITIES ▼ OUPUTS ▼ OUTCOMES ▼ IMPACT / RESULTS CONTROL&PREDICTIONCONTROL&PREDICTION ?

Focusing on outcomes (1/17) DESIRED/EXPECTED OUTCOMES Desired or expected outcomes that would result from the program subject of this evaluation.  What are you trying to achieve with your program?  What type of changes do you want to see in the program participants in terms of behaviour, attitude, knowledge, skills, status, etc?

Focusing on outcomes (2/17) DESIRED/EXPECTED OUTCOMES Specific ObjectivesOUTCOMES What do you want to achieve? Type of change Proyect objective #1Outcome #1X Proyect objective #2Outcome #2Y Proyect objective #3Outcome #3X,Y,Z

BREAK

Focusing on outcomes (3/17) DETAILS OF DATA COLLECTION ¿What data do you need in order to answer the KEQs?

Focusing on outcomes (4/17) # Key Evaluation Questions Required data Other considerations for the evaluation KEQ #1 KEQ #2 KEQ #3 KEQ #4 DETAILS OF DATA COLLECTION

Focusing on outcomes (5/17) DETAILS OF DATA COLLECTION ¿What methods could be used to collect the required data?

Focusing on outcomes (6/17) DETAILS OF DATA COLLECTION 1.There is no magic key to tell you the most appropriate method to answer your KEQ. 2.All methods have limitations, so try using a combination of methods. 3.Each type of question suits specific approaches/methods – so let them guide you. Other factors to consider: time, cost, resources, knowledge. 4.Primary users should the one to determine what constitutes credible evidence. The primary user should feel comfortable with the selected methods and the collected data. Adapted from Dart, 2007.

Focusing on outcomes(7/17) DETAILS OF DATA COLLECTION COMPATIBILITY BETWEEN METHODS AND QUESTION CATEGORIES Impact: Contribution Analysis / Data trawl & expert panel / GEM. Outcomes: OM / MSC / GEM. Approach/Model: Comparative studies of different approaches. Process: Evaluation study: interview process, focus groups. Quality: Audit against standards, peer review. Cost-effectiveness: Economic modeling Adapted from Dart, 2007.

Focusing on outcomes (8/17) DETAILS OF DATA COLLECTION – METHODS SUMMARY (1/3) Contribution Analysis: Seeks for evidence to show evidence between a given activity and an outcome in order to show change trends that have resulted from an intervention. Does not intend to show linear causality. Data Trawl: Data search and analysis from disperse literature in order to identify relationships between activities and outcomes. GEM (Gender Evaluation Methodology): Links gender and ICT through relevant indicators. Read more:

Focusing on outcomes (9/17) DETAILS OF DATA COLLECTION – METHODS SUMMARY (2/3) Outcome Mapping: Focuses on mid-term outcomes, suggesting that in the best case scenario these outcomes will lead to long-term impact in a non-linear way. Read more: Most Significant Change: Seeks to identify most significant changes based on participants´ stories. Read more: Expert panels: Group of experts is invited to comment and analyze outcomes and how they relate to possible impacts. Read more:

Focusing on outcomes (10/17) DETAILS OF DATA COLLECTION – METHODS SUMMARY (2/3) Comparative studies of different approaches: Self-explanatory. Interview process: Interviews on how participants experienced the process of the project subject of the evaluation. Focus Groups: Self-explanatory. Audit against standards: This might refer to a comparative analysis against specific standards. Peer reviews: Self-explanatory. Economic Modeling: Not sure what this method refers to.

Focusing on outcomes (11/17) DETAILS OF DATA COLLECTION Given the primary intended USES of the evaluation, do you think that the results that will be obtained with these methods will be :  Credible (accurate)?  Reliable (consistent)?  Valid (true, believable and correct)?

Focusing on outcomes (12/17) DETAILS OF DATA COLLECTION Do you think that these methods are :  Cost-effective?  Practical?  Ethical?

Focusing on outcomes (13/17) DETAILS OF DATA COLLECTION ¿Do you think that you will be able to use the results that you will obtain by the selected methods according to the purposes and intended uses that you defined earlier in the process?

Focusing on outcomes (14/17) DETAILS OF DATA COLLECTION Formative improvement and learning To improve the program subject of the evaluation. Findings’ primary intended uses Knowledge generation To identify patterns of effectiveness. To adapt interventions to emerging conditions. Program development Evaluation purposes This is just an example, please adapt to your particular scenario.

Focusing on outcomes (15/17) DETAILS OF DATA COLLECTION Who will do the data collection? How will you, as primary users, be involved in the data collection?

Focusing on outcomes (16/17) DETAILS OF DATA COLLECTION Will the data collection be based on a sample? How do you think the sampling should be done? Who will do it?

Focusing on outcomes (17/17) DETAILS OF DATA COLLECTION Who will manage and analyze collected data? How will you, as primary users, be involved in data management and analysis?

Conclusions and next steps

Conclusions and next steps (for the evaluator only)

References Patton, M.Q. (2008). Utilization focused evaluation, 4th Edition. Sage. Dart, J “Key evaluation questions”. Presentation at the Evaluation in Practice Workshop. Kualal Lumpur, December.