Types of Evaluation.

Slides:



Advertisements
Similar presentations
Introduction to Monitoring and Evaluation
Advertisements

Action Research Not traditional educational research often research tests theory not practical Teacher research in classrooms and/or schools/districts.
M & E for K to 12 BEP in Schools
Donald T. Simeon Caribbean Health Research Council
MODULE 8: PROJECT TRACKING AND EVALUATION
Chapter 15 Evaluation Recognizing Success. Social Work Evaluation and Research Historically –Paramount to the work of early social work pioneers Currently.
Evaluation Research Pierre-Auguste Renoir: Le Grenouillere, 1869.
Benefits and limits of randomization 2.4. Tailoring the evaluation to the question Advantage: answer the specific question well – We design our evaluation.
Experimental Research Designs
+ Evidence Based Practice University of Utah Presented by Will Backner December 2009 Training School Psychologists to be Experts in Evidence Based Practices.
Program Evaluation It’s Not Just for OMB Anymore….
A brief overview What is program evaluation? How is an evaluation conducted? When should it be used? When can it be used? Used with Permission of: John.
Midterm Review Evaluation & Research Concepts Proposals & Research Design Measurement Sampling Survey methods.
Studying Behavior. Midterm Review Session The TAs will conduct the review session on Wednesday, October 15 th. If you have questions, your TA and.
Evaluation Research COMT 502. topics Evaluation research –Process of determining whether intervention has its intended result.
PPA 502 – Program Evaluation
Program Evaluation In A Nutshell 1 Jonathan Brown, M.A.
Questions I have had some professors who have a preference on APA style, is the library website a good source for APA format? Do you have a particular.
EDU555 CURRICULUM & INSTRUCTION ENCIK MUHAMAD FURKAN MAT SALLEH WEEK 4 CURRICULUM EVALUATION.
Formulating the research design
Educational Psychology Third Edition
Methodology: How Social Psychologists Do Research
Experimental Design The Gold Standard?.
PISA FOR DEVELOPMENT Technical Workshops Contextual questionnaires 10 th April 2014 OECD Secretariat 1.
REC 375—Leadership and Management of Parks and Recreation Services Jim Herstine, Ph.D., CPRP Assistant Professor Parks and Recreation Management UNC Wilmington.
Dr. G. Johnson, Program Evaluation and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson.
1 Types of Evaluation. 2 Different types of evaluation Needs assessment Process evaluation Impact evaluation Cost-benefit analysis.
Research Design for Quantitative Studies
Chapter 11 Research Methods in Behavior Modification.
McGraw-Hill/Irwin ©2009 The McGraw-Hill Companies, All Rights Reserved Marketing Research, Primary Data, Secondary Data, Qualitative Research, Quantitative.
EVAL 6970: Cost Analysis for Evaluation Dr. Chris L. S. Coryn Nick Saxton Fall 2014.
Final Study Guide Research Design. Experimental Research.
KEYWORDS REFRESHMENT. Activities: in the context of the Logframe Matrix, these are the actions (tasks) that have to be taken to produce results Analysis.
Semester 2: Lecture 9 Analyzing Qualitative Data: Evaluation Research Prepared by: Dr. Lloyd Waller ©
INTERNATIONAL SOCIETY FOR TECHNOLOGY IN EDUCATION working together to improve education with technology Using Evidence for Educational Technology Success.
Workshop 6 - How do you measure Outcomes?
Chapter 2 Research Methods in Social Psychology. Chapter Outline  Characteristics of Empirical Research  Research Methods  Research in Diverse Populations.
GSSR Research Methodology and Methods of Social Inquiry socialinquiry.wordpress.com January 17, 2012 I. Mixed Methods Research II. Evaluation Research.
 2008 Johns Hopkins Bloomberg School of Public Health Evaluating Mass Media Anti-Smoking Campaigns Marc Boulay, PhD Center for Communication Programs.
Psychological Research Strategies Module 2. Why is Research Important? Gives us a reliable, systematic way to consider our questions Helps us to draw.
Private involvement in education: Measuring Impacts Felipe Barrera-Osorio HDN Education Public-Private Partnerships in Education, Washington, DC, March.
Impact Evaluation in Education Introduction to Monitoring and Evaluation Andrew Jenkins 23/03/14.
Integrated Risk Management Charles Yoe, PhD Institute for Water Resources 2009.
PPA 502 – Program Evaluation Lecture 2c – Process Evaluation.
 Descriptive Methods ◦ Observation ◦ Survey Research  Experimental Methods ◦ Independent Groups Designs ◦ Repeated Measures Designs ◦ Complex Designs.
CHAPTER 12 Descriptive, Program Evaluation, and Advanced Methods.
Evaluating Ongoing Programs: A Chronological Perspective to Include Performance Measurement Summarized from Berk & Rossi’s Thinking About Program Evaluation,
Begin at the Beginning introduction to evaluation Begin at the Beginning introduction to evaluation.
Research Strategies. Why is Research Important? Answer in complete sentences in your bell work spiral. Discuss the consequences of good or poor research.
CHAPTER 2 Research Methods in Industrial/Organizational Psychology
Quasi Experimental and single case experimental designs
1 Module 3 Designs. 2 Family Health Project: Exercise Review Discuss the Family Health Case and these questions. Consider how gender issues influence.
Program Evaluation Overview. Definitions of Program Evaluation systematic collection of information abut the activities, characteristics, and outcome.
0 ©2015 U.S. Education Delivery Institute While there is no prescribed format for a good delivery plan, it should answer 10 questions What a good delivery.
Readings n Text: Riddick & Russell –Ch1 stakeholders – p10 –Ch 2 an evaluation system –Proposal p25-36 – Ch 4 – Lit Review n Coursepack –GAO report Ch.
Thinking About Program Evaluation HUS 3720 Instructor Terry Wimberley, Ph.D.
Research Methods Observations Interviews Case Studies Surveys Quasi Experiments.
Psychological Research Strategies Module 2. Why is Research Important? Gives us a reliable, systematic way to consider our questions Helps us to draw.
Methodology: How Social Psychologists Do Research
EVALUATION RESEARCH To know if Social programs, training programs, medical treatments, or other interventions work, we have to evaluate the outcomes systematically.
Basic Concepts of Outcome-Informed Practice (OIP).
What is Impact Evaluation … and How Do We Use It? Deon Filmer Development Research Group, The World Bank Evidence-Based Decision-Making in Education Workshop.
Session 7: Planning for Evaluation. Session Overview Key definitions:  monitoring  evaluation Process monitoring and process evaluation Outcome monitoring.
Evaluation Nicola Bowtell. Some myths about evaluation 2Presentation title - edit in Header and Footer always involves extensive questionnaires means.
Chapter 6 Selecting a Design. Research Design The overall approach to the study that details all the major components describing how the research will.
Unit 6 Research Project in HSC Unit 6 Research Project in Health and Social Care Aim This unit aims to develop learners’ skills of independent enquiry.
CHAPTER 2 Research Methods in Industrial/Organizational Psychology
4.2 Identify intervention outputs
Define Your IT Strategy
Presentation transcript:

Types of Evaluation

Different types of evaluation Formative evaluations examine the delivery of the program or technology, the quality of its implementation, and the assessment of the organizational context, personnel, procedures, inputs, etc Needs assessment Process evaluation Summative Evaluations describe what happens subsequent to delivery of the program or technology; assessing whether the object can be said to have caused the outcome; determining the overall impact of the causal factor beyond only the immediate target outcomes; and, estimating the relative costs associated with the object. c. Impact evaluation d. Cost-benefit analysis

Needs Assessment Should Provide Clear sense of target population Students who are not responding to other inputs Students who are falling behind Clear sense of need program will fill What are teachers lacking? How to deliver? How much? What are potential barriers? Clear articulation of program benefits Is a wrong being righted? Is a right being expanded? Clear sense of alternatives Is this the most effective, efficient, cost-effective method of meeting teacher/student needs? Tools – focus group discussions, structured unstructured surveys Implementor should know what, why, for whom, it wants to do the program

Process evaluation Are the services being delivered? Money is being spent Textbooks are reaching the classroom, being used Can same service be delivered at lower cost? Substituting expensive inputs with cheaper ones Are the services reaching the right population? Are the books reaching students? Which students? Are the clients satisfied with service? Teachers’, students’ response to teaching method Tools/resources – administrative data, surveys, group discussions

Impact evaluation The program happened, how did it change lives? What does ToC say we might expect to change? Intermediate indicators Final outcomes Primary: did textbooks cause children to learn more Secondary: Distributional questions: who learned more? If several treatments: what was the best program design? 5 5

How impact differs from process? When we answer a process question, we need to describe what happened. This can be done from reading documents, interviewing people, admin records, etc. When we answer an impact question, we need to compare what happened to what would have happened without the program There are various ways to get at this, but all of them have in common that they need to re-create what did not happen.

Impact Evaluation Techniques Experimental Evaluation – Assignment of treatment is random Quasi-Experimental – There are multiple waves of data or multiple groups available. But the treatment assignment is not random Non-Experimental – Only a single snapshot measurement avaialble Do a blackboard demo: The problem of constructing a counterfactual

Evaluation and cost-benefit analysis Needs assessment gives you the metric for defining the cost/benefit ratio Process evaluation gives you the costs of all the inputs Impact evaluation gives you the quantified benefits Identifying alternatives allows for comparative cost benefit

Example: Comparative cost benefit In comparison to other programs, deworming has been found to be the most cost-effective

Linking back to objectives for evaluation Accountability Did we do what we said we were going to do? Process evaluation determines whether books delivered and used Did we have a positive impact on people’s lives? Impact evaluation of link between books and test scores Lesson learning Particular programs do or do not work Impact evaluations of similar programs in different situations What is the most effective route to achieve a certain outcome? Cost benefit analysis comparing several programs Similarities in strategies that are successful, for example, in changing behavior, even across fields? Linking results back to theory Reduced poverty through more effective programs Future decisions based on lessons learned Solid reliable impact evaluations are the building blocks for more general lesson learning

Things to be very clear about a. Validity – internal and external b. Study Ethics

a. Internal Validity How well the study was run (research design, operational definitions used, how variables were measured, what was/wasn't measured, etc.), and In case of an impact evaluation how confidently can one conclude that the change in the dependent variable was produced solely by the independent variable and not extraneous ones

a. External Validity The extent to which a study's results (regardless of whether the study is descriptive or experimental) can be generalized/applied to other people or settings reflects its external validity. Typically, group research employing randomization/randomized-selection will initially possess higher external validity than will studies (e.g., case studies and single-subject experimental research) that do not use random selection/assignment.

b. Ethical Principles Voluntary participation Informed consent Risk of harm Confidentiality Institutional Review Boards – Enforces established procedures that researchers will consider all relevant ethical issues in formulating research plans.