.  Evaluators are not only faced with methodological challenges but also ethical challenges on a daily basis.

Slides:



Advertisements
Similar presentations
Program Evaluation Alternative Approaches and Practical Guidelines
Advertisements

Focusing an Evaluation Ben Silliman, Youth Development Specialist NC 4-H Youth Development.
Introduction to Monitoring and Evaluation
Johns Hopkins University School of Education Johns Hopkins University Evaluation Overview.
Donald T. Simeon Caribbean Health Research Council
METAEVALUATION An Overview (dls 8/30/11). Key Questions  1. What is the essence of metaevaluation?  2. Why is metaevaluation important?  3, What are.
Institutional Effectiveness (ie) and Assessment
Elementary School Counselor
STRATEGIC PLAN Community Unit School District 300 7/29/
Chapter 6: Program-Oriented Approaches
Campus Improvement Plans
What You Will Learn From These Sessions
Designing an Effective Evaluation Strategy
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
A plain English Description of the components of evaluation Craig Love, Ph.D. Westat.
 Systematic determination of the quality or value of something (Scriven, 1991)  What can we evaluate?  Projects, programs, or organizations  Personnel.
Systems Analysis and Design 9th Edition
Decision-Making and Strategic Information Workshop on M&E of PHN Programs July 24-August 11, 2006 Addis Ababa.
PPA 502 – Program Evaluation
The Academic Assessment Process
Chapter 3 Preparing and Evaluating a Research Plan Gay and Airasian
Chapter 15 Evaluation.
Program Evaluation In A Nutshell 1 Jonathan Brown, M.A.
Evaluation. Practical Evaluation Michael Quinn Patton.
How to Write Goals and Objectives
Measuring Learning Outcomes Evaluation
RESEARCH DESIGN.
Sub-theme Three The Self-Assessment Process and Embedding QA into the Life of an Institution by Terry Miosi, Ph.D. UAE Qualification Framework Project.
Technology Leadership
Evaluation Basics Principles of Evaluation Keeping in mind the basic principles for program and evaluation success, leaders of youth programs can begin.
Program Evaluation EDL 832 Jeffrey Oescher, Instructor 6 June 2013.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Semester 2: Lecture 9 Analyzing Qualitative Data: Evaluation Research Prepared by: Dr. Lloyd Waller ©
BUSINESS INFORMATICS descriptors presentation Vladimir Radevski, PhD Associated Professor Faculty of Contemporary Sciences and Technologies (CST) Linkoping.
HECSE Quality Indicators for Leadership Preparation.
NCATE Standard 3: Field Experiences & Clinical Practice Monica Y. Minor, NCATE Jeri A. Carroll, BOE Chair Professor, Wichita State University.
Monitoring & Evaluation: The concepts and meaning Day 9 Session 1.
University of Palestine Dept. of Urban Planning Introduction to Planning ( EAGD 3304 ) M.A. Architect: Tayseer Mushtaha Mob.:
1. Housekeeping Items June 8 th and 9 th put on calendar for 2 nd round of Iowa Core ***Shenandoah participants*** Module 6 training on March 24 th will.
EDU 385 CLASSROOM ASSESSMENT Week 1 Introduction and Syllabus.
ACADEMIC PERFORMANCE AUDIT ON AREA 1, 2 AND 3 Prepared By: Nor Aizar Abu Bakar Quality Academic Assurance Department.
Programme Objectives Analyze the main components of a competency-based qualification system (e.g., Singapore Workforce Skills) Analyze the process and.
Program Evaluation.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
Evaluating Ongoing Programs: A Chronological Perspective to Include Performance Measurement Summarized from Berk & Rossi’s Thinking About Program Evaluation,
UNITAR SEMINAR – February 22, 2012 Paul Balogun- Consultant
The Conceptual Framework: What It Is and How It Works Linda Bradley, James Madison University Monica Minor, NCATE April 2008.
CES 20th Annual Conference Keun-bok Kang (CNU) & Chan-goo Yi (ETRI) ; Taejon, Korea A DESIGN OF THE METAEVALUATION MODEL A DESIGN OF THE METAEVALUATION.
Assessment of Student Learning in General Education AAHE/NCA 2003 Assessment Workshop Omaha, Nebraska ● June 2003.
Arts Assessment Resource Guide Produced by: San Diego County Office of Education Project Coordinator: Ron Jessee.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Evaluation Revisiting what it is... Who are Stakeholders and why do they matter? Dissemination – when, where and how?
Module 4: Systems Development Chapter 13: Investigation and Analysis.
Evaluate Phase Pertemuan Matakuliah: A0774/Information Technology Capital Budgeting Tahun: 2009.
New Employee Induction Program
Session VI Evaluation & Wrap Up. Evaluation Activities directed at collecting, analyzing, interpreting, and communicating information about the effectiveness.
Data Driven Planning and Decision-making for Continuous School Improvement: Developing a Program Evaluation Plan Leadership for Innovative Omani Schools.
Erik Augustson, PhD, National Cancer Institute Susan Zbikowski, PhD, Alere Wellbeing Evaluation.
WELCOME Challenge and Support. What is challenge and support Table discussion As a governor what do you think Challenge and Support looks like?
Overview and Update.  LBUSD is currently facing a unique set of challenges and opportunities. It is imperative that we look intensely and thoroughly.
IDEV 624 – Monitoring and Evaluation Evaluating Program Outcomes Elke de Buhr, PhD Payson Center for International Development Tulane University.
Continual Service Improvement Methods & Techniques.
1 Module 1 Introduction: The Role of Gender in Monitoring and Evaluation.
Session 2: Developing a Comprehensive M&E Work Plan.
Stages of Research and Development
Fundamentals of Information Systems, Sixth Edition
Links for Academic Learning: Planning An Alignment Study
Presenter: Kate Bell, MA PIP Reviewer
Elements of evaluation quality: questions, answers, and resources
Presentation transcript:

 Evaluators are not only faced with methodological challenges but also ethical challenges on a daily basis.

 Ethics in program evaluation refers to insuring that the actions of the program evaluator are in no way causing harm or potential harm to program participants, vested stakeholders, or the greater community

 Established in 1975 the Joint Standards Committee was created to develop a set of standards to ensure the highest quality of program evaluation in the educational setting.

 The Joint Standards Committee is made up of several organizations. The American Evaluation Association (AEA) is one of those contributing organizations and sends delegates to Joint Standards Committee meetings.

 The standards are broken down into five main areas: 1) Utility 2) Feasibility 3) Propriety 4) Accuracy, and 5) Evaluation accountability. Use the link below to access the standards: American Evaluation Association. (n.d.). Programme Accountability Standards. Retrieved from

 The purpose of these standards is to increase the likelihood that stakeholders would find both the process and product associated with the evaluation to be valuable in nature

 The purpose of these standards is to ensure that the evaluation is conducted accordingly, using appropriate project management techniques, along with using resources appropriately.

 These standards are designed to support what is fair, legal and right in program evaluation.

 The purpose of this standard is to ensure that evaluations are both dependable and truthful in their data collection and findings

 These standards call for both a rigorous documentation of evaluations and the use of internal and external meta-evaluations in order to improve the on-going processes and products associated with evaluation.

 An evaluation approach is the process in which the evaluator goes about collecting data.

 Objective-based Evaluation o Most evaluation today is objective-based o Evaluation objectives are aligned with program goals o Typically there are 5- 8 evaluation objectives o After aligning the evaluation objectives to the program goals the evaluator sets out to document the degree to which program goals have been accomplished. o Uses evaluation matrixes and logic models

 A very effective tool for evaluators to use  Provides a plan or “blueprint” for the evaluation  A table where the evaluator delineates the evaluation objectives, tools, timeline and stakeholders

 A quantitative delineation of an program goal. o Example Benchmark: Students who participating in the after school program will have a 20% decrease in office referrals each year.

 Allows for keeping close track of “linkages” between program goals and outcomes  However, this approach could bias evaluators because program goals are so clearly defined

 Unorthodox approach to program evaluation  Evaluator does not know the program goals  Conducts observations and collects data to determine what the evaluator thinks are program goals based on evidence

 Evaluators not biased by the stated goals of the program  Current emphasis on accountability and outcomes makes goal free evaluation difficult to implement and keep program in compliance with the funding source

 Evaluator and data collection methods not driven by objectives but by “burning” questions.  These questions are usually asked by a decision-making body or administrative group, not participants or those most “affected” by the programming.

 CIPP approach is the most noted decision- based approach  This approach uses both formative and summative evaluation data through a prescribed framework.  Four steps or phases that guide the evaluation process: Context, Input, Process and Product.

 Context is the first component of the CIPP model. It this section the evaluator focuses on studying the context or situation for which the program will take place.

 What do teachers and staff think we need to address this program?  What do teachers, staff, and the greater school community believe is the underlying elements of students’ behavior issues during the school day?  What is currently not working in our building’s current student behavior program?

 The second component of the model is input. Input allows the evaluator the opportunity to examine the relationship between the amount of resources available (e.g. money, staff, equipment) and the programs proposed activities.  The question that has to be answered at this juncture is: Will the current budget/funding support the proposed activities?

 Process evaluation is the third component of the CIPP model. In this component the question: Are we doing the program as planned?

 Product evaluation is the fourth and final component of the CIPP model.  Product evaluation focuses evaluation efforts on final outcomes of the program and determining whether the program met its stated goals and objectives.  This component is primarily summative evaluation and answers the question: Was the program successful?

 Evaluator “teaches” stakeholders to evaluate the program (or aspects of it ) that serves them.  E.g. students collecting data to evaluate their after school program (Youth Participatory Evaluation, Flores, 2008).

 Empowers underrepresented groups  Provides a unique perspective to the data, program and evaluation process that an external evaluator would not be able to “capture”

 Participatory evaluators may stray from goals of program or collecting data that is of critical interest to the funding source  Participatory evaluators lack technical expertise to collect, analyze and interpret data. Data validity could be comprimised.

 The evaluator’s role to develop or select the criteria that will be used to judge the program or product.  Sciven also believed that the purpose of this approach was to present the evaluation findings and to let the consumers (as well as potential consumers) make the final decision as to use or not use the program or products.

 More of an eclectic approach used by today’s evaluators  Evaluators take “bits and pieces” of the above approaches and use them appropriately in order to extend and support quality evaluations for funders, stakeholders, and the greater community.