Review: Introduction Define Evaluation

Slides:



Advertisements
Similar presentations
HEALTH PLANNING AND MANAGEMENT
Advertisements

Management, Leadership, & Internal Organization………..
Introduction to Monitoring and Evaluation
Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
Action Research Not traditional educational research often research tests theory not practical Teacher research in classrooms and/or schools/districts.
Team 6 Lesson 3 Gary J Brumbelow Matt DeMonbrun Elias Lopez Rita Martin.
OECD/INFE High-level Principles for the evaluation of financial education programmes Adele Atkinson, PhD OECD With the support of the Russian/World Bank/OECD.
MERC Ten Steps to Designing an Evaluation for Your Educational Program Linda Perkowski, Ph.D. University of Minnesota Medical School.
Chapter 6: Program-Oriented Approaches
Strategic Planning An Overview. General Definition The process of strategic planning involves deciding where you want to go, how you want to be positioned,
Dr. Suzan Ayers Western Michigan University
Introduction to Evidence-Based Inquiry
Summative Evaluation The Evaluation after implementation.
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
A plain English Description of the components of evaluation Craig Love, Ph.D. Westat.
Program Evaluation.
Evaluation.
Public Budget As Decision- Making Process  Decision - Making Models:  Incremental Change Model  Satisfying Model  Ideal Rational Model  Stages of.
A brief overview What is program evaluation? How is an evaluation conducted? When should it be used? When can it be used? Used with Permission of: John.
Guidelines for Evaluation Planning: Clarifying the Evaluation Request and Responsibilities Dr. Suzan Ayers Western Michigan University (courtesy of Dr.
PPA 502 – Program Evaluation
The Academic Assessment Process
Problem Analysis Intelligence Step 2 - Problem Analysis Developing solutions to complex population nutrition problems (such as obesity or food insecurity)
SELECTING A DATA COLLECTION METHOD AND DATA SOURCE
Chapter 3 Preparing and Evaluating a Research Plan Gay and Airasian
PPA 503 – The Public Policy Making Process
Scaling and Attitude Measurement in Travel and Hospitality Research Research Methodologies CHAPTER 11.
Continuous Quality Improvement (CQI)
Evaluation Planning II: Setting Boundaries and Analyzing the Evaluation Context Dr. Suzan Ayers Western Michigan University (courtesy of Dr. Mary Schutten)
Product Evaluation the outcome phase. Do the magic bullets work? How do you know when an innovative educational program has “worked”? How do you know.
Management-Oriented Evaluation …evaluation for decision-makers. Jing Wang And Faye Jones.
Slide 1 D2.TCS.CL5.04. Subject Elements This unit comprises five Elements: 1.Define the need for tourism product research 2.Develop the research to be.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 10, 2011.
1 Introduction to Evaluating the Minnesota Demonstration Program Paint Product Stewardship Initiative September 19, 2007 Seattle, WA Matt Keene, Evaluation.
Strategic and operational plan. Planning it is a technical function that enables HSO to deal with present and anticipate the future. It involve deciding.
Program Evaluation EDL 832 Jeffrey Oescher, Instructor 6 June 2013.
Classroom Assessments Checklists, Rating Scales, and Rubrics
ASSESSMENT OF HRD NEEDS Jayendra Rimal. Goals of HRD Improve organizational effectiveness by: o Solving current problems (e.g. increase in customer complaints)
Outcome Based Evaluation for Digital Library Projects and Services
Elizabeth Godfrey 1.  Periodic assessment of results Appropriateness, effectiveness, efficiency, impact, sustainability  Identifies intended and unintended.
Semester 2: Lecture 9 Analyzing Qualitative Data: Evaluation Research Prepared by: Dr. Lloyd Waller ©
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
EVALUATION APPROACHES Heather Aquilina 24 March 2015.
GSSR Research Methodology and Methods of Social Inquiry socialinquiry.wordpress.com January 17, 2012 I. Mixed Methods Research II. Evaluation Research.
Teaching Today: An Introduction to Education 8th edition
EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn Nick Saxton Fall 2014.
PPA 502 – Program Evaluation Lecture 2c – Process Evaluation.
Quantitative and Qualitative Approaches
Review: Alternative Approaches II What three approaches did we last cover? What three approaches did we last cover? Describe one benefit of each approach.
The Interactive Model Of Program Planning
The Major Steps of a Public Health Evaluation 1. Engage Stakeholders 2. Describe the program 3. Focus on the evaluation design 4. Gather credible evidence.
Program Evaluation.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
 Welcome, introductions  Conceptualizing the evaluation problem  stakeholder interests  Divergent and convergent processes  Developing evaluation.
Begin at the Beginning introduction to evaluation Begin at the Beginning introduction to evaluation.
Program Evaluation Overview. Definitions of Program Evaluation systematic collection of information abut the activities, characteristics, and outcome.
Evaluation: from Objectives to Outcomes Janet Myers, PhD MPH AIDS Education and Training Centers National Evaluation Center
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 9, 2012.
Background to Program Evaluation
Kuliah 4 Etika Profesi dan Bisnis Oleh Coky Fauzi Alfi cokyfauzialfi.wordpress.com Ethical Decision-Making Process.
Company LOGO. Company LOGO PE, PMP, PgMP, PME, MCT, PRINCE2 Practitioner.
ASK STANDARDS Assessment and Accountability CNS 610 Written & Narrated by: Kelcie Dixon Western Kentucky University.
.  Evaluators are not only faced with methodological challenges but also ethical challenges on a daily basis.
Decision-Oriented Evaluation Approaches Presenters: Chris, Emily, Jen & Marie.
Classroom Assessments Checklists, Rating Scales, and Rubrics
Strategic and operational plan
Leacock, Warrican and Rose (2009)
Classroom Assessments Checklists, Rating Scales, and Rubrics
14 Cultural Competence Training, Assessment, and Evaluation of Cultural Competence, and Evidence-Based Practices in Culturally Competent Agencies.
Review: Introduction Define Evaluation
Presentation transcript:

Review: Introduction Define Evaluation How do formal/informal evaluation differ? What are two uses of evaluation in education? What are the pros/cons of using an external evaluator?

Alternative Approaches to Evaluation Dr. Suzan Ayers Western Michigan University (courtesy of Dr. Mary Schutten)

Alternative Approaches Stakeholders: individuals and groups who have a direct interest in, and may be affected by, evaluation; should be involved early, actively & continuously Program: activities that are provided on a continuing basis; typically what is evaluated There are a variety of alternative, often conflicting, views of what evaluation is and how it should be carried out

Why so many alternatives? The way one views evaluation directly impacts the type of activities/methods used Origins of alternative models stem from differences in: Philosophical & ideological beliefs Methodological preferences Practical choices

Philosophical & Ideological Beliefs Epistemologies (philosophies of knowing) Objectivism (social science base of empiricism; replicate) Subjectivism (experientially-based; tacit knowledge) Pros/Cons of each? Principles for assigning value (parallel obj/subj) Utilitarian: focus on group gains (avg scores); greatest good for the greatest number Intuitionist-pluralist: value is individually-determined Room for both or are these dichotomous? Philosophical purists are rare (impractical?) Choose the methods right for THAT evaluation Understand assumptions/limitations of different approaches

Methodological Preferences Quantitative (numerical) Qualitative (non-numerical) Evaluation is a transdiscipline; crosses paradigms “Law of the instrument” fallacy With hammer/nails, all appears to need hammering Identify what is useful in each evaluation approach, use it wisely & avoid being distracted by approaches designed to deal w/ different needs

Practical Considerations Evaluators disagree whether/not intent of evaluation is to render a value judgment Decision-makers or evaluator render judgment? Evaluators differ in views of evaluation’s political role Authority? Responsibility? These dictate eval style Influence of evaluators’ prior experience Who should conduct the evaluation and nature of expertise needed to do so Desirability (?) of having a wide variety of evaluation approaches

Classification Schema for Evaluation Approaches Conceptual approaches to evaluation, NOT techniques Objectives-oriented: focus on goals/objectives & degree to which they are achieved Management-oriented: identifying and meeting informational needs of decision makers Consumer-oriented: generate information to guide product/service use by consumers Expertise-oriented: use of professional expertise to judge quality of evaluation object Participant-oriented: stakeholders centrally involved in process See figure 3.1 (p. 68)

Objectives-oriented Approach Purposes of some activity are specified and then evaluation focuses on the extent to which these purposes are achieved Ralph W. Tyler popularized this approach in education (criterion ref test) Tylerian models Metfessel & Michael’s paradigm (enlarged vision of alternative instruments to collect evaluation data) Provus’s Discrepancy Evaluation Model (agree on stds, det if discrepancy exists btwn perf/std, use discrepancy info to decide to improve, maintain, terminate program) Logic models Determine long-term outcomes & backtrack to today

Objectives-oriented Steps Establish broad goals or objectives tied to mission statement Classify the goals or objectives Define objectives in behavioral terms Find situations where achievement of objectives can be shown Select/develop measurement techniques Collect performance data Compare data with behaviorally stated objectives

Objectives-oriented Pros/Cons Strengths: simplicity, easy to understand, follow and implement; produces information relevant to the mission Weakness: can lead to tunnel vision Ignores outcomes not covered by objectives Neglects the value of the objectives themselves Neglects the context in which evaluation takes place

Goal Free Evaluation This is the opposite of objectives-oriented evaluation, but the two supplement one another Purposefully avoid awareness of goals; should not be taken as given, goals should be evaluated Predetermined goals not allowed to narrow focus of evaluation study Focus on actual outcomes rather than intended Evaluator has limited contact with program manager and staff Increases likelihood of seeing unintended outcomes

Management-oriented Approach Geared to serve decision makers Identifies decisions administrator must make Collects data re: +/- of each decision alternative Success based on teamwork between evaluators and decision makers Systems approach to education in which decisions are made about inputs, processes, and outputs Decision maker is always the audience to whom evaluation is directed

CIPP Evaluation Model (Stufflebeam) Context Evaluation: planning decisions Needs to address? Existing programs? Input Evaluation: structuring decisions Available resources, alternative strategies? Process Evaluation: implementing decisions How well is plan being implemented? Barriers to success? Revisions needed? Product Evaluation: recycling decisions Results? Needs reduced? What to do after program has ‘run its course’?

CIPP Steps Focusing the Evaluation Collection of Information Organization of Information Analysis of Information Reporting of Information Administration of Evaluation (timeline, staffing, budget etc…)

Context Evaluation Table 5.1 Objective: define institutional context, target population and assess their needs Method: system analysis, survey, hearings, interviews, diagnostic tests, Delphi technique (experts) For deciding upon the setting to be served, the goals associated with meeting needs and objectives for solving problems

Input Evaluation Objective: identify and assess system capabilities, procedural designs for implementing the strategies, budgets, schedules Method: inventory human and material resources, feasibility, economics via literature review, visit exemplary programs For selecting sources of support, solution strategies in order to structure change activities, provide basis to judge implementation

Process Evaluation Objective: identify or predict defects in the process or procedural design, record/judge procedural events Method: monitoring potential procedural barriers, continual interaction with and observation of the activities of the staff For implementing and refining the program design and procedure (a.k.a., process control)

Product Evaluation Objective: collect descriptions and judgments of outcomes and relate them to CIP, interpret worth/merit Methods: measure outcomes, collect stakeholder information, analyses of data For deciding to continue, terminate, modify, or refocus an activity and to document the effects (whether intended or unintended)

Uses of Management-oriented Approaches to Evaluation CIPP has been used in school districts, state and federal government agencies Useful guide for program improvement Accountability Figure 5.1 (p. 94) Formative and summative aspects of CIPP

Management-oriented Pros/Cons Strengths: appealing to many who like rational, orderly approaches, gives focus to the evaluation, allows for formative and summative evaluation Weaknessws: preference given to top management, can be costly and complex, assumes important decisions can be identified in advance of the evaluation

REVIEW/Qs Why are there so many alternative approaches to evaluation? What two conceptual approaches to evaluation did we discuss tonight? What are their +/-? Which, if either, of these approaches do you think will work for your evaluation object? Identify your most likely evaluation object