Chapter 6: Program-Oriented Approaches

Slides:



Advertisements
Similar presentations
Integrating the NASP Practice Model Into Presentations: Resource Slides Referencing the NASP Practice Model in professional development presentations helps.
Advertisements

Results Based Monitoring (RBM)
Leon County Schools Performance Feedback Process August 2006 For more information
K-6 Science and Technology Consistent teaching – Assessing K-6 Science and Technology © 2006 Curriculum K-12 Directorate, NSW Department of Education and.
Review: Introduction Define Evaluation
M & E for K to 12 BEP in Schools
Gallup Q12 Definitions Notes to Managers
Aligning Employee Performance with Agency Mission
Program Evaluation Strategies to Improve Teaching for Learning Rossi Ray-Taylor and Nora Martin Ray.Taylor and Associates MDE/NCA Spring School Improvement.
Program Evaluation. Lecture Overview  Program evaluation and program development  Logic of program evaluation (Program theory)  Four-Step Model  Comprehensive.
Laura Pejsa Goff Pejsa & Associates MESI 2014
Chapter Three: Determining Program Components by David Agnew Arkansas State University.
Action Implementation and Monitoring A risk in PHN practice is that so much attention can be devoted to development of objectives and planning to address.
Third Edition Dr. Wasim Al-Habil. Chapter Strategic Management in the Public Sector.
Business research methods: data sources
Title slide PIPELINE QRA SEMINAR. PIPELINE RISK ASSESSMENT INTRODUCTION TO GENERAL RISK MANAGEMENT 2.
Program Evaluation In A Nutshell 1 Jonathan Brown, M.A.
Thinking: A Key Process for effective learning “The best thing we can do, from the point of view of the brain and learning, is to teach our learners how.
EDU555 CURRICULUM & INSTRUCTION ENCIK MUHAMAD FURKAN MAT SALLEH WEEK 4 CURRICULUM EVALUATION.
Customer Focus Module Preview
1 IMC Planning Knowing the Score In IMC, a plan is similar to a musical score. The IMC plan details which marketing communications and media should be.
MGT-555 PERFORMANCE AND CAREER MANAGEMENT
How to Develop the Right Research Questions for Program Evaluation
Employee Engagement Survey
Delmar Learning Copyright © 2003 Delmar Learning, a Thomson Learning company Nursing Leadership & Management Patricia Kelly-Heidenthal
1 Qualitative Evaluation Terms Coding/categorization The process of condensing qualitative data through the identification of common themes. Data Matrix.
Monitoring Evaluation Impact Assessment Objectives Be able to n explain basic monitoring and evaluation theory in relation to accountability n Identify.
Continuing QIAT Conversations Planning For Success Joan Breslin Larson Third webinar in a series of three follow up webinars for.
9 Closing the Project Teaching Strategies
V-1 Module V ______________________________________________________ Providing Positive Behavioral Interventions and Supports.
Sociology 3322a. “…the systematic assessment of the operation and/or outcomes of a program or policy, compared to a set of explicit or implicit standards.
+ REFLECTIVE COACHING APRIL 29, Goals for Today Check in on where everyone is in our self-guided learning and practice with reflective coaching.
Slide 1 D2.TCS.CL5.04. Subject Elements This unit comprises five Elements: 1.Define the need for tourism product research 2.Develop the research to be.
1 Introduction to Evaluating the Minnesota Demonstration Program Paint Product Stewardship Initiative September 19, 2007 Seattle, WA Matt Keene, Evaluation.
© 2015 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.
IAOD Evaluation Section, the Development Agenda (DA) and Development Oriented Activities Julia Flores Marfetan, Senior Evaluator.
Module 2 Stakeholder analysis. What’s in Module 2  Why do stakeholder analysis ?  Identifying the stakeholders  Assessing stakeholders importance and.
EVALUATION APPROACHES Heather Aquilina 24 March 2015.
Making Sense of the Social World 4 th Edition Chapter 11, Evaluation Research.
MAINSTREAMING MONITORING AND EVALUATION IN EDUCATION Can education be effectively managed without an M & E system in place?
© 2012 Cengage Learning. All Rights Reserved. This edition is intended for use outside of the U.S. only, with content that may be different from the U.S.
1 Designing Effective Programs: –Introduction to Program Design Steps –Organizational Strategic Planning –Approaches and Models –Evaluation, scheduling,
Making Sound Use of Funds Decisions for Title I, Part D Nicholas Read and Simon Gonsoulin, NDTAC Jeff Breshears, California Department of Education.
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
EDU 385 CLASSROOM ASSESSMENT Week 1 Introduction and Syllabus.
EVALUATION OF HRD PROGRAMS Jayendra Rimal. The Purpose of HRD Evaluation HRD Evaluation – the systematic collection of descriptive and judgmental information.
Assessment for learning
Program Evaluation.
Alain Thomas Overview workshop Background to the Principles Definitions The National Principles for Public Engagement What.
1 Nemours Health Planning and Evaluation Collaborative Learning Session I: Designing Effective Health Programs.
Student Name Student Number ePortfolio Demonstrating my achievement of the NSW Institute of Teachers Graduate Teacher Stage of the Professional Teacher.
Strategic Management at Non Profit. If you fail to plan, you plan to fail But Plans are nothing planning is everything.
Program Evaluation: Alternative Approaches and Practical Guidelines, 4e © 2011 Pearson Education, Inc. All rights reserved. Program Evaluation Alternative.
Developing a Framework In Support of a Community of Practice in ABI Jason Newberry, Research Director Tanya Darisi, Senior Researcher
Welcome To Implementation Science 8 Part Webinar Series Kathleen Ryan Jackson Erin Chaparro, Ph.D University of Oregon.
Program Evaluation Overview. Definitions of Program Evaluation systematic collection of information abut the activities, characteristics, and outcome.
Background to Program Evaluation
Evaluating Engagement Judging the outcome above the noise of squeaky wheels Heather Shaw, Department of Sustainability & Environment Jessica Dart, Clear.
ELED 6560 Summer Learning Exercises #10 The Un-Natural Part of Teaching  Five Ways that Teaching Behavior is Un-Natural 1. Helping Others 2.
IMPLEMENTING LEAPS IN CENTRAL AND EASTERN EUROPE: TRAINERS’ HANDBOOK Monitoring and Evaluating Results.
.  Evaluators are not only faced with methodological challenges but also ethical challenges on a daily basis.
Organizations of all types and sizes face a range of risks that can affect the achievement of their objectives. Organization's activities Strategic initiatives.
Monitoring and Evaluation Systems for NARS organizations in Papua New Guinea Day 4. Session 10. Evaluation.
Stages of Research and Development
Introduction to Program Evaluation
Emergence of a Curriculum approach in Language teaching
4.2 Identify intervention outputs
Consider the Evidence Evidence-driven decision making
Lecturette 2: Planning Change
Review: Introduction Define Evaluation
Presentation transcript:

Chapter 6: Program-Oriented Approaches Presentation by Jay Kerstetter, Amanda Brown, Jody Yoos, & Jane Lightner J.K.

Orienting Questions: #1 What are the key concepts of the objectives-oriented evaluation approach? #2 How has this approach influenced evaluation? What is it? Objectives-oriented evaluation approach helps determine whether some or all of the program objectives are achieved and, if so, how well they are achieved. Evaluators may work with stakeholders to establish if program objectives are met. Information from this approach can assist with deciding to maintain, terminate, or change approaches within the program. J.K.

Tylerian Evaluation Approach 1. Ralph Tyler is credited with initiating this approach in the 1930’s 2. Tyler began to formulate his views on education and evaluation. 3. His approach included the following steps: Establish goals or objectives Classify the goals or objectives Define objectives in behavioral terms Find situations in which achievement of objectives can be shown Develop or select measurement techniques Collect performance data Compare performance data with behaviorally stated objectives 4. This approach was readily adoptable by evaluators and had great influence on evaluation theorists. Tylerian Evaluation Approach J.K.

Provus’s Discrepancy Evaluation Model: Developed by Malcolm Provus; viewed evaluation as a continuous information management process. Provus stemmed key characteristics of his proposal from Tyler. Provus viewed evaluation as a process. This process was called DEM; Discrepancy Evaluation Model, which are broken into four developmental stages. Definition Installation Process Product Cost-benefit analysis (optional) The DEM was designed to facilitate the development of programs in large public school systems and later applied to state evaluations by federal bureau. The DEM was one of the earliest approaches to evaluation and elements of it can still be found today in many evaluation. J.K.

Orienting Question #3: How is the objective-oriented evaluation approach used today? Standards-based testing Accountability in education Performance monitoring systems used in many government programs Many refinements to the system since the 1930’s J.Y.

Ralph W. Tyler: Tylerian Evaluation Approach Influenced the Elementary and Secondary Education Act (ESEA) of 1965, first act to require evaluation of educational programs Started the National Assessment of Educational Progress (NAEP), only way to examine educational achievement of al 50 states. J.Y.

Malcolm Provus: Provus’s Discrepancy Evaluation Model Approach evaluated in Pittsburgh public schools Viewed evaluation as a continuous information- management process designed to serve as “the watch-dog of program-management” and the “handmaiden of administration in the management of program development through sound decision making” Malcolm Provus: Provus’s Discrepancy Evaluation Model J.Y.

Orienting Question #4a: How are Logic Models used in evaluation? Developed as an extension of objectives- oriented evaluation Designed to fill in those steps between the program and its objectives Program planners/evaluator Identify program Inputs Activities Outputs (immediate program impacts) Outcomes (long-term objectives/goals) J.Y.

Today Logic Models are used in program planning/evaluation Help program staff articulate /discuss how program might achieve goals What elements are important to evaluate at any given time Build internal evaluation capacity or think in an evaluative way Example Companies: United way of America WK Kellogg Foundation Annie E Casey Foundation J.Y.

Orienting Question #4b: How are Program Theories used in evaluation? Theory-Based evaluation Is used by evaluators to gain a better understanding of the program Better define the evaluation questions the study should address To aid their choices of what concepts to measure and when to measure them To improve their interpretation of results and their feedback to stakeholders to enhance use. Orienting Question #4b: How are Program Theories used in evaluation? J.Y.

Steps in Theory Based Evaluation Engage Relevant stakeholder Develop a first draft of program theory evaluator or evaluation team Present draft to stakeholders for further discussion, reaction, and input Conduct a plausibility check Communicate findings to key stakeholders Probe arrows for model specificity Finalize program impact theory J.Y.

Theory Driven Evaluation Work with stakeholders to identify key questions to be answered in the evaluation and the appropriate designs and methods for answering those questions Emphasis is on testing the program model Provide guidance as to what to measure and when to measure it The selection of the evaluation questions to be addressed depend on the stage of the program and what stakeholders hope to learn Provides evaluator with critical information that can be used throughout the evaluation J.Y.

Orienting Question #5: How do theory-based evaluation and objectives approaches differ? The objectives are identified by group looking for the evaluation. Purpose of activity specified and then those purposes are being acheived. Individual credited- Ralph W. Tyler Theory-Based The evaluator discusses goals, basics, and objectives of program w/the stakeholders. The evaluator decides how the program should work and then sees if it works that way. Huey Chen and Leonard Bickman helped to develop approaches to theory based. Science based and quantitative. A.B.

Question #6: What are the strengths and limitations of program oriented evaluation approaches? Objective oriented – the simplicity of the concept and program makes it easy to understand, follow and possibly implement. Face validity – the evaluator is being held accountable for what is being asked to be evaluated. They want to see if what is currently happening or working based on their own goals and objectives for the program. A.B.

Theory oriented – the chance for the evaluator to partake in dialogue with the stakeholders is a strength of the program because it helps them to expand their knowledge on the topic. This then leads to the evaluator to have a clear understanding of the program so they know how to properly evaluate the program. Strengths (con’t) A.B.

Weaknesses: Objective oriented – the evaluator can have a single minded focus on the objectives which will cause them to overlook the complications, elements and factors contributing to that programs success or failure. This approach does not ask for the evaluator to gain an understanding of the context in which the program operates this could then be what is affecting the program’s success or failure. Evaluator could ignore the actual value of the objective. Since the evaluator is told what the objectives are they are not asked to evaluate whether that objective even fits the program. A.B.

Theory oriented – this like the objective oriented can cause the evaluator to ignore certain aspects of the program that are important because the evaluator is so focused on the theory of how it runs and not how it is actually running. Evaluators may ignore the needs or values of stakeholders involved with the program. This approach may also oversimplify the complexity of the program making it feel easier to evaluate than it really is because not all of the surrounding factors are accounted for in the process. Weaknesses (con’t) A.B.

Orienting Question # 7: What is goal-free evaluation? Rationale: “Goals should not be taken as givens.” Developed by Scriven (1972)---believes the most important function of a goal-free evaluation is to reduce the bias that occurs from knowing program goals Thus…increase objectivity in judging the program as a whole. J.L.

Orienting Question #8: What does it teach us about conducting an evaluation? Goals can act as “blinders” causing us to possibly miss the most important outcomes not related to the goals. Goal-free evaluation was proposed to primarily indentify the unanticipated side effects that an objectives-oriented evaluation might miss J.L.

Major Characteristics of a Goal-Free Evaluation 1. Evaluator purposely avoids becoming aware of program goals. 2. Predetermined goals are not permitted o narrow the focus of the evaluation study. 3. Goal-free evaluation focuses on actual outcomes rather than intended program outcomes. 4. The goal-free evaluator has minimal contact with the program manager and staff. 5. Goal-free evaluation increases the likelihood that unanticipated side effects will be noted. J.L.

Goal Directed + Goal-Free= can work together Internal Goal Evaluator vs. External Goal-Free Evaluator Goal-directed How well is program meeting goals Provide information to administrator What does the program do? Looking at ALL the programs outcomes, intended or not Goal Directed + Goal-Free= can work together J.L.

Information taken from Program Evaluation: Alternate Approaches and Practical Guidelines