Background to Program Evaluation

Slides:



Advertisements
Similar presentations
Performance Assessment
Advertisements

Site-Based Decision Making Campus Planning. Restructuring A process through which a district or school alters the pattern of its structures (vision, rules,
MODULE 8: PROJECT TRACKING AND EVALUATION
Coastal Plains RESA Assessment Literacy: Formative Instructional Practices March 27, April 23, April 30, May 7 Session One: Modules 1 & 2 Session Two:
SUCCESSFUL PROGRAM DEVELOPMENT STRATEGIES Solid research base is lacking Solid research base is lacking Hundreds of literature prescribe how to develop.
MERC Ten Steps to Designing an Evaluation for Your Educational Program Linda Perkowski, Ph.D. University of Minnesota Medical School.
Chapter 6: Program-Oriented Approaches
Project Monitoring Evaluation and Assessment
Program Evaluation. Lecture Overview  Program evaluation and program development  Logic of program evaluation (Program theory)  Four-Step Model  Comprehensive.
Program Evaluation Essentials. WHAT is Program Evaluation?
Laura Pejsa Goff Pejsa & Associates MESI 2014
Research Basics PE 357. What is Research? Can be diverse General definition is “finding answers to questions in an organized and logical and systematic.
Evaluation Research, aka Program Evaluation. Definitions Program Evaluation is not a “method” but an example of applied social research. From Rossi and.
NCTM’s Focus in High School Mathematics: Reasoning and Sense Making.
Program Evaluation In A Nutshell 1 Jonathan Brown, M.A.
EDU555 CURRICULUM & INSTRUCTION ENCIK MUHAMAD FURKAN MAT SALLEH WEEK 4 CURRICULUM EVALUATION.
PROGRAM EVALUATION 2013 R&D, FEBRUARY 12, 2014 DBEEPEDML.
Ensuring Quality and Effective Staff Professional Development to Increase Learning for ALL Students.
Identification, Analysis and Management
CHALLENGES AND OPPORTUNITIES FOR CRITICAL ANALYSIS IN ASSESSMENT.
Product Evaluation the outcome phase. Do the magic bullets work? How do you know when an innovative educational program has “worked”? How do you know.
ISTEP: Technology Field Research in Developing Communities Instructor: M. Bernardine Dias CAs: Sarah Belousov and Ermine Teves Spring 2009.
Principles of Management Core Principles
PDHPE K-6 Using the syllabus for consistency of assessment © 2006 Curriculum K-12 Directorate, NSW Department of Education and Training.
Leadership: Connecting Vision With Action Presented by: Jan Stanley Spring 2010 Title I Directors’ Meeting.
Too expensive Too complicated Too time consuming.
Lecture 8A Designing and Conducting Formative Evaluations English Study Program FKIP _ UNSRI
Verification: Quality Assurance in Assessment Verification is the main quality assurance process associated with assessment systems and practice - whether.
1 Introduction to Evaluating the Minnesota Demonstration Program Paint Product Stewardship Initiative September 19, 2007 Seattle, WA Matt Keene, Evaluation.
Setting purposeful goals Douglas County Schools July 2011.
Effective Instructional Feedback Mike Miles July 2009.
EVALUATION APPROACHES Heather Aquilina 24 March 2015.
The Ugly Truth… Most school principals think they are “effective instructional leaders” but few really are.
1 Using Logic Models to Enhance Evaluation WESTAT Center to Improve Project Performance (CIPP) Office of Special Education Programs Amy A. Germuth, Ph.D.
Monitoring & Evaluation: The concepts and meaning Day 9 Session 1.
PPA 502 – Program Evaluation Lecture 2c – Process Evaluation.
Programme Objectives Analyze the main components of a competency-based qualification system (e.g., Singapore Workforce Skills) Analyze the process and.
20081 E-learning Lecture-10: EVALUATING THE IMPACTS OF E-LEARNING week 12- Semester-4/ 2009 Dr. Anwar Mousa University of Palestine Faculty of Information.
EVALUATION OF HRD PROGRAMS Jayendra Rimal. The Purpose of HRD Evaluation HRD Evaluation – the systematic collection of descriptive and judgmental information.
Program Evaluation.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
 Welcome, introductions  Conceptualizing the evaluation problem  stakeholder interests  Divergent and convergent processes  Developing evaluation.
Assessment Design. Four Professional Learning Modules 1.Unpacking the AC achievement standards 2.Validity and reliability of assessments 3. Confirming.
The Future of Education Inspection Overview: Key points from the new Common Inspection Framework (CIF) Highlight Ofsted new way of working Priorities.
Program Evaluation: Alternative Approaches and Practical Guidelines, 4e © 2011 Pearson Education, Inc. All rights reserved. Program Evaluation Alternative.
1 TESL Evaluating CALL Packages:Curriculum/Pedagogical/Lingui stics Dr. Henry Tao GUO Office: B 418.
Program Evaluation Overview. Definitions of Program Evaluation systematic collection of information abut the activities, characteristics, and outcome.
Phases of Curriculum Design: Evaluation Stufflebeam’s CIPP Model Dr. Katherine Korkidis April 19, 2009.
The 2012 Ofsted inspection framework SCHOOLS North East 14 th October 2011.
IDEV 624 – Monitoring and Evaluation Evaluating Program Outcomes Elke de Buhr, PhD Payson Center for International Development Tulane University.
EVALUATION RESEARCH To know if Social programs, training programs, medical treatments, or other interventions work, we have to evaluate the outcomes systematically.
Kaplan University Writing CenterSaturday, March 05, IX520 Needs Assessment and Evaluation in Education and Training Unit 7 Seminar September 13,
Evaluation: An Overview HSC 489. Two Categories of Evaluation Informal – impromptu, unsystematic procedures Formal – systematic well-planned procedures.
.  Evaluators are not only faced with methodological challenges but also ethical challenges on a daily basis.
Program Evaluation “Most often defined as a process used to determine whether the design and delivery of a program were effective and whether the proposed.
4.Model of Evaluation a. Hierarchical Criteria Model (classical model by Bennett, 1976) 1. Input (resource) 2. Activity 3. Participation 4. Reaction 5.
Observing and Assessing Young Children
Chapter 1 Toward Accountability. The Faces of Accountability Professional Accountability.Service Delivery Accountability.Coverage Accountability.Cultural.
Dr. Kathleen Haynie Haynie Research and Evaluation November 12, 2010.
Focus Questions What is assessment?
Title of the Change Project
Evaluating a Task-based English Course: A Proposed Model
Strategic Planning for Learning Organizations
Introduction to Program Evaluation
Introduction to CPD Quality Assurance
Start with the Science & Technology Standards (2002, 2008?)
Systematic, intentional inquiry
What is a Logic Model? A depiction of a program showing what the program will do and what it is to accomplish. A series of “if-then” relationships that.
What is a Logic Model? A depiction of a program showing what the program will do and what it is to accomplish. A series of “if-then” relationships that.
How is an M & E framework derived from the logframe?
Presentation transcript:

Background to Program Evaluation

Origins of Program Evaluation Fairly recent in origin Becoming more evident in Canada Professional

Evolution of PE 1900 - 30 Age of efficiency and testing. Evaluation and measurement treated similarly. 1930 - 45 Age of Tyler. Ralph Tyler basic principles of curriculum and development. First to connect objectives and outcomes. What should schools seek to attain?

Evolution of PE 1946 - 1957 Age of Innocence. More emphasis on progress not effectiveness 1957 Sputnik Panic. What is wrong with education? 1958 - 1972 Age of expansion. Development of major theories. 1973 - present Age of professionalism. Creation of major professional organizations

Stake - Countenance model Need for formalized evaluation Not just anecdotal but descriptive data is necessary Must include description and judgement Includes intents, and observations, which are compared to standards then a judgement is made.

Stake - Countenance model Is there congruence between what is intended and what is observed? Contingency - What is the relationship between the variables? Is there a logical connection between an event and its purpose?

Stufflebeam - CIPP Based on making decisions Based on Robert Tyler’s model of objectives and outcomes. Made this approach more systematic

Stufflebeam - CIPP C - Context I - Input P - Process P - Product

Provus - Discrepancy Improve existing models Establish new and better programs Greater accountability of educators to the public Wiser decisions by administrators Each step of a program is compared to what it should be

Provus D - Design content I - Installation P - Process P - Product C - May include cost-benefit analysis

Scriven Simple approach to evaluation Goals versus roles Goals - outcomes of the program. Reason for a program. Really need to study the goals of a program Roles - the political dimension of the evaluation. Underlying motivation

Scriven - Summative and Formative Formative - gives feedback during the delivery of a program for immediate or future modification. Summative - Evaluation at the end of a program to see if it has been effective or has met its original goals.

Scriven - Goal-free Instead of using existing staff Evaluation can be formative of summative Outside evaluator is not made aware of program goals Eliminates “tunnel vision” Evaluator determines own path through the process.

Levine - Adversary Quasi-judicial Involves two or more teams of evaluators Evaluation takes up positive and negative opposing viewpoints Very thorough look at both sides of a decision Expensive

Rippey - Transactional Involves all those impacted by an issue Does not begin with a particular focus Usually begins with some form of unrest or the initial stages of a potentially damaging situation Converts conflict into productive activity and goals Similar to appreciative inquiry in many ways

Rippey - Transactional Initial phase - meeting with all stakeholders to determine issues Instrumentation - collectively uncovering the issues for the conflict and creating an instrument to gauge responses to the perceived issues. Program development - creation of ways to manage the issues Program monitoring - assumption of responsibilities for ensuring program success Recycling - feedback on all of the first four stages