CES 20th Annual Conference Keun-bok Kang (CNU) & Chan-goo Yi (ETRI) ; Taejon, Korea A DESIGN OF THE METAEVALUATION MODEL A DESIGN OF THE METAEVALUATION.

Slides:



Advertisements
Similar presentations
Program Evaluation Alternative Approaches and Practical Guidelines
Advertisements

ENTITIES FOR A UN SYSTEM EVALUATION FRAMEWORK 17th MEETING OF SENIOR FELLOWSHIP OFFICERS OF THE UNITED NATIONS SYSTEM AND HOST COUNTRY AGENCIES BY DAVIDE.
Site-Based Decision Making Campus Planning. Restructuring A process through which a district or school alters the pattern of its structures (vision, rules,
Introduction to Monitoring and Evaluation
MODULE 8: PROJECT TRACKING AND EVALUATION
What You Will Learn From These Sessions
Enhancing Data Quality of Distributive Trade Statistics Workshop for African countries on the Implementation of International Recommendations for Distributive.
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
Program Evaluation Essentials. WHAT is Program Evaluation?
Evaluation.
A brief overview What is program evaluation? How is an evaluation conducted? When should it be used? When can it be used? Used with Permission of: John.
INVITATIONAL PRACTICUM IN METAEVALUATION Session 2 9/15/11 GAO & Joint Committee Standards DLS 9/13/11 1.
Lecture(2) Instructor : Dr. Abed Al-Majed Nassar
Decision-Making and Strategic Information Workshop on M&E of PHN Programs July 24-August 11, 2006 Addis Ababa.
PPA 502 – Program Evaluation
PPA 502 – Program Evaluation
Evaluation. Practical Evaluation Michael Quinn Patton.
EDU555 CURRICULUM & INSTRUCTION ENCIK MUHAMAD FURKAN MAT SALLEH WEEK 4 CURRICULUM EVALUATION.
Recent international developments in Energy Statistics United Nations Statistics Division International Workshop on Energy Statistics September 2012,
Development of Competence Profile Quality managers in VET-institutions Project no: PL1-LEO This publication [communication] reflects the.
Several Evaluations Theories and Methods Reference: Foundation of Program Evaluation by Sadish, Cook, and Leviton (1991)
The use and convergence of quality assurance frameworks for international and supranational organisations compiling statistics The European Conference.
AICT5 – eProject Project Planning for ICT. Process Centre receives Scenario Group Work Scenario on website in October Assessment Window Individual Work.
Sociology 3322a. “…the systematic assessment of the operation and/or outcomes of a program or policy, compared to a set of explicit or implicit standards.
Program Evaluation EDL 832 Jeffrey Oescher, Instructor 6 June 2013.
Semester 2: Lecture 9 Analyzing Qualitative Data: Evaluation Research Prepared by: Dr. Lloyd Waller ©
1 IFRS in the Banking Sector A supervisor’s perspective REPARIS Workshop Marc Pickeur Vienna CBFA March 2006 Belgium.
“Strategies for Effective Clinical Teaching and Evaluation” Assessment & Evaluation – (Part 2) Patricia A. Mahoney, MSN, RN, CNE This presentation is a.
IAOD Evaluation Section, the Development Agenda (DA) and Development Oriented Activities Julia Flores Marfetan, Senior Evaluator.
1 Unit 1 Information for management. 2 Introduction Decision-making is the primary role of the management function. The manager’s decision will depend.
UNDAF M&E Systems Purpose Can explain the importance of functioning M&E system for the UNDAF Can support formulation and implementation of UNDAF M&E plans.
GSSR Research Methodology and Methods of Social Inquiry socialinquiry.wordpress.com January 17, 2012 I. Mixed Methods Research II. Evaluation Research.
Teaching Today: An Introduction to Education 8th edition
Monitoring & Evaluation Presentation for Technical Assistance Unit, National Treasury 19 August 2004 Fia van Rensburg.
EVALUATION THEORY AND MODEL Theory and model should have symbiotic relationship with practice Theory and model should have symbiotic relationship with.
process information Coordination of National Statistical Systems Seminar on the Implementation of Fundamental Principles Konrad Pesendorfer.
Meta-evaluation of the Performance Evaluation System of Public Research Institutes in Korea Chan Goo Yi (Pukyong National University, Korea ;
1 of 27 How to invest in Information for Development An Introduction Introduction This question is the focus of our examination of the information management.
Programme Objectives Analyze the main components of a competency-based qualification system (e.g., Singapore Workforce Skills) Analyze the process and.
20081 E-learning Lecture-10: EVALUATING THE IMPACTS OF E-LEARNING week 12- Semester-4/ 2009 Dr. Anwar Mousa University of Palestine Faculty of Information.
EVALUATION OF HRD PROGRAMS Jayendra Rimal. The Purpose of HRD Evaluation HRD Evaluation – the systematic collection of descriptive and judgmental information.
BMH CLINICAL GUIDELINES IN EUROPE. OUTLINE Background to the project Objectives The AGREE Instrument: validation process and results Outcomes.
The Major Steps of a Public Health Evaluation 1. Engage Stakeholders 2. Describe the program 3. Focus on the evaluation design 4. Gather credible evidence.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
United Nations Regional Seminar on Census Data Dissemination and Spatial Analysis September, Nairobi, Kenya 1/18 Consultations with users in preparing.
DAC Evaluation Quality Standards Workshop, Auckland 6/2 & 7/ Evaluation quality standards in Dutch Development Cooperation Ted Kliest Policy and.
Implementation and follow up Critically important but relatively neglected stages of EIA process Surveillance, monitoring, auditing, evaluation and other.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
United Nations Oslo City Group on Energy Statistics OG7, Helsinki, Finland October 2012 ESCM Chapter 8: Data Quality and Meta Data 1.
Program Evaluation Overview. Definitions of Program Evaluation systematic collection of information abut the activities, characteristics, and outcome.
Technology Needs Assessments under GEF Enabling Activities “Top Ups” UNFCCC/UNDP Expert Meeting on Methodologies for Technology Needs Assessments
URBAN STREAM REHABILITATION. The URBEM Framework.
Session VI Evaluation & Wrap Up. Evaluation Activities directed at collecting, analyzing, interpreting, and communicating information about the effectiveness.
Erik Augustson, PhD, National Cancer Institute Susan Zbikowski, PhD, Alere Wellbeing Evaluation.
UNIT 7 MONITORING AND EVALUATION  Monitoring and evaluation is the process of examining progress against institution’s goals or plan.  The term SM &
Session 2: Developing a Comprehensive M&E Work Plan.
.  Evaluators are not only faced with methodological challenges but also ethical challenges on a daily basis.
Evaluation What is evaluation?
EECS David C. Chan1 Computer Security Management Session 1 How IT Affects Risks and Assurance.
URBAN STREAM REHABILITATION. The URBEM Framework.
SQA project process standards IEEE software engineering standards
SQA project process standards IEEE software engineering standards
Quality assurance in official statistics
WEEK 4 CURRICULUM EVALUATION
Introduction to CPD Quality Assurance
EVALUATION THEORY AND MODEL
URBAN STREAM REHABILITATION
WHAT is evaluation and WHY is it important?
Elements of evaluation quality: questions, answers, and resources
Presentation transcript:

CES 20th Annual Conference Keun-bok Kang (CNU) & Chan-goo Yi (ETRI) ; Taejon, Korea A DESIGN OF THE METAEVALUATION MODEL A DESIGN OF THE METAEVALUATION MODEL 16. May Keun-bok Kang, Professor Chungnam National University (CNU), Taejon, Korea Tel & Fax : ; Chan-goo Yi, Principal Researcher Electronics and Telecommunications Research Institute (ETRI), Taejon, Korea Tel : ; Fax : ; A DESIGN OF THE METAEVALUATION MODEL A DESIGN OF THE METAEVALUATION MODEL 16. May Keun-bok Kang, Professor Chungnam National University (CNU), Taejon, Korea Tel & Fax : ; Chan-goo Yi, Principal Researcher Electronics and Telecommunications Research Institute (ETRI), Taejon, Korea Tel : ; Fax : ;

CES 20th Annual Conference Keun-bok Kang (CNU) & Chan-goo Yi (ETRI) ; Taejon, Korea 1. Introduction 1. Introduction 2. Definition of Metaevaluation 2. Definition of Metaevaluation 3. Previous Researches on the Metaevaluation 3. Previous Researches on the Metaevaluation 4. Building the Metaevaluation Model 4. Building the Metaevaluation Model 4-1. Component Factors of Metaevaluation 4-1. Component Factors of Metaevaluation 4-2. Design of the Metaevaluation Model 4-2. Design of the Metaevaluation Model 1. Introduction 1. Introduction 2. Definition of Metaevaluation 2. Definition of Metaevaluation 3. Previous Researches on the Metaevaluation 3. Previous Researches on the Metaevaluation 4. Building the Metaevaluation Model 4. Building the Metaevaluation Model 4-1. Component Factors of Metaevaluation 4-1. Component Factors of Metaevaluation 4-2. Design of the Metaevaluation Model 4-2. Design of the Metaevaluation Model CONTENTSCONTENTS

CES 20th Annual Conference Keun-bok Kang (CNU) & Chan-goo Yi (ETRI) ; Taejon, Korea ▣ Purpose of this study - designing of metaevaluation model that can be used in practice - in order to achieve this : review the previous approaches to metaevaluation (especially the component factors of metaevaluation) identify the component factors of metaevaluation suggest the metaevaluation model ▣ Purpose of this study - designing of metaevaluation model that can be used in practice - in order to achieve this : review the previous approaches to metaevaluation (especially the component factors of metaevaluation) identify the component factors of metaevaluation suggest the metaevaluation model 1. Introduction - 1 -

CES 20th Annual Conference Keun-bok Kang (CNU) & Chan-goo Yi (ETRI) ; Taejon, Korea ▣ ▣ usually metaevaluation is understood as evaluation of evaluation ▣ ▣ according to how they understand the nature and the purpose of policy evaluation, they define the metaevaluation differently ▣ ▣ policy evaluation : - (purpose of policy evaluation) produce valid and useful policy relevant information to stakeholders, to decide in making the policy or changing the existing policy, to improve the policy implementation, to identify the responsibility of policy process participants ▣ ▣ usually metaevaluation is understood as evaluation of evaluation ▣ ▣ according to how they understand the nature and the purpose of policy evaluation, they define the metaevaluation differently ▣ ▣ policy evaluation : - (purpose of policy evaluation) produce valid and useful policy relevant information to stakeholders, to decide in making the policy or changing the existing policy, to improve the policy implementation, to identify the responsibility of policy process participants 2. Definition of Metaevaluation - 2 -

CES 20th Annual Conference Keun-bok Kang (CNU) & Chan-goo Yi (ETRI) ; Taejon, Korea ▣ ▣ Definition of metaevaluation : - evaluation on the evaluation and evaluation system, during or after the evaluation, in terms of evaluation paradigm, evaluation resources, evaluation process, evaluation performance and evaluation utilization ▣ ▣ Purpose of metaevaluation : - to value the evaluation(system) : summative evaluation - to improve the evaluation(system) : formative evaluation - to promote the utilization of evaluation : summative / formative evaluation ▣ ▣ Definition of metaevaluation : - evaluation on the evaluation and evaluation system, during or after the evaluation, in terms of evaluation paradigm, evaluation resources, evaluation process, evaluation performance and evaluation utilization ▣ ▣ Purpose of metaevaluation : - to value the evaluation(system) : summative evaluation - to improve the evaluation(system) : formative evaluation - to promote the utilization of evaluation : summative / formative evaluation

CES 20th Annual Conference Keun-bok Kang (CNU) & Chan-goo Yi (ETRI) ; Taejon, Korea metaevaluation is an evaluation which examines the value and merits of policy evaluation itself (Stufflebeam, 1981) - when policy evaluation is understood as examining the value and merits of policy activities - metaevaluation is a process of classifying, acquiring and utilizing descriptive and perceptive information related effectiveness, actuality, ethicality and technical appropriateness of certain evaluation activities - the purpose of metaevaluation is to be a guide for evaluation activities, and openly report upon the merits and demerits of evaluation itself metaevaluation is an evaluation which examines the value and merits of policy evaluation itself (Stufflebeam, 1981) - when policy evaluation is understood as examining the value and merits of policy activities - metaevaluation is a process of classifying, acquiring and utilizing descriptive and perceptive information related effectiveness, actuality, ethicality and technical appropriateness of certain evaluation activities - the purpose of metaevaluation is to be a guide for evaluation activities, and openly report upon the merits and demerits of evaluation itself 3. Previous Researches on the Metaevaluation (expecially the components of metaevaluation) (expecially the components of metaevaluation) 3. Previous Researches on the Metaevaluation (expecially the components of metaevaluation) (expecially the components of metaevaluation) - 4 -

CES 20th Annual Conference Keun-bok Kang (CNU) & Chan-goo Yi (ETRI) ; Taejon, Korea metaevaluation is examining the validity of the evaluation design, data collection and analysis - when the purpose of evaluation is understood as producing valid information for policy results - metaevaluation is focused on the technical or structural adequacy of evaluation (for example, Cook & Gruder, 1978 ; Chelimsky, 1987 ; Smith & Hauer, 1990 ; Greene, 1992) metaevaluation is examining the validity of the evaluation design, data collection and analysis - when the purpose of evaluation is understood as producing valid information for policy results - metaevaluation is focused on the technical or structural adequacy of evaluation (for example, Cook & Gruder, 1978 ; Chelimsky, 1987 ; Smith & Hauer, 1990 ; Greene, 1992) - 5 -

CES 20th Annual Conference Keun-bok Kang (CNU) & Chan-goo Yi (ETRI) ; Taejon, Korea metaevaluation is assessing the usefulness of the evaluation - when the purpose of evaluation is understood as providing useful information to stakeholders (policymakers, decision makers, program managers, clients, etc) - metaevaluation is focused on assessing the evaluation utilization (for example, Mackay, 1992) metaevaluation is assessing the usefulness of the evaluation - when the purpose of evaluation is understood as providing useful information to stakeholders (policymakers, decision makers, program managers, clients, etc) - metaevaluation is focused on assessing the evaluation utilization (for example, Mackay, 1992)

CES 20th Annual Conference Keun-bok Kang (CNU) & Chan-goo Yi (ETRI) ; Taejon, Korea metaevaluation is checking the reliability of the evaluative judgements(evaluation results), and the evaluation process that ensure the reduction of the bias - when evaluation is understood as making unbiased evaluative judgements of merits or worth of the policy results - for example, Scriven suggests his ‘Metaevaluation Checklist’ (not identified, quoted from Rogers, 1995) metaevaluation is checking the reliability of the evaluative judgements(evaluation results), and the evaluation process that ensure the reduction of the bias - when evaluation is understood as making unbiased evaluative judgements of merits or worth of the policy results - for example, Scriven suggests his ‘Metaevaluation Checklist’ (not identified, quoted from Rogers, 1995)

CES 20th Annual Conference Keun-bok Kang (CNU) & Chan-goo Yi (ETRI) ; Taejon, Korea Larson & Berliner's Standards ▣ ▣ Larson & Berliner(1983) classified the three component factors of metaevaluation, and further divided each factor into detailed evaluation items - factors of evaluation input resources and techniques used in evaluation, environment - factors of evaluation process actual evaluation activities carried out in accordance with evaluation plan - factors of evaluation outcome number of decisions affected by the evaluation Larson & Berliner's Standards ▣ ▣ Larson & Berliner(1983) classified the three component factors of metaevaluation, and further divided each factor into detailed evaluation items - factors of evaluation input resources and techniques used in evaluation, environment - factors of evaluation process actual evaluation activities carried out in accordance with evaluation plan - factors of evaluation outcome number of decisions affected by the evaluation

CES 20th Annual Conference Keun-bok Kang (CNU) & Chan-goo Yi (ETRI) ; Taejon, Korea Larson & Berliner's Standards (continued) - this research was the first systematic approach to metaevaluation - but, they did not concern the evaluation purposes and types - they suggested 23 evaluation items for metaevaluation, but some items were overlapped and the classification of items is somewhat inappropriate - they had no little concern on the rational and due process Larson & Berliner's Standards (continued) - this research was the first systematic approach to metaevaluation - but, they did not concern the evaluation purposes and types - they suggested 23 evaluation items for metaevaluation, but some items were overlapped and the classification of items is somewhat inappropriate - they had no little concern on the rational and due process - 9 -

CES 20th Annual Conference Keun-bok Kang (CNU) & Chan-goo Yi (ETRI) ; Taejon, Korea The Joint Committee on Standards for Educational Evaluation (1994) - developed 30 principles(standards) for metaevaluation. they were grouped as such : Utility, Feasibility, Propriety, Accuracy - these standards are a useful checklist for metaevlauation, but the committee does not suggest a satisfactory framework for metaevaluation (little guidance on how to plan evaluation in order to meets these standards) - it is needed that these standards to be checked for applicability in other countries and other types of program (Rogers, 1995) The Joint Committee on Standards for Educational Evaluation (1994) - developed 30 principles(standards) for metaevaluation. they were grouped as such : Utility, Feasibility, Propriety, Accuracy - these standards are a useful checklist for metaevlauation, but the committee does not suggest a satisfactory framework for metaevaluation (little guidance on how to plan evaluation in order to meets these standards) - it is needed that these standards to be checked for applicability in other countries and other types of program (Rogers, 1995)

CES 20th Annual Conference Keun-bok Kang (CNU) & Chan-goo Yi (ETRI) ; Taejon, Korea Patricia Rogers' Metaevaluation Framework (1995) ▣ ▣ a framework for evaluating evaluation in terms of their intended and actual impact on the implementation of existing programs ▣ ▣ her metaevaluation framework consists of : - 5 evaluation criteria for intermediate outcomes of program evaluation producing valid information providing useful information to decision-makers producing unbiased judgements of merit or worth involvement & illumination of relevant stakeholders empowerment of the intended program clients Patricia Rogers' Metaevaluation Framework (1995) ▣ ▣ a framework for evaluating evaluation in terms of their intended and actual impact on the implementation of existing programs ▣ ▣ her metaevaluation framework consists of : - 5 evaluation criteria for intermediate outcomes of program evaluation producing valid information providing useful information to decision-makers producing unbiased judgements of merit or worth involvement & illumination of relevant stakeholders empowerment of the intended program clients

CES 20th Annual Conference Keun-bok Kang (CNU) & Chan-goo Yi (ETRI) ; Taejon, Korea Patricia Rogers' Metaevaluation Framework (1995) (continued) - evaluation's impact on processes of implementing existing programs or replacing existing programs with alternatives - evaluation's contribution to the development of programs which meets need ▣ ▣ this framework is very systematic, but the evaluation object is limited to the existing programs which is implementing Patricia Rogers' Metaevaluation Framework (1995) (continued) - evaluation's impact on processes of implementing existing programs or replacing existing programs with alternatives - evaluation's contribution to the development of programs which meets need ▣ ▣ this framework is very systematic, but the evaluation object is limited to the existing programs which is implementing

CES 20th Annual Conference Keun-bok Kang (CNU) & Chan-goo Yi (ETRI) ; Taejon, Korea Component Factors of Metaevaluation - evaluation paradigm - evaluation resources - evaluation process - evaluation performance - evaluation utilization Component Factors of Metaevaluation - evaluation paradigm - evaluation resources - evaluation process - evaluation performance - evaluation utilization 4. Building the Metaevaluation Model

CES 20th Annual Conference Keun-bok Kang (CNU) & Chan-goo Yi (ETRI) ; Taejon, Korea ① Evaluation Paradigm - evaluation purpose : rationality - evaluation type : appropriateness - evaluation object : suitability different level and range of evaluation object may lead different evaluation results ① Evaluation Paradigm - evaluation purpose : rationality - evaluation type : appropriateness - evaluation object : suitability different level and range of evaluation object may lead different evaluation results

CES 20th Annual Conference Keun-bok Kang (CNU) & Chan-goo Yi (ETRI) ; Taejon, Korea ② Evaluation Resources - evaluation manpower quality and quantity of evaluators user involvement - evaluation organization structural and functional appropriateness of evaluation organization - evaluation budget and information appropriateness of evaluation budget quality(adequacy, reliability, etc) and quantity of information ② Evaluation Resources - evaluation manpower quality and quantity of evaluators user involvement - evaluation organization structural and functional appropriateness of evaluation organization - evaluation budget and information appropriateness of evaluation budget quality(adequacy, reliability, etc) and quantity of information

CES 20th Annual Conference Keun-bok Kang (CNU) & Chan-goo Yi (ETRI) ; Taejon, Korea ③ Evaluation Process - evaluation procedure : objectivity and fairness - timing of evaluation : fitness to the evaluation type etc. - evaluation methodology : accuracy and validity - evaluation criteria : appropriateness - evaluation indicators : rationality ③ Evaluation Process - evaluation procedure : objectivity and fairness - timing of evaluation : fitness to the evaluation type etc. - evaluation methodology : accuracy and validity - evaluation criteria : appropriateness - evaluation indicators : rationality

CES 20th Annual Conference Keun-bok Kang (CNU) & Chan-goo Yi (ETRI) ; Taejon, Korea ④ Evaluation Performance - evaluation outcome : validity - evaluation information : usefulness ④ Evaluation Performance - evaluation outcome : validity - evaluation information : usefulness

CES 20th Annual Conference Keun-bok Kang (CNU) & Chan-goo Yi (ETRI) ; Taejon, Korea ⑤ Evaluation Utilization ▣ ▣ preparation and dissemination of evaluation report - clarity and impartiality of report - timeliness and dissemination of report ▣ ▣ utilization of evaluation results - instrumental utilization contribution to improving the implementing existing program or to changing the existing program, to developing the new program - conceptual utilization enlightenment of policy ⑤ Evaluation Utilization ▣ ▣ preparation and dissemination of evaluation report - clarity and impartiality of report - timeliness and dissemination of report ▣ ▣ utilization of evaluation results - instrumental utilization contribution to improving the implementing existing program or to changing the existing program, to developing the new program - conceptual utilization enlightenment of policy

CES 20th Annual Conference Keun-bok Kang (CNU) & Chan-goo Yi (ETRI) ; Taejon, Korea Design of the Metaevaluation Model component factors of metaevaluation items of metaevaluation & criteria purpose rationality type object appropriateness suitability (level and range) manpower quality / quantity user involvement organization structural and functional appropriateness evaluation paradigm evaluation resource budget information appropriateness quantity and quality (adequacy, reliability, etc)

CES 20th Annual Conference Keun-bok Kang (CNU) & Chan-goo Yi (ETRI) ; Taejon, Korea (continued) items of metaevaluation & criteria component factors of metaevaluation objectivity and fairness fitness to evaluation type etc. accuracy and validity appropriateness rationality validity usefulness procedure timing methodology criteria indicator outcome information evaluation process evaluation performance clarity and impartiality timeliness and dissemination report Improving & changing the existing program developing the new program Instrumental utilization enlightment of policy conceptual utilization evaluation utilization