EVAL 6000: Foundations of Evaluation Carl D. Westine & Dr. Chris L. S. Coryn December 2, 2010.

Slides:



Advertisements
Similar presentations
Board Governance: A Key to Quality Organizations
Advertisements

Program Evaluation Alternative Approaches and Practical Guidelines
Good Evaluation. Good Evaluation Should … be simple be fair be purposeful be related to the curriculum assess skills & strategies set priorities use multiple.
ENTITIES FOR A UN SYSTEM EVALUATION FRAMEWORK 17th MEETING OF SENIOR FELLOWSHIP OFFICERS OF THE UNITED NATIONS SYSTEM AND HOST COUNTRY AGENCIES BY DAVIDE.
Engaging Your Community in Practical Program Evaluation Stephanie Welch, MS-MPH, RD, LDN Knox County Health Department Phone (865)
Introduction to Monitoring and Evaluation
Johns Hopkins University School of Education Johns Hopkins University Evaluation Overview.
METAEVALUATION An Overview (dls 8/30/11). Key Questions  1. What is the essence of metaevaluation?  2. Why is metaevaluation important?  3, What are.
Measuring Value: Using Program Evaluation to Understand What’s Working -- Or Isn’t Juliana M. Blome, Ph.D. , MPH Office of Program Analysis and Evaluation.
What You Will Learn From These Sessions
Designing an Effective Evaluation Strategy
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
Program Evaluation Essentials. WHAT is Program Evaluation?
Ray C. Rist The World Bank Washington, D.C.
ACCOUNTING ETHICS Lect. Victor-Octavian Müller, Ph.D.
The quality framework of European statistics by the ESCB Quality Conference Vienna, 3 June 2014 Aurel Schubert 1) European Central Bank 1) This presentation.
Introduction to Program Evaluation Dr. Suzan Ayers Western Michigan University (courtesy of Dr. Mary Schutten)
Second Independent Evaluation Roles / Responsibilities & Relationships.
INVITATIONAL PRACTICUM IN METAEVALUATION Session 2 9/15/11 GAO & Joint Committee Standards DLS 9/13/11 1.
© CSR Asia 2010 ISO Richard Welford CSR Asia
Decision-Making and Strategic Information Workshop on M&E of PHN Programs July 24-August 11, 2006 Addis Ababa.
PPA 502 – Program Evaluation
Evaluation. Practical Evaluation Michael Quinn Patton.
SAFA- IFAC Regional SMP Forum
Purpose of the Standards
Evaluating Physical Activity Intervention Programs Thomas Schmid, PhD Physical Activity and Health Branch CDC Atlanta, Georgia, USA.
What should be the basis of
Standards and Guidelines for Quality Assurance in the European
Research Methods for the Social Sciences: Ethics Ryan J. Martin, Ph.D. Thomas N. Cummings Research Fellow March 9, 2010.
EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn Nick Saxton Fall 2014.
1 © 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation Getting started with evaluation.
Chapter 14 Overview of Qualitative Research Gay, Mills, and Airasian
University of Wisconsin - Extension, Cooperative Extension, Program Development and Evaluation Unit 2: Developing an evaluation plan.
ISO Richard Welford CSR Asia © CSR Asia 2011.
PART II – Management Audit: Basic Standards, Values and Norms Shared by Pratap Kumar Pathak.
1 REQUIREMENT ENGINEERING Chapter 7. 2 REQUIREMENT ENGINEERING Definition Establishing what the customer requires from a software system. OR It helps.
Prof. György BAZSA, former president Hungarian Accreditation Committee (HAC) CUBRIK Workshop IV Beograd, 13 March, 2012 European Standards and Guidelines.
INTOSAI Public Debt Working Group Updating of the Strategic Plan Richard Domingue Office of the Auditor General of Canada June 14, 2010.
Program Evaluation EDL 832 Jeffrey Oescher, Instructor 6 June 2013.
Joint Committee on Standards for Educational Evaluation Evaluation Standards Joan Kruger Spring 2008.
Monitoring & Evaluation Presentation for Technical Assistance Unit, National Treasury 19 August 2004 Fia van Rensburg.
EVALUATION THEORY AND MODEL Theory and model should have symbiotic relationship with practice Theory and model should have symbiotic relationship with.
EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn Nick Saxton Fall 2014.
Review: Alternative Approaches II What three approaches did we last cover? What three approaches did we last cover? Describe one benefit of each approach.
Evaluation Proposal Defense Observations and Suggestions Yibeltal Kiflie August 2009.
CES 20th Annual Conference Keun-bok Kang (CNU) & Chan-goo Yi (ETRI) ; Taejon, Korea A DESIGN OF THE METAEVALUATION MODEL A DESIGN OF THE METAEVALUATION.
© 2002, CARE USA. All rights reserved. Applying Quality Standards in Impact Evaluation: Case of CARE Program Quality Framework and Evaluation Policy Ahmed.
Lifelong Learning: Construction of competencies and knowledge.
Kathy Corbiere Service Delivery and Performance Commission
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
Session VI Evaluation & Wrap Up. Evaluation Activities directed at collecting, analyzing, interpreting, and communicating information about the effectiveness.
Data Driven Planning and Decision-making for Continuous School Improvement: Developing a Program Evaluation Plan Leadership for Innovative Omani Schools.
Copyright © 2007 Pearson Education Canada 9-1 Chapter 9: Internal Controls and Control Risk.
Evaluating Engagement Judging the outcome above the noise of squeaky wheels Heather Shaw, Department of Sustainability & Environment Jessica Dart, Clear.
Company LOGO. Company LOGO PE, PMP, PgMP, PME, MCT, PRINCE2 Practitioner.
.  Evaluators are not only faced with methodological challenges but also ethical challenges on a daily basis.
Can Participatory Evaluation Approach Trigger an Increased Demand For Results of Relevant Stakeholders in Programming Period? Krunoslav Karlovcec,
Organizations of all types and sizes face a range of risks that can affect the achievement of their objectives. Organization's activities Strategic initiatives.
AES 2010 Kate McKegg, Nan Wehipeihana, Jane Davidson.
Essential Competencies for Program Evaluators Jean A. King 2012 AEA/CDC Summer Institute.
Jeanette Gurrola Psychology Department School of Behavioral & Organizational Sciences Claremont Graduate University American Evaluation.

Designing Effective Evaluation Strategies for Outreach Programs
Setting Actuarial Standards
Data is your friend: how to incorporate evaluation into your research
EVALUATION THEORY AND MODEL
Business Retention and Expansion
Elements of evaluation quality: questions, answers, and resources
The GEF Public Involvement Policy
Presentation transcript:

EVAL 6000: Foundations of Evaluation Carl D. Westine & Dr. Chris L. S. Coryn December 2, 2010

Agenda The Program Evaluation Standards (45 min) –Overview of JCSEE and the standards –The “new” 30 standards –2 nd Edition vs. 3 rd Edition –Research on overlaps across standards –Activity on 3 rd Edition Sufflebeam and Coryn 2010 (10 min) Stufflebeam 2001 (10 min) Break (15 min) Activity: Hard Won Lessons (60 min)

The JCSEE The Joint Committee for Standards on Educational Evaluation was created in 1975, and currently oversees the maintenance and updates to the Program Evaluation Standards Made up of representatives from professional societies WMU folks involved in the JCSEE: Daniel Stufflebeam, James Sanders, Arlen Gullickson Currently Daniela Schröter is becoming more involved with the revision process (our point of contact)

What are Standards JCSEE defined an evaluation standard as a “principle commonly agreed to by experts in the conduct and use of evaluation, that when implemented will lead to greater evaluation quality” (JCSEE, 2010, p. 292) Focus on North America but other countries have been ada/opting these Other standards exist (GAO, Guiding Principles, etc.) For educational programs but… education is relied upon by everything

Early Standards The first edition was published in –Contained 30 standards –Defined the groups (Utility, Feasibility, Propriety, and Accuracy) Spearheaded by Dan Stufflebeam (WMU) and a collection of education and evaluation experts

Second Edition Standards The second edition was published in 1994 –Contained 30 standards –Maintained same group structure Achieved status of being accredited as official standards though the American National Standards Institute (ANSI) Update led by James Sanders (WMU)

Current Happenings Transition from the 2 nd Edition to the 3 rd Edition Third Edition is being published, and copies can be ordered through the JCSEE website: Don Yarrborough at the University of Iowa oversees the JCSEE and the development of the newest standards

The Latest Edition 3 rd Edition Took more than 5 years to develop and finalize (several delays in publishing) The next update has already begun –JCSEE continuously takes suggestions and feedback from users –Goal to update standards more frequently

Organization of the PES Book Applying the Standards Functional Table of Standards –Shows which standards are relevant at points along the evaluation continuum Standards –Group Overview/Scenario –Standard Statements –Rationale/Clarification (Overview) –Implementing (Guidelines) –Hazards (Common Errors) –Application(s) –Documentation

The Program Evaluation Standards The Program Evaluation Standards for educational evaluations Should be used for: –Guiding an evaluation effort (formative) –Assessing the quality of educational programs (summative) –Assessing the quality of an evaluation of an educational program (metaevaluation) –A tool to help policy makers understand evaluations –[Research on evaluation] (in my own opinion)

The Program Evaluation Standards There are now 5 categories totaling 30 individual standards Designed to ensure that an evaluation will… –Utility (7 -> 8): “… serve the information needs of intended users.” –Feasibility (3 -> 4): “… be realistic, prudent, diplomatic, and frugal.” –Propriety (8 -> 7): “… be conducted legally, ethically, and with due regard for the welfare of those involved in the evaluation, as well as those affected by its results.” –Accuracy (12 -> 8): “… reveal and convey technically adequate information about the features that determine worth or merit of the program being evaluated.” –Evaluation Accountability (0 -> 3): be well documented and held subject to internal and external evaluation (JCSEE, 2010).

Utility Standards U1 Evaluator Credibility Evaluations should be conducted by qualified people who establish and maintain credibility in the evaluation context. U2 Attention to Stakeholders Evaluations should devote attention to the full range of individuals and groups invested in the program and affected by its evaluation. U3 Negotiated Purposes Evaluation purposes should be identified and continually negotiated based on the needs of stakeholders. U4 Explicit Values Evaluations should clarify and specify the individual and cultural values underpinning purposes, processes, and judgments.

Utility Standards (cont.) U5 Relevant Information Evaluation information should serve the identified and emergent needs of stakeholders. U6 Meaningful Processes and Products Evaluations should construct activities, descriptions, and judgments in ways that encourage participants to rediscover, reinterpret, or revise their understandings and behaviors. U7 Timely and Appropriate Communicating and Reporting Evaluations should attend to the continuing information needs of their multiple audiences. U8 Concern for Consequences and Influence Evaluations should promote responsible and adaptive use while guarding against unintended negative consequences and misuse.

Feasibility Standards F1 Project Management Evaluations should use effective project management strategies. F2 Practical Procedures Evaluation procedures should be practical and responsive to the way the program operates. F3 Contextual Viability Evaluations should recognize, monitor, and balance the cultural and political interests and needs of individuals and groups. F4 Resource Use Evaluations should use resources effectively and efficiently.

Propriety Standards P1 Responsive and Inclusive Orientation Evaluations should be responsive to stakeholders and their communities. P2 Formal Agreements Evaluation agreements should be negotiated to make obligations explicit and take into account the needs, expectations, and cultural contexts of clients and other stakeholders. P3 Human Rights and Respect Evaluations should be designed and conducted to protect human and legal rights and maintain the dignity of participants and other stakeholders. P4 Clarity and Fairness Evaluations should be understandable and fair in addressing stakeholder needs and purposes.

Propriety Standards (cont.) P5 Transparency and Disclosure Evaluations should provide complete descriptions of findings, limitations, and conclusions to all stakeholders, unless doing so would violate legal and propriety obligations. P6 Conflicts of Interests Evaluations should openly and honestly identify and address real or perceived conflicts of interests that may compromise the evaluation. P7 Fiscal Responsibility Evaluations should account for all expended resources and comply with sound fiscal procedures and processes.

Accuracy Standards A1 Justified Conclusions and Decisions Evaluation conclusions and decisions should be explicitly justified in the cultures and contexts where they have consequences. A2 Valid Information Evaluation information should serve the intended purposes and support valid interpretations. A3 Reliable Information Evaluation procedures should yield sufficiently dependable and consistent information for the intended uses. A4 Explicit Program and Context Descriptions Evaluations should document programs and their contexts with appropriate detail and scope for the evaluation purposes.

Accuracy Standards (cont.) A5 Information Management Evaluations should employ systematic information collection, review, verification, and storage methods. A6 Sound Designs and Analyses Evaluations should employ technically adequate designs and analyses that are appropriate for the evaluation purposes. A7 Explicit Evaluation Reasoning Evaluation reasoning leading from information and analyses to findings, interpretations, conclusions, and judgments should be clearly and completely documented. A8 Communication and Reporting Evaluation communications should have adequate scope and guard against misconceptions, biases, distortions, and errors.

Evaluation Accountability Standards E1 Evaluation Documentation Evaluations should fully document their negotiated purposes and implemented designs, procedures, data, and outcomes. E2 Internal Metaevaluation Evaluators should use these and other applicable standards to examine the accountability of the evaluation design, procedures employed, information collected, and outcomes. E3 External Metaevaluation Program evaluation sponsors, clients, evaluators, and other stakeholders should encourage the conduct of external metaevaluations using these and other applicable standards.

Utility (Old vs. New) 2 nd Edition (OLD)3 rd Edition (NEW) U1Stakeholder IdentificationU1Evaluator Credibility U2Evaluator CredibilityU2Attention to Stakeholders U3Information Scope and SelectionU3Negotiated Purposes U4Values IdentificationU4Explicit Values U5Report ClarityU5Relevant Information U6Report Timeliness and Dissemination U6Meaningful Processes and Products U7Evaluation ImpactU7Timely and Appropriate Communicating and Reporting U8Concern for Consequences and Influences

Feasibility (Old vs. New) 2 nd Edition (OLD)3 rd Edition (NEW) F1Practical ProceduresF1Project Management F2Political ViabilityF2Practical Procedures F3Cost EffectivenessF3Contextual Viability F4Resource Use

Propriety (Old vs. New) 2 nd Edition (OLD)3 rd Edition (NEW) P1Service OrientationP1Responsive and Inclusive Orientation P2Formal AgreementsP2Formal Agreements P3Rights of Human SubjectsP3Human Rights and Respect P4Human InteractionsP4Clarity and Fairness P5Complete and Fair AssessmentP5Transparency and Disclosure P6Disclosure of FindingsP6Conflicts of Interest P7Conflict of InterestP7Fiscal Responsibility P8Fiscal Responsibility

Accuracy (Old vs. New) 2 nd Edition (OLD)3 rd Edition (NEW) A1Program DocumentationA1Justified Conclusions and Decisions A2Context AnalysisA2Valid Information A3Described Purposes and ProceduresA3Reliable Information A4Defensible Information SourcesA4Explicit Program and Context Descriptions A5Valid InformationA5Information Management A6Reliable InformationA6Sound Designs and Analysis A7Systematic InformationA7Explicit Evaluation Reasoning A8Analysis of Quantitative InformationA8Communication and Reporting A9Analysis of Qualitative Information A10Justified Conclusions A11Impartial Reporting A12Metaevaluation

Evaluation Accountability (Old vs. New) 2 nd Edition (OLD)3 rd Edition (NEW) A3Described Purposes and Procedures EA1Evaluation Documentation A12MetaevaluationEA2Internal Metaevaluation EA3External Metaevaluation

Significant Changes Managing focus (Process Use) –Project –Information Less emphasis on “actionable” statements Evaluation Accountability –New standard group –Transparency/documentation –Metaevaluation

Significant Changes (cont.) Combining of Standards –Quant/Qual into Design –Human Rights/Interactions into Respect –Clarity/Timeliness/Dissemination into Appropriate Communication Expansion of the drawn out examples/scenarios

Using the Standards Three principles to guide the use of the Program Evaluation Standards 1.Standards require adaptive, responsive, and mindful use –The user must discover how to apply them in each specific situation 2.Order of the Standard groups does not matter (contrary to what others say: Utilization-Focused, Accuracy) 3.In depth knowledge is required –Don’t just read the individual standards

Overlaps Across the Standards Functional Table of Standards begins the process of identifying standards that are related to each other a specific points in the evaluation process –How should we identify overlaps (dependency relationships) among the standards? –Could this enhance metaevaluation efficiency? –Do specific standards have more significance than others?

Activity Four Groups: Utility, Feasibility/Evaluation Accountablity, Propriety, and Accuracy. Identify the keywords in the standard statements. Example: U1 Evaluator Credibility “Evaluations should be conducted by qualified people who establish and maintain credibility in the evaluation context.”