EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn Kristin A. Hobson Fall 2011.

Slides:



Advertisements
Similar presentations
Role of CSOs in monitoring Policies and Progress on MDGs.
Advertisements

Introduction to Monitoring and Evaluation
Johns Hopkins University School of Education Johns Hopkins University Evaluation Overview.
Donald T. Simeon Caribbean Health Research Council
MODULE 8: PROJECT TRACKING AND EVALUATION
What You Will Learn From These Sessions
EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn & Carl D. Westine October 14, 2010.
A Commitment to Excellence: SUNY Cortland Update on Strategic Planning.
Good Evaluation Planning – and why this matters Presentation by Elliot Stern to Evaluation Network Meeting January 16 th 2015.
Chapter 1: An Overview of Program Evaluation Presenter: Stephanie Peterson.
Program Evaluation Essentials. WHAT is Program Evaluation?
EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn Kristin A. Hobson Fall 2011.
EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn Nick Saxton Fall 2014.
EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn Kristin A. Hobson Fall 2012.
Ray C. Rist The World Bank Washington, D.C.
EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn & Carl D. Westine October 7, 2010.
EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn Kristin A. Hobson Fall 2011.
EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn Kristin A. Hobson Fall 2011.
Lecture(2) Instructor : Dr. Abed Al-Majed Nassar
Planning and Strategic Management
Evaluation is a professional and ethical responsibility and is a core part of PHN professional practice Commitment to evaluation helps build the PHN intelligence.
Decision-Making and Strategic Information Workshop on M&E of PHN Programs July 24-August 11, 2006 Addis Ababa.
PPA 502 – Program Evaluation
Daniel L. Stufflebeam C. I. P. P. Evaluation Model.
PPA 502 – Program Evaluation
PPA 503 – The Public Policy-Making Process
PPA 503 – The Public Policy Making Process
Program Evaluation In A Nutshell 1 Jonathan Brown, M.A.
Evaluation. Practical Evaluation Michael Quinn Patton.
EDU555 CURRICULUM & INSTRUCTION ENCIK MUHAMAD FURKAN MAT SALLEH WEEK 4 CURRICULUM EVALUATION.
© American Bar Association Effective Strategic Planning Henry F. White, Jr. Executive Director & Chief Operating Officer American Bar Association 10 th.
Standards and Guidelines for Quality Assurance in the European
EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn Nick Saxton Fall 2014.
Several Evaluations Theories and Methods Reference: Foundation of Program Evaluation by Sadish, Cook, and Leviton (1991)
EVAL 6000: Foundations of Evaluation
Orienting Public Spending towards Achieving Results: Performance and Performance Management Joel Turkewitz World Bank.
Investment Portfolio Methodologies Pertemuan Matakuliah: A Strategi Investasi IT Tahun: 2009.
Performance Measurement and Analysis for Health Organizations
Program Evaluation EDL 832 Jeffrey Oescher, Instructor 6 June 2013.
Copyright 2010, The World Bank Group. All Rights Reserved. Planning and programming Planning and prioritizing Part 1 Strengthening Statistics Produced.
Semester 2: Lecture 9 Analyzing Qualitative Data: Evaluation Research Prepared by: Dr. Lloyd Waller ©
INTERNATIONAL SOCIETY FOR TECHNOLOGY IN EDUCATION working together to improve education with technology Using Evidence for Educational Technology Success.
Management of assessments and decision making: execution, facilitation, evaluation Mikko V. Pohjola, Nordem Oy (THL)
IAOD Evaluation Section, the Development Agenda (DA) and Development Oriented Activities Julia Flores Marfetan, Senior Evaluator.
GSSR Research Methodology and Methods of Social Inquiry socialinquiry.wordpress.com January 17, 2012 I. Mixed Methods Research II. Evaluation Research.
EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn Nick Saxton Fall 2014.
Integrating Knowledge Translation and Exchange into a grant Maureen Dobbins, RN, PhD SON, January 14, 2013.
PPA 502 – Program Evaluation Lecture 2c – Process Evaluation.
1 REVIMP From review to improvement Dr. Adrie Visscher Faculty of Behavioural Sciences The Netherlands.
The Major Steps of a Public Health Evaluation 1. Engage Stakeholders 2. Describe the program 3. Focus on the evaluation design 4. Gather credible evidence.
Lecture : 5 Problem Identification And Problem solving.
 Welcome, introductions  Conceptualizing the evaluation problem  stakeholder interests  Divergent and convergent processes  Developing evaluation.
Monitoring and Evaluation
INTEGRATED ASSESSMENT AND PLANNING FOR SUSTAINABLE DEVELOPMENT 1 Click to edit Master title style 1 Evaluation and Review of Experience from UNEP Projects.
Chapter 8: Participant-Oriented Evaluation Approaches
Community Planning 101 Disability Preparedness Summit Nebraska Volunteer Service Commission Laurie Barger Sutter November 5, 2007.
Evaluation and Designing
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Program Evaluation Overview. Definitions of Program Evaluation systematic collection of information abut the activities, characteristics, and outcome.
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
Designing New Programs Design & Chronological Perspectives (Presentation of Berk & Rossi’s Thinking About Program Evaluation, Sage Press, 1990)
Background to Program Evaluation
Evaluating Engagement Judging the outcome above the noise of squeaky wheels Heather Shaw, Department of Sustainability & Environment Jessica Dart, Clear.
EVALUATION RESEARCH To know if Social programs, training programs, medical treatments, or other interventions work, we have to evaluate the outcomes systematically.
Developing Monitoring & Evaluation Frameworks: Process or Product? Anne Markiewicz.
Project: EaP countries cooperation for promoting quality assurance in higher education Maria Stratan European Institute for Political Studies of Moldova.
Decision-Oriented Evaluation Approaches Presenters: Chris, Emily, Jen & Marie.
Module 1: Introducing Development Evaluation
Communicate the Impact of Poor Cost Information on a Decision
Communicate the Impact of Poor Cost Information on a Decision
Presentation transcript:

EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn Kristin A. Hobson Fall 2011

Agenda Stage Three theories – Peter Rossi Use-oriented theories and theorists – Utilization-focused evaluation – Michael Patton – Participatory Evaluation – Brad Cousins Questions and discussion Encyclopedia of Evaluation entries

“Evaluation research is more that the application of methods…it is also a political and managerial activity, an input into…policy decisions and allocations” — Peter H. Rossi

Biographical Sketch Born in 1921 in New York City Ph.D. in Sociology, Columbia University B.S. in Sociology, City College Professor Emeritus of Sociology at University of Massachusetts and held positions as Harvard, University of Chicago, and Johns Hopkins University Published numerous books, research monographs and articles Led many high-stakes national-level evaluations

Rossi’s View of Evaluation Influenced by Campbell, Cronbach, and Scriven Major function of social research in public policy formulation and change is to evaluate the effectiveness of public programs Emphasis on empirically testing social theories as part of program evaluation

Rossi’s Influence Extensive and diverse Sociological (e.g., books on life histories of American families) Methodological (e.g., survey research) Primarily evaluation theory and methodology

Rossi’s Major Contributions Tailored evaluation Comprehensive evaluation Theory-driven evaluation Demystification The “good enough” rule The metallic and plastic laws of evaluation

Rossi’s Theory of Social Programming Social interventions are conservative and incremental Central task is to design programs that serve the disadvantaged well Recognizes the political and economic constraints placed on social programs

Rossi’s Theory of Knowledge Construction Both realist and empiricists in orientation Simultaneously emphasizes fallibilism and multiplism Questions the philosophical warrants for a singular epistemology, and questions the legitimacy and value of epistemology more generally

Rossi’s Theory of Valuing Similar to Scriven in many respects Social need is a crucial criterion for value claims Integrates both prescriptive and descriptive theories (though never clear in explication of how to integrate)

Rossi’s Theory of Knowledge Use Distinguishes between instrumental, conceptual, and persuasive uses Not clear about contingencies to guide choices to facilitate types of use Demystification (e.g., the nature of social problems and their amelioration) has been criticized for being too “scientistic”

Rossi’s Theory of Evaluation Practice Clearly describes trade-offs and priorities depending on various circumstances (e.g., innovations, modifications, established programs) Recognizes constraints associated with trade-offs and priorities (e.g., comprehensive versus tailored evaluations) See Table 9.1, p. 383

Evaluation Theory Tree

Use-Oriented Theorists FettermanKingPreskill

“This class of theories [use] are concerned with designing evaluation that are intended to decision making…to ensure that evaluation results have a direct impact on decision making and organizational change” — Marvin C. Alkin

Use-Oriented Theories Originated from decision-oriented theories Decision-oriented theorists emphasize evaluation as assisting key decision makers in making informed decisions Evaluations should be designed to ensure direct impact on decision making and organizational change

“Evaluations should be judged by their utility and actual use…[and]… evaluators should facilitate the evaluation process and design any evaluation with careful consideration of how everything that is done, from beginning to end, will affect use” — Michael Q. Patton

Explicitly geared to ensure that evaluations make an impact and are used Evaluation is guided in collaboration with a targeted group of priority users Utilization-Focused Evaluation

All aspects are chosen and applied to help targeted users obtain and apply evaluation findings to their intended use and maximize the likelihood that they will In the interest of getting findings used, draws on any legitimate evaluation approach

Situational Analysis What decisions, if any, are the evaluation findings expected to influence? When will decisions be made? By whom? When, then, must the evaluation findings be presented to be timely and influential? What is at stake in the decisions? For whom? What controversies or issues surround the decision? What is the history and context of the decision-making process? What other factors (values, politics, personalities, promises already made) will affect the decision making?

Situational Analysis How much influence do you expect the evaluation to have—realistically? To what extent has the outcome of the decision already been determined? What data and findings are needed to support decision making? What needs to be done to achieve that level of influence? How will we know afterward if the evaluation was used as intended?

“[Practical participatory evaluation]…seeks to understand program with the expressed intention of informing and improving their implementation” — J. Bradley Cousins

Participatory Evaluation Evaluator works collaboratively in partnership with a select group of intended users The evaluator’s role is to provide technical support, training, and to assure and maintain quality control Involves a broad group of stakeholder participants

Participatory Evaluation Modified from more limited stakeholder-based aproaches Stakeholders are engaged in the entire evaluation process (e.g., design, data collection, analysis, reporting, application of findings) Assumes that involvement will increase buy-in, credibility, and use

“[The CIPP model encourages evaluators to engage a]…representative stakeholder review panel to help define the evaluation questions, shape evaluation plans, review draft reports and disseminate findings” — Daniel L. Stufflebeam

Improvement- and Accountability-Oriented Approaches Expansive and seek comprehensiveness in considering the full range of questions and criteria needed to assess a program Often employ the assessed needs of a program’s stakeholders as the foundational criteria for assessing a program

Improvement- and Accountability-Oriented Approaches They usually reference all pertinent technical and economic criteria for judging the merit or quality of programs Examine all relevant outcomes, not just those keyed to program objectives Use multiple qualitative and quantitative assessment methods to provide cross-checks on findings

Decision- and Accountability- Oriented Studies Emphasizes that program evaluation should be used proactively to help improve a program as well as retrospectively to judge value Philosophical underpinnings include an objectivist orientation to finding best answers to context-limited questions and subscription to the principles of a well- functioning democratic society, especially human rights, an enlightened citizenry, equity, excellence, conservation, probity, and accountability

Decision- and Accountability- Oriented Studies Serves stakeholders by engaging them in focusing an evaluation and assessing draft evaluation reports; addressing their most important questions plus those required to assess the program’s value; providing timely, relevant information to assist decision making; producing an accountability record; and issuing needed summative evaluation reports This approach is best represented by Stufflebeam’s context, input, process, and product (CIPP) model for evaluation

CIPP Model

Evaluation RolesContextInputProcessProduct Formative Evaluation: Prospective application of CIPP information to assist decision making and quality assurance. Guidance for determining areas for improvement and for choosing and ranking goals (based on assessing needs, problems, assets, and opportunities, plus contextual dynamics). Guidance for choosing a program strategy (based on identifying and assessing alternative strategies and resource allocation plans). Examination of the work plan. Guidance for implementing the operational plan (based on monitoring and judging activities and delivering periodic evaluative feedback). Guidance for continuing, modifying, adopting, or terminating the effort (based on assessing outcomes and side effects). Summative evaluation: Retrospective use of CIPP information to sum up the effort’s merit, worth, probity, equity, feasibility, efficiency, safety, cost, and significance. Comparison of goals and priorities to assessed needs, problems, assets, opportunities, and relevant contextual dynamics. Comparison of the program’s strategy, design, and budget to those of critical competitors and to goals and targeted needs of beneficiaries. Full description of the actual process and record of costs. Comparison of the designed and actual processes and costs. Comparison of outcomes and side effects to goals and targeted needs and, as feasible, to results of competitive programs. Interpretation of results against the effort’s assessed context, inputs, and processes.

Encyclopedia Entries CIPP Model (Context, Input, Process, Product) Cost-Benefit Analysis Cost-Effectiveness Goal Indicators Meta-Analysis Monitoring Needs Assessment Objectives Objectives-Based Evaluation Outcomes Outputs Success Case Method Tyler, Ralph W.