OST Certificate Program Program Evaluation Course

Slides:



Advertisements
Similar presentations
Program Evaluation Alternative Approaches and Practical Guidelines
Advertisements

Performance Assessment
Researchers nights Information Day Colette RENIER Research Executive Agency FP7-PEOPLE-2010-NIGHT INFORMATION DAY Brussels, 12 November.
FASS Disability Initiative Seminar Two: Curriculum and Course Design Dr Leanne Dowse (SSIS) and Dr Brooke Dining.
Strategic Planning An Overview. General Definition The process of strategic planning involves deciding where you want to go, how you want to be positioned,
Basic Concepts of Strategic Management
Dr. Suzan Ayers Western Michigan University
Experiential Learning Cycle
What is Evaluation? David Dwayne Williams Brigham Young University
Clinical Supervision Foundations Module Six Performance Evaluation.
Interpersonal skills & Communication Edina Nagy Lajos Kiss Szabolcs Hornyák.
Chapter 3 Identifying Issues and Formulating Questions – Mary Ellen Good Identification and formulation of questions is a critical phase of the evaluation.
Evaluation.
Reflective Practice Leadership Development Tool. Context recognised that a key differentiator between places where people wanted to work and places where.
Planning and Strategic Management
The Nature of Strategic Management
1 Types of Evaluation Decide on the Purpose: Formative - improve and inform Summative- identify value/effect.
Guidelines for Evaluation Planning: Clarifying the Evaluation Request and Responsibilities Dr. Suzan Ayers Western Michigan University (courtesy of Dr.
Presentation By: Chris Wade, P Eng. Finally … a best practice for selecting an engineering firm.
Understanding Boards Building Connections: Community Leadership Program.
PPA 502 – Program Evaluation
Chapter 2 DO How can you create a strategic map for your hotel?
MSP course 2007 Phase 0 – Setting up Kumasi, Ghana 2008 Wageningen International.
3 Chapter Needs Assessment.
PPA 503 – The Public Policy Making Process
Evaluation. Practical Evaluation Michael Quinn Patton.
Questions to Consider What are the components of a comprehensive instructional design plan? What premises underline the instructional design process?
Evaluation of Training and Education Activities. Objectives By the end of this presentation, participants will be able to List reasons why evaluation.
Student Assessment Inventory for School Districts Inventory Planning Training.
1 © 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation Getting started with evaluation.
Classroom Action Research Overview What is Action Research? What do Teacher Researchers Do? Guidelines and Ideas for Research.
Soo Young Rieh School of Information University of Michigan Information Ethics Roundtable Misinformation and Disinformation April 3-4, 2009 University.
Identification, Analysis and Management
TEMPUS IV- THIRD CALL FOR PROPOSALS Recommendation on how to make a good proposal TEMPUS INFORMATION DAYS Podgorica, MONTENEGRO 18 th December 2009.
From Evidence to Action: Addressing Challenges to Knowledge Translation in RHAs The Need to Know Team Meeting May 30, 2005.
Becoming a Teacher Ninth Edition
Chapter 11: Project Risk Management
Copyright © 2011 by the McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Planning and Strategic Management Chapter 04.
Too expensive Too complicated Too time consuming.
Program Evaluation EDL 832 Jeffrey Oescher, Instructor 6 June 2013.
Ways for Improvement of Validity of Qualifications PHARE TVET RO2006/ Training and Advice for Further Development of the TVET.
Management & Development of Complex Projects Course Code MS Project Management Perform Qualitative Risk Analysis Lecture # 25.
Teaching Today: An Introduction to Education 8th edition
1 Designing Effective Programs: –Introduction to Program Design Steps –Organizational Strategic Planning –Approaches and Models –Evaluation, scheduling,
Overview of Chapters 11 – 13, & 17
Review: Alternative Approaches II What three approaches did we last cover? What three approaches did we last cover? Describe one benefit of each approach.
1 of 27 How to invest in Information for Development An Introduction Introduction This question is the focus of our examination of the information management.
Copyright 2012 Delmar, a part of Cengage Learning. All Rights Reserved. Chapter 9 Improving Quality in Health Care Organizations.
The Interactive Model Of Program Planning
CASD Librarians: Do You Speak SAS? What We Need to Know October 25, 2011.
Cognitive Processes Chapter 8. Studying CognitionLanguage UseVisual CognitionProblem Solving and ReasoningJudgment and Decision MakingRecapping Main Points.
SOFTWARE PROJECT MANAGEMENT
The Conceptual Framework: What It Is and How It Works Linda Bradley, James Madison University Monica Minor, NCATE April 2008.
The Research Process.  There are 8 stages to the research process.  Each stage is important, but some hold more significance than others.
Implementing Strategy Chapter 7. Objectives Upon completion of this chapter, you should be able to:  Translate strategic thought to organisational action.
/0604 © Business & Legal Reports, Inc. BLR’s Training Presentations Effective Decision-Making Strategies.
PowerPoint Presentation to Accompany Management, 8/e John R. Schermerhorn, Jr. Prepared by: Jim LoPresti University of Colorado, Boulder Published by:
Preparation Plan. Objectives Describe the role and importance of a preparation plan. Describe the key contents of a preparation plan. Identify and discuss.
Evaluating Engagement Judging the outcome above the noise of squeaky wheels Heather Shaw, Department of Sustainability & Environment Jessica Dart, Clear.
Community-Based Deer Management Collaborative Deer Management Outreach Initiative.
Developing Monitoring & Evaluation Frameworks: Process or Product? Anne Markiewicz.
Planning and Organizing Chapter 13. The Planning Function Planning for a business should stem from the company’s Business Plan – The business plan sets.
Company LOGO. Company LOGO PE, PMP, PgMP, PME, MCT, PRINCE2 Practitioner.
IMPLEMENTING LEAPS IN CENTRAL AND EASTERN EUROPE: TRAINERS’ HANDBOOK Monitoring and Evaluating Results.
Organizations of all types and sizes face a range of risks that can affect the achievement of their objectives. Organization's activities Strategic initiatives.
Monitoring and Evaluation Systems for NARS organizations in Papua New Guinea Day 4. Session 10. Evaluation.
Stages of Research and Development
Module 6: Developing Evaluation Questions & Starting the Design Matrix
Advocacy and CampaiGning
Presentation transcript:

OST Certificate Program Program Evaluation Course Chapter 12 Identifying and Selecting the Evaluation Questions and Criteria OST Certificate Program Program Evaluation Course Fitzpatrick, Sanders, and Worthen. (2004). Program Evaluation. Boston, MA: Pearson Education Inc.

Objectives for Identifying and Selecting the Evaluation Questions and Criteria 1. Identify, clarify, and select evaluation questions Four things inform evaluation questions: Clarification of the evaluation purpose Development of the program definition Nature of the evaluation (formative or summative) Stakeholders’ needs If important questions are overlooked or trivial questions allowed, the following could result: an evaluation that has little or no payoff for the expense an evaluation focus that misdirects future efforts loss of goodwill or credibility because an audience’s important questions are omitted disenfranchisement of legitimate stakeholders unjustified conclusions

Objectives for Identifying and Selecting the Evaluation Questions and Criteria (cont’d) 2. Identify criteria that will be used to judge the object of the evaluation Ex: attendance or performance on a test 3. Specify the standards the object must achieve on the criteria to be considered successful Ex: a benchmark of 85% attendance, or a score of 68

Two Stages of Identifying and Selecting Questions for an Evaluation Divergent phase —generate a “laundry list” of potentially important questions and concerns. Convergent phase —select the most critical questions to be addressed from the “laundry list,” from which criteria will be developed.

Identifying Appropriate Sources of Questions and Criteria The Divergent Phase Identifying Appropriate Sources of Questions and Criteria

Sources of Questions During the Divergent Phase 1. Questions, concerns, and values of stakeholders 2. Evaluation models, frameworks, and approaches 3. The literature in the field of the program 4. Professional standards, guidelines, or criteria developed or used elsewhere 5. Expert consultants 6. Evaluator’s professional judgment

Sources of Questions During the Divergent Phase (cont’d) 1. Questions, concerns, and values of stakeholders Stakeholders are the single most important source of questions during this phase. Interview them to determine what they would like to know about the program (questions, concerns, perceptions, ideas for change). Including stakeholders adds validity to the study because they are program “experts.” But, the evaluator must remember that she is the evaluation expert.

Sources of Questions During the Divergent Phase (cont’d) 1. Questions, concerns, and values of stakeholders (cont’d) Three procedural rules for using stakeholders: 1. Use them in their expertise area. 2. Consider carefully the methods you use to extract information from them. 3. Ensure equitable participation Judge thoughtfulness and importance of their questions. Generate all possible questions; don’t be openly judgmental. See Figure 12.1 on page 239 for ways to generate questions with the stakeholders.

Sources of Questions During the Divergent Phase (cont’d) 2. Evaluation models, frameworks, and approaches Don’t start with models to generate questions. But these conceptual frameworks stimulate questions that might not emerge from other sources (particularly if stakeholders are only focusing their questions on outcomes). stakeholders often assume that outcomes are the only thing evaluations can address Using the models as a heuristic allows the evaluator to educate the stakeholder.

Sources of Questions During the Divergent Phase (cont’d) 3. The literature in the field of the program Literature draws evaluator’s attention to issues that should be raised. Previous evaluators are useful sources. They generate questions, and they provide causative models that guide question development. Previous evaluations give insight into possible methods. Tiffany, who are “they”?

Sources of Questions During the Divergent Phase (cont’d) 4. Professional standards, guidelines, or criteria developed or used elsewhere The Program Evaluation Standards, for example, provide insight into appropriate questions and also into criteria for judging results. 5. Expert consultants Often evaluators are not experts in the targeted content area. Outside consultant can provide a more neutral and broad view. Their input can reflect current knowledge and practice. The Joint Committee recommends teams of experts for most evaluations.

Sources of Questions During the Divergent Phase (cont’d) 6. Evaluator’s professional judgment Prior experience with similar evaluations may inform an experienced evaluator as to which questions are going to be useful. Important questions may be omitted unless the evaluator raises them herself.

Selecting the Questions, Criteria, and Issues to Be Addressed The Convergent Phase Selecting the Questions, Criteria, and Issues to Be Addressed

The Convergent Phase This phase is always necessary because: 1. There is nearly always a budget limit. 2. A study becomes increasingly complicated and hard to manage without it. 3. The attention span of the audience is limited.

Selecting the Questions This step must include both the evaluator and stakeholders. Selected Evaluation Questions Should: Hold interest of key audiences Who would use the information? Who will be affected if this evaluation question is dropped? Reduce present uncertainty Would an answer to the question reduce uncertainty or provide information that is not readily available? Yield important information Would the answer provide important information/have an impact on the course of events (as opposed to being “nice to know”)? (cont’d)

Selecting the Questions (cont’d) Selected Evaluation Questions Should: Be of continuing interest Is this question merely of passing interest or does it focus on critical dimensions of continued interest? Be critical to the study’s scope and comprehensiveness Would the scope or comprehensiveness of the evaluation be limited if this question were dropped? Be answerable in terms of resources Is it feasible to answer this question given available financial and human resources, time, methods, and technology?

Selecting the Questions (cont’d) Final Steps: Using a matrix is one way of selecting questions from among the laundry list (see Figure 12.2 on p.249). Meet with stakeholders/client; review “laundry list,” selected questions, and any outstanding issues. Get reactions to selected list. If sponsor or client demands too much control over the selection of questions, evaluator must decide whether the evaluation will be compromised. At the end of the process, there are usually between 3 and 12 final evaluation questions.

Specifying Criteria and Standards Identify standards and criteria for each question that requires a final judgment. Having standards prior to getting results is useful in helping groups to be clear, realistic, and concrete concerning what acceptable expectations for program success. Criteria specify those characteristics of the program that are critical to a program’s success. Standards represent the level of performance a program must reach on the criteria to be considered successful. Absolute Standards – e.g., 80% of students must pass the state assessment Relative Standards – e.g., there must be a 10% improvement in the pass rate this year compared to the year before the program was implemented

Specifying Criteria and Standards (cont’d) By now, you have a fairly good working list of questions, criteria, and standards, but remain flexible. What sort of things might change over the course of an evaluation that would require new or revised questions? changes in scheduling, personnel, or funding unanticipated problems in program implementation evaluation procedures that aren’t working new critical issues that emerge

Working With Stakeholders (cont’d) 1. Commitment and Support Managers and supervisors must: demonstrate that evaluation leads to more informed decisions. reinforce and reward employee’s use of evaluation knowledge and skills. integrate evaluation into the organization’s daily work (make it a part of everyone’s job) Important to consider who’s buy-in is needed and whose name and involvement would add credibility

Working With Stakeholders (cont’d) 2. Use Participatory and Collaborative Approaches Provides opportunities for: Ensuring that all voices are heard. Addressing a diverse set of evaluation questions. Increasing the credibility of the data and results. Sharing the workload. Surfacing individual’s values, beliefs, and assumptions about the program and evaluation.

Working With Stakeholders (cont’d) 3. Choose an Evaluator Role Decide on a role based on: Evaluation context The role the organization will accept Which role is your strength What role will support the greatest use of evaluation findings

Strategies for Getting Buy-In Link evaluation work to the organization’s mission. Involve stakeholders throughout and communicate with them. Link evaluation work to management. Start with small projects involving only a few stakeholders, then widely disseminate findings throughout the organization.

Major Concepts and Theories from Chapter 12 Evaluation questions help to lend focus to the evaluation and guide data selection choices. Criteria specify characteristics of the program that are critical to success of a program. Standards point to the level of performance a program must attain for success. The divergent phase of question development involves all key stakeholders and results in a comprehensive list of potential evaluation questions and concerns. Additional sources of questions are: evaluation models, existing standards, research literature, and the evaluator’s experience. The convergent phase involves the final selection of questions for the evaluation.