Overview of Chapters 11 – 13, & 17

Slides:



Advertisements
Similar presentations
Program Evaluation Alternative Approaches and Practical Guidelines
Advertisements

1 of 19 How to invest in Information for Development An Introduction IMARK How to invest in Information for Development An Introduction © FAO 2005.
ENTITIES FOR A UN SYSTEM EVALUATION FRAMEWORK 17th MEETING OF SENIOR FELLOWSHIP OFFICERS OF THE UNITED NATIONS SYSTEM AND HOST COUNTRY AGENCIES BY DAVIDE.
Develop and Validate Minimum Core Criteria and Competencies for AgrAbility Program Staff Bill Field, Ed.D., Professor National AgrAbility Project Director.
Develop an Information Strategy Plan
Johns Hopkins University School of Education Johns Hopkins University Evaluation Overview.
Screen 1 of 43 Reporting Food Security Information Reporting Formats for Food Security Report Types Learning Objectives At the end of this lesson you will.
What You Will Learn From These Sessions
Dr. Suzan Ayers Western Michigan University
OST Certificate Program Program Evaluation Course
Practicing Community-engaged Research Mary Anne McDonald, MA, Dr PH Duke Center for Community Research Duke Translational Medicine Institute Division of.
Interpersonal skills & Communication Edina Nagy Lajos Kiss Szabolcs Hornyák.
Laura Pejsa Goff Pejsa & Associates MESI 2014
Guidelines for Evaluation Planning: Clarifying the Evaluation Request and Responsibilities Dr. Suzan Ayers Western Michigan University (courtesy of Dr.
PPA 502 – Program Evaluation
Business research methods: data sources
SELECTING A DATA COLLECTION METHOD AND DATA SOURCE
Recreational Therapy: An Introduction
Challenge Questions How good is our operational management?
Design Plans CSCI102 - Systems ITCS905 - Systems MCS Systems.
PPA 503 – The Public Policy Making Process
Evaluation. Practical Evaluation Michael Quinn Patton.
1 OUTLINE Need of a Proposal (why do we need a proposal?) Definition Types Elements of Winning Business Proposals Criteria for Proposals Writing Process.
English for Water Managers
RESEARCH REPORT PREPARATION AND PRESENTATION
Copyright © 2008 Allyn & Bacon Meetings: Forums for Problem Solving 11 CHAPTER Chapter Objectives This Multimedia product and its contents are protected.
Evaluation Planning II: Setting Boundaries and Analyzing the Evaluation Context Dr. Suzan Ayers Western Michigan University (courtesy of Dr. Mary Schutten)
How to Develop the Right Research Questions for Program Evaluation
Technical Report Writing
Strengthening the quality of research for policy engagement in the African context – achievements and aspirations Tebogo B. Seleka Botswana Institute for.
RESEARCH REPORT PREPARATION AND PRESENTATION. 2 RESEARCH REPORT A research report is: – a written document or oral presentation based on a written document.
Chapter 6: Social Work With Individuals Social.
Adolescent Literacy – Professional Development
1 Introduction to Evaluating the Minnesota Demonstration Program Paint Product Stewardship Initiative September 19, 2007 Seattle, WA Matt Keene, Evaluation.
Program Evaluation EDL 832 Jeffrey Oescher, Instructor 6 June 2013.
Certificate IV in Project Management Introduction to Project Management Course Number Qualification Code BSB41507.
Writing research proposal/synopsis
OECD/INFE Tools for evaluating financial education programmes Adele Atkinson, PhD Policy Analyst OECD With the support of the Russian/World Bank/OECD Trust.
Toolkit for Mainstreaming HIV and AIDS in the Education Sector Guidelines for Development Cooperation Agencies.
Proposal Development Sample Proposal Format Mahmoud K. El -Jafari College of Business and Economics Al-Quds University – Jerusalem April 11,2007.
T 7.0 Chapter 7: Questioning for Inquiry Chapter 7: Questioning for Inquiry Central concepts:  Questioning stimulates and guides inquiry  Teachers use.
1 Technical Communication A Reader-Centred Approach First Canadian Edition Paul V. Anderson Kerry Surman
Using Needs Assessment to Build A Strong Case for Funding Anna Li SERVE, Inc.
Experimental Research Methods in Language Learning Chapter 16 Experimental Research Proposals.
1 Designing Effective Programs: –Introduction to Program Design Steps –Organizational Strategic Planning –Approaches and Models –Evaluation, scheduling,
Review: Alternative Approaches II What three approaches did we last cover? What three approaches did we last cover? Describe one benefit of each approach.
1 of 27 How to invest in Information for Development An Introduction Introduction This question is the focus of our examination of the information management.
Professional Certificate in Electoral Processes Understanding and Demonstrating Assessment Criteria Facilitator: Tony Cash.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
Working in Teams, Unit 4 Individual Roles and Team Mission Working in Teams/Unit 41 Health IT Workforce Curriculum Version 1.0/Fall 2010.
Module V: Writing Your Sustainability Plan Cheri Hayes Consultant to Nebraska Lifespan Respite Statewide Sustainability Workshop June 23-24, 2015 © 2011.
Context Evaluation knowing the setting Context Evaluation knowing the setting.
Evaluation Revisiting what it is... Who are Stakeholders and why do they matter? Dissemination – when, where and how?
Introduction to policy briefs What is a policy brief? What should be included in a policy brief? How can policy briefs be used? Getting started.
IB: Language and Literature
Session VI Evaluation & Wrap Up. Evaluation Activities directed at collecting, analyzing, interpreting, and communicating information about the effectiveness.
Fundamentals of Governance: Parliament and Government Understanding and Demonstrating Assessment Criteria Facilitator: Tony Cash.
Unit 1 Module 4 Explain: Roles of the Evaluator Introduction to Educational Evaluation Dr. Kristin Koskey.
1 Introduction to Cultural Competence A Training Tool.
Draft Public Involvement Plan for: Chehalis River Basin Flood District Formation August 19, 2010.
Software Project Management Lecture 3. What is Project Management?  Project management is “the application of knowledge, skills, tools and techniques.
Pharmacy in Public Health: Community Health Course, date, etc. info.
European Agency for Development in Special Needs Education Project updates Marcella Turner-Cmuchal.
Evaluation What is evaluation?
Monitoring and Evaluation Systems for NARS organizations in Papua New Guinea Day 4. Session 10. Evaluation.
Monitoring and Evaluation
Technical Report Writing
Note to evaluator… The overall purpose of this presentation is to guide evaluators through the completion of steps 4 to 6 of the UFE checklist. The main.
TECHNOLOGY ASSESSMENT
Presentation transcript:

Overview of Chapters 11 – 13, & 17

Clarifying the evaluation request and responsibilities Chapter 11 Clarifying the evaluation request and responsibilities An organization has requested that you to complete an evaluation. What questions do you feel are important to ask before beginning the evaluation? Are there times when you would decline a request for evaluation? Why? What are some of the advantages and disadvantages in having an evaluation conduct by an external evaluator? By an internal evaluator? (take 5 minutes with a partner to discuss)

Clarifying the Evaluation Request and Responsibilities Identify the stakeholders Key questions: Who are the sponsors and clients? Who are the managers and staff? Are there other interest groups, such as agencies, elected officials, the public at large? 2. Identify the purpose for the evaluation Why is the evaluation being requested? What question will it answer? How will the results be used? By whom?

Clarifying the Evaluation Request and Responsibilities 3. Determine if the Evaluation is Appropriate Key questions: How will the results be used? Will the results produce trivial information? Is the program too new? Is the evaluation purpose ethical?

Clarifying the Evaluation Request and Responsibilities Steps to follow: Clarify the intended program model or theory Examine the program in implementation to determine whether it matches the model and could achieve the goals or objectives Explore different approaches to determine if they meet the stakeholders information needs Agree on evaluation priorities and intended uses of the study

Who will evaluate the program? Internal vs. External evaluator Internal: know the organization, history, and decision-making style and would be around to encourage use of evaluation results External: bring a greater objectivity and sometimes specialized skill for a particular project or evaluation

Chapter 12 Setting Boundaries and Analyzing the Evaluation Context 1. After determining who the stakeholders are, how do you determine who should be involved in the evaluation process? 2. What should the evaluator consider in analyzing the political context in which an evaluation will occur? (take 5 minutes to discuss with a partner)

Setting Boundaries and Analyzing the Evaluation Context The evaluator should identify and communicate with each stakeholder group or its representative: 1. To learn about the groups concerns 2. To understand how they will use the evaluation results This will help the evaluator weigh their input during the evaluation

Setting Boundaries and Analyzing the Evaluation Context Do evaluators need to be aware of Squeaking wheels &Powerful Stakeholders? You need to bring about a “democratic dialogue” if possible, with a balance in the number of people involved and diversity within the group.

Setting Boundaries and Analyzing the Evaluation Context Greene (2005) identified four groups: People who have decision authority over the program People who have direct responsibility for the program People who are the intended beneficiaries of the program

Setting Boundaries and Analyzing the Evaluation Context Scriven (2007) adds to the list: Political supporters or opponents Community Leaders Public in General

Setting Boundaries and Analyzing the Evaluation Context Political Context: 1. Who will gain to gain/lose? 2. Which individuals and groups have power in the setting? 3. How should the evaluator relate/communicate to the groups? 4. Will all groups cooperate? 5. Which groups have a vested interest in the outcomes? 6. What safeguards should be incorporated into the evaluation agreement?

Chapter 13 Identifying and selecting the evaluation questions and criteria What is the focus of evaluation questions? What are good sources for evaluation questions? Who should be involved in the divergent and convergent phases? Why?

Identifying and selecting the evaluation questions and criteria Evaluation Questions: Provide focus to the evaluation Specify the information the evaluation will provide Guide choices for data collection, analysis, and interpretation

Identifying and selecting the evaluation questions and criteria Good Sources: Existing Standards in the Field Research Literature Content Experts Evaluators Own Experience

Identifying and selecting the evaluation questions and criteria Divergent: Cronbach (1982), “Opening one’s mind to be entertained at least briefly as prospects for investigation” (p. 210). Fitzpatrick, et al. (2011), “throw a broad net and learn from many possible sources” (p. 316).

Identifying and selecting the evaluation questions and criteria Convergent: Three reasons for reducing the range of variables. Budget Manageability/Complexity Attention Span of Audience The evaluator aims for the maximum bandwith. Cronbach (1982) *Typically completed by an advisory group with representatives from different stakeholder groups.

Chapter 17: Reporting Evaluation Results: Maximizing Use and Understanding What considerations are important in tailoring the reporting of evaluation results to audiences? What are some of the ways results can be communicated to stakeholders? What should be included in an evaluation report? (take 5 minutes to discuss with a partner)

Chapter 17: Reporting Evaluation Results: Maximizing Use and Understanding Different audiences have different information needs and knowledge levels. Reading ability Familiarity with the program Attitude toward the program Role in decision making Familiarity with evaluation practices/methods Attitude and interest in evaluation Experience in using evaluation results Torres, Preskill, and Pointek (2005)

Chapter 17: Reporting Evaluation Results: Maximizing Use and Understanding If possible (Patton, 2008) points out that evaluation data are sued more if the evaluator discusses and negotiates the format, style, and organization of reports with primary users. Other Key Points: Avoid Jargon, Use simple direct language, Use examples, anecdotes, be interesting…

Chapter 17: Reporting Evaluation Results: Maximizing Use and Understanding Key Elements of an Evaluation Report (p. 470) Executive Summary Introduction to the Report Focus of the Evaluation Brief Overview of Evaluation Plan & Procedures Presentation of Results Conclusion and Recommendations Minority Reports or Rejoinders Appendices