Sociology 3322a. “…the systematic assessment of the operation and/or outcomes of a program or policy, compared to a set of explicit or implicit standards.

Slides:



Advertisements
Similar presentations
Program Evaluation: What is it?
Advertisements

4th Module: Information Systems Development and Implementation:
Anonymous Services, Hard to Reach Participants, and Issues around Outcome Evaluation Centre for Community Based Research United Way of Peel Region Roundtable.
Mywish K. Maredia Michigan State University
What You Will Learn From These Sessions
Designing an Effective Evaluation Strategy
Lesson 2: Project: Evaluation, Monitoring, Auditing Macerata, 22 nd October Alessandro Valenza, Director, t33 srl.
Program Evaluation. Lecture Overview  Program evaluation and program development  Logic of program evaluation (Program theory)  Four-Step Model  Comprehensive.
Laura Pejsa Goff Pejsa & Associates MESI 2014
Process Evaluation: Considerations and Strategies CHSC 433 Module 4/Chapter 8 L. Michele Issel UIC School of Public Health.
Introduction to Research Methodology
Session 6 ATS policy development Preventing Amphetamine-Type-Stimulant (ATS) Use Among Young People A UNODC Training Workshop.
Return On Investment Integrated Monitoring and Evaluation Framework.
Developing a Logic Model
Evaluation is a professional and ethical responsibility and is a core part of PHN professional practice Commitment to evaluation helps build the PHN intelligence.
1 Types of Evaluation Decide on the Purpose: Formative - improve and inform Summative- identify value/effect.
The Lumina Center Grantseeking Workshop Series Presents Outcomes & Evaluations April 20, 2006.
PPA 502 – Program Evaluation
 To assess the learners achievement at the end of a teaching-learning process, for instance, at the end of the unit.  Measures the learners attainment.
Title slide PIPELINE QRA SEMINAR. PIPELINE RISK ASSESSMENT INTRODUCTION TO GENERAL RISK MANAGEMENT 2.
Evaluation. Practical Evaluation Michael Quinn Patton.
Chapter 6 Training Evaluation
Our experience in monitoring and evaluating drug abuse prevention.
How to Develop the Right Research Questions for Program Evaluation
Creating a service Idea. Creating a service Networking / consultation Identify the need Find funding Create a project plan Business Plan.
Proposal Writing for Competitive Grant Systems
Performance Measurement and Analysis for Health Organizations
Too expensive Too complicated Too time consuming.
Program Evaluation and Logic Models
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Evaluation Assists with allocating resources what is working how things can work better.
Lesson 2: Project: Evaluation, Monitoring, Auditing Macerata, 22 nd October Alessandro Valenza, Director, t33 srl.
KEYWORDS REFRESHMENT. Activities: in the context of the Logframe Matrix, these are the actions (tasks) that have to be taken to produce results Analysis.
Semester 2: Lecture 9 Analyzing Qualitative Data: Evaluation Research Prepared by: Dr. Lloyd Waller ©
Qualitative Research and Decision-Making
Framework for Monitoring Learning & Evaluation
Starting the Planning Process & Assessing Needs HSC 489.
Module 2 Stakeholder analysis. What’s in Module 2  Why do stakeholder analysis ?  Identifying the stakeholders  Assessing stakeholders importance and.
Workshop 6 - How do you measure Outcomes?
Implementation and process evaluation: developing our approach Ann Lendrum University of Manchester Neil Humphrey University of Manchester Gemma Moss Institute.
PPA 502 – Program Evaluation Lecture 2c – Process Evaluation.
Screen 1 of 22 Food Security Policies – Formulation and Implementation Policy Monitoring and Evaluation LEARNING OBJECTIVES Define the purpose of a monitoring.
Chapter 6 Training Evaluation
Qualitative research methodology
Developing a Framework In Support of a Community of Practice in ABI Jason Newberry, Research Director Tanya Darisi, Senior Researcher
Evaluation design and implementation Puja Myles
Program Evaluation Overview. Definitions of Program Evaluation systematic collection of information abut the activities, characteristics, and outcome.
Adrienne DiTommaso, MPA, CNCS Office of Research and Evaluation
CRITICAL THINKING AND THE NURSING PROCESS Entry Into Professional Nursing NRS 101.
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
Erik Augustson, PhD, National Cancer Institute Susan Zbikowski, PhD, Alere Wellbeing Evaluation.
Basic Concepts of Outcome-Informed Practice (OIP).
Session 2: Developing a Comprehensive M&E Work Plan.
IMPLEMENTING LEAPS IN CENTRAL AND EASTERN EUROPE: TRAINERS’ HANDBOOK Monitoring and Evaluating Results.
Organizations of all types and sizes face a range of risks that can affect the achievement of their objectives. Organization's activities Strategic initiatives.
Chapter 1 Toward Accountability. The Faces of Accountability Professional Accountability.Service Delivery Accountability.Coverage Accountability.Cultural.
Monitor and Revise Teaching. ObjectivesObjectives Describe how to monitor teaching List ways to contribute to broader evaluations Explain how to review.
Assessment/Evaluation Make evaluation a central part of planning – not an afterthought 1) Determine Needs 2) Determine Desired Outcomes 3) Determine Activities.
Demonstrating Institutional Effectiveness Documenting Using SPOL.
Logic Models How to Integrate Data Collection into your Everyday Work.
Incorporating Evaluation into a Clinical Project
Program Evaluation ED 740 Study Team Project Program Evaluation
Right-sized Evaluation
Fundamentals of Monitoring and Evaluation
Strategic Planning for Learning Organizations
Introduction to Program Evaluation
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 3. Session 7. Managers’ and stakeholders’ information needs.
Introduction to Client Assessment
Looking at your program data
Presentation transcript:

Sociology 3322a

“…the systematic assessment of the operation and/or outcomes of a program or policy, compared to a set of explicit or implicit standards as a means of contributing to the improvement of the program or policy…”* * Carol Weiss, quoted in Introduction to Program Evaluation Quote used by permission

Genres of Evaluation (Greene) (see Table 38.1) 1 st Genre  Centred around interests of policy makers and funders 2 nd Genre  Centred around practical needs of decision makers 3 rd Genre  Focus on understanding needs and values of all stakeholders and their contexts of meaning 4 th Genre  To empower stakeholders in the interest of social change

Genres (cont.) What are the underlying ideologies and values of each genre? What type of methodology is typical for each?

Postmodern Evaluation How is it different from other genres? What is the value of “playfulness” (Abma, 1997a) to postmodern evaluation practice?

One main goal of program evaluation is: “contributing to the improvement of the program or policy”; however as the readings point out, often organizations that provide programs are resistant to evaluation, why is this the case? How could you, as a program evaluator, reassure them and acquire their cooperation?

How can qualitative program evaluators strengthen evaluation narratives? Credibility vs internal validity Applicability vs external validity Dependability vs reliability Confirmability vs neutrality What is the importance of  Sampling for diversity  Triangulating for agreement  Monitoring bias  Explicit commitment to inclusiveness and diversity

“assessment of the operation and/or outcomes of a program or policy” Evaluations can generally answer two types of questions: 1. What is the outcome of the program? Did the program have any impact, was there any improvement in people's lives? 2. How did the program get to that outcome? Did the program have some set of procedures? Were these procedures followed, were the procedures reasonable, was there a better way to get to the outcomes?

Program evaluation includes more than evaluating program effectiveness. Many people when they think “program evaluation” think of one question :“is this program effective?”, meaning, can it actually be shown to prevent (re)occurrence of a problem that the program was set up to address? However, evaluating the ultimate effectiveness of a program is rarely an easy process.

Questions to Ask: For programs delivering human services, the following questions need to be addressed in order to demonstrate that the program is effective: o Do those who receive the program achieve better outcomes than those who do not receive the program? For example, is there a higher rate of reduced violent behaviour among youth who participate in an after-school sports program than among a comparable group who do not attend a program? Do youth who attend such programs develop fewer problems with the justice system than a comparable group of youth who do not receive such programs?

Why do people who attend the program achieve better outcomes than those who do not? If the program consists of many services, which services are effective? Is the program 100% effective for all who receive? If not, do some kinds of people benefit more than others and why or how so?

Evaluation techniques provide methods for framing quality enhancement questions, collecting data to address quality questions, and for testing possible solutions to quality enhancement questions.

Other Aspects: Program evaluation also includes monitoring implementation of programs. This approach can be used to collect data to address specific quality enhancement questions as well as to identify new questions. Program evaluation also includes approaches to measuring costs and outcomes, both of which are relevant to many quality enhancement questions.

Program manager Program staff Program clients/consumers Program board Program funder Other programs Community groups Researchers.

Stakeholder Engagement Why is stakeholder engagement important? How did Mobley achieve this in the Swindon project? What was the outcome?

In Class Exercise: We are going to look specifically now at various models of program evaluation. I will first summarize what the main objective of the model is. What I want you to do for each: with someone next to you come up with some questions you think you would need to ask if you were to use that model. Let’s say that you were to evaluate an Addiction Treatment Program (or any substance abuse)

Rationale Assessment This model examines the assumptions of a program or system The process of systemically examining the rationale of a program can identify potential problems that form the basis of quality enhancement questions.

Possible questions for a rationale assessment…. why are certain things done? how these relate to the program’s objectives?

Needs Assessment : To gather information to be used in developing appropriate interventions regarding a specific problem. Possible questions a researcher can investigate or ask?

Possible questions for a needs assessment….  is there a need for this kind of service in the community?  What are the characteristics of people who might use the services?  How many beds/staff/other facilities are likely to be needed?  When should services be provided (days of week, time, etc.)  What types of programs should be offered? Different ones for different target groups?

Logic Model  Purpose is identifying linkages between what is done by the program and the program’s goals.

Possible questions for a logic assessment….  What are the components of the program?  What are the objectives of each component?  How does each component relate to the overall goal of the program?

Client/Consumer Characteristics Assessment Purpose is measuring what is done by the program and to whom obtaining feedback from users regarding their perceptions of the programs.

Possible questions for client/consumer assessment…. What are the characteristics of people who attend the program? To what extent is the program reaching the people it was intended to reach? Is it accessible to the people? Which parts of the program did clients find most helpful? Least? To what extent did clients feel their needs were met? Why do people drop out of the program?

Outcome Evaluation  Outcome evaluation involves monitoring the status of clients/consumers following an intervention compared to their status prior to the intervention so as to assess the overall effectiveness of a program.

Possible questions for an outcome assessment…. What proportion of clients complete program? What proportion of clients have reduced or eliminated substance abuse? To what extent have clients improved their level of self- esteem after the program? To what extent did users receiving drug education increase their knowledge of the risks of drug use? Have the clients learned which environmental factors can trigger their return to substance abuse? How they learned ways in which to deal with stress without resorting to substance abuse?

Economic Evaluation  Attempts to determine whether the beneficial consequences of a program justify its cost; to assess the net monetary benefit gained as a result of a specified program, compared to an alternative program, or a “no program” condition. It compares alternative program options.

Possible questions for an economic assessment…. What is the average cost per client of providing the program? What is the cost per client of the day program compared to the outpatient program? What proportion of the program costs are attributable to no-shows?