PPA 503 – The Public Policy-Making Process

Slides:



Advertisements
Similar presentations
Performance Assessment
Advertisements

Introduction to Monitoring and Evaluation
Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
M & E for K to 12 BEP in Schools
Johns Hopkins University School of Education Johns Hopkins University Evaluation Overview.
METAEVALUATION An Overview (dls 8/30/11). Key Questions  1. What is the essence of metaevaluation?  2. Why is metaevaluation important?  3, What are.
Good Evaluation Planning – and why this matters Presentation by Elliot Stern to Evaluation Network Meeting January 16 th 2015.
Chapter 1: An Overview of Program Evaluation Presenter: Stephanie Peterson.
Criminal Justice Organizations: Administration and Management
Project Monitoring Evaluation and Assessment
Program Evaluation. Lecture Overview  Program evaluation and program development  Logic of program evaluation (Program theory)  Four-Step Model  Comprehensive.
Chapter 3 Identifying Issues and Formulating Questions – Mary Ellen Good Identification and formulation of questions is a critical phase of the evaluation.
Ray C. Rist The World Bank Washington, D.C.
Introduction to Research Methodology
Policy Analysis and Program Evaluation
Decision-Making and Strategic Information Workshop on M&E of PHN Programs July 24-August 11, 2006 Addis Ababa.
PPA 502 – Program Evaluation Lecture 10 – Maximizing the Use of Evaluation Results.
Tailoring Evaluations
Action Implementation and Monitoring A risk in PHN practice is that so much attention can be devoted to development of objectives and planning to address.
PPA 502 – Program Evaluation
PPA 502 – Program Evaluation
Creating Architectural Descriptions. Outline Standardizing architectural descriptions: The IEEE has published, “Recommended Practice for Architectural.
Quality evaluation and improvement for Internal Audit
Program Evaluation In A Nutshell 1 Jonathan Brown, M.A.
Evaluation. Practical Evaluation Michael Quinn Patton.
Purpose of the Standards
Project Management “Introduction to Project Management: Tools, Techniques, and Practices” BA 320 Operations Management.
Australia’s Experience in Utilising Performance Information in Budget and Management Processes Mathew Fox Assistant Secretary, Budget Coordination Branch.
Standards and Guidelines for Quality Assurance in the European
Internal Auditing and Outsourcing
Meeting SB 290 District Evaluation Requirements
ISTEP: Technology Field Research in Developing Communities Instructor: M. Bernardine Dias CAs: Sarah Belousov and Ermine Teves Spring 2009.
1 Module 4: Designing Performance Indicators for Environmental Compliance and Enforcement Programs.
Parliamentary Committees in Democracies: Unit 4 Research Services for Parliamentary Committees.
Performance Measurement and Analysis for Health Organizations
Copyright © 2014 by The University of Kansas Using the Evaluation System to Answer Key Questions About Your Initiative.
Professional Certificate – Managing Public Accounts Committees Ian “Ren” Rennie.
Evaluation in the GEF and Training Module on Terminal Evaluations
1 Introduction to Evaluating the Minnesota Demonstration Program Paint Product Stewardship Initiative September 19, 2007 Seattle, WA Matt Keene, Evaluation.
Program Evaluation EDL 832 Jeffrey Oescher, Instructor 6 June 2013.
Certificate IV in Project Management Introduction to Project Management Course Number Qualification Code BSB41507.
Semester 2: Lecture 9 Analyzing Qualitative Data: Evaluation Research Prepared by: Dr. Lloyd Waller ©
Course on Data Analysis and Interpretation P Presented by B. Unmar Sponsored by GGSU PART 2 Date: 5 July
ROLE OF INFORMATION IN MANAGING EDUCATION Ensuring appropriate and relevant information is available when needed.
Monitoring & Evaluation: The concepts and meaning Day 9 Session 1.
Overview of Chapters 11 – 13, & 17
PPA 502 – Program Evaluation Lecture 2c – Process Evaluation.
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
Professional Certificate in Electoral Processes Understanding and Demonstrating Assessment Criteria Facilitator: Tony Cash.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
 Welcome, introductions  Conceptualizing the evaluation problem  stakeholder interests  Divergent and convergent processes  Developing evaluation.
Context Evaluation knowing the setting Context Evaluation knowing the setting.
Copyright © 2014 by The University of Kansas Using the Evaluation System to Answer Key Questions About Your Initiative.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Developing a Framework In Support of a Community of Practice in ABI Jason Newberry, Research Director Tanya Darisi, Senior Researcher
MG 2351 PRINCIPLES OF MANAGEMENT UNIT- II- PLANNING
Kathy Corbiere Service Delivery and Performance Commission
Session VI Evaluation & Wrap Up. Evaluation Activities directed at collecting, analyzing, interpreting, and communicating information about the effectiveness.
Fundamentals of Governance: Parliament and Government Understanding and Demonstrating Assessment Criteria Facilitator: Tony Cash.
Company LOGO. Company LOGO PE, PMP, PgMP, PME, MCT, PRINCE2 Practitioner.
.  Evaluators are not only faced with methodological challenges but also ethical challenges on a daily basis.
Organizations of all types and sizes face a range of risks that can affect the achievement of their objectives. Organization's activities Strategic initiatives.
Neeraj Kumar Negi Senior Evaluation Officer GEF Independent Evaluation Office March 11 th 2015 Performance Measurement in GEF.
Monitoring and Evaluation Systems for NARS organizations in Papua New Guinea Day 4. Session 10. Evaluation.
Stages of Research and Development
Overview – Guide to Developing Safety Improvement Plan
MaryCatherine Jones, MPH, Public Health Consultant, CVH Team
Overview – Guide to Developing Safety Improvement Plan
Evaluation in the GEF and Training Module on Terminal Evaluations
TECHNOLOGY ASSESSMENT
Presentation transcript:

PPA 503 – The Public Policy-Making Process Lecture 9a - Evaluation

Evaluating Public Programs Program evaluation is a way of bringing to public decision-makers the available knowledge about a problem, about the relative effectiveness of past and current strategies for addressing or reducing that problem, and about the observed effectiveness of particular programs.

Administrative Purposes for Evaluation Policy formulation – to assess or justify the need for a new program and to design it optimally on the basis of past experience. Information on the problem addressed by the program: how big is it? What is its frequency and direction? How is it changing? Information on the results of past programs that dealt with the problem: were those programs feasible? Were they successful? What difficulties did they encounter? Information allowing the selection of one program over another: what are the comparative costs and benefits? What kinds of growth records were experienced?

Administrative Purposes for Evaluation Policy execution – to ensure that a program is implemented in the most cost-effective and technically competent way. Information on program implementation: how operational is the program? How similar is it across sites? Does it conform to the policies and expectations formulated? How much does it cost? How do stakeholders feel about it? Are there delivery problems or error, fraud, and abuse?

Administrative Purposes for Evaluation Policy execution – to ensure that a program is implemented in the most cost-effective and technically competent way. Information on program management: what degree of control exists over expenditures? What are the qualifications and credentials of the personnel? What is the allocation of resources? How is program information used in decision making? Ongoing information on the current state of the problem or threat addressed in the program: is the problem growing? Is it diminishing? Is it diminishing enough so that the program is no longer needed? Is it changing in terms of its significant characteristics.

Administrative Purposes for Evaluation Accountability in public decision making – to determine the effectiveness of an operating program and the need for its continuation, modification, or termination. Information on program outcomes or effects: what happened as a result of program implementation? Information on the degree to which the program made or is making a difference: what change in the problem or threat has occurred that can be directly attributed to the program? Information on the unexpected (and expected) effects of the program.

Functions and Roles of Evaluation Sponsors Executive branch (federal, state, local). Program managers (cost-effectiveness). Agency heads and top policy makers (need, effectiveness). Central budget or policy authorities (effectiveness, need).

Functions and Roles of Evaluation Sponsors Legislative branch: Congressional and legislative policy and evaluation offices (all aspects). Legislative authorization, appropriations, and budget committees (program funding and refunding). Oversight committees (all aspects). Regardless of sponsor, evaluators should clearly specify the objectives and limitations of each evaluation.

Functions and Roles of Evaluation Sponsors As a general rule, public administrators should expect their work on program effectiveness and feasibility to be of more general use than their work on implementation, which will be of most use to program managers and agency heads. Information needs will be larger for large programs than small, new programs over old.

Evaluation Approaches Front-end analysis – evaluative work conducted before a decision to move ahead with a program. Evaluability assessment – reasonableness of assumptions and objectives, comparison of objectives to program activities, feasibility of full-scale evaluation.

Evaluation Approaches Process evaluation – describe and analyze the processes of implemented program activities – management strategies, operations, costs, interactions, etc. Effectiveness or impact evaluation – how well has a program been working? Are the changes the result of the program?

Evaluation Approaches Program and problem monitoring – continuous rather than snapshot – inform on problem characteristics or track program or problem progress in several areas. Metaevaluation or evaluation synthesis – reanalyzes findings from several analyses to determine what has been learned.

Evaluation Approaches

Introduction to Evaluation Procedures Program evaluation is the use of social research methods to systematically investigate the effectiveness of social intervention programs. Draws on techniques and concepts of social science disciplines Intended to be used for improving programs and informing social action aimed at ameliorating social problems.

Introduction to Evaluation Procedures Modern evaluation research grew from pioneering efforts in the 1930s and burgeoned in the post-war years as new methodologies were developed. The social policy and public administration movements have contributed to the professionalization of the field and to the sophistication of the consumers of evaluation research.

Introduction to Evaluation Procedures The need for program evaluation is undiminished in the 2000s and may even be expected to grow. Contemporary concern over the allocation of scarce resources makes it more essential than ever to evaluate the effectiveness of social interventions.

Introduction to Evaluation Procedures Evaluation must be tailored to the political and organizational context of the program to be evaluated.

Introduction to Evaluation Procedures The assessment of one or more program domains: The need for the program The design of the program The program implementation and service delivery The program impact or outcomes Program efficiency Accurate description of program performance and assessment against relevant standards or criteria.

Introduction to Evaluation Procedures Program evaluation presents many challenges to the evaluator Changes in circumstances and activities during an evaluation. Appropriate balance between science and pragmatism. Diversity of perspectives and approaches.

Introduction to Evaluation Procedures Most evaluators are trained as social scientists or social researchers. Complex evaluations may require specialized staffs. Basic knowledge is good for researchers and consumers.

Tailoring evaluations Every evaluation must be tailored to the circumstances of the program to yield credible and useful answers to specific questions while still allowing practical implementation.

Tailoring evaluations Influences on evaluation plans include the purpose of the evaluation. Provide feedback for program improvement to program managers and sponsors. Establish accountability to decision-makers with responsibility to ensure that the program is effective. Contribute to knowledge about some form of social intervention.

Tailoring evaluations Influences also include the nature of program structure and circumstances. The program must be responsive to: How new or open to change the program is. The degree of consensus or conflict among stakeholders about the nature and mission of the program. The values and concepts inherent in the program rationale and design. The way in which the program is organized and administered.

Tailoring evaluations Evaluation planning must also accommodate limitations on resources. Resources include: Funding; Time for completion; Pertinent technical expertise; Program and stakeholder cooperation; Access to important records and program material. Balance between what is desirable and what is feasible.

Tailoring evaluations The evaluation design can be structured around three issues. The questions the evaluation is to answer; The methods and procedures to be used to answer these questions; The nature of the evaluator-stakeholder interactions during the course of the evaluation.

Tailoring evaluations Deciding on the appropriate relationship between the evaluator and the evaluation sponsor, as well as other major stakeholders, is an often neglected, but critical aspect of an evaluation plan. Independent is often expected But participatory or collaborative may enhance stakeholders skills or political influence.

Tailoring evaluations Evaluation questions and methods fall into five categories: Need for services; Program conceptualization and design; Program implementation; Program outcomes; and Program efficiency.

Tailoring evaluations Evaluation terms corresponding to these categories include needs assessment, process evaluation, and impact assessment. Much of evaluation planning consists of identifying the evaluation approach corresponding to the type of questions to be answered and tailoring specifics to the program situation.

Identifying issues and formulating questions A critical phase in evaluation planning is the identification and formulation of the questions that the evaluation will address. These questions focus the evaluation on the areas of program performance most at issue for key stakeholders and guide the design so that it will provide meaningful information about program performance.

Identifying issues and formulating questions Good evaluation questions must identify clear, observable dimensions of program performance that are relevant to the program’s goals and represent domains in which the program can realistically be expected to have accomplishments.

Identifying issues and formulating questions What most distinguishes evaluation questions, however, is that they involve criteria by which the identified dimensions of program performance can be judged. If the formulation of the evaluation questions can include performance standards on which key stakeholders agree, evaluation planning will be easier and the potential for disagreement with the results reduced.

Identifying issues and formulating questions To ensure that matters of greatest significance are covered in the evaluation design, the evaluation questions are best formulated through interaction and negotiation with the evaluation sponsors and other stakeholders representative of significant groups or distinctly positioned in relation to program decision-making.

Identifying issues and formulating questions Although stakeholder input is critical, the evaluator must be prepared to identify program issues that warrant inquiry. Evaluator should conduct a somewhat independent analysis of the assumptions and expectations on which the program is based.

Identifying issues and formulating questions Make the program theory explicit. Program Theory Program’s Organizational Plan Service Utilization Plan Impact Theory

Identifying issues and formulating questions

Identifying issues and formulating questions Program theory describes the assumptions inherent in a program. Encompasses impact theory, which links program actions to intended outcomes; and Process theory, which describes a program’s organizational plan and scheme for ensuring utilization of its services by the target population.

Identifying issues and formulating questions When these procedures have generated a full set of evaluation questions, evaluator must organize them into related clusters. Draw on stakeholder input and professional judgment to set priorities. With the priority evaluation questions determined, evaluator is ready to design the part of the evaluation devoted to answering them.

Meeting the Need for Evaluation Three basic questions Can the results of the evaluation influence decisions about the program? Can the evaluation be done in time to be useful? Is the program significant enough to merit evaluation?

Choices Facing Evaluators Evaluation design What are the evaluation questions? What comparisons are needed? What measurements are needed? How will the resulting information be used? What “breakouts” (disaggregations of data) are needed, such as by facility or type of client?

Choices Facing Evaluators Data Collection What are the primary data sources? How should data be collected? Is sampling required? Where and how? How large a sample is needed? How will data quality be ensured?

Choices Facing Evaluators Data Analysis What analytical techniques are available (given the data)? Which analytical tools will be most appropriate? In what format will the data be most useful? Getting Evaluation Information Used How should evaluation findings be packaged for different audiences? Should specific recommendations accompany evaluation reports to encourage action? What mechanisms can be used to check on implementation of recommendations?