Evaluating the Options Analyst’s job is to: gather the best evidence possible in the time allowed to compare the potential impacts of policies.

Slides:



Advertisements
Similar presentations
Net Present Value and Other Investment Rules Chapter 5 Copyright © 2010 by the McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin.
Advertisements

Project Selection (Ch 4)
Key Concepts and Skills
Chapter McGraw-Hill/Irwin Copyright © 2006 by The McGraw-Hill Companies, Inc. All rights reserved. 9 Net Present Value and Other Investment Criteria.
0 Net Present Value and Other Investment Criteria.
Copyright © 2012 Pearson Prentice Hall. All rights reserved. Chapter 10 Capital Budgeting Techniques.
Chapter 9 INVESTMENT CRITERIA Pr. Zoubida SAMLAL GF 200.
Exploring uncertainty in cost effectiveness analysis NICE International and HITAP copyright © 2013 Francis Ruiz NICE International (acknowledgements to:
8. Evidence-based management Step 3: Critical appraisal of studies
COST-BENEFIT ANALYSIS Chapter 8. Projecting Present Dollars into the Future R=$ T=years r=interest rate How much will $1000 earn in 2 years at an interest.
Engineering Systems Analysis Richard de Neufville © Massachusetts Institute of Technology Economic Evaluation Slide 1 of 22 Economic Evaluation l Objective.
Economic Analysis Concepts. 2 Is the project justified ?- Are benefits greater than costs? Which is the best investment if we have a set of mutually exclusive.
Engineering Economic Analysis Canadian Edition
Cumulative Risk Assessment for Pesticide Regulation: A Risk Characterization Challenge Mary A. Fox, PhD, MPH Linda C. Abbott, PhD USDA Office of Risk Assessment.
PLCY 240 – March 19, 2009 Introduction to Cost-Benefit Analysis Basic Steps of CBA: 1)specify the set of alternative projects 2)decide whose benefits and.
A METHODOLOGY FOR MEASURING THE COST- UTILITY OF EARLY CHILDHOOD DEVELOPMENTAL INTERVENTIONS Quality of improved life opportunities (QILO)
BUSINESS ECONOMICS Class 6 1 and 2 December, 2009.
1 Civil Systems Planning Benefit/Cost Analysis Scott Matthews / and Lecture 2 - 9/1/2004.
PPA 502 – Program Evaluation Lecture 10 – Maximizing the Use of Evaluation Results.
Rest of Course Proposals & Research Design Measurement Sampling Survey methods Basic Statistics for Survey Analysis Experiments Other Approaches- Observation,
© 2003 The McGraw-Hill Companies, Inc. All rights reserved. Net Present Value and Other Investment Criteria Chapter Nine.
1© 2013 by Nelson Education Ltd. CHAPTER TWELVE The Costs and Benefits of Training.
AGEC 608 Lecture 17, p. 1 AGEC 608: Lecture 17 Objective: Review the main aspects of cost- effectiveness analysis (CEA) and cost-utility analysis (CUA).
Agenda: Zinc recap Where are we? Outcomes and comparisons Intimate Partner Violence Judicial Oversight Evaluation Projects: Program Theory and Evaluation.
Agenda: Block Watch: Random Assignment, Outcomes, and indicators Issues in Impact and Random Assignment: Youth Transition Demonstration –Who is randomized?
Chapter 6 Training Evaluation
Project Detailed Business Case
POLICY ANALYSIS WHAT IS IT? HOW DO WE DO IT? WHY IS IT IMPORTANT?
Stephane Larocque – Consulting Practice Leader, Impact Infrastructure A DECISION MAKING FRAMEWORK FOR SUSTAINABLE INFRASTRUCTURE DEVELOPMENT 1 ST INTERNATIONAL.
Economic Evaluations, Briefly… CHSC 433 Module 6/Chapter 13 UIC School of Public Health L. Michele Issel, PhD, R N.
Click to add title Household energy efficiency programme evaluation: does it tell us what we need to know? Dr Joanne Wade CXC
© 2006 ITT Educational Services Inc. SE350 System Analysis for Software Engineers: Unit 6 Slide 1 Chapter 5 Initiating and Planning Systems Development.
Research and Evaluation Center Jeffrey A. Butts John Jay College of Criminal Justice City University of New York August 7, 2012 How Researchers Generate.
Moving from Development to Efficacy & Intervention Fidelity Topics National Center for Special Education Research Grantee Meeting: June 28, 2010.
State Smart Transportation Initiative October 9, 2014 Matthew Garrett Oregon DOT Director Erik Havig Oregon DOT Planning Section Manager.
Knowing what you get for what you pay An introduction to cost effectiveness FETP India.
Normative Criteria for Decision Making Applying the Concepts
Program Evaluation. Program evaluation Methodological techniques of the social sciences social policy public welfare administration.
Cost-Effectiveness Thresholds Professor of Health Economics
Semester 2: Lecture 9 Analyzing Qualitative Data: Evaluation Research Prepared by: Dr. Lloyd Waller ©
INTERNATIONAL SOCIETY FOR TECHNOLOGY IN EDUCATION working together to improve education with technology Using Evidence for Educational Technology Success.
The aim / learning outcome of this module is to understand how to gather and use data effectively to plan the development of recycling and composting.
Methodological Problems Faced in Evaluating Welfare-to-Work Programmes Alex Bryson Policy Studies Institute Launch of the ESRC National Centre for Research.
HOW TO WRITE RESEARCH PROPOSAL BY DR. NIK MAHERAN NIK MUHAMMAD.
GSSR Research Methodology and Methods of Social Inquiry socialinquiry.wordpress.com January 17, 2012 I. Mixed Methods Research II. Evaluation Research.
Engineering Economic Analysis Canadian Edition
Decision Support for Biosecurity: Valuation of Biodiversity Brian Bell.
Predicting the Benefits and Costs of Criminal Justice Policies TAD Conference, August 23, 2013 David L. Weimer La Follette School of Public Affairs University.
Integrated Risk Management Charles Yoe, PhD Institute for Water Resources 2009.
1© 2010 by Nelson Education Ltd. Chapter Twelve The Costs and Benefits of Training.
The Program Evaluation Cycle Module 3. 2 Overview n Overview of the evaluation cycle n Major components of the cycle n Main products of an evaluation.
CAPITAL BUDGETING_LECT 091 The Concept of Opportunity Cost The concept of opportunity cost is used in CBA to place a dollar value on the inputs required.
Chapter 6 Training Evaluation
META-ANALYSIS, RESEARCH SYNTHESES AND SYSTEMATIC REVIEWS © LOUIS COHEN, LAWRENCE MANION & KEITH MORRISON.
Measuring Impact 1 Non-experimental methods 2 Experiments
Objectives and Criteria: What are they and how do we use them? Where do they come from? What are the “must-have” criteria? How are market and government.
Flagship Program on Health Sector Reform and Sustainable Financing.
1 Cost-Benefit Analysis Public Economics Minda DC. Eduarte.
Capital Insight Pty Limited ABN Berry Street North Sydney NSW 2060 t f Health Economics.
Agenda: Quasi Experimental Design: Basics WSIPP drug court evaluation Outcomes and Indicators for your projects Next time: bring qualitative instrument.
Readings n Text: Riddick & Russell –Ch1 stakeholders – p10 –Ch 2 an evaluation system –Proposal p25-36 – Ch 4 – Lit Review n Coursepack –GAO report Ch.
Benefit-Cost Analysis in Environmental Decision Making
©2005 Prentice Hall Business Publishing, Introduction to Management Accounting 13/e, Horngren/Sundem/Stratton Capital Budgeting Chapter 11.
EBM --- Journal Reading Presenter :葉麗雯 Date : 2005/10/27.
Rest of Course Proposals & Research Design Measurement Sampling
Training Evaluation Chapter 6 6 th Edition Raymond A. Noe Copyright © 2013 by The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/Irwin.
IMPACT EVALUATION PBAF 526 Class 5, October 31, 2011.
What are systematic reviews and why do we need them?
Making the Business Case for an IT System
6 Chapter Training Evaluation.
Presentation transcript:

Evaluating the Options Analyst’s job is to: gather the best evidence possible in the time allowed to compare the potential impacts of policies

Time Outcomes Activities, Policies, and Programs Before During After What should we do?:  Policy Analysis  Benefit-cost analysis  Cost-Effectiveness  Needsassessment What was the impact?:  Outcome evaluation  Performance Measurement What are we doing? How does it work?:  Process Evaluation

WSIPP: “Return on Investment: Evidence-Based Options to Improve Statewide Outcomes” What information do we get from WSIPP study? How did they create it? What principles can we take away for our predictions?

Benefits: To whom? For what period?

Costs: What’s included? How can they be positive?

Summary stats: What are they? How are they different?

Net present value (WSIPP 2011, Tech appdx p.6)) Q is how much of outcome you get with program P the value of the outcome C is cost of program Dis is discount rate Proage is age of participant when program starts

Benefit-Cost Ratio (WSIPP 2012, tech appdx p.6) Internal Rate of Return: discount rate at which NPV is zero =0

WSIPP study elementsWSIPP study elements: Meta-analysis: averages results across multiple studies to get program impacts Q (meta example)(meta example) Estimates monetary benefits: –private gains to participants, –public value of avoiding outcomes like abuse and crime, and –private value of not being victim, P Puts these together to get impact on outcomes over life with discount for later benefits and costs (dis) Benefit Cost analysis adds up benefits and costs

WSIPP adjusts estimates for quality of the evaluation evidence it collects: (tech appdx p.17) Determine empirically if have enough studies (i.e., which types of estimates are largest).

From WSIPP case: What outcomes will you account for? Impacts on who? –(state or local budget, participants, by-standers) What time frame will you use? –(discounting and NPV) How will you weight multiple sources of evidence given its qualities? How will you add it all up? Will you demonstrate Sensitivity analysis of results? What other criteria “count” other than monetized?

General starting points for predictions: Need detailed descriptions of the alternatives (but not TOO detailed) Focus on key impacts and most important costs Common metrics (dollars, DALYS, etc.) are useful if they capture key outcomesDALYS, May need to adjust estimates for your scale or context Get the best possible evidence—won’t get perfect information Need to understand strengths and weaknesses of your evidence and communicate it

Where to get evidence on costs and impacts (from Hatry): Previous experience with similar changes Pilot study in your organization Information from other organizations that implemented similar policies (program evaluations) Academic or think tank studies (academic journal search and web search) Modeled or “engineered” estimates Theories and logical inference about causal connections (tragically leads to “high,” “medium,” or “low”!) WEAKEST!

Does the evidence from elsewhere apply to your organization (external validity)? Is the policy or political context different in important ways? Are the economic conditions different? Is the target of the policy (e.g., client population or location) different in critical ways? Would the policy or program be implemented in the same way? To the same scale?  You must assess the severity of the differences and predict their impacts on your outcomes.

Sources of uncertainty in estimates: Validity of comparison and study methodology (see WSIPP report) Statistical uncertainty (randomness) Uncertainty in how would be implemented in new context Possible changes in other policies or conditions (e.g., economic or social)

What do you do with uncertainty? Give explicit range estimates for costs or impacts Perform sensitivity analysis and discuss effects on trade-offs (e.g., Monte Carlo) Use worst case/best case scenarios Give best guess estimates with caveats Build resilience into your policy options But…. Clients like certainty There is limited time/space to explain details Need to make decisions in face of uncertainty

How do you add up? Most likely you don’t!--Any adding up scheme inherently weights the criteria Can use cost/benefit analysis (monetize) Use go/no go (minimum threshold) for each criteria and pick policy that meets all, then maximize one Don’t have to recommend one policy but MUST point out KEY trade-offs across policy options

Your mandate: Find at least one quantitative outcome criterion that you can find evidence to estimate. You must provide cost estimates of your options Use at least one academic or think tank study as evidence for at least one outcome (and preferably more) I challenge you to find the most informative quantitative and qualitative evidence from the broadest sources.