Agenda: Quasi Experimental Design: Basics WSIPP drug court evaluation Outcomes and Indicators for your projects Next time: bring qualitative instrument.

Slides:



Advertisements
Similar presentations
Performance Measurement and Evaluation 2/8/2014 Performance Measurement and Evaluation 1 Performance Measurement and Evaluation What are they? How are.
Advertisements

Advantages and limitations of non- and quasi-experimental methods Module 2.2.
Mywish K. Maredia Michigan State University
Estimating the Impact of Liens on Taxpayer Compliance Behavior and Income Taxpayer Advocate Service Research & Analysis June 2012.
Social Impacts Measurement in Government and Academia Daniel Fujiwara Cabinet Office & London School of Economics.
Philip Davies The Challenges Ahead for Impact Evaluation Studies in Poland Using Mixed Methods of Evaluation for Policy Making Purposes.
Designs to Estimate Impacts of MSP Projects with Confidence. Ellen Bobronnikov March 29, 2010.
CHAPTER 12, evaluation research
VIII Evaluation Conference ‘Methodological Developments and Challenges in UK Policy Evaluation’ Daniel Fujiwara Senior Economist Cabinet Office & London.
Research Design and Validity Threats
SINGLE CASE, QUASI- EXPERIMENT AND DEVELOPMENT RESEARCH.
Evaluation Research COMT 502. topics Evaluation research –Process of determining whether intervention has its intended result.
Objectives for Session Nine Observation Techniques Participatory Methods in Tanzania Hand back memos.
Agenda: Zinc recap Where are we? Outcomes and comparisons Intimate Partner Violence Judicial Oversight Evaluation Projects: Program Theory and Evaluation.
Agenda: Block Watch: Random Assignment, Outcomes, and indicators Issues in Impact and Random Assignment: Youth Transition Demonstration –Who is randomized?
Evaluating NSF Programs
Thinking for a Change Cognitive Skills Program Outcome Evaluation Carver County Court Services.
School Improvement Specialist Meeting
Research and Evaluation Center Jeffrey A. Butts John Jay College of Criminal Justice City University of New York August 7, 2012 How Researchers Generate.
Overview of MSP Evaluation Rubric Gary Silverstein, Westat MSP Regional Conference San Francisco, February 13-15, 2008.
Research Design for Quantitative Studies
The Evaluation of Mathematics and Science Partnership Program A Quasi Experimental Design Study Abdallah Bendada, MSP Director
Measuring the Value of Your Volunteer Efforts Nikki Russell Volunteer Initiatives Manager United Way of King County.
Copyright © 2008 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 12 Undertaking Research for Specific Purposes.
CNCS Evaluation Highlights Carla Ganiel, Senior Program and Project Specialist AmeriCorps State and National.
TRANSLATING RESEARCH INTO ACTION What is Randomized Evaluation? Why Randomize? J-PAL South Asia, April 29, 2011.
Evaluating the Options Analyst’s job is to: gather the best evidence possible in the time allowed to compare the potential impacts of policies.
S-005 Intervention research: True experiments and quasi- experiments.
Semester 2: Lecture 9 Analyzing Qualitative Data: Evaluation Research Prepared by: Dr. Lloyd Waller ©
Agenda Admin Observational studies and others Why education? The progressives and centralization.
Assisting GPRA Report for MSP Xiaodong Zhang, Westat MSP Regional Conference Miami, January 7-9, 2008.
Private involvement in education: Measuring Impacts Felipe Barrera-Osorio HDN Education Public-Private Partnerships in Education, Washington, DC, March.
Impact Evaluation in Education Introduction to Monitoring and Evaluation Andrew Jenkins 23/03/14.
Rigorous Quasi-Experimental Evaluations: Design Considerations Sung-Woo Cho, Ph.D. June 11, 2015 Success from the Start: Round 4 Convening US Department.
CHAPTER 12 Descriptive, Program Evaluation, and Advanced Methods.
Classifying Designs of MSP Evaluations Lessons Learned and Recommendations Barbara E. Lovitts June 11, 2008.
AFRICA IMPACT EVALUATION INITIATIVE, AFTRL Africa Program for Education Impact Evaluation David Evans Impact Evaluation Cluster, AFTRL Slides by Paul J.
Evaluating Impacts of MSP Grants Hilary Rhodes, PhD Ellen Bobronnikov February 22, 2010 Common Issues and Recommendations.
Program Evaluation The use of scientific methods to judge and improve the planning, monitoring, effectiveness, and efficiency of health, nutrition, and.
Mathematics and Science Partnerships: Summary of the FY2006 Annual Reports U.S. Department of Education.
Pay for performance and impact evaluation design Practical lessons from OECD review Y-Ling Chi, OECD.
Evaluating Impacts of MSP Grants Ellen Bobronnikov Hilary Rhodes January 11, 2010 Common Issues and Recommendations.
Research Design ED 592A Fall Research Concepts 1. Quantitative vs. Qualitative & Mixed Methods 2. Sampling 3. Instrumentation 4. Validity and Reliability.
Chapter 10 Finding Relationships Among Variables: Non-Experimental Research.
Quasi Experimental and single case experimental designs
1 Module 3 Designs. 2 Family Health Project: Exercise Review Discuss the Family Health Case and these questions. Consider how gender issues influence.
Evaluation Designs Adrienne DiTommaso, MPA, CNCS Office of Research and Evaluation.
The Disability Employment Initiative (DEI): Impact Evaluation Design October 21, 2015 Sung-Woo Cho, Ph.D.
Framework of Preferred Evaluation Methodologies for TAACCCT Impact/Outcomes Analysis Random Assignment (Experimental Design) preferred – High proportion.
Using propensity score matching to understand what works in reducing re-offending GSS Methodology Symposium Sarah French & Aidan Mews, Ministry of Justice.
Evaluating Impacts of MSP Grants Ellen Bobronnikov January 6, 2009 Common Issues and Potential Solutions.
Current practices in impact evaluation Howard White Independent Evaluation Group World Bank.
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov February 16, 2011.
1 Support to enterprise – a counterfactual approach Daniel Mouqué Evaluation Unit, DG REGIO Ex post evaluation – WP 6c.
1 General Elements in Evaluation Research. 2 Types of Evaluations.
Practice- Based Evaluation Research Week 4 Day 2 DIE 4564 Research Methods.
Defining Clear Goals and ObjectivesDefining Clear Goals and Objectives Barbara A. Howell, M.A. PAD 5850.
Outcomes Evaluation A good evaluation is …. –Useful to its audience –practical to implement –conducted ethically –technically accurate.
The Effect of the Appalachian Math and Science Partnership on Student Achievement William Craig, Betsy Evans, and Eugenia Toma Martin School of Public.
IMPACT EVALUATION PBAF 526 Class 5, October 31, 2011.
Looking for statistical twins
to see if program cost causes more in savings. $1.74
Research Design and Outcomes
Measuring Results and Impact Evaluation: From Promises into Evidence
Assess Plan Do Review Resource 1: Types of Evaluation – which type of approach should I use? There are a number of different types of evaluation, the most.
Erika Ostlie, M.A. Carnevale Associates, LLC April 10, 2012
Andres F. Rengifo Christine S. Scott-Hayward Vera Institute of Justice
Conducting Propensity Score Matching and Survival Analysis to Predict Recidivism for a Home Visitation Program Case Study, and Applying Results Propensity.
Impact Evaluation Methods
Impact Evaluation Methods: Difference in difference & Matching
Presentation transcript:

Agenda: Quasi Experimental Design: Basics WSIPP drug court evaluation Outcomes and Indicators for your projects Next time: bring qualitative instrument for your project (focus group or interview protocol)

Quasi-experimental design: Goal: Pick a comparison group that looks like the intervention group would have without the intervention Can compare aggregate control and intervention groups OR can match individual cases within intervention and control –Can only use cases that HAVE a good match –Can match to only 1 case or to more (e.g., 5 cases) Can match on pre-intervention characteristics OR previous outcomes Can use simple or multivariate techniques to match OR to control for differences when assess outcomes OR both

Impact here!

Quasi-experimental methods: Propensity score matching Regression discontinuity design Instrumental Variables Multivariate regression of post-period outcomes with controls Difference in difference regression Fixed Effects regression models

WSIPP drug court evaluation What are the biggest challenges to picking a control group for people going through drug courts? Worksheet on evaluation Results

Results: Meta-analysis of 30 evaluations shows that drug courts do decrease recidivism by about 13.3 percent (about 6 percentage points) Six comparisons: –Similar counties or pre-drug court in previous period –Regression, Propensity score, or risk score matching Most drug courts in WA have reduced recidivism by similar levels to other studies; but not King Co. Cost-benefit shows higher costs for drug courts, but they return $1.74 in benefits for each dollar of extra costs

Projects: What are your outcomes and indicators (can be for process or impact)? How do they capture the key impacts of the program? How are the indicators appropriate and collectable? In next memo you will need a chart with each research question, each outcome and indicator, and where the data will come from (see student examples)