Presentation is loading. Please wait.

Presentation is loading. Please wait.

Research Design and Outcomes

Similar presentations


Presentation on theme: "Research Design and Outcomes"— Presentation transcript:

1 Research Design and Outcomes
PBAF 526

2 Today Zinc recap Where are we? Outcomes and comparisons
Intimate Partner Violence Judicial Oversight Evaluation Projects: Program Theory and Evaluation Questions

3 Zinc lessons about process evaluation
We need description, data, and analysis to understand what activities took place To understand why a program works or doesn’t work, we need to connect activities to outcomes Process evaluation assesses early causal links in program theory

4 Next Stop: Impact Evaluation
Start on left: What are our outcomes….What should we do? Then what are we doing? Then what did we accomplish…what was the impact?

5 Comparison: Need to disentangle effects of program from other influences Need comparisons for impact and process evaluation We want to know outcomes or activities for program group IF they didn’t receive the program—the “counter-factual” Use comparison group as similar as possible to intervention group Experimental comparison groups are randomized (by individual or other unit) Quasi-experimental research compares to a non-random group without the intervention

6 Comparisons may be:. -over time. -across intervention groups
Comparisons may be: -over time -across intervention groups with and without program; levels of intervention (“dosage”)

7 Comparisons may be:. -over time. -across intervention groups
Comparisons may be: -over time -across intervention groups with and without program; levels of intervention (“dosage”) Impact here!

8 What types of comparisons were used for these impact evaluations?
School based mentoring program Jamaica PATH program Zinc in Nepal Intimate Partner Violence Judicial Oversight Demonstration

9 Outcomes reflect more than program influence
Characteristics of participants Other pre-existing differences (e.g., other services) Changes over time (maturation) Changes in social or economic context Impact of evaluation (e.g., testing) (instrumentation)

10 Properties of Comparisons
Internal Validity: Does measured impact only reflect contribution of program? External Validity: Does measured impact reflect what we could expect in a similar program elsewhere?

11 Outcomes Indicators Capture key conceptual results:
How would you know if program worked? What would you see? What would change for participants? Include both shorter run and longer run outcomes Come from program theory, previous evaluations/research, stakeholders Indicators Measures of the outcomes Use multiple indicators to: Triangulate findings Assess multiple dimensions of outcome Prevent distortion of single measure Come from available data, previous research and evaluation, theory of how to best capture outcome, stakeholders

12 Properties of Indicators
Validity: Does the indicator measure what we say it does? Reliability: Does the indicator have the same reading when the outcomes are the same (with different evaluators or over time with no outcome change)? Sensitivity: Is the indicator able to detect changes due to our program?

13 Judicial Oversight Demonstration: Coordinated response to intimate partner violence
Program theory Outcomes and Indicators Comparison

14 Judicial Oversight Demonstration: Program Theory
Collaboration between law enforcement, courts, and service providers Consistent responses to IPV offenses Coordinated victim advocacy and services Offender accountability and oversight These lead to goals of victim safety, accountability for offenders, and reduced repeat IPV

15 Judicial Oversight Demonstration: Comparisons to assess outcomes
Three sites chosen based on capacity and interest in the demonstration (MA, MI, and WI) Goals of evaluation Test impact Learn from implementation analysis Two sites: JOD cases to cases in a similar comparison county One site: JOD cases to cases prior to JOD

16 Are JOD samples similar to comparisons?

17 Judicial Oversight Demonstration: Research tools
IMPACT (in JOD and comparison sites): Interviews with victims and offenders in-person via computer early and 9 months later Admin. data from courts, probation system, and service providers IMPLEMENTATION: Site observation and interviews with service providers Focus Groups with victims and offenders

18

19 Judicial Oversight Demonstration Outcomes and Indicators
Victim Safety Perceptions of quality of treatment and safety/well-being Satisfaction ratings on surveys Safety and well-being rating on survey Number of contacts with probation officials Safety Victim reports of new IPV % offenders re-arrested for IPV Offender Accountability Accountability Number of probation requirements % convicted and sentenced % with public defender, defense attorney % reported to Batterer Intervention Programs by follow-up Perceptions Rating of clarity of legal process Rating of fairness of judges and probation officials Rating of perception of certainty and severity of future penalties Recidivism

20 Example comparison in outcomes:
Chi-sq is JOD vs comparison sample

21 Judicial Oversight Demonstration Results
JOD is feasible and may benefit the justice system JOD did not increase victim perceptions of safety or well-being JOD increased offender accountability, but not key perceptions related to future offending JOD reduced repeat IPV where offenders were incarcerated Batterer intervention programs did not reduce IPV

22 Judicial Oversight Demonstration
What can we say about internal validity? What can we say about external validity?

23 Projects: What does your program do?
By what pathways do/will those activities affect outcomes? What questions will your evaluation address? Your job is to help each project in your group move forward


Download ppt "Research Design and Outcomes"

Similar presentations


Ads by Google