QUICK OVERVIEW - REDUCING REOFFENDING EVALUATION PACK INTRODUCTION TO A 4 STEP EVALUATION APPROACH.

Slides:



Advertisements
Similar presentations
Il Project Cycle Management :A Technical Guide The Logical Framework Approach 1 1.
Advertisements

Scottish Learning and Teaching Strategies Support Group Academy Scotland - Enhancement and Engagement 24 May 2007.
Protocol Development.
Mywish K. Maredia Michigan State University
Good Evaluation Planning – and why this matters Presentation by Elliot Stern to Evaluation Network Meeting January 16 th 2015.
Program Evaluation. Lecture Overview  Program evaluation and program development  Logic of program evaluation (Program theory)  Four-Step Model  Comprehensive.
8. Evidence-based management Step 3: Critical appraisal of studies
Results-Based Management: Logical Framework Approach
Business research methods: data sources
How to Write Goals and Objectives
Elsevier Science (USA) items and derived items copyright © 2003, Elsevier Science (USA). All rights reserved. Chapter 2 Introduction to the Quantitative.
UOFYE Assessment Retreat
Prototype Evidence-based Database for Transportation Asset Management Janille Smith-Colin, Infrastructure Research Group 2014 UTC Conference for the Southeastern.
How to Develop the Right Research Questions for Program Evaluation
Writing a Research Proposal
Evaluation Workshop 10/17/2014 Presenters: Jim Whittaker – KCP/Evaluation Dr. Larry M. Gant – SSW Evaluation Christiane Edwards – SSW Evaluation 1.
Monitoring Evaluation Impact Assessment Objectives Be able to n explain basic monitoring and evaluation theory in relation to accountability n Identify.
Evaluation: A Necessity for the Profession, an Essential Support for Practitioners Linking Intervention with Outcome Bryan Hiebert University of Victoria.
Sophia Gatowski, Ph.D., Consultant National Council of Juvenile & Family Court Judges Sophia Gatowski, Ph.D., Consultant National Council of Juvenile &
Moving from Development to Efficacy & Intervention Fidelity Topics National Center for Special Education Research Grantee Meeting: June 28, 2010.
Lesson 5 – Logical Framework Approach (LFA)
Impact assessment framework
Strategic Planning Session David Rudder, Ph.D. Rudder Consultants, LLC. May 17, 2006.
Unit 10. Monitoring and evaluation
What makes a successful development project? Kristin Olsen IOD PARC
Course on Data Analysis and Interpretation P Presented by B. Unmar Sponsored by GGSU PART 2 Date: 5 July
EVALUATION APPROACHES Heather Aquilina 24 March 2015.
The Evaluation of Publicly Funded Research Berlin, 26/27 September 2005 Evaluation for a changing research base Paul Hubbard Head of Research Policy, HEFCE,
Impact Evaluation in Education Introduction to Monitoring and Evaluation Andrew Jenkins 23/03/14.
1 of 27 How to invest in Information for Development An Introduction Introduction This question is the focus of our examination of the information management.
4/5 June 2009Challenges of the CMEF & Ongoing Evaluation 1 Common monitoring and evaluation framework Jela Tvrdonova, 2010.
Evaluating Impacts of MSP Grants Hilary Rhodes, PhD Ellen Bobronnikov February 22, 2010 Common Issues and Recommendations.
Screen 1 of 22 Food Security Policies – Formulation and Implementation Policy Monitoring and Evaluation LEARNING OBJECTIVES Define the purpose of a monitoring.
Sheila Nolan Director of School Improvement
META-ANALYSIS, RESEARCH SYNTHESES AND SYSTEMATIC REVIEWS © LOUIS COHEN, LAWRENCE MANION & KEITH MORRISON.
Evaluating Impacts of MSP Grants Ellen Bobronnikov Hilary Rhodes January 11, 2010 Common Issues and Recommendations.
A short introduction to the Strengthened Approach to supporting PFM reforms.
June 07 Supporting People – Outcomes Monitoring Framework Pre-Pilot Workshop.
Evaluating Impacts of MSP Grants Ellen Bobronnikov January 6, 2009 Common Issues and Potential Solutions.
Program Evaluation Overview. Definitions of Program Evaluation systematic collection of information abut the activities, characteristics, and outcome.
An Expanded Model of Evidence-based Practice in Special Education Randy Keyworth Jack States Ronnie Detrich Wing Institute.
Guide to Options Comparison Revision of the SAFEGROUNDS Guidance James Penfold, Quintessa SAFESPUR, 4 October 2007.
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
Cost effectiveness in Human Services: simple complicated and complex considerations Towards a method that is useful and accurate Andrew Hawkins, ARTD Consultants.
How to design and deliver a successful evaluation 19 th October 2015 Sarah Lynch, Senior Research Manager National Foundation for Educational Research.
"The challenge for Territorial Cohesion 2014 – 2020: delivering results for EU citizens" Veronica Gaffey Acting Director EUROPEAN COMMISSION, DG for Regional.
Overview of evaluation of SME policy – Why and How.
NIHR Themed Call Prevention and treatment of obesity Writing a good application and the role of the RDS 19 th January 2016.
Jane Holdsworth 20 January The terminology of outcomes and impact How to evidence outcomes and impact Methods for collecting evidence Sources of.
New Advanced Higher Subject Implementation Events Physics Advanced Higher Course Assessment.
UNIT 7 MONITORING AND EVALUATION  Monitoring and evaluation is the process of examining progress against institution’s goals or plan.  The term SM &
Association of Enterprise Architects International Committee on Enterprise Architecture Standards Jan 23, Collaborative Expedition Workshop #57aeajournal.org.
Evaluation and Assessment Evaluation is a broad term which involves the systematic way of gathering reliable and relevant information for the purpose.
Critiquing Quantitative Research.  A critical appraisal is careful evaluation of all aspects of a research study in order to assess the merits, limitations,
Identifying Monitoring Questions from your Program Logic.
Templars Primary School The development of whole school science planning.
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov March 23, 2011.
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 3. Session 9. Periodic data collection methods.
Scrutiny of RIAs Problem Definition and Objectives
Alternative Assessment (Portfolio)
Community program evaluation school
Internal assessment criteria
Draft OECD Best Practices for Performance Budgeting
What Are Rubrics? Rubrics are components of:
Resource 1. Evaluation Planning Template
CATHCA National Conference 2018
WHAT is evaluation and WHY is it important?
Project Closure And Termination
Monitoring and Evaluating FGM/C abandonment programs
Times they are changing: Evaluating outreach in a new era
Presentation transcript:

QUICK OVERVIEW - REDUCING REOFFENDING EVALUATION PACK INTRODUCTION TO A 4 STEP EVALUATION APPROACH

WHAT IS THE EVALUATION RESOURCE PACK? The evaluation resource pack is a comprehensive and user-friendly slide-show which provides step by step evaluation guidance and resources. It was produced by Justice Analytical Services at the Scottish Government to help evaluators and funders conduct better quality evaluations to assess the value of their interventions. The main purpose of the full version evaluation resource pack is to, Emphasise the importance of using the evidence-base to design interventions. Promote a rigorous 4 STEP approach to evaluation which interventions of ALL SIZES and at ALL STAGES OF DEVELOPMENT can conduct. Help interventions carry out a realistic and rigorous alternative to impact evaluations which are very difficult to do in Scotland Describe how to structure an evaluation report to increase consistency and quality in report writing To provide advice to funders on how to judge the merit of interventions in Scotland.

WHO IS THE EVALUATION RESOURCE PACK FOR? The evaluation resource pack is designed to support anyone commissioning or evaluating criminal justice interventions. In particular it should help, Contractors and practitioners to Evaluate criminal justice interventions using a robust 4 step method Structure an evaluation report Funders to Commission evaluations Judge the strength of evaluation reports Assess the value of interventions

WHY DO WE NEED AN EVALUATION RESOURCE PACK? THE BACKGROUND TO THE 4-STEP METHOD Assessing the impact of funded interventions in Scotland is difficult due to methodological constraints but the pressure to show that interventions are effective has led to some poor and at times dubious evaluations. To see whether your intervention had an impact and made a real difference to users, you need a to compare users with a randomised or matched control group have the same risk of reoffending as the users. You also need large sample sizes and use statistical testing on the outcomes. Very few evaluations are able to use this method so we devised an alternative which mitigates the lack of impact information by emphasising the need to embed robust evaluations from elsewhere and evaluating the extent to which an intervention is evidence-based as part of the evaluation itself. The evaluation then collects data to test whether the intervention was implemented as intended and whether short and medium term outcomes materialised. This pack was devised to support evaluators conduct this type of evaluation which can be described in 4 steps:

THE 4 STEP APPROACH TO EVALUATION Then analyse the data (and collect more if necessary) to find out the extent to which your intervention was evidence-based and if it worked as the logic model predicted it would. Put as much emphasis on describing and evaluating inputs as well as outputs and outcomes Use this logic model to identify indicators for inputs, outputs and outcomes and collect data using relevent methods Draw a logic model describing how your intervention works in practice by describing the links between inputs, outputs and outcomes. The logic model forms the basis for evaluating the whole intervention so this may provide better clues as to why an intervention acheived its outcomes or why it did not. Interventions should be clearly structured and designed using robust evidence so it is important to be familiar with the results from the ‘what works’ and desistance literature. This knowledge should be used to evaluate the extent to which the intervention is grounded in strong and consistent evidence. You could also be explcit about how much it cost and how the funds were spent. Review the evidence Draw a logic model Identify Indicators and collect monitoring data Evaluate logic model

CONTENTS OF THE FULL VERSION The purpose of the pack…………………………………………………………………….. 2 Background: Why a 4 step approach? The challenges of measuring impact in Scotland Impact evaluations – why are they so hard to do? Control and comparison groups – which characteristics need to be similar? So if measuring impact is tricky, what can we do? The 4 step approach to evaluation 5 The 4 step approach…………………………………………………………………………. 4 step approach to evaluation -overview 11 Step 1: Review the evidence What does the evidence say? Evidence summaries An example – an evidence-based justification for a fictitious intervention 13 Step 2: Draw a logic model showing how the service or intervention works……… What are logic models What logic models can do A very simple evidence-based model A logic model template to use Logic model column content– a quick guide The importance of designing a structured intervention An evidence-based logic model – reducing reoffending An example - the Reducing Reoffending evidence model An example -A simple supervised bail logic model 30 Step 3: Identifying indicators and collecting monitoring data……………………….. Use the Logic model of identify indicators Use the Logic model to set evaluation questions and guide the collection of data Example: Indicators for outputs/activities and outcomes Data collection Quantitative and Qualitative data –uses, benefits and limitations. Data capture and analysis An example data collection framework for a criminal justice intervention 40 6

CONTENTS CONTINUED... Step 4: Evaluate logic model………………………………………………………………….. Test the logic model Measuring and reporting outcomes Measuring and reporting impact Caveats to measuring impact Subjective measures of impact A note on Cost Benefit analysis 54 Evaluation Report Structure Structure and content 61 Judging the worth of an intervention…………………………………………………………. Assessing an evaluation report Example judging criteria matrix for a reducing reoffending intervention Features and advantages of a scoring system 65 Advantages and disadvantages of the 4 step logic model approach to evaluation Helpful Resources………………………………………………………………………………….. 70 /71 7