Improving Justice Sector Assessments. Insanity: doing the same thing over and over, while expecting the outcome to be different.

Slides:



Advertisements
Similar presentations
Program Evaluation Alternative Approaches and Practical Guidelines
Advertisements

LAO PDR Summary Findings from NOSPA Mission and Possible Next Steps.
Project L.O.F.T. Report May 2007 through October 2007 Creating a design to meet stakeholder desires and dissolve our current set of interacting problems.
Analyzing Student Work
Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
From Research to Advocacy
Knowing if the RBF mechanism is working Incorporating Rigorous Impact Evaluation into your HRBF program Sebastian Martinez World Bank.
Unit 252 Planning and monitoring work
Best Practices in Assessment, Workshop 2 December 1, 2011.
Regulatory Frameworks in OECD countries and their Relevance for India Nick Malyshev Senior Counsellor Public Governance and Territorial Development OECD.
Project Monitoring Evaluation and Assessment
Whole site approach to improvement Leading the Learning Workshop 3 - for leadership teams in secondary sites Quality, Improvement & Effectiveness Unit.
CHAPTER 3 ~~~~~ INFORMAL ASSESSMENT: SELECTING, SCORING, REPORTING.
Return On Investment Integrated Monitoring and Evaluation Framework.
Evaluation is a professional and ethical responsibility and is a core part of PHN professional practice Commitment to evaluation helps build the PHN intelligence.
PPA 502 – Program Evaluation
Child Welfare Workforce Changing Context & Implications Resulting from Privatization & Performance-Based Contracting Karl Ensign, Director Evaluation for.
Health Systems and the Cycle of Health System Reform
Science Inquiry Minds-on Hands-on.
Principal Performance Evaluation System
ASSESSMENT& EVALUATION Assessment is an integral part of teaching. Observation is your key assessment tool in the primary and junior grades.
Performance Measurement and Analysis for Health Organizations
Medical Audit.
Evaluation methods and tools (Focus on delivery mechanism) Jela Tvrdonova, 2014.
If you don’t know where you’re going, any road will take you there.
Michalis Adamantiadis Transport Policy Adviser, SSATP SSATP Capacity Development Strategy Annual Meeting, December 2012.
Semester 2: Lecture 9 Analyzing Qualitative Data: Evaluation Research Prepared by: Dr. Lloyd Waller ©
The inspection of local area responsibilities for disabled children and young people and those who have special educational needs Charlie Henry HMI National.
IAOD Evaluation Section, the Development Agenda (DA) and Development Oriented Activities Julia Flores Marfetan, Senior Evaluator.
Module 2 Stakeholder analysis. What’s in Module 2  Why do stakeholder analysis ?  Identifying the stakeholders  Assessing stakeholders importance and.
CEBP Learning Institute Fall 2009 Evaluation Report A collaborative Partnership between Indiana Department of Corrections & Indiana University November.
Researchers’ Consulting Activities FITT (Fostering Interregional Exchange in ICT Technology Transfer)
MAINSTREAMING MONITORING AND EVALUATION IN EDUCATION Can education be effectively managed without an M & E system in place?
Suggestions for Speedy & Inexpensive Justice Presentation to the Committee of the Whole The Senate of Pakistan 19 August 2015.
1. Housekeeping Items June 8 th and 9 th put on calendar for 2 nd round of Iowa Core ***Shenandoah participants*** Module 6 training on March 24 th will.
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
Lessons Learned about Going to Scale with Effective Professional Development Iris R. Weiss Horizon Research, Inc. February 2011.
BCO Impact Assessment Component 3 Scoping Study David Souter.
Objectives (and limitations) of this paper review IDB experience to: –help focus the discussion of lessons learned; and –provide input to formulation of.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
Responsiveness to Instruction RtI Tier III. Before beginning Tier III Review Tier I & Tier II for … oClear beginning & ending dates oIntervention design.
Alain Thomas Overview workshop Background to the Principles Definitions The National Principles for Public Engagement What.
The inspection of local area responsibilities for disabled children and young people and those who have special educational needs Mary Rayner HMI Lesley.
Strategies for Knowledge Management Success SCP Best Practices Showcase March 18, 2004.
Tier III Implementation. Define the Problem  In general - Identify initial concern General description of problem Prioritize and select target behavior.
INTEGRATED ASSESSMENT AND PLANNING FOR SUSTAINABLE DEVELOPMENT 1 Click to edit Master title style 1 Evaluation and Review of Experience from UNEP Projects.
Workshop: RIA for Prime Ministry Experts 13 October 2009 EuropeAid/125317/D/SER/TR Session 3 RIA Consultation for Public Sector and Government.
A short introduction to the Strengthened Approach to supporting PFM reforms.
Program Evaluation Overview. Definitions of Program Evaluation systematic collection of information abut the activities, characteristics, and outcome.
Student Learning Objectives 1 SCEE Summit Student Learning Objectives District Professional Development is the Key 2.
Kathy Corbiere Service Delivery and Performance Commission
Techniques for presenting content
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
Evaluation: from Objectives to Outcomes Janet Myers, PhD MPH AIDS Education and Training Centers National Evaluation Center
Fundamentals of Governance: Parliament and Government Understanding and Demonstrating Assessment Criteria Facilitator: Tony Cash.
SITUATION ASSESSMENT FOR HIV PROGRAMMING DR. S.K CHATURVEDI DR. KANUPRIYA CHATURVEDI.
Evaluating Engagement Judging the outcome above the noise of squeaky wheels Heather Shaw, Department of Sustainability & Environment Jessica Dart, Clear.
Capacity Development Results Framework A strategic and results-oriented approach to learning for capacity development.
ASSESSMENT and EVALUATION (seeing through the jargon and figuring out how to use the tools)
A Professional Development Series from the CDC’s Division of Population Health School Health Branch Professional Development 101: The Basics – Part 1.
Session 2: Developing a Comprehensive M&E Work Plan.
Tools for Mainstreaming Disaster Risk Reduction: Guidance Notes for Development Organisations Charlotte Benson and John Twigg Presented by Margaret Arnold.
Cedric D. Murry APT Instructor of Applied Technology in research and development.
Raising standards improving lives The revised Learning and Skills Common Inspection Framework: AELP 2011.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
Monitoring and Evaluation Systems for NARS organizations in Papua New Guinea Day 4. Session 10. Evaluation.
Benchmarking Excellence in Restorative Conferencing
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 3. Session 9. Periodic data collection methods.
WHAT is evaluation and WHY is it important?
TECHNOLOGY ASSESSMENT
Presentation transcript:

Improving Justice Sector Assessments

Insanity: doing the same thing over and over, while expecting the outcome to be different.

Project preparation -Diagnostic/assessment -Formulation of priorities for reform -Design of the project – solutions crafted to meet the priority problems identified. Project implementation -Support/ oversight -Monitoring and evaluation -Feedback loop, learning, adjustments to project -Measuring outcomes and impact -Recording lessons learned

What do we need to change? TC: -Develop new methodologies -Define a research agenda -Create centers for research, dissemination. LH: -Consolidate information -Build knowledge base -Disseminate better, and require that it be used.

Why focus on diagnostics? -Measuring impact requires a baseline -Changing to a problem focus (rather than capacity-building) requires a definition of the problem -Understanding the political context of a reform requires analysis of who stands to benefit (and lose) -It’s hard to know where to start without an understanding of priorities.

Doing diagnostics differently can be part of building the knowledge base to change what we know about and what we learn from justice reforms, and perhaps even change their impact. The goal: reforms that are fact-based rather than purely assertion-based: - an empirically-based process to determine what is working and what is wrong and what might work as a solution -based on sound social-science methodology -to test and/or complement qualitative assertions

The Bank has been a leader in good diagnostic practice – in general and in the justice area. Do we need to change our diagnostic process? How solid is Bank practice with respect to assessments and diagnostics in the justice sector? To get a sense of that, we commissioned an analysis of Bank diagnostics, held a workshop with Bank staff who have worked on diagnostics.

Key findings: -Tap into the wealth of research capacity within the Bank (from social sciences, DEC, etc.) regarding the diagnostic/research process -More emphasis on collaborative engagement with local actors -Improve the relevance, quality and sophistication of questions (and thus answers)

(continued) -Be explicit about choice of focus/scope, the underlying theory of what constitutes justice – i.e., what we’re measuring the system against, the tools used. -Add some tools to the toolkit: surveys, observation of court process, reviewing court decisions, etc. -Increase empirical accuracy of descriptions, and analyze problems found in terms of the factors that contribute to them.

Would it be possible to prepare a manual on assessments, to build on the Bank’s existing guidelines and experience? Project concept note reviewed at the end of Methodology: -analyze existing assessments for good practices -workshop with Bank staff -look for other manuals for diagnostics of systems that might be useful -in relatively short timeframe try writing an initial draft, to get a sense of what we know and don’t know and what team skills are needed to do this

(continued) -put together a team pulling in Bank staff from various departments with expertise in aspects of diagnostics -get regular input/review from Bank staff -produce a manual, for hard copy and to put on-line -make it interactive at least in the sense that we can continually take feedback and comments into account to improve the manual.

Some of the challenges to attempting a manual: Different types of studies are appropriate for different purposes: -Desk review, pulling together what is known, understanding what knowledge is lacking -Comprehensive look at the justice sector where are major weaknesses and strengths, priorities for reform -Examination of one area of concern or interest

(continued) -Support stakeholders in coming to a consensus regarding what is wrong and priorities for change, and where it may be possible to bring about desired change. Can we write a manual appropriate to all the different sorts of diagnostic one might engage in? Can we write a manual appropriate to the different levels of expertise/experience that our target audience (i.e., people starting out to do assessments) will have?

If we review the list of types of diagnostics, each one, to be done well, would require an understanding of what aspects of the justice sector need to be understood in order to have a grasp of how the sector’s functioning. Each one requires the assessor to know what sort of questions, measurement methods, and likely sources of data exist or can be created to understand the aspects of the sector under review.

For example, the assessor may hear from all sides that the system is slow, that delay is an obstacle to access and a hindrance to the system’s credibility. -How slow is too slow? What’s the standard? -Which courts are slow? -What kind of cases? - Which proceedings?

(continued) -In what part of the process does delay occur: preparation or pre-trial? The hearing itself? Interlocutory appeals, continuances, or other procedural maneuvers? Time between hearing and judgment? Enforcement of judgment? -Who is reporting delay? Judges, parties, lawyers? -What incentives reinforce delay? - What’s causing delay?

Are these the relevant questions? Under what circumstances? What methods exist to get at answers to those questions, and how best to employ them? This is a cursory review of some of the issues raised in analyzing delay - the manual’s goal would be to identify key questions and methods for probing those questions, and provide references to more detailed information about both.

If the manual achieves that, it can help produce diagnostics that provide: -a first step toward forming a solid empirical basis for reform efforts, -a first step toward establishing a baseline from which to measure change, -and thus an important element in making justice reform a fact-based rather than a purely assertion- based enterprise.

Topic/Diagnostic Stage SubstanceStakeholders 1.Getting startedDesk review, logistics, choosing between general or narrow approach. Initial outreach to fill gaps in desk review and as first step for field assessment. 2.Identifying and selecting problems Most common problems to analyze, guidance on choosing, what factors to look at, and risks in analyzing a particular theme. Who to include as subjects of diagnostic, additional persons to be targeted by the diagnostic. 3.Tools/methodsChecklists, survey techniques (pros & cons), short-cuts and best practices. Who to interview, who to use in diagnostic team etc. 4.Report writing and Analysis How to organize the report, different levels of analysis & description, techniques for presenting data. Tailoring report content to have strongest impact on the report’s audience(s). 5.DisseminationHow to disseminate and best reach audience (political considerations.)

Assessment Manual “Iceberg” 1.Getting Started 2.Identifying/Sele cting Problems 3.Tools/methods 4.Report writing/Analysis 5.Dissemination

Is this process of improving diagnostics one that sounds useful/makes sense to you? Is it one that would likely have a positive impact on our project work and in filling the knowledge gap? Would you be willing to review and comment on the draft manual?