Presentation is loading. Please wait.

Presentation is loading. Please wait.

Improving Justice Sector Assessments. Insanity: doing the same thing over and over, while expecting the outcome to be different.

Similar presentations


Presentation on theme: "Improving Justice Sector Assessments. Insanity: doing the same thing over and over, while expecting the outcome to be different."— Presentation transcript:

1 Improving Justice Sector Assessments

2 Insanity: doing the same thing over and over, while expecting the outcome to be different.

3 Project preparation -Diagnostic/assessment -Formulation of priorities for reform -Design of the project – solutions crafted to meet the priority problems identified. Project implementation -Support/ oversight -Monitoring and evaluation -Feedback loop, learning, adjustments to project -Measuring outcomes and impact -Recording lessons learned

4 What do we need to change? TC: -Develop new methodologies -Define a research agenda -Create centers for research, dissemination. LH: -Consolidate information -Build knowledge base -Disseminate better, and require that it be used.

5 Why focus on diagnostics? -Measuring impact requires a baseline -Changing to a problem focus (rather than capacity-building) requires a definition of the problem -Understanding the political context of a reform requires analysis of who stands to benefit (and lose) -It’s hard to know where to start without an understanding of priorities.

6 Doing diagnostics differently can be part of building the knowledge base to change what we know about and what we learn from justice reforms, and perhaps even change their impact. The goal: reforms that are fact-based rather than purely assertion-based: - an empirically-based process to determine what is working and what is wrong and what might work as a solution -based on sound social-science methodology -to test and/or complement qualitative assertions

7 The Bank has been a leader in good diagnostic practice – in general and in the justice area. Do we need to change our diagnostic process? How solid is Bank practice with respect to assessments and diagnostics in the justice sector? To get a sense of that, we commissioned an analysis of Bank diagnostics, held a workshop with Bank staff who have worked on diagnostics.

8 Key findings: -Tap into the wealth of research capacity within the Bank (from social sciences, DEC, etc.) regarding the diagnostic/research process -More emphasis on collaborative engagement with local actors -Improve the relevance, quality and sophistication of questions (and thus answers)

9 (continued) -Be explicit about choice of focus/scope, the underlying theory of what constitutes justice – i.e., what we’re measuring the system against, the tools used. -Add some tools to the toolkit: surveys, observation of court process, reviewing court decisions, etc. -Increase empirical accuracy of descriptions, and analyze problems found in terms of the factors that contribute to them.

10 Would it be possible to prepare a manual on assessments, to build on the Bank’s existing guidelines and experience? Project concept note reviewed at the end of 2005. Methodology: -analyze existing assessments for good practices -workshop with Bank staff -look for other manuals for diagnostics of systems that might be useful -in relatively short timeframe try writing an initial draft, to get a sense of what we know and don’t know and what team skills are needed to do this

11 (continued) -put together a team pulling in Bank staff from various departments with expertise in aspects of diagnostics -get regular input/review from Bank staff -produce a manual, for hard copy and to put on-line -make it interactive at least in the sense that we can continually take feedback and comments into account to improve the manual.

12 Some of the challenges to attempting a manual: Different types of studies are appropriate for different purposes: -Desk review, pulling together what is known, understanding what knowledge is lacking -Comprehensive look at the justice sector where are major weaknesses and strengths, priorities for reform -Examination of one area of concern or interest

13 (continued) -Support stakeholders in coming to a consensus regarding what is wrong and priorities for change, and where it may be possible to bring about desired change. Can we write a manual appropriate to all the different sorts of diagnostic one might engage in? Can we write a manual appropriate to the different levels of expertise/experience that our target audience (i.e., people starting out to do assessments) will have?

14 If we review the list of types of diagnostics, each one, to be done well, would require an understanding of what aspects of the justice sector need to be understood in order to have a grasp of how the sector’s functioning. Each one requires the assessor to know what sort of questions, measurement methods, and likely sources of data exist or can be created to understand the aspects of the sector under review.

15 For example, the assessor may hear from all sides that the system is slow, that delay is an obstacle to access and a hindrance to the system’s credibility. -How slow is too slow? What’s the standard? -Which courts are slow? -What kind of cases? - Which proceedings?

16 (continued) -In what part of the process does delay occur: preparation or pre-trial? The hearing itself? Interlocutory appeals, continuances, or other procedural maneuvers? Time between hearing and judgment? Enforcement of judgment? -Who is reporting delay? Judges, parties, lawyers? -What incentives reinforce delay? - What’s causing delay?

17 Are these the relevant questions? Under what circumstances? What methods exist to get at answers to those questions, and how best to employ them? This is a cursory review of some of the issues raised in analyzing delay - the manual’s goal would be to identify key questions and methods for probing those questions, and provide references to more detailed information about both.

18 If the manual achieves that, it can help produce diagnostics that provide: -a first step toward forming a solid empirical basis for reform efforts, -a first step toward establishing a baseline from which to measure change, -and thus an important element in making justice reform a fact-based rather than a purely assertion- based enterprise.

19 Topic/Diagnostic Stage SubstanceStakeholders 1.Getting startedDesk review, logistics, choosing between general or narrow approach. Initial outreach to fill gaps in desk review and as first step for field assessment. 2.Identifying and selecting problems Most common problems to analyze, guidance on choosing, what factors to look at, and risks in analyzing a particular theme. Who to include as subjects of diagnostic, additional persons to be targeted by the diagnostic. 3.Tools/methodsChecklists, survey techniques (pros & cons), short-cuts and best practices. Who to interview, who to use in diagnostic team etc. 4.Report writing and Analysis How to organize the report, different levels of analysis & description, techniques for presenting data. Tailoring report content to have strongest impact on the report’s audience(s). 5.DisseminationHow to disseminate and best reach audience (political considerations.)

20 Assessment Manual “Iceberg” 1.Getting Started 2.Identifying/Sele cting Problems 3.Tools/methods 4.Report writing/Analysis 5.Dissemination 1.2.3.4.5.

21 Is this process of improving diagnostics one that sounds useful/makes sense to you? Is it one that would likely have a positive impact on our project work and in filling the knowledge gap? Would you be willing to review and comment on the draft manual?


Download ppt "Improving Justice Sector Assessments. Insanity: doing the same thing over and over, while expecting the outcome to be different."

Similar presentations


Ads by Google