Presentation is loading. Please wait.

Presentation is loading. Please wait.

“Measurement is the first step that leads to control and eventually to improvement. If you can’t measure something, you can’t understand it. If you can’t.

Similar presentations


Presentation on theme: "“Measurement is the first step that leads to control and eventually to improvement. If you can’t measure something, you can’t understand it. If you can’t."— Presentation transcript:

0 Performance Management for Justice Information Sharing
David J. Roberts Global Justice Consulting Steve Prisoc Chief Information Officer New Mexico State Courts Elizabeth Zwicker Program Specialist US Bureau of Justice Assistance 2006 BJA/SEARCH Regional Information Sharing Conference March 27, Minneapolis, Minnesota

1 “Measurement is the first step that leads to control and eventually to improvement. If you can’t measure something, you can’t understand it. If you can’t understand it, you can’t control it. If you can’t control it, you can’t improve it.” —H. James Harrington

2 WHY evaluate performance?
Information is control Provides feedback to improve program performance Provides information for resource allocation Enables effective planning Tests generalizations based on experiences and assumptions Market and develop support among funding bodies, constituents, and staff.

3 Landscape of Performance Management
Investment appraisal and benefits realization What is the actual investment we’re making? How are benefits going to be collected and tracked? Solid program management and tracking Is the project on track? How do we ensure it remains on track? Achievement of the strategic objectives Fundamentally, what is it that we’re trying to achieve in our information sharing initiative?

4 Process vs. Impact Evaluations
Process evaluations focus on how the initiative was executed; the activities, efforts, and workflow associated with the response. Process evaluations ask whether the response occurred as planned, and whether all components worked as intended. Fundamentally, a process evaluation posits the question, “Are we doing the thing right?” Impact evaluations focus on the outcome (the what) of the initiative; the output (products and services) and outcome (results, accomplishment, impact). Did the problem decline or cease? And if so, was the response the proximate cause of the decline? Fundamentally, the impact evaluation posits the question, “Are we doing the right thing(s)?”

5 Balanced Scorecard Originally developed in business by Kaplan & Norton
Financial – How do we look to stakeholders? Customer – How well do we satisfy our internal and external customers’ needs? Internal Business Process – How well do we perform at key internal business processes? Learning and Growth – Are we able to sustain innovation, change, and continuous improvement?

6 Balanced Scorecard for Law Enforcement (Mark Moore, et al)
Reduce criminal victimization Call offenders to account Reduce fear and enhance personal security Guarantee safety in public spaces Use financial resources fairly, efficiently, and effectively Use force and authority fairly, efficiently, and effectively Satisfy customer demands/achieve legitimacy with those policed

7 Trial Court Performance Standards
Access to Justice Expedition and Timeliness Equality, Fairness and Integrity Independence and Accountability Public Trust and Confidence.

8 Corrections Performance
Security Drug Use Significant Incidents Community Exposure… Safety …of inmates …of staff …of environment… Order Inmate misconduct Use of force Perceived control… Care Stress & illness Health care Dental care… Activity Work & industry Education & training Religion… Justice Staff fairness Use of force Grievances (# & type)... Conditions Space Pop density Freedom of movement… Management Satisfaction Stress & burnout Turnover…

9 Universal IJIS Elements
Definition: The ability to access and share critical information at key decision points throughout the whole of the justice enterprise. Scope: Recognition that the boundaries of the enterprise are increasingly elastic—engaging not only justice, but also emergency & disaster management, intelligence, homeland security, first responders, health & social services, private industry, the public, etc. Goal: Get the right information, to the right people, all of the time—underscores the need for dynamic information exchange.

10 Information Sharing Objectives
What is the problem we’re addressing? What information do we have regarding current levels of performance? What is it that we’re trying to do? 3 Universal Objectives: Improve Public Safety and Homeland Security; Enhance the Quality and Equality of Justice; Gain Operational Efficiencies, Effectiveness, and demonstrate Return on Investment (ROI).

11 Sample Public Safety Measures
Increase the percentage of court dispositions that can be matched to an arrest—this will improve the quality of the computerized criminal history records Decrease the average response time to establish a positive identification following an arrest Reduce the number of incidents of criminal records being associated with the wrong person Reduce recidivism Reduce the fear of crime in target neighborhoods Decrease the amount of time it takes to serve a warrant Decrease the amount of time for law enforcement to have details on protection orders. Reduce the amount of time it takes users of the integrated justice system to respond to a request from the public Reduce the time it takes to complete a criminal history background check Reduce the number of agencies that can’t communicate with each other.

12 JNET: Improved Public Safety & Homeland Security
Notifications Timely notification of critical events Arrest, disposition, warrant, violation, death, etc Offender accountability and increased public safety. Confirmed Notifications FY01/02         3,645  FY02/03         18,349 FY03/04         29,980 FY04/05         33,264 FY05/06         46,424 Total = 178,339 confirmed notifications

13 Sample Quality of Justice Measures
Reduce the number of civilian complaints against local law enforcement Reduce the number of continuances per case that result from scheduling conflicts between the courts, law enforcement, and prosecution Reduce the number of cases without a next scheduled event Reduce the average number of days or hours from arrest to arraignment Reduce the average time a defendant is held while waiting for a bond decision Reduce the time it takes for correctional facility intake Reduce the number of days it takes to process cases from arrest to disposition Reduce the number of false arrests. Reduce the amount of missing data.

14 JNET: Improvement in the Quality of Justice
Improved decision making At key decision points, providing the required information in a timely, usable method Traffic Stop Who is this person? Positive identification (photo, SID, etc) Is this person wanted? Outstanding warrants/wants. Is this person a threat? Previous history of violent behavior, firearms, etc. Enhanced Overall Data Quality Reduction of errors Accurate and timely statistical reporting Improve Business Process Minimize offender processing time Reduction in “holding” time.

15 Sample Efficiency/Effectiveness Measures
Reduce the number of hours that staff spend entering data manually or electronically Reduce the costs of copying documents for justice organizations Reduce the number of hours spent filing documents manually Reduce the number of hours spent searching other governmental databases Increase the number of law enforcement personnel performing community policing tasks, instead of administrative tasks Reduce the amount of missing information in criminal justice databases Reduce the number of corrections needed in databases maintained by CJIS agencies Decrease the number of warrants that never get entered into the state registry Increase the number of query hits on each agency database Reduce the number of hours it takes to enter a court disposition into the state criminal history repository

16 JNET: Efficient and Effective ROI
PennDOT (DMV) Certified Drivers History via JNET In 2003, the PennDOT processed 157,840 certified driving history requests for local police, district attorneys, and the Courts One clear performance measure is highlighted by the dramatic reduction in processing costs for PennDOT. The personnel cost metric is based on the time required to process a paper copy of the driver history request, including the manual application of an embossed certification seal. PennDOT calculates their personnel cost at $1.50 per certified history processed, and when incorporating a combined printing and mailing cost of $.50 per copy, the total cost to manually generate a certified driver history equates to $2.00 per request. During August 2006, the 56,126 certified driving history requests process by JNET saved PennDOT $112,252 in monthly operating expenses. Only 4767 were processed in the traditional fashion. PennDOT has reallocated personnel to support and process other areas of business such as ‘paid’ requests from individual citizens and pre-employment screeners.

17 Critical Assumptions Baseline data exist regarding current or historical performance of the system Access, ability and willingness to capture data regarding on-going performance Timely, accurate and complete data collection Appropriate and sufficiently detailed analysis techniques Staff to conduct the analysis and reports Effective communication mechanisms to: Monitor on-going baseline performance Constantly assess the impact and operations Political will and operational capacity to do something as a result of what the measures show!

18 Performance Dashboards
What we’re NOT talking about: The threat level in the airline sector is HIGH or Orange 3/1/07

19 What we ARE talking about…

20 Sample Performance Dashboard
Draft dashboard assessing performance on a series of dimensions that have been agreed by key decisionmakers. This requires effective data collection and routine reporting from operational systems in place throughout the County and agreement that we’re going to do something with the data in order to respond to critical performance elements.

21

22 Establishing a Performance Management Program
The Six Steps to Establishing a Performance-Based Management Program Source: Will Artley, DJ Ellison and Bill Kennedy, The Performance-Based Management Handbook, Volume 1: Establishing and Maintaining a Performance-Based Management Program (Washington, DC: U.S. Department of Energy, 2001)

23 Outcomes and performance measures
Outcomes are the benefits or results gained by reaching goals, achieving objectives and resolving strategic issues Performance Measures are specific, measurable, time-bound expressions of future accomplishment that relate to goals, objectives and strategic initiatives Goals, objectives and strategic initiatives should ideally lead to outcomes Pragmatic performance measurement planners recognize that not all things that need to be measured can always be empirically linked to outcomes.

24 Not all outcomes easily lend themselves to measurement
Everything that can be counted does not necessarily count; everything that counts cannot necessarily be counted. —Albert Einstein It is important that performance measures be based on criteria that correspond to desired outcomes; however, it is often difficult or even impossible to obtain objective measures of certain key outcomes.

25 Program Logic Model and Chain of Events
Populated logic model Program Logic Model and Chain of Events Category Program Feature and Activity Initial Outcomes Intermediate Outcomes Intermediate Outcomes II Final Outcomes Reached Measures Rap sheet information of appropriate scope, timeliness, accuracy and ease of use available to magistrate judge at first court appearance/bond hearing 1. Greater use of Rap sheet information when setting bail/bond and conditions of release 1.More appropriate conditions of release and establishment of bail/bond appropriate to both the arrest charges and the criminal history and past warrant information 1. Fewer crimes committed by those awaiting trial 2. Fewer failures to appear 3. More timely disposition of criminal cases Enhanced justice process 2. Positive influence on lessening total number of crimes committed

26 Use scenario approach to reach agreement and define performance
Bring stakeholders together to reach consensus on the desired state of integration Define the current state of integration (baseline) Quantify gap between current state and desired state Define desired outcomes Develop objectives and performance measures that can be linked to desired outcomes

27 Stakeholders must agree on performance measures in advance
Perceived performance is an agreed-upon construct Criteria for defining performance should be negotiated by stakeholders (and governing body) prior to developing measures Stakeholders will value outcomes differently depending on their role within (or relative to) the justice enterprise

28 Characteristics of good measures
Measures link back to goals, objectives and mission statements Measures drive the right behavior from employees, partners and consultants Collecting data on measures is feasible and cost effective Measures are understood by most employees Measures lend themselves to being tracked on an ongoing basis so that drops in performance can be detected when there is time to do something about it. Measures represent aspects of performance that we can actually change

29 Performance measurement caveats
Most people (including your employees and consultants) can learn to make measures come out the way they think you want them to, without actually improving a process Always question the measures you’ve defined, keeping in mind that the people applying them could find ways of boosting the measures without really improving anything Test each measure to determine if it operates as expected. Does it always go one way when things get better and the other when things get worse?

30 The Russian Nail Manipulating a single metric allowed Soviet managers to appear successful even though their efforts did not always lead to expected outcomes. Success was typically measured by singular metrics of gross output, such as weight, quantity, square feet, or surface area. Gross output indicators played havoc with assortments, sizes, quality, etc., and frequently resulted in products like Khrushchev’s chandeliers – so heavy “that they pull the ceilings down on our heads.” A famous Soviet cartoon depicted the manager of a nail factory being given the Order of Lenin for exceeding his tonnage. Two giant cranes were pictured holding up one giant nail. My Time with Soviet Economics by Paul Craig Roberts (Published in The Independent Review, v.VII, n.2, Fall 2002,pp. 259– 264.)

31 Behavior driven the wrong way
The Soviet Union wasted billions searching for oil because it rewarded drilling crews on the basis of the number of feet drilled. Because it is easier to drill many shallow wells than a few deep wells, drillers drilled lots of shallow wells, regardless of what was advisable geologically. The 1983 Chicago Sun Times article reported a Soviet hospital that had turned away a seriously ill patient because "they were nearing their yearly quota for patient deaths—and would be criticized by authorities if they exceeded it."

32 Family of related measures
Produce x widgets per hour Produce x widgets per hour without exceeding y dollars Produce x widgets per hour without exceeding y dollars with only one full-time employee Produce x widgets per hour without exceeding y dollars and with only one full-time employee and generating z units of waste Produce x widgets per hour without exceeding y dollars with only one full-time employee and generating z units of waste and at a zero defect rate Produce x widgets per hour without exceeding y dollars with only one full-time employee and generating z units of waste and at a zero defect rate and without contributing to global warming

33 The “widget” family Number produced within specified time period
Cost of producing widgets People required Waste generated Defect rate CO2 produced

34 Specific Justice Example

35 Legislatively imposed measure
Number of calls from users requesting assistance (lower number indicates superior performance)

36 Replacement multi-dimensional measure
Measure: Length of time to resolve a call for service and the quality of service call resolution as measured by the following two dimensions: 1. Average time from the opening of a service ticket to the closing of a service ticket. JID will also report the median and standard deviation with the average The quality of service as measured by a regular user surveys designed to measure the quality of the service provided to the caller. Survey respondents are selected randomly

37 Strategic Goal 3: Identify and recommend cost effective biometric identification applications
Objective 3.1: By September 2004, research, identify, and recommend technological applications that support biometrics for rapid identification. Objective 3.2: By September 2004, research, identify, and evaluate the costs and benefits of biometric identification applications. Outcomes: • Increased knowledge of biometric technologies • Improved cost-effective biometric identification solutions Performance Measures: • Number of research projects on biometric technological solutions completed by September 2004 • Number of research projects on costs and benefits of biometrics completed by September 2004 • Number of research reports presented to the Governing Body

38 Justice performance measures
Average law enforcement response time to calls for service for incidents involving a threat to citizen safety Percent of arrest records in state repository with final dispositions Number of automated information exchanges within and between criminal justice agencies Number of crimes cleared using AFIS system(s) Number of arrests made of wanted individuals resulting from the use of electronically available warrant and detainer information Number of electronic checks of justice databases performed to identify high risk individuals Average time from arrest to final case disposition for felony arrests

39 What makes a performance measure effective?
First and foremost, to be effective a measure must be an incentive to a person or group of persons to change behavior in such a way that things really improve. A performance measure should provide feedback to a person or group of persons. Without feedback no information is available on whether the target implied by the measure is being met. A performance measure (or family of measures) should be precise and comprehensive so as to prevent the possibility of the measure being met without actually leading to expected outcomes.

40 Three-legged Stool Strategic Planning Project Management
Performance Management

41 The role of project plans
Project plans can augment a performance plan by ensuring that outputs are completed on time and on budget Rigorous project management can ensure that tasks are actually performed before they are measured. Project planning, along with strategic planning, is an essential adjunct to any performance management program.

42 There’s more to management than measurement
If you can’t measure it, you can’t manage it. —Peter Drucker Drucker’s saying has convinced some managers that measurement is management, which is a bit of an overstatement; however, measurement is one of the most powerful tools in management toolbox

43 Final points If you don’t monitor your performance it will probably get worse. You can’t devise performance measures in a vacuum, you must involve stakeholders and measure what’s valued. Don’t devise measures for which you lack data. Performance measurement can be expensive and time consuming so why bother unless you intend to use the results to provide ongoing process feedback. Errors in devising measures will lead to unexpected consequences

44 So inscrutable is the arrangement of causes and consequences in this world that a two-penny duty on tea, unjustly imposed in a sequestered part of it, changes the condition of its inhabitants. Thomas Jefferson

45 Performance Measurement BJA’s Perspective
Elizabeth Zwicker

46 Purposes: Performance Measures
Linking people and dollars to performance Linking programs and resources to results Justification of continued funding Learning and management tools for us, for you I’m here to tell you what performance measures are, why they’re so important to us and to you. I’m here to introduce myself as a resource for you today. What are performance measures? They’re listed in the solicitation as the data to be colleted from grantees. We’re not collecting the data for a gotcha exercise.. We’re trying to use the information grantees gather to make better decisions which will affect you at the state and local level. It’s about collecting information to compile for “brag-worthy” reports to superiors, those who make funding, programmatic decisions, etc. Your success is our success. We want information to make good decisions to help you succeed. We may have to make some course corrections if the outcomes of the program aren’t what we expected, maybe grantees need more TA, etc. Need information before we can best help you. We understand that when you’re in the field doing the work of the grant, it’s not always easy to understand the connection between reporting information and how that affects the program. It may seem like a circuitous route, but I’d like to explain the connection to you.

47 What Does BJA Do With the Data?
GPRA: Government Performance and Results Act PART: Program Assessment Rating Tool ( Budget formulation MD&A: Management Discussion and Analysis We’re not asking for data for fun; we don’t want to require you to report if we’re not going to analyze or report. You’re not wasting time by ensuring the data is accurately collected. GPRA enacted in ’93 to encourage greater accountability within the government for results Ensures that programs are managed in ways to maximize performance, minimize cost, achieve desired results PART: Starting in 2002, Office of Management and Budget developed this tool to assess and improve performance. Set of questions that provide consistent analysis across government programs. Links performance to budget decisions, informs management actions, legislative proposals. For more information, see the White House’s website, or go to expectmore.gov. Programs that don’t demonstrate results do not get funding recommendations. Justification for BJA’s request for funding in President’s budget. MD&A: concise description of mission, financial position, condition, and program performance. Reported to Main Justice, White House. Data submitted subject to an audit during annual DOJ-wide audit.

48 How You’ll Report Performance Measures
Via the semi-annual progress report submitted electronically via GMS (Grant Management System) Due Jan 30st and July 30st Report only on grant-funded activities during the specified reporting period Progress reports will not be accepted without complete data The performance measures have their own section of GMS. The first 7 questions are the same for all programs, all states, BJA-wide. The following questions are program specific and the answers must be contained in the answer fields. Do not attach a narrative to GMS in place of answering the questions. If the questions do not apply, please put 0 and then explain in the narrative question. Do not put letters in the answer fields. Will make it easier to interpret and aggregate the data. State Policy Advisors will review the reports and if incomplete or in the wrong format, they’ll click on the Change Request button, which will send you an requesting changes. Until your SPA is satisfied with your progress report, you will not be able to submit your next report. If your report is delinquent, the funds will be frozen. We’ll report results on our website, at our regional conferences, release to evaluators. Data belongs to the public, so once it is in a good format, we will release it.

49 Resources Will Artley, DJ Ellison and Bill Kennedy, The Performance-Based Management Handbook, Volume 1: Establishing and Maintaining a Performance-Based Management Program (Washington, DC: U.S. Department of Energy, 2001) at John E. Eck, Assessing Responses to Problems: An Introductory Guide for Police Problem-Solvers (Washington, DC: Center for Problem-Oriented Policing, no date), at Michael Geerken, The Art of Performance Measurement for Criminal Justice Information System Projects, (Washington, DC: U.S. Department of Justice, Bureau of Justice Assistance, 2006 [forthcoming]) Robert H. Langworthy (ed.), Measuring What Matters: Proceedings from the Policing Research Institute Meetings, (Washington, DC: NIJ/COPS, July 1999, NCJ ), pp David J. Roberts, Law Enforcement Tech Guide: Creating Performance Measures that Work! A Guide for Law Enforcement Executives and Managers, (Washington, DC: U.S. Department of Justice, Office of Community Oriented Policing Services, 2006) at


Download ppt "“Measurement is the first step that leads to control and eventually to improvement. If you can’t measure something, you can’t understand it. If you can’t."

Similar presentations


Ads by Google