“Measurement is the first step that leads to control and eventually to improvement. If you can’t measure something, you can’t understand it. If you can’t.

Slides:



Advertisements
Similar presentations
. . . a step-by-step guide to world-class internal auditing
Advertisements

0 Performance Management for Justice Information Sharing David J. Roberts Global Justice Consulting 2006 Symposium on Justice & Public Safety Information.
Introduction to Monitoring and Evaluation
Donald T. Simeon Caribbean Health Research Council
NATIONAL CONFERENCE ON PRIVACY, TECHNOLOGY AND CRIMINAL JUSTICE INFORMATION THE ROLE OF LAW ENFORCEMENT AND STATE CRIMINAL HISTORY REPOSITORIES Washington,
Performance Management Review FAQs
Ray C. Rist The World Bank Washington, D.C.
Evaluating the Alternative Financing Program Geoff Smith Vice President Woodstock Institute March 18, 2008 WOODSTOCK INSTITUTE.
Coordination of Care and the Patient’s Journey Improving Community Health Care Systems Matt Will, BA, NREMT-P Regional Coach Mayo Medical Transport Minnesota.
Partner reward – a help or a hindrance to effective business development? Peter Scott Peter Scott Consulting
Purpose of the Standards
Pre-Project Planning Lessons from the Construction Industry Institute Construction Industry Institute Michael Davis, P. Eng, PMP Ontario Power Generation.
Performance Management Research and Statistics PAC January 2006.
A Guide for Navigators 1National Disability Institute.
© OECD A joint initiative of the OECD and the European Union, principally financed by the EU 1 Linking Quality to Strategy: Benefits of Balanced Scorecards.
1 Module 4: Designing Performance Indicators for Environmental Compliance and Enforcement Programs.
COST MANAGEMENT Accounting & Control Hansen▪Mowen▪Guan COPYRIGHT © 2009 South-Western Publishing, a division of Cengage Learning. Cengage Learning and.
9 Closing the Project Teaching Strategies
Performance Measurement and Analysis for Health Organizations
Performance Measurement for Justice Integration Projects The Illinois and Cook County Experience Mark Myrent Illinois Criminal Justice Information Authority.
THE MANAGEMENT AND CONTROL OF QUALITY, 5e, © 2002 South-Western/Thomson Learning TM 1 Chapter 8 Performance Measurement and Strategic Information Management.
Presented by Margaret Robbins Program Director, TMCEC.
© Grant Thornton | | | | | Guidance on Monitoring Internal Control Systems COSO Monitoring Project Update FEI - CFIT Meeting September 25, 2008.
1 Introduction to Evaluating the Minnesota Demonstration Program Paint Product Stewardship Initiative September 19, 2007 Seattle, WA Matt Keene, Evaluation.
Performance Development at The Cathedral of the Incarnation A Supervisor’s Guide.
2004 National Oral Health Conference Strategic Planning for Oral Health Programs B.J. Tatro, MSSW, PhD B.J. Tatro Consulting Scottsdale, Arizona.
PERFORMANCE METRICS IN INTEGRATED JUSTICE: Measuring Success and the Need for Improvement presented by: Bob Roper, CIO Colorado Judicial Branch Teri B.
Copyright 2010, The World Bank Group. All Rights Reserved. Planning and programming Planning and prioritizing Part 1 Strengthening Statistics Produced.
CSI - Introduction General Understanding. What is ITSM and what is its Value? ITSM is a set of specialized organizational capabilities for providing value.
December 14, 2011/Office of the NIH CIO Operational Analysis – What Does It Mean To The Project Manager? NIH Project Management Community of Excellence.
Alaska Staff Development Network – Follow-Up Webinar Emerging Trends and issues in Teacher Evaluation: Implications for Alaska April 17, :45 – 5:15.
Commissioning Self Analysis and Planning Exercise activity sheets.
Guest Cycle A division of the flow of business through a hotel that identifies the physical contacts and financial exchanges between guests and hotel employees.
0 Performance Management for Justice Information Sharing David J. Roberts Global Justice Consulting Bob Roper, CIO/Director of JBITS Colorado Judicial.
ISM 5316 Week 3 Learning Objectives You should be able to: u Define and list issues and steps in Project Integration u List and describe the components.
1 Designing Effective Programs: –Introduction to Program Design Steps –Organizational Strategic Planning –Approaches and Models –Evaluation, scheduling,
Environmental Management System Definitions
© 2012 Delmar, Cengage Learning Section V Getting the Job Done… Through Others Chapter 16 Measuring Performance: Assessment and Evaluation.
Accounting & Financial Analysis 1 Lecture 8 Budgets.
© 2011 Delmar, Cengage Learning Part IV Control Processes in Police Management Chapter 12 Control and Productivity in the Police Setting.
1 Performance Measures A model for understanding the behavior of our work Presented by Wendy Fraser.
Strategies for Knowledge Management Success SCP Best Practices Showcase March 18, 2004.
Control  It consists of seeing that everything is being carried out in accordance with the plan, which has been adopted, the order, which have been given.
DESIGNING A PUBLIC PARTICIPATION PROGRAM. By the time you select techniques you should know The target stakeholders What has to be accomplished with them.
Information, Analysis, and Knowledge Management in the Baldrige Criteria Examines how an organization selects, gathers, analyzes, manages, and improves.
Kathy Corbiere Service Delivery and Performance Commission
Or How to Gain and Sustain a Competitive Advantage for Your Sales Team Key’s to Consistently High Performing Sales Organizations © by David R. Barnes Jr.
Internal Auditing Effectiveness
Purchasing Forum – May The integration of the activities, plans, attitudes, policies, and efforts of the people of an organization working together.
SEARCH, The National Consortium for Justice Information and Statistics Melissa Nee Government Affairs Specialist SEARCH Overview Briefing.
Quality Assurance. Define Quality (product & service) Exceeds the requirements of the customer. General excellence of standard or level. A product which.
MGT-555 PERFORMANCE AND CAREER MANAGEMENT LECTURE NO
Justice Information Network Strategic Plan Development Justice Information Network Board March 18, 2008 Mo West, JIN Program Manager.
Lecture “6” Manage Project Team
1 DEMONSTRATION PROJECTS TO ENSURE STUDENTS WITH DISABILITIES RECEIVE A QUALITY HIGHER EDUCATION PROGRAM Performance Measurement, Program and Project Evaluation.
Current risk and compliance priorities for law firms PETER SCOTT CONSULTING.
Session 2: Developing a Comprehensive M&E Work Plan.
Info-Tech Research Group1 Info-Tech Research Group, Inc. Is a global leader in providing IT research and advice. Info-Tech’s products and services combine.
Balanced Scorecard The University of Texas at El Paso Division of the Vice President for Business Affairs.
Demonstrating Institutional Effectiveness Documenting Using SPOL.
Welcome. Contents: 1.Organization’s Policies & Procedure 2.Internal Controls 3.Manager’s Financial Role 4.Procurement Process 5.Monthly Financial Report.
Cindy Tumbarello, RN, MSN, DHA September 22, 2011.
FUNDAMENTALS OF PUBLIC HEALTH Joseph S Duren Lopez Community & Public Health - HCA415 Instructor: Adriane Niare November 10, 2015.
Criminal Justice Today CHAPTER Criminal Justice Today, 13th Edition Frank Schmalleger Copyright © 2015, © 2013 by Pearson Education, Inc. All Rights Reserved.
ADRCs Do What? Using Logic Models to Document and Improve ADRC Outcomes Glenn M. Landers.
© 2012 Delmar, Cengage Learning Section V Getting the Job Done… Through Others Chapter 16 Measuring Performance: Assessment and Evaluation.
Budget Formulation: good practices
Performance Measurement
Presentation transcript:

Performance Management for Justice Information Sharing David J. Roberts Global Justice Consulting Steve Prisoc Chief Information Officer New Mexico State Courts Elizabeth Zwicker Program Specialist US Bureau of Justice Assistance 2006 BJA/SEARCH Regional Information Sharing Conference March 27, 2007 Minneapolis, Minnesota

“Measurement is the first step that leads to control and eventually to improvement. If you can’t measure something, you can’t understand it. If you can’t understand it, you can’t control it. If you can’t control it, you can’t improve it.” —H. James Harrington

WHY evaluate performance? Information is control Provides feedback to improve program performance Provides information for resource allocation Enables effective planning Tests generalizations based on experiences and assumptions Market and develop support among funding bodies, constituents, and staff.

Landscape of Performance Management Investment appraisal and benefits realization What is the actual investment we’re making? How are benefits going to be collected and tracked? Solid program management and tracking Is the project on track? How do we ensure it remains on track? Achievement of the strategic objectives Fundamentally, what is it that we’re trying to achieve in our information sharing initiative?

Process vs. Impact Evaluations Process evaluations focus on how the initiative was executed; the activities, efforts, and workflow associated with the response. Process evaluations ask whether the response occurred as planned, and whether all components worked as intended. Fundamentally, a process evaluation posits the question, “Are we doing the thing right?” Impact evaluations focus on the outcome (the what) of the initiative; the output (products and services) and outcome (results, accomplishment, impact). Did the problem decline or cease? And if so, was the response the proximate cause of the decline? Fundamentally, the impact evaluation posits the question, “Are we doing the right thing(s)?”

Balanced Scorecard Originally developed in business by Kaplan & Norton Financial – How do we look to stakeholders? Customer – How well do we satisfy our internal and external customers’ needs? Internal Business Process – How well do we perform at key internal business processes? Learning and Growth – Are we able to sustain innovation, change, and continuous improvement?

Balanced Scorecard for Law Enforcement (Mark Moore, et al) Reduce criminal victimization Call offenders to account Reduce fear and enhance personal security Guarantee safety in public spaces Use financial resources fairly, efficiently, and effectively Use force and authority fairly, efficiently, and effectively Satisfy customer demands/achieve legitimacy with those policed

Trial Court Performance Standards Access to Justice Expedition and Timeliness Equality, Fairness and Integrity Independence and Accountability Public Trust and Confidence.

Corrections Performance Security Drug Use Significant Incidents Community Exposure… Safety …of inmates …of staff …of environment… Order Inmate misconduct Use of force Perceived control… Care Stress & illness Health care Dental care… Activity Work & industry Education & training Religion… Justice Staff fairness Use of force Grievances (# & type)... Conditions Space Pop density Freedom of movement… Management Satisfaction Stress & burnout Turnover…

Universal IJIS Elements Definition: The ability to access and share critical information at key decision points throughout the whole of the justice enterprise. Scope: Recognition that the boundaries of the enterprise are increasingly elastic—engaging not only justice, but also emergency & disaster management, intelligence, homeland security, first responders, health & social services, private industry, the public, etc. Goal: Get the right information, to the right people, all of the time—underscores the need for dynamic information exchange.

Information Sharing Objectives What is the problem we’re addressing? What information do we have regarding current levels of performance? What is it that we’re trying to do? 3 Universal Objectives: Improve Public Safety and Homeland Security; Enhance the Quality and Equality of Justice; Gain Operational Efficiencies, Effectiveness, and demonstrate Return on Investment (ROI).

Sample Public Safety Measures Increase the percentage of court dispositions that can be matched to an arrest—this will improve the quality of the computerized criminal history records Decrease the average response time to establish a positive identification following an arrest Reduce the number of incidents of criminal records being associated with the wrong person Reduce recidivism Reduce the fear of crime in target neighborhoods Decrease the amount of time it takes to serve a warrant Decrease the amount of time for law enforcement to have details on protection orders. Reduce the amount of time it takes users of the integrated justice system to respond to a request from the public Reduce the time it takes to complete a criminal history background check Reduce the number of agencies that can’t communicate with each other.

JNET: Improved Public Safety & Homeland Security Notifications Timely notification of critical events Arrest, disposition, warrant, violation, death, etc Offender accountability and increased public safety. Confirmed Notifications FY01/02         3,645  FY02/03         18,349 FY03/04         29,980 FY04/05         33,264 FY05/06         46,424 Total = 178,339 confirmed notifications

Sample Quality of Justice Measures Reduce the number of civilian complaints against local law enforcement Reduce the number of continuances per case that result from scheduling conflicts between the courts, law enforcement, and prosecution Reduce the number of cases without a next scheduled event Reduce the average number of days or hours from arrest to arraignment Reduce the average time a defendant is held while waiting for a bond decision Reduce the time it takes for correctional facility intake Reduce the number of days it takes to process cases from arrest to disposition Reduce the number of false arrests. Reduce the amount of missing data.

JNET: Improvement in the Quality of Justice Improved decision making At key decision points, providing the required information in a timely, usable method Traffic Stop Who is this person? Positive identification (photo, SID, etc) Is this person wanted? Outstanding warrants/wants. Is this person a threat? Previous history of violent behavior, firearms, etc. Enhanced Overall Data Quality Reduction of errors Accurate and timely statistical reporting Improve Business Process Minimize offender processing time Reduction in “holding” time.

Sample Efficiency/Effectiveness Measures Reduce the number of hours that staff spend entering data manually or electronically Reduce the costs of copying documents for justice organizations Reduce the number of hours spent filing documents manually Reduce the number of hours spent searching other governmental databases Increase the number of law enforcement personnel performing community policing tasks, instead of administrative tasks Reduce the amount of missing information in criminal justice databases Reduce the number of corrections needed in databases maintained by CJIS agencies Decrease the number of warrants that never get entered into the state registry Increase the number of query hits on each agency database Reduce the number of hours it takes to enter a court disposition into the state criminal history repository

JNET: Efficient and Effective ROI PennDOT (DMV) Certified Drivers History via JNET In 2003, the PennDOT processed 157,840 certified driving history requests for local police, district attorneys, and the Courts One clear performance measure is highlighted by the dramatic reduction in processing costs for PennDOT. The personnel cost metric is based on the time required to process a paper copy of the driver history request, including the manual application of an embossed certification seal. PennDOT calculates their personnel cost at $1.50 per certified history processed, and when incorporating a combined printing and mailing cost of $.50 per copy, the total cost to manually generate a certified driver history equates to $2.00 per request. During August 2006, the 56,126 certified driving history requests process by JNET saved PennDOT $112,252 in monthly operating expenses. Only 4767 were processed in the traditional fashion. PennDOT has reallocated personnel to support and process other areas of business such as ‘paid’ requests from individual citizens and pre-employment screeners.

Critical Assumptions Baseline data exist regarding current or historical performance of the system Access, ability and willingness to capture data regarding on-going performance Timely, accurate and complete data collection Appropriate and sufficiently detailed analysis techniques Staff to conduct the analysis and reports Effective communication mechanisms to: Monitor on-going baseline performance Constantly assess the impact and operations Political will and operational capacity to do something as a result of what the measures show!

Performance Dashboards What we’re NOT talking about: The threat level in the airline sector is HIGH or Orange 3/1/07

What we ARE talking about…

Sample Performance Dashboard Draft dashboard assessing performance on a series of dimensions that have been agreed by key decisionmakers. This requires effective data collection and routine reporting from operational systems in place throughout the County and agreement that we’re going to do something with the data in order to respond to critical performance elements.

Establishing a Performance Management Program The Six Steps to Establishing a Performance-Based Management Program Source: Will Artley, DJ Ellison and Bill Kennedy, The Performance-Based Management Handbook, Volume 1: Establishing and Maintaining a Performance-Based Management Program (Washington, DC: U.S. Department of Energy, 2001)

Outcomes and performance measures Outcomes are the benefits or results gained by reaching goals, achieving objectives and resolving strategic issues Performance Measures are specific, measurable, time-bound expressions of future accomplishment that relate to goals, objectives and strategic initiatives Goals, objectives and strategic initiatives should ideally lead to outcomes Pragmatic performance measurement planners recognize that not all things that need to be measured can always be empirically linked to outcomes.

Not all outcomes easily lend themselves to measurement Everything that can be counted does not necessarily count; everything that counts cannot necessarily be counted. —Albert Einstein It is important that performance measures be based on criteria that correspond to desired outcomes; however, it is often difficult or even impossible to obtain objective measures of certain key outcomes.

Program Logic Model and Chain of Events Populated logic model Program Logic Model and Chain of Events Category Program Feature and Activity Initial Outcomes Intermediate Outcomes Intermediate Outcomes II Final Outcomes Reached Measures Rap sheet information of appropriate scope, timeliness, accuracy and ease of use available to magistrate judge at first court appearance/bond hearing 1. Greater use of Rap sheet information when setting bail/bond and conditions of release 1.More appropriate conditions of release and establishment of bail/bond appropriate to both the arrest charges and the criminal history and past warrant information 1. Fewer crimes committed by those awaiting trial 2. Fewer failures to appear 3. More timely disposition of criminal cases Enhanced justice process 2. Positive influence on lessening total number of crimes committed

Use scenario approach to reach agreement and define performance Bring stakeholders together to reach consensus on the desired state of integration Define the current state of integration (baseline) Quantify gap between current state and desired state Define desired outcomes Develop objectives and performance measures that can be linked to desired outcomes

Stakeholders must agree on performance measures in advance Perceived performance is an agreed-upon construct Criteria for defining performance should be negotiated by stakeholders (and governing body) prior to developing measures Stakeholders will value outcomes differently depending on their role within (or relative to) the justice enterprise

Characteristics of good measures Measures link back to goals, objectives and mission statements Measures drive the right behavior from employees, partners and consultants Collecting data on measures is feasible and cost effective Measures are understood by most employees Measures lend themselves to being tracked on an ongoing basis so that drops in performance can be detected when there is time to do something about it. Measures represent aspects of performance that we can actually change

Performance measurement caveats Most people (including your employees and consultants) can learn to make measures come out the way they think you want them to, without actually improving a process Always question the measures you’ve defined, keeping in mind that the people applying them could find ways of boosting the measures without really improving anything Test each measure to determine if it operates as expected. Does it always go one way when things get better and the other when things get worse?

The Russian Nail Manipulating a single metric allowed Soviet managers to appear successful even though their efforts did not always lead to expected outcomes. Success was typically measured by singular metrics of gross output, such as weight, quantity, square feet, or surface area. Gross output indicators played havoc with assortments, sizes, quality, etc., and frequently resulted in products like Khrushchev’s chandeliers – so heavy “that they pull the ceilings down on our heads.” A famous Soviet cartoon depicted the manager of a nail factory being given the Order of Lenin for exceeding his tonnage. Two giant cranes were pictured holding up one giant nail. My Time with Soviet Economics by Paul Craig Roberts (Published in The Independent Review, v.VII, n.2, Fall 2002,pp. 259– 264.)

Behavior driven the wrong way The Soviet Union wasted billions searching for oil because it rewarded drilling crews on the basis of the number of feet drilled. Because it is easier to drill many shallow wells than a few deep wells, drillers drilled lots of shallow wells, regardless of what was advisable geologically. The 1983 Chicago Sun Times article reported a Soviet hospital that had turned away a seriously ill patient because "they were nearing their yearly quota for patient deaths—and would be criticized by authorities if they exceeded it."

Family of related measures Produce x widgets per hour Produce x widgets per hour without exceeding y dollars Produce x widgets per hour without exceeding y dollars with only one full-time employee Produce x widgets per hour without exceeding y dollars and with only one full-time employee and generating z units of waste Produce x widgets per hour without exceeding y dollars with only one full-time employee and generating z units of waste and at a zero defect rate Produce x widgets per hour without exceeding y dollars with only one full-time employee and generating z units of waste and at a zero defect rate and without contributing to global warming

The “widget” family Number produced within specified time period Cost of producing widgets People required Waste generated Defect rate CO2 produced

Specific Justice Example

Legislatively imposed measure Number of calls from users requesting assistance (lower number indicates superior performance)

Replacement multi-dimensional measure Measure: Length of time to resolve a call for service and the quality of service call resolution as measured by the following two dimensions: 1. Average time from the opening of a service ticket to the closing of a service ticket. JID will also report the median and standard deviation with the average. 2. The quality of service as measured by a regular user surveys designed to measure the quality of the service provided to the caller. Survey respondents are selected randomly

Strategic Goal 3: Identify and recommend cost effective biometric identification applications Objective 3.1: By September 2004, research, identify, and recommend technological applications that support biometrics for rapid identification. Objective 3.2: By September 2004, research, identify, and evaluate the costs and benefits of biometric identification applications. Outcomes: • Increased knowledge of biometric technologies • Improved cost-effective biometric identification solutions Performance Measures: • Number of research projects on biometric technological solutions completed by September 2004 • Number of research projects on costs and benefits of biometrics completed by September 2004 • Number of research reports presented to the Governing Body

Justice performance measures Average law enforcement response time to calls for service for incidents involving a threat to citizen safety Percent of arrest records in state repository with final dispositions Number of automated information exchanges within and between criminal justice agencies Number of crimes cleared using AFIS system(s) Number of arrests made of wanted individuals resulting from the use of electronically available warrant and detainer information Number of electronic checks of justice databases performed to identify high risk individuals Average time from arrest to final case disposition for felony arrests

What makes a performance measure effective? First and foremost, to be effective a measure must be an incentive to a person or group of persons to change behavior in such a way that things really improve. A performance measure should provide feedback to a person or group of persons. Without feedback no information is available on whether the target implied by the measure is being met. A performance measure (or family of measures) should be precise and comprehensive so as to prevent the possibility of the measure being met without actually leading to expected outcomes.

Three-legged Stool Strategic Planning Project Management Performance Management

The role of project plans Project plans can augment a performance plan by ensuring that outputs are completed on time and on budget Rigorous project management can ensure that tasks are actually performed before they are measured. Project planning, along with strategic planning, is an essential adjunct to any performance management program.

There’s more to management than measurement If you can’t measure it, you can’t manage it. —Peter Drucker Drucker’s saying has convinced some managers that measurement is management, which is a bit of an overstatement; however, measurement is one of the most powerful tools in management toolbox

Final points If you don’t monitor your performance it will probably get worse. You can’t devise performance measures in a vacuum, you must involve stakeholders and measure what’s valued. Don’t devise measures for which you lack data. Performance measurement can be expensive and time consuming so why bother unless you intend to use the results to provide ongoing process feedback. Errors in devising measures will lead to unexpected consequences

So inscrutable is the arrangement of causes and consequences in this world that a two-penny duty on tea, unjustly imposed in a sequestered part of it, changes the condition of its inhabitants. Thomas Jefferson

Performance Measurement BJA’s Perspective Elizabeth Zwicker

Purposes: Performance Measures Linking people and dollars to performance Linking programs and resources to results Justification of continued funding Learning and management tools for us, for you I’m here to tell you what performance measures are, why they’re so important to us and to you. I’m here to introduce myself as a resource for you today. What are performance measures? They’re listed in the solicitation as the data to be colleted from grantees. We’re not collecting the data for a gotcha exercise.. We’re trying to use the information grantees gather to make better decisions which will affect you at the state and local level. It’s about collecting information to compile for “brag-worthy” reports to superiors, those who make funding, programmatic decisions, etc. Your success is our success. We want information to make good decisions to help you succeed. We may have to make some course corrections if the outcomes of the program aren’t what we expected, maybe grantees need more TA, etc. Need information before we can best help you. We understand that when you’re in the field doing the work of the grant, it’s not always easy to understand the connection between reporting information and how that affects the program. It may seem like a circuitous route, but I’d like to explain the connection to you.

What Does BJA Do With the Data? GPRA: Government Performance and Results Act PART: Program Assessment Rating Tool (www.expectmore.gov) Budget formulation MD&A: Management Discussion and Analysis We’re not asking for data for fun; we don’t want to require you to report if we’re not going to analyze or report. You’re not wasting time by ensuring the data is accurately collected. GPRA enacted in ’93 to encourage greater accountability within the government for results Ensures that programs are managed in ways to maximize performance, minimize cost, achieve desired results PART: Starting in 2002, Office of Management and Budget developed this tool to assess and improve performance. Set of questions that provide consistent analysis across government programs. Links performance to budget decisions, informs management actions, legislative proposals. For more information, see the White House’s website, or go to expectmore.gov. Programs that don’t demonstrate results do not get funding recommendations. Justification for BJA’s request for funding in President’s budget. MD&A: concise description of mission, financial position, condition, and program performance. Reported to Main Justice, White House. Data submitted subject to an audit during annual DOJ-wide audit.

How You’ll Report Performance Measures Via the semi-annual progress report submitted electronically via GMS (Grant Management System) Due Jan 30st and July 30st Report only on grant-funded activities during the specified reporting period Progress reports will not be accepted without complete data The performance measures have their own section of GMS. The first 7 questions are the same for all programs, all states, BJA-wide. The following questions are program specific and the answers must be contained in the answer fields. Do not attach a narrative to GMS in place of answering the questions. If the questions do not apply, please put 0 and then explain in the narrative question. Do not put letters in the answer fields. Will make it easier to interpret and aggregate the data. State Policy Advisors will review the reports and if incomplete or in the wrong format, they’ll click on the Change Request button, which will send you an email requesting changes. Until your SPA is satisfied with your progress report, you will not be able to submit your next report. If your report is delinquent, the funds will be frozen. We’ll report results on our website, at our regional conferences, release to evaluators. Data belongs to the public, so once it is in a good format, we will release it.

Resources Will Artley, DJ Ellison and Bill Kennedy, The Performance-Based Management Handbook, Volume 1: Establishing and Maintaining a Performance-Based Management Program (Washington, DC: U.S. Department of Energy, 2001) at http://www.orau.gov/pbm/pbmhandbook/pbmhandbook.html John E. Eck, Assessing Responses to Problems: An Introductory Guide for Police Problem-Solvers (Washington, DC: Center for Problem-Oriented Policing, no date), at http://www.popcenter.org/Tools/tool-assessing.htm Michael Geerken, The Art of Performance Measurement for Criminal Justice Information System Projects, (Washington, DC: U.S. Department of Justice, Bureau of Justice Assistance, 2006 [forthcoming]) Robert H. Langworthy (ed.), Measuring What Matters: Proceedings from the Policing Research Institute Meetings, (Washington, DC: NIJ/COPS, July 1999, NCJ 170610), pp. 37-53. David J. Roberts, Law Enforcement Tech Guide: Creating Performance Measures that Work! A Guide for Law Enforcement Executives and Managers, (Washington, DC: U.S. Department of Justice, Office of Community Oriented Policing Services, 2006) at http://www.cops.usdoj.gov/mime/open.pdf?Item=1968