9-1-1 Quality Assurance & Performance Measurement Best Practices

Similar presentations


Presentation on theme: "9-1-1 Quality Assurance & Performance Measurement Best Practices"— Presentation transcript:

1 9-1-1 Quality Assurance & Performance Measurement Best Practices

2 Introductions The State of Public Safety Communications Performance Measurement Standards, Benchmarks, Best Practices and Tools Quality Assurance Measurement Best Practices and Tools Questions & Answers

3 The STATE of public safety Communications

4 PSAP FirstNet /ESN NG9-1-1
PSAP at Center of Convergence Between NG9-1-1 & FirstNet Networks Mobile LOCATION Telematics & IOT DATA Text-to-911 IP Audio VIDEO & Photos STREAMING VIDEO Situational Awareness Mission- Critical audio Dispatch & Messaging MISSION-CRITICAL Data PSAP NG9-1-1 FirstNet /ESN Together, the creation of FirstNet and the migration to NG911 are ushering in the next evolution of public safety communications, and with it big data. NG911 and FirstNet are two complementary parts of one whole - they will be pillars of a more efficient and effective emergency response ecosystem. Together, they will enable the seamless exchange of broadband-rich multimedia communications between the public, 911 and first responders. The convergence of FirstNet and NG911 networks at the PSAP will dramatically enhance public safety communications. But they will also introduce new complexities into the PSAP and 911 telecommunicators who will need to coalesce and manage a lot more data – big data to be exact. NG911 and FirstNet are the two halves of the Public Safety request and response activities, with the PSAP (the emergency response nerve center) being in the middle. NG911 and FirstNet will produce an entirely new landscape for dealing with emergency calls. In this broadband-enabled environment, the secret to success will be in the telecommunicators’ ability to coalesce a wide array of new, potentially life-saving critical information; make sense of it, and share it with first responders in emergency situations. NG911 and FirstNet will flood PSAPs with many different types of new data that need to be received, processed and acted upon. PSAPs will become touchpoints for managing more types and greater volumes of multimedia information. As the APCO P43 Report states “PSAPs of the future will be a nerve center, managing data-rich communications via broadband technology with callers and first responders.” FirstNet and NG911 will connect PSAPs and first responders to more sources of data for informed decision-making – for example, text messages, videos and photos from citizens; streaming audio and video from connected drones, surveillance cameras, first responder body cameras; automatic crash notifications and collision data (telematics), biometric data from wearable devices; various other digitally connected alarms and sensors; as well as databases that house building plans, hazmat information, and other vital information. INTELLIGENCE CENTER

5 Performance Measurement Standards & Benchmarks

6 Performance Measurement Industry Standards
NENA Call Answering Standard/Model Recommendation 90% of all calls arriving at the PSAP shall be answered within 10 seconds during the busy hour (i.e. greatest call volume). 95% of all calls should be answered within 20 seconds. NFPA® 1221 Standard for the Installation, Maintenance, and Use of Emergency Services Communications Systems 95% of alarms received on emergency lines shall be answered within 15 seconds, and 99% within 40 seconds. [7.4.1] Where alarms are transferred from primary to a secondary PSAP, the transfer procedure shall not exceed 30 seconds for 95% all alarms processed. [7.4.4] National Fire Protection Association Standards 1710 & 1221 CALEA Standards for Public Safety Communications Agencies The agency has established performance measurements for processing times for all incoming emergency lines.

7 New Performance Measurement Standards in the Works
“Topics may include”… # of calls # of emergency/nonemergency incidents # of police/fire/EMS calls handled # of abandoned calls # of calls transferred Time to answer Length of call Wired/wireless/text/TDD/TTY and next-generation sessions Trunk group Customer satisfaction Frequency of review

8 Chesterfield County (VA)
Call Answering Benchmarks Metric Goal Sugar Land (TX) Chesterfield County (VA) Portland (OR) San Francisco (CA) Louisville (KY) NENA  NFPA Answer Time 95% within 10 secs 90% within 5 secs 90% within 20 secs 90% within 10 secs <10% not answered within 10 seconds 90% within 10 secs during the busy hour 95% within 15 secs, 99% within 40 seconds Size 26 full-time, 3 part-time 597K (2010) 874K (2015) 900K (2014) 1.523 million calls annually (2016) Sources: National Emergency Number Association Call Answering Standard/Model Recommendation, Document at page 8 (June 10, 2006), Sugar Land (TX) Public Safety Dispatch 2016 Business Plan

9 Anatomy of a 9-1-1 Call “Hello” to “Hello”
Telephony Switch Answer Queue 9-1-1 Greeting Call Taking Transfer to Dispatch Dispatch Queue Units Dispatched Units En Route Units On Scene Pick-up Time Call Taking Time Dispatching Time Travel Time Call Pick-up Incident Creation Dispatch Time Unit On Scene There are important metrics along the entire anatomy of a call. Source: NYC 911 Performance Reporting,

10 Other Important Performance Measures
NFPA® 1221 [7.4.7] Records of the dispatch of emergency response units to alarms shall be maintained in accordance with the records retention policy of the AHJ and shall identify the following: Unit designation for each emergency response unit (ERU) dispatched Time of dispatch acknowledgment by each ERU responding Enroute time of each ERU Time of arrival of each ERU at the scene Time of patient contact, if applicable Time each ERU is returned to service National Fire Protection Association Standards 1710 & 1221

11 Other Examples of Performance Goals Measured
CALL HANDLING QUALITY ASSURANCE Call Processing Times Clark (WA) Regional Emergency Services Agency – 80 seconds (EMD calls) Lincoln (NE) Emergency Communications/911 Center – Avg. call duration of 70 seconds Abandoned Calls Chesterfield County (VA) Communications – Abandoned calls exceeding 3 seconds should not exceed 10% of all emergency calls Quality Assurance Score Sugar Land (TX) – 80% # of Calls Evaluated Sugar Land (TX) – 7 random calls per operator per month Maine Emergency Services Communication Bureau – EMD calls per week (per IAED requirements) OTHER Audio Recordings Request Processing Time Sugar Land (TX) – 3 Days Citizen Complaints Sugar Land (TX) – 3 or fewer substantiated complaints regarding dispatch response Chesterfield County (VA) Communications – 2% or less of received telephone calls CALL HANDLING Radio Dispatch Sugar Land (TX) – Within 60 seconds of initiating Sources: Sugar Land (TX) Public Safety Dispatch 2016 Business Plan, Recommendations for Establishing and Maintaining a Quality Assurance Program Related to PSAP Quality Assurance (MissionCritical Partners)

12 Other Performance Measures
DISPATCH CALL HANDLING EMD Responses per Population # Incidents per multiple calls per same incident First Call Resolution % Non-Emergency Calls Transfers to other PSAPs STAFFING SATISFACTION Citizen Satisfaction Call Taker Satisfaction Overtime as a % of Total Personnel Costs QA / COMPLIANCE CAD ENTRY EMD Case compliance levels at or above accreditation levels Data entry error rate Time to locate requested call for service event number in the CAD systems

13 Today’s Performance Reporting Challenges

14 Broadband = More Multimedia & Data to Manage & Report On
TEXT-TO-911 SCREENS CAD TELEMATICS TELEPHONY VIDEO NICE Inform is our NG9-111 ready solution i3 SIP-based traffic through the ESInet is captured and stored Combined with legacy telephony and radio All – is accessible through NICE Inform NICE Inform provides the infrastructure to manage multimedia and by that a future-proof solution. Fully compatible with NRX hardware Seamless upgrade of database and archives from NRX Support for Motorola ASTRO 7.18, Agent 511 SMS Text, NENA i3 Audio and SMS Text i3 routed SIP converted Telematics EIDD LogEvent (Audit Log) RADIO PHOTOS

15 Reporting Requirements
CALEA Standards for Public Safety Communications Agencies A performance measurement methodology which addresses specific techniques for data collection, processing, data cleaning, and reporting. Operational Records: Call and dispatch performance statistics shall be compiled and maintained in accordance with Section 7.4. [12.5.1] Statistical analysis for call and dispatch performance measurement shall be done monthly and compiled over a 1-year period. [12.5.2] A management information system (MIS) program shall track incoming calls and dispatched alarms and provide real-time information and strategic management reports. [ ] National Fire Protection Association Standards 1710 & 1221

16 How Most PSAP Performance Reporting is Still Done Today
9-1-1 CUT COPY PASTE CAD MIS Beginning of Next Month

17 Performance Reporting Advancements

18 Add More Context to Your Recordings
Associate valuable CAD data with recorded communications: Incident ID Incident Type Incident Location Incident Priority Incident Status CAD Events Volusia County tags Incident Number, Incident Type, Incident Location (often the emergency is not where the caller is so ANI/ALI doesn’t help) To give you an idea of the power of a CAD integration, let’s illustrate an example. If you were standing behind standing behind a call taker, could you tell what type of call their handling and if you would want it to be evaluated? What if you could automatically tag all that data to the call recording? Now it’s easy.

19 CAD Incident Analytics: Expedites Search & Reconstruction
Reconstruct Incidents & Complete Media Requests in Half the Time CAD data associated with recordings: Incident ID / Report Number Incident Type, Status, Priority, Location Call Taker & Dispatcher Involved SEARCH USING CAD DATA NICE Inform CAD Integration & Incident Analytics saves time and improves the value of incident evidence. It eliminates unnecessary steps in search for recordings as you respond to media requests for incident investigation. Instead of manually matching time ranges of recorded communications to CAD incident records, now you have all you need in one place. Significant CAD system events that occurred over the course of each incident such as dispatch of units or on-scene arrival can be displayed on a timeline alongside synchronized media recordings—calls, radio transmissions, text-to-911 communications and screen video. Examples of incident dataaccessed from CAD systems: Incident ID / Report Number Incident Type Incident Status Incident Severity Incident Date/Time ƒƒCall Taker and Dispatcher Involved Incident Location ƒƒmments and Other CAD Data CAD DATA & EVENTS ON TIMELINE

20 Click this box to change color
“CAD Integration takes away the guesswork from trying to match CAD records to audio recordings – it does it for us automatically. It has cut our audio request processing time in half. Now we can get back to our other work.” – Karin Marquez, Communication Supervisor, Westminster Police Department, Colorado

21 CAD Details Displayed on Incident Reconstruction Timeline
Unit Dispatch Scene Clear Operator A CAD Position Unit On Route Unit On Scene CAD integration enables: Rapid reconstruction of an incident based on CAD incident ID More complete incident view for investigations Easy identification of discrepancies in operator performance: CAD input/update vs. radio communications Most authentic incident reconstruction Planned future release

22 Incident Intelligence Dashboards
Improve Operational Performance and Uncover Critical Insights Visualize all of your important metrics from: CAD Telephony / Call Taking Radio Quality Assurance DRILL THROUGH TO DETAILS MAPS & WORD CLOUDS

23 Empower Your Team with Performance Metric Updates
CALL HANDLING EVALUATIONS 911 Call Answer Times Text-to-911 Response Times Non-Emergency Answer Times Call Duration / Processing Transfers to other PSAPs Quality Assurance Score # of QA Evaluations Completed per Call Taker VIEW ON PC MONITORS & WALLBOARDS INCIDENTS Supports multiple browsers and client operating systems, including: Internet Explorer 11+ (Windows) Firefox (Windows) Chrome (Windows and Android) Safari (Mac and iOS) Speed of Answer / Hold Time Time to Dispatch CAD Case Entry Time # Abandoned Calls Non-Emergency Call Hold Times Transfer Conference Time 911 Calls per Hour Non-Emergency Calls per Hour Quality Assurance Score # of QA Evaluations Completed per Call Taker DISPATCH MOBILE TABLETS Time to Dispatch Hello to Hello CALL VOLUME Emergency Calls Non-Emergency Calls Abandoned Calls View by OPERATOR SHIFT AGENCY INCIDENT TYPE PRIORITY * Not all metrics supported in all environments

24 Quality Assurance Measurement Best Practices

25 Commitment to Quality Assurance
We’re committed to consistent Quality Assurance. We’ve partnered with The Denise Amber Lee Foundation to help educate public safety professionals on the importance of formalized, ongoing Quality Assurance and Training programs. Our goals is to see every PSAP employing Quality Assurance Processes within 5 years. 2525 Commitment to Quality Assurance Educate public safety professionals on the importance of and best practices for Quality Assurance and Training (QA/QI). See every PSAP employing Quality Assurance processes within 5 years Free E-Book 25

26 “All of the training in the world is useless if the professional telecommunicator isn’t being continuously monitored and reinforced for proper procedures.” APCO NENA QA/QI ANSI Standard Foreword

27 Highlights of APCO/NENA QA/QI Standard

28 Collaborative Effort APCO Denise Amber Lee Foundation IAED NENA
PowerPhone PSAP Leaders

29 Overview Starting point for any size agency No cost
Voluntary (not mandatory) Easy to implement Complete system for all call types Vendor/product agnostic

30 Review the Whole Call, Not Just the Intake Piece
One of the things that sets the new standard apart is that it embraces QA for the entire call taking and call dispatch process. – Eric Parry, ENP One of the things that sets the new APCO/NENA QA standard apart is that it embraces QA for the entire call taking process. Typically quality assurance programs that are out there only cover the call intake piece. What we decided to do here is cover the whole thing – stem to stern. So, we not only have processes in place for evaluating the three types of disciplines that come into our centers, we also have evaluation templates for the dispatch piece as well.” Perhaps no story illustrates more poignantly the critical need for a QA/QI program than that of Denise Amber Lee. On January 17, 2008, Denise was abducted from her home in North Port, Florida. In the hours that followed, Denise managed to dial using her captor’s cell phone. At least four other calls to were placed, one from her distraught husband and three from eyewitnesses. But despite all the calls, no help was ever dispatched. Telephony Switch Answer Queue 9-1-1 Greeting Call Taking Transfer to Dispatch Dispatch Queue Units Dispatched Units En Route Units On Scene Pick-up Dispatching Travel Call Pick-up Incident Creation Dispatch Time Unit On Scene Source: NYC 911 Performance Reporting,

31 Position and Discipline Monitoring
Consistently administered and randomly selected review of recordings Call taking for police incidents Dispatching police incidents Call taking for fire incidents Dispatching fire incidents Call taking for EMS incidents Dispatching EMS incidents

32 APCO/NENA Quality Assurance & Improvement Standard
Review in the normal course of business: At least 2% of all calls for service When the 2% factor would not apply or be overly burdensome due to low or excessively high call volumes, agencies must decide on realistic levels of case review. All cases involving catastrophic loss and/or high-acuity events – as soon as possible after the receipt of the call and/or following the radio dispatch or at least within 5 days. Any other call or event types as defined by your agency Source: APCO/NENA ANS Standard for the Establishment of a Quality Assurance and Quality Improvement Program for Public Safety Answering Points.

33 Quality Assurance Metric Goals
Addendum 1* of APCO/NENA QA/QI Standard Consider starting the program with a lower threshold score of (80%). Once your staff members have become accustomed to this level of review, consider raising the bar every two months by 2% to 5%. Continue to raise the bar until management believes it is at the appropriate level. A level of 90% is recommended. *Informative material and not a part of the American National Standard (ANS)

34 Setting Up QA Forms Question Answer Choices
Yes/No Refused (when the caller refuses to provide information) Not Applicable Tools: Sample QA Evaluations Templates for EMS, Fire and Police call taking and dispatching

35 12 Tips for Updating Your Quality Assurance Program
The APCO Project 43 ‘Broadband Implications for the PSAP’ Report offers the following recommended 12 updates to your QA/QI program: Set clearly defined minimum standards and expectations for processing SMS/text-to-911 and multimedia/MMS calls. The QA/QI program must be understood by PSTs. Update pre-scripted “interview” questions for each public safety discipline (police, fire, EMS). Set minimum expectations for gathering critical criteria particularly for callers sending multimedia information (address, callback telephone number, nature of emergency, etc.). Establish new requirements for objective scoring categories and supporting standard evaluation guidelines for the handling of broadband information (below expectations, meets expectations, exceeds expectations, etc.). Maintain a log of all incoming SMS/text-to-911 and multimedia/MMS calls which are subject to random or requested/special review in the QA program.

36 12 Tips for Updating Your Quality Assurance Program
Access and print transcripts of SMS/text-to-911 and record and store multimedia/MMS calls along with other associated information (CAD event, ANI/ALI data, etc.). Review data, photos, videos, etc. associated with incidents to assess how this information was utilized by the PST.

37 12 Tips for Updating Your Quality Assurance Program
Provide appropriate training for conducting reviews on SMS/text-to-911 and multimedia/MMS calls to QA evaluators. Establish timeline benchmarks for conducting QA reviews on SMS/text-to-911 calls and multimedia/MMS calls (e.g., weekly, monthly, etc.). Establish an accountability process, training, performance improvement plans, and/or corrective action specific to SMS/text-to-911 and multimedia/MMS calls as required. Align standard operating procedures (SOPs) with those areas identified for improvement so that the SOPs can be used in future training related to use of broadband technologies (in-service training, remedial training, training bulletins, etc.). Implement or expand Critical Incident Stress Debriefing to address Post Traumatic Stress Disorder experienced by PSTs exposed to disturbing multimedia/MMS data. New Stress-induced PST Issues Incident-related photos and video messaging will have different and more disturbing impact on PSTs than voice calls (PTSD)

38 Getting Buy-In to Your QA Program
Explain the objectives of the program Clarify exactly how they’ll be monitored, what criteria they’ll be measured on, how evaluations will be conducted, how the data will be used, and why it matters to them Get them involved in QA form design Allow them to listen to their own calls and self-evaluate Phase the program in over a period of time, soliciting feedback and making adjustments along the way

39 Creative Ways to Reward Top Performers
Callout in an employee publication Opportunity to attend a regional or national APCO/NENA conference Letter of commendation from a supervisor Incorporate best practice calls into Training Curriculum Pat on the back in an employee meeting Preferred parking spot

40 Quality Assurance Advancements

41 Typical QA Workflow DAY 1 Begin search in Logging Recorder
Sudden Emergency Search through CAD records Get distracted Return to search in Recorder Burn recordings to CD and send to QA In absence of integrations, this process looks something like this: 1. Search through CAD records to find the prescribed number of calls matching the incident types that must be evaluated, like heart attacks, house fires, or domestic violence in some counties, depending on local requirement. 2. Get distracted by a sudden emergency that requires your attention. 3. Come back, find your notes from CAD search, and now search through call recordings in your logging recorder, hoping to match the dates and times you found in CAD to the records in your recorder interface. 4. Get distracted by urgent events. 5. Continue to interact with the call logger – several agencies reported that this takes as long as 2-3 hours per week. 6. Burn those recordings onto a CD and send them for QA evaluation. 7. QA evaluator pulls up the recordings in a media player, and finds the Excel spreadsheets with various categories of QA rating and protocol compliance questions. Numerous distractions are very likely to occur, which then require the restart of this step. 8. Supervisor receives the printout of the QA evaluation with follow-up steps and coaching recommendations. 9. Call taker or dispatcher receives instruction – typically some 3-6 weeks after they took the call that has been evaluated. Some employees receive this level of attention three times per month, others are lucky enough to get it weekly. 10. Due to the somewhat random nature of this process, inherent delays due to interruptions, and the “generalized” quality of the feedback mostly related to protocol compliance, employees benefit very little. QA evaluator scores call in spreadsheet Call taker receives instruction Supervisors receives print out 2 TO 4 WEEKS 4141

42 CAD-Driven Call Selection for Quality Assurance
Increase the Productivity, Objectiveness & Consistency of Your Quality Assurance Program EVALUATOR TO-DO LIST QA FORM OPENS UP WITH INCIDENT RECORDING High priority incidents and call types – or a random sample of calls – automatically assigned to supervisors/QA staff for evaluation Objective evaluation of adherence to standards and protocols Eliminates hunting and pecking Automated Workflows can Reduce QA Time by 50% 2X Evaluator Productivity

43 We used to struggle to perform 90 QA checks a month. Now we are at 1,680 QA evaluations per month, and keep increasing.” - Training and QA Supervisor Hamilton County 9-1-1, Tennessee 18X+ Evaluator Productivity

44 Hamilton County 9-1-1 Serves 26 agencies, 130+ telecommunicators answering 2,000 to 2,500 calls per day Dispatches responders for law enforcement, fire and EMS Review and rate 5 to 12 calls per dispatcher per month Calls associated with one of 150 CAD incident types are automatically categorized for QA via business rules: Challenge Manual QA review and reporting process made it challenging to complete the appropriate number of monthly QA checks. Solution QA tasks loaded and ready when a QA reviewer logs in, the required QA forms, automated notification back to the reviewed employee, CAD integration tags 150 incident types to recorded calls, and business rules automatically categorize and schedule 7% of EMS calls, 7% of Fire calls and 3% of Police calls for QA. 7% EMS 7% FIRE 3% POLICE

45 Screen Recording Enhances QA Reviews
Capture Multiple Workstation Screens Configurable to either: 24 x 7 Screen Recording Triggered by Audio Recording Enhance Quality Assurance Comprehensive view of entire incident Identifies gaps between what was said and what was done QA Scheduler will prefer calls with screen Liability Protection Proof of what took place from operator perspective Compare actions on CAD with communication over Radio Optimize IT Application Workflow Spot system bottlenecks Use Screen Recordings for Training OPERATOR WORKSTATION “Capture multiple (or primary) screens of a single workstation” SSR 3.0 can be configured to record either all screens of a multi-screen workstation or only the primary screen of a multiple screen workstation. REPLAY WORKSTATION

46 AQUA Call Playback Recording Integration
Reduce the time it takes to review calls by 50%

47 APCO Adviser 9-1-1 Call Recording Playback Integration
Reduce the time it takes to review calls by 50%

48 NICE – Unmatched Experience
Public Safety Partnership Founded: 1986 NASDAQ: NICE Revenue: $1 Billion+ Employees: 5,500+ Customers using NICE Inform: 3,000+ 12 out of 15 of the largest cities in US & Canada Market Leader with Largest Public Safety R&D and Support Organizations Invented and Patented VoIP Recording 80 dedicated R&D professionals 75 NICE-certified implementation and support engineers, and dozens of regional partners Countries: ~150 NICE is a publicly-listed (NASDAQ: NICE) company with 2014 revenue over with more than 25K customers globally across multiple businesses. Our core focus/competency? is capturing customer’s disparate data and then analyzing it to uncover new insights – something that they didn’t know before - that could help them solve a difficult problem or potentially even transform their business. Unmatched breadth and depth of human and financial resources and industry partnerships. 30+ years of implementing and supporting more than 4,000 public safety systems worldwide, including 12 of the 15 largest emergency communication centers in the US and Canada. Long-term product roadmap backed by 80 R&D professionals dedicated solely to public safety. We invented and patented VoIP recording. 1 Million+ VoIP & 200K+ RoIP channels deployed Largest Services & Support Organization Dedicated to Public Safety Rely on NICE’s dedication to customer satisfaction, backed by a 30-year history of successful worldwide deployments and support of mission-critical recording systems: 75 NICE-certified implementation and support engineers, dozens of regional support partners in the NICE support ecosystem, and unified service and support with Motorola teams. Repeatable deployment model and proven support methods and processes. 90% of service tickets are resolved within 24 hours. Professional trainers used to ensure customers get the most value from their NICE solution. NICE Public Safety User Group (PUBNUG), Q3 2016, will be the largest User Group for sharing PSAP recording, incident management and QA/QI best practices – adding additional value for NICE customers. “It’s clear that NICE Public Safety is executing well on all cylinders – product reliability and support have been outstanding. Keep up the great work.” – Diane Sanchez, Deputy Director of Emergency Communications, Harlingen Police Dept., Texas

49 Exclusive Customer User Group
Community Forum for sharing best practices, tips and tools Training webinars, videos & resources Searchable PUBNUG member database Jobs board 500+ members PUBLIC SAFETY NICE USER GROUP Tools to prepare for and implement NG9-1-1, QA and FirsNet

50 Questions? We’re here to help.
YOUR NAME Your Phone Address


Download ppt "9-1-1 Quality Assurance & Performance Measurement Best Practices"
Ads by Google