ISDS Service Support Performance – May 2019

Slides:



Advertisements
Similar presentations
Confidential Phone Internet CustomerHelp Desk Support Model for Sabre Qik Customers V 1.0 Agency eServices.
Advertisements

The importance of the service catalogue to the service desk
Prepared by the Virginia Department of Criminal Justice Services PTCC …the future of case management DCJS presents:
1 IEX8175 RF Electronics Avo Ots telekommunikatsiooni õppetool, TTÜ raadio- ja sidetehnika inst.
Use of Informational Change Request to Communicate Agency Changes.
Date: 03/05/2007 Vendor Management and Metrics. 2 A.T. Kearney X/mm.yyyy/00000 AT Kearney’s IT/Telecom Vendor Facts IT/Telecom service, software and equipment.
Incident Management ISD Division Office of State Finance.
Problem Management ISD Division Office of State Finance.
June 20, 2014 Linda Sinclair. ITIL regards a call center, contact centre or Help Desk as limited type of service desk which provides a share of what.
Term 2, 2011 Week 3. CONTENTS The physical design of a network Network diagrams People who develop and support networks Developing a network Supporting.
Administration and Finance Incident Prioritization Document
Training Role Module 10 – Support v2. Role Module 10 – Support Objective –To understand the issues commonly encountered by first- and second-line support.
Remedy – Customer Portal Fiona Gregory McKesson CRM 1.
Axios Systems IT Service Management Solutions TM Report and Follow Up Follow-up-makes-the –world- go-round Brian Kerr, Axios Systems.
Common Origination and Disbursement (COD) Open Forum Session 19.
Information Technology Division Customer Service Support Center.
COTS Software Licensing Service Level Agreement Considerations 2014.
 How to Reach Us  Service Support Levels  Support Definitions & Standards  Response Levels  Point of Contact  Our Escalation Processes  Expect a.
EGI Process Assessment and Improvement Plan – EGI core services – Tiziana Ferrari FedSM project 1EGI Process Assessment and Improvement Plan (Core Services)
ITIL® Core Concepts “Foundations to the Framework” Thatcher Deane 02/12/2010.
Service Desk Good Practice Guide – Benchmarking survey results January 11 th 2012.
Call Center Support Process Web Support & Maintenance for BGC Partners Version 1.1 June 3, 2016 Notice: The enclosed material is proprietary to TRIZE Consulting.
Campus wide Ticketing Tool for UC Berkeley
Stavroula Balopoulou , Angelo Lykiardopoulos, Sissy Iona HCMR-HNODC
Environment Live Data and Services to GEO Assessment
MEETING PLANNER SATISFACTION TRACKING SOLUTION
Service Delivery Dashboard: FY17 Overview
A Helpdesk Health Check GURU’S GUIDE:
Ticketing essentials for Technicians at the University of Oregon
IT Service Management at Cardiff University
Managing Expectations and SLA
Ticket Handling, Queue Management and QlikView Dashboard Workshop
Customer Relationship Management
Incident Management Incident Management.
Managing Expectations and SLA
Customer Relationship Management
Like a town… Like a town Easy access to services at CERN
IT Service Operation - purpose, function and processes
Customer Relationship Management
AgilizTech Support Desk Overview
IT management, simplified.
October 2014 CenturyLink Design Service Repair Billing & Repair Billing Dispute Process Highlights.
IT Service management.
Establishing Strategic Process Roadmaps
ServiceNow Assessments
Service Delivery and Support Program Update – Jan. 31, 2018
Query and Statistics Optimization Manual of CSC-V & V7.4.6
The Features of a Product or System
YSU ITS Metrics for Tod Hall Leaders November 4, 2016
HP Service Manager-UPAPDRP
Standards of Performance Scorecard
Klopotek is transitioning to a Global Organization
Training Guide: VCBM Tool
Standards of Performance Scorecard
Standards of Performance Scorecard
Analysis of Network Outage Reports
Standards of Performance Scorecard
Standards of Performance Scorecard
The Service Portal What is the Self-Service Web Portal?
The Service Portal What is the Self-Service Web Portal?
Key Value Indicators (KVIs)
Project Name - Testing Iteration 1 UAT Kick-off
Standards of Performance Scorecard
The Service Portal What is the Self-Service Web Portal?
Standards of Performance Scorecard
KVI Analysis for period 1st May 2018 to 31st March 2019
ISSUE MANAGEMENT PROCESS MONTH DAY, YEAR
Standards of Performance Scorecard
Customer Experience: This is Now
KVI Analysis for period 1st May 2018 to 31st March 2019
Presentation transcript:

ISDS Service Support Performance – May 2019 Incident Response Incident Response Incident Resolution Change Success Rate Volumes in this graph April 19 Met 820 Breached 48 May 19 519 34 739 54 Volumes in this graph Mar 19 Met 1863 Breached 181 April 19 1354 150 May 19 1877 205 Ticket Volumes by Channel First Contact Resolution Customer Satisfaction May-19 satisfaction: 97% May-19 responses: 236 12-month average: 97%

ISDS Service Support Performance Dashboard Explained Incident Response Incident Resolution Change Success Rate Description: Time to assign an incident to a technician, based on the following targets: Calculation: All ISDS teams (but excluding Service Desk) and all priorities combined total, showing whether we met or breached the agreed targets for the last 3 months. Comments and data observations: Our incident response is taken seriously by all teams and is consistently above the 90% SLA target each month. Every ISDS team comfortably met SLA Response targets this month. Priority Time to Resolve 1 – Critical 2 Hours 2 – High 4 Hours 3 – Medium 1 day 4 - Low 3 days 5 - Minor 5 days Priority Time to Assign 1 – Critical 15 mins 2 – High 30 mins 3 – Medium 2 hours 4 - Low 1 day 5 - Minor 2 days Description: Time to resolve an incident, based on the following targets: Calculation: All ISDS teams (including Service Desk) and all priorities combined total, showing whether we met or breached the agreed targets for the last 3 months. Comments and data observations: 205 cases breached their SLA, with 53 belonging to Student Journey who closed a significant number of Aged Cases. 9 were P2 Teaching Emergencies, making up part of the 94 total breached cases within Campus Teams. Description: 13-month high level view of whether IT changes have been performed with no issues, some issues or major issues. The categorisation is based on the following info: Comments and data observations: 1 completed with Major Issues; 1 was cancelled; 1 was Rolled Back; 2 with Minor Issues. 18 completed with no issues for a success rate of 78%. Report Status Actual Change Status Completed with no issues Minor issues or cancelled Completed with minor issues Cancelled Major issues or rolled back Completed with major issues Rolled back Failed Note: Success rate % only includes changes completed with NO issues out of total number of changes attempted. Excludes Pre-Authorised Changes. Ticket Volumes by Channel First Contact Resolution Customer Satisfaction Description: 13-month view of the total number of cases in our ITSM tool, processed by the IT Helpline per month, per channel. Calculation: Includes all incidents, service requests and requests for information, and cases recorded by NorMAN (our Out of Hours service). Comments and data observations: The Service Desk processed 5227 cases during the month of April. This is 547 more cases than in May 2018; representing an increase across all channels with the exception of Face-to-Face. Description: 13-month view of incidents versus Service Requests resolved by the Service Desk at first contact. Calculation: Includes only incidents and service requests. Calculated as cases logged via phone or face-to-face and marked as resolved in our ITSM tool within 20 minutes of being logged. Excludes emails. Comments and data observations: Total FCR was 54%. This is an increase of 2 percentage points from April 2019; with the overall trend being indicative of long-term work with technical teams to increase FCR through Shift Left initiatives. Description: Responses to our on-going customer satisfaction survey at the end of each ticket resolution. Excludes face to face feedback cards. Calculation: The number of customers who responded with either highly satisfied or satisfied, versus the number of customers who responded with unsatisfied or highly unsatisfied. Includes feedback for all ISDS teams. Comments and data observations: All feedback received for PoB cases is analysed and shared with the relevant teams each month. Customers who were unsatisfied are contacted by the responsible team or the Service Management Office. 

ISDS Service Level Overview – Critical Services – May 2019 Availability SLA Result Compared to last month Reliability Reliability Result Comment Student Records Management Not met 94.46% Met 2 (1) U4SM overran change window by 39.25 hours from 00:00 on 8th May. Users informed they could access the system again at 15:16 on 9th May. (2) Intermittent QLS log in issues for 2 hours from 08:10 on 10th May Coursework 100% Timetabling Student Portal SAP Payroll Core Network Identity Management Data Centre Environments Moodle Turnitin Classroom Technology

1 2 3 ISDS Service Level Performance Explained Availability Target  Incident Prioritisation Urgency = Preconfigured in PoB 1 2 3 Critical service Essential service Supporting service Impact = Selected Manually Global impact, affecting more than 80% of users of a service OR Potential of affecting more than 1,000 people AND Affecting more than 80% of system functionality 4 Affecting 20%-80% of users of a service Affecting critical piece of system functionality Affecting more than 10 people Teaching emergency 5 Affecting less than 10 people Affecting non-critical functionality of the system Single printer / single PC Availability Target Measuring the total uptime of the entire service, availability refers to that time when the service can be utilised by users in accordance with the definition incorporated in the service catalogue. Availability target 99.9% Reliability Target Reliability focuses on minimising the number of outages the University experiences, regardless of duration, during designated periods with regard to specific services. Number of outages per month <2 Availability and reliability reporting explained: Only Business Critical services are included in the report Measured & reported monthly, calculated on a 24/7 basis Any downtime approved in advance via the IT Change Management process is excluded from these calculations Downtime is recorded and reported manually, i.e. not via an automated monitoring system Availability and reliability comments and data observations: Work to produce a service catalogue is ongoing to define components of Business Critical services to help clarify uptime/downtime. U4SM Outage: There was a significant overrun of an approved Change window. Users were advised not to attempt to sign for the duration; an effective service outage. Work is on-going to address the issues with downtime associated with U4SM Changes through our Problem Management Process.