HDI San Diego Chapter May 2012 Desktop Support Metrics: A Case Study Mike Russell V.P. Communications, SDHDI.

Slides:



Advertisements
Similar presentations
W ELCOME TO H OSPITALITY C ONNECTIONS Our business provides social media and technology solutions for Foodservice. We invite you to explore our professional.
Advertisements

Connoizor Enabling Performance for Businesses through Software Website: Contact:
A BPM Framework for KPI-Driven Performance Management
S-Curves & the Zero Bug Bounce:
© 2006 LaBounty & Associates, Inc. Working With Support Teams Beyond the Service Desk Char LaBounty LaBounty & Associates, Inc.
Everything HelpDesk® The Academic Preventive Maintenance Solution Helping you stay one step ahead with Ticket Templates GroupLink Corporation.
 Janet Thomas DOH 5/01/2010 Senior System Analyst – 10 years of experience.  Job Duties: develop and implement a new system for patient tracking. Train.
Microsoft Dynamics CRM Online Choice Begins Today! Ralph R. Zerbonia President Universe Central Corporation.
MIS (Management Information System)
Metrics and Dashboards
Implementation: SLA & SOP Processes Purpose Strategy Foundation We Deliver Competitive Advantage to our Customer's Global Supply Chain.
Note: See the text itself for full citations. Information Technology Project Management, Seventh Edition.
A A A N C N U I N F O R M A T I O N T E C H N O L O G Y : IT OPERATIONS 1 Problem Management Jim Heronime, Manager, ITSM Program Tanya Friehauf-Dungca,
Copyright 2009  Develop the project charter: working with stakeholders to create the document that formally authorizes a project—the charter  Develop.
Coordination of Care and the Patient’s Journey Improving Community Health Care Systems Matt Will, BA, NREMT-P Regional Coach Mayo Medical Transport Minnesota.
1 IEX8175 RF Electronics Avo Ots telekommunikatsiooni õppetool, TTÜ raadio- ja sidetehnika inst.
© 2005 EMC Corporation. All rights reserved. Remote Support Personalised ~ Acting LocallyThinking Globally; Planning Globally;Delivering Locally;
Graffiti Reporting A partnership of Local and State Government; My Local Services App enhancements.
We make it easier for businesses of all sizes to safely accept checks transmodus offers clients automation utilizing our online processing platform for.
Date: 03/05/2007 Vendor Management and Metrics. 2 A.T. Kearney X/mm.yyyy/00000 AT Kearney’s IT/Telecom Vendor Facts IT/Telecom service, software and equipment.
Best Practices – Overview
Clarity Educational Community Metrics that Matter Derek Phair & Eric Van Blarcum | May 4-6, 2015.
By Saurabh Sardesai October 2014.
ITIL: Why Your IT Organization Should Care Service Support
1 Service #ITSMgoodness barclayrae.com.
Office of Project Management Metrics Report Presentation
What is Business Analysis Planning & Monitoring?
Accelerating Product and Service Innovation © 2013 IBM Corporation IBM Integrated Solution for System z Development (ISDz) Henk van der Wijk 23 Januari.
Project Manager’s Survival Guide PMI Westchester - Quality SIG Presentation June 11, 2013 By: Annmarie Gordon, PMP.
Quint Wellington Redwood ©2002 business value through IT performance Performance Based Service Management (PBSM) “A new approach for quick & measurable.
Service Strategies Certification Showcase March 18, 2004 Kevin Durio Director, Customer Service & Support Accounting Solutions Best Software-Small Business.
The Information Component: Help Desk Performance Measures
How Do You Measure Up? Determining What Metrics Matter Most APPA B&F Conference, Savannah GA. 9/14/2009 Wayne Turnbow IT Services Department Manager TID.
1 Workshop on the Strategic Planning Model. 2 Strategic Planning Model A B C D E Environmental Scan A ssessment Background Information Situational Analysis.
June 20, 2014 Linda Sinclair. ITIL regards a call center, contact centre or Help Desk as limited type of service desk which provides a share of what.
AS Level ICT Mrs. Ghazaal. In the past, when a customer wanted to talk to someone in a company they would usually be able to telephone and be put through.
McLean & Company1 Improving Business Satisfaction Moving from Measurement to Action.
INDUSTRY FORUM PROGRESS UPDATE PRESENTATION PRESENTED BY: ALAN MORGAN Technical Committee.
When Partnering Fails… Gayle Waldron President, The Management Edge.
1 Perform! Benchmark™ - OVERVIEW & DATA CAPTURE INFORMATION Current State Benchmarking & Best Practices Analysis.
APPA - Business and Financial Conference Minneapolis, MN – September 19, 2006 Jim Daley Manager – Information Services Rochester Public Utilities
Service Transition & Planning Service Validation & Testing
1 Ss. Colman-John Neumann Basketball Survey 2007/2008.
Project Post-Mortem University of California Berkeley Extension Copyright © 2008 Patrick McDermott From an AutoContent Wizard 10/27/2007.
Measuring Managed Services 0 IT Infrastructure Partnership Team December 9, 2008.
Improving Loyalty by Improving Dealership Business Operations Raise SSI/CSI Scores Drive Accountability Eliminate Waste Improve Profits Increase Sales.
Project Tracking and Monitoring QMS Training. 2 Objective To track and monitor the progress of the project and take appropriate corrective actions to.
1 FY04 ORS Performance Management Manage Information Technology Team Members: Charlie Jones, Ben Ashtiani, Denny Bermudez, La'Tanya Burton, Ron Edwards,
Executive Summary Target Deployment – January 4, 2005 Actual Deployment – December 22, 2004.
1 Perform! Benchmark™ - OVERVIEW Current State Benchmarking & Best Practices Analysis Tool for the Public Sector.
Clinical Application. The Problem Clinical Systems are extremely complex IT configures and deploys best practices (best guesses) about what users want.
We provide web based benchmarking, process diagnostics and operational performance measurement solutions to help public and private sector organisations.
Compliance Monitoring and Enforcement Audit Program - The Audit Process.
Coaching and Creating an INSPIRE Customer Service Experience
Welcome to the DET INSPIRE Lunch N Learn Lunch Learn Do.
Staff Assessment Technology Services Department Palmyra Area School District.
Impact Research 1 Enabling Decision Making Through Business Intelligence: Preview of Report.
Welcome to Creating a Positive Customer Experience Observe Share Do.
This presentation, including any supporting materials, is owned by Gartner, Inc. and/or its affiliates and is for the sole use of the intended Gartner.
Progress of ITSM at Pomona College and the use of Footprints Information Technology Services IT Service Management and the Tools Supporting it Pomona College,
Schools - Reducing Persistent Absence Recent data Areas to considerExample questions Red Green Are pupils who are likely to meet PA criteria identified.
Student Support at the Service Desk Presented to UCSF Education Committee September 8, 2011 Julie Cox Director, IT Customer Services and Service Desk.
Check Your Stats New Reports and Processes
ServiceNow Assessments
ITIL: Why Your IT Organization Should Care Service Support
ITIL: Why Your IT Organization Should Care Service Support
How Will You Document Success?
ITIL: Why Your IT Organization Should Care Service Support
How Will You Document Success?
Presentation transcript:

HDI San Diego Chapter May 2012 Desktop Support Metrics: A Case Study Mike Russell V.P. Communications, SDHDI

About the Author Over 26 years in I.T. in San Diego Experienced in Programming, Operations, Infrastructure and Support 12 years in Service Delivery Management Specialized in Desktop Support Management and Device Deployment Member of the San Diego HDI chapter for 5 years Member of HDI Desktop Support Forums Member of HDI Desktop Support Advisory Board

The Challenges of Developing Desktop Support Metrics Analysis and Development of Desktop Support standards are not as mature as the Service Desk standards –Different responsibilities require different processes –Different processes require different metrics which are difficult to capture –History of using subjective not objective measures Desktop Support Staff have multiple inputs for service –Mobile, dynamic and dispersed workforce –Have a closer relationship with customer base –Receive escalations and triage requests from different departments in I.T.

Common Perceptions of Desktop Support Perceptions of Desktop Support can be varied –Customers feel the Desktop Support team is an extension or replacement of the Service Desk –I.T. Partners feel that the Desktop Support team can be engaged at will for immediate assistance and may be an inexhaustible resource –Project Managers feel that the Desktop Support team is a replacement for project labor and may be an inexhaustible resource –Executive Management does not fully understand the scope of work performed by Desktop Support –Desktop Support analysts can feel misunderstood, under appreciated and over utilized

The Problem I needed a way to accurately measure, analyze and market the services delivered by Desktop Support. –Demonstrate Staff Productivity –Measure Staff Effectiveness –Measure Performance Quality –Track Customer Satisfaction –Illustrate effects of bad changes –Identify Opportunities for Service Improvement –Demonstrate Improved performance –Improve staff satisfaction –Market the value of Desktop Support to I.T. and Executive Management

Must Have Measurements Staff Effectiveness Staff Productivity –The ability to track tickets by: Incidents Service Requests Problems Changes Quality Customer Satisfaction

Staff Effectiveness The Effective use of Time by Staff Actual time staff has available to work on issues –Does not include meetings, breaks, non-ticketed project time, sick days, PTO, etc. –May require process changes in time and attendance Actual time spent on individual tickets –Does not mean time received to time resolved –May be in several chunks of time –Will require manual best judgment of staff –May require modification to your ticketing system Actual Ticket Time/Available Time = Effectiveness

Staff Effectiveness Example: Bobby works a standard 40 hour week (37.5 hours w/breaks) Bobby attends 4 hours of meetings (37.5 – 4 = 34.5) Bobby is sick one day (34.5 – 8 = 26.5) Bobby documents 22.3 hours spent on ticket resolution. Bobbys Effectiveness is 22.3/26.5 = 84%

Staff Effectiveness Expect to see initial rates between 30% to 200% –Low numbers indicate that staff may not be estimating times correctly or not reporting all issues –High numbers may indicate duplicate tickets or lack of understanding of what is being tracked This can take 2 – 3 months to settle in with staff –Review monthly with staff to find cause of out of range effectiveness –Do not assume staff are purposely misleading with stats –Tip: do NOT show total time spent on tickets to staff at first (or possible at all) Industry Standard: 80% Effectiveness Rating (48 Minutes per Hour)

Staff Productivity You should already be tracking the number of tickets being closed per day or month –Decide which metrics are related to productivity and customer satisfaction (ex: initial response time, days to resolve, etc.) –In your ticketing system, automatically classify tickets as incidents or requests –Have the analyst resolving the ticket verify the CLOSING classification (do NOT default to opening classification!) –The closing analyst should document if the ticket is related to a change or problem, and if so, which one –Try using the time captured in the effectiveness metric to calculate tickets closed per working hour

Quality Use a monthly quality survey to track SLA adherence and other factors critical to the delivery of superior support Customer Contact Time (SLA < 2 business hours) Resolution Time (SLA < 8 business hours) Work log information complete and correct –Document all customer contacts including name –Should be clear enough that the CUSTOMER can understand it. Appointments Made and Kept –If appointments are made, are they kept Asset Information Correct Closing CTI Appropriate for the issue

Quality Contains Objective and Subjective measurements Measurement standards should be clear and documented Should not be performed by one individual Sampling size needs to remain consistent Because some subjective judgments must be made, the staff members must have the ability to review and challenge the results As a manager, you have the right and responsibility to make changes to the results in order to remain fair

Customer Satisfaction Send an automated survey to customers for each ticket Expect a 5% - 15% rate of return Very low or Very high returns are a red flag, especially on an individual basis. Design reports so that Customer Satisfaction can be trended to other metrics (ticket volumes, time to respond, problems, projects, etc.) Customer Satisfaction transcends all levels of management, and can be the most important factor in the perception, success and survival of the desktop support team.

Quick Review Perceptions: –Unlimited Resources, Unknown scope of services, Always Immediately available (not doing anything else), misunderstood, under appreciated, over utilized Metrics driving the solution: –Staff Productivity –Staff Effectiveness –Quality –Customer Satisfaction Put this all together, what does it look like?

The Solution The Balanced Team Scorecard –Productivity metrics for Incident, Service Request Problems and Changes (SLAs/OLAs) –Average Team Effectiveness –Average Quality Scores –Average Customer Satisfaction –Trending report for last 12 months for Key Performance Indicators and SLAs Subtext describing significant factors in changes Distributed Monthly to I.T. Management and Customers

What does the end result look like?

The Results It Works! –Received Praise from Executive Levels on the metrics reported –Adopted as standard for metrics reporting for the I.T. operations teams –Received praise from the staff as they felt recognized and valued in the organization –Captures data that can be used for further research, ex: cost of bad changes, most costly services, etc. –Recognized as a best practice by HDI, presented at the 2012 National Conference in Orlando

Thank you! Questions? Mike Russell, I.T. Service Delivery Management