Presentation is loading. Please wait.

Presentation is loading. Please wait.

Performance Dashboards A Common Sense Approach

Similar presentations


Presentation on theme: "Performance Dashboards A Common Sense Approach"— Presentation transcript:

1 Performance Dashboards A Common Sense Approach
Eddie Vidal Measurement is important because it puts vague concepts in context; it is simply not enough to say you want to deliver quality service—you must define it before you can know if you’re succeeding. In this session, Eddie Vidal will introduce the tools and templates you’ll need to grade and rank your service desk analysts in eight different categories (based on the same program he implemented at the University of Miami). Since all of these numbers and measurements will get you nowhere without the buy-in and contribution of the analysts you are measuring, Eddie will walk attendees through the steps he followed to gain that buy-in, and provide you with a starting point for implementing analyst dashboards in your organization and improving the quality of service you provide to your customers.

2 Objectives Common Sense Approach – Keeping it simple
Useful information – Why it’s important to use metrics. Obtain Buy-in Create Professional Development Program We spend our whole lives being measured. How much did you weigh at birth? What was your GPA? What score did you earn on your SAT? As a manager, how do you measure yourself? Many measurements are subjective, which can lead to differences of opinion. This presentation will provide the steps taken at the University of Miami and help you apply them to your organization. You will be able to clearly define the expectations of your team, track individual and team performance, recognize top contributors and obtain consistent performance from your team members on a daily basis. Eddie will walk you through the steps he navigated and provide you a starting point to implement an analyst dashboard within your organization and improve the quality of service provided to your customers. Take back ideas from the presentation back to your organization Be able to apply templates right away Share our approach

3 Setting Expectations Do we know what is expected of you?
If you knew, would you do your job better? If you knew the results of your work? Know your strengths Work on weaknesses Praise, Praise, Praise Do you have goals set from your management. If you knew what was expected from you wouldn’t you do a better job? When is the last time you have received a nod or a pat on the back from your manager that you have done a good job? Is the feedback specific?

4 Family Set expectations for our family and kids
Young kids expectations Please, Thank you Teenagers Participate in the family outings/dinner No phones at dinner Be respectful of adults Make smart choices School work, report card, grades

5 Common Sense Approach – Why?
Recognize top performers To demonstrate management cares Our goal is to achieve better morale, fair treatment to each team member and obtain consistent performance on a daily basis Failure is an option Started in 2006 there were no metrics Moral was down Couldn’t communicate how well we were doing Too many generalizations, customers are unhappy 1-on-1 meetings at first, not much discussion about work related stuff

6 Common Sense Approach – Why?
Specify required performance levels Track individual and team performance Plan for head count Allocate resources Justification for promotions and salary increases Example of Microsoft Exchange Upgrade, 15,000 users affected, Service Desk said they were killed Asked to pull reports on open incidents and there were only 40 incidents Why the low number? Maybe there weren’t that many incidents or the team did not open incidents for all the calls.

7 Common Sense Approach Researched best practices, contacted ITSM peers and used HDI Focus Books, HDI Research Corner Several Revisions Involved and Gained Acceptance from Team Obtained buy-in from Management Started thinking about creating a dashboard after attending first HDI conference in 2007, Mandalay Bay, Las Vegas Nv Changing the culture is not easy. I’m one person and there are 15,000 employees Scripps example w/ Call volume and resource scheduling

8 Acceptance As a team member of the IT Support Center I have participated, provided feedback and helped develop the measurements used for our annual review and recognition plan. I, hereby, acknowledge that I have read and understand the IT Support Center Measurement procedures. By signing, I acknowledge and agree to the criteria by which I will be measured and understand what is expected of me. _________________ ____________________ Employee Signature Print Name _________________ ____________________ Authorized Signature Print Name  _________________ _____________________ Date Date

9 Service Desk Analyst Employee of the Month Spotlight on Success
Reward Must reach score of 90% or higher One employee eligible per calendar month* *If we have a tie, the employee entering the most Service Requests and Incidents will be the winner. Ever receive a letter or a card from someone. Where do you keep it now?

10 Year Phone Calls Incidents Service Requests 1 26,344 9,830 5,044 2 35,922 15,008 5,339 3 40,719 25,447 6,076 4 40,270 27,791 5,832

11 What is Measured? Call Monitoring 15% Incident Tracking 15%
Average Talk Time 10% Percent Available/Logged in Time 10% First Call Resolution 10% Percent of Service Requests Entered 15% Percent of Team Calls Answered 10% Service Request/Incident Tracking Accuracy 15%

12 Metrics – Call Tracking
Percent of incidents entered based on total calls answered Example: 75 incidents entered / 100 calls received = 75% Weight 15% Goal 70% 70% or higher 15 points 50% to 69%, 12 points Why would a metric be over 100% Example – enter incident from day before because they couldn’t enter them that day. Did the analysts understand the impact? Educate them! Change to Call tracking. Allows us to build a repository of what is going on We may we want to clarify this to Trouble Ticket. 70% = 15 Points 50% = 12 Points 40% = 9 Points 30% = 6 Points 20% = 3 points < 20% = 0 Points

13 Why Do We Track Incidents?
To build a repository to identify customer training and education needs Ability to build self-help solutions to allow customers to resolve many issues with less impact on the support staff – Level 0 support Leads to Problem, Change, and Knowledge Management? Migrate from dispatch center to true support center Call avoidance – allow customers to do self-help before having to contact support center, allows more time for difficult issues

14 Service Requests/Incident Accuracy
Weight 15% Goal 95% accuracy Criteria used for grading Location Over 50 buildings at UM, need to ensure the Incident info is accurate

15 Service Request/Incident Tracking Accuracy
Has the customer been contacted with in 24 hours? Are work notes and comments user friendly? Does the customer understand it? Was the customer kept in the loop? Was customer sign-off obtained? Time, how long do you think this takes? Manually completed by students. At FIU it was 48 hour turnaround The first report our team received was at 65%, the boss was not happy. Had to create a plan on how I was going to improve this. It took us 3 months to bring up the score to 90+%. Starting each meeting similar to 5 Dysfunctions of a Team, what are our goals? Has the customer been contacted within 24 hours? Are diary entries user friendly, does the customer understand it if they were to read the ticket? Was the customer kept in the loop. Was customer sign-off obtained? 3 attempts in 10 business days Kept Customer in loop? How is this determined? Keep this in mind to determine if the customer is kept in the loop. If the customer needs to call the SC for the status of their SR, do they really know what the status is? Would that phone call be required? 1) Are the diary entries updated to state when the next expected entry or work is to occur with the user? 2) Is the customer informed when the next visit or work on their service request is to be performed? 3) If you have set an appointment with the customer in advance has it been entered in the Service Request? 4) If customer was not reachable when contacting them is there an entry stating the means of communication, for example, via , voic or in person? 5) Is there a statement stating the user was informed by leaving a service receipt form, or voice mail? 6) If you answered YES to all of the questions then it's a yes for keeping customer in the loop. If the answer is no to one of the questions then the Customer Service Objective has not been met.

16 Ticket Evaluation Template
Eddie

17 Common Sense Approach - Scoring
Subjective Maybe Not sure Hmm I think so Objective Yes No

18 Percent of Calls Answered Percent of Service Requests Entered
Are users calling published number? Do you have one Analyst answering most of the calls? Explain why voice mail is removed. Stress to customer they must call 6565, this will allow us to properly track all calls into our ACD Based on 7 employees >14% = 10 points > % = 8 points 8 – 10% = 6 points < 8% = 0 points Answering 33% of all calls incoming on the ACD for 3 employees with a range of +/- 5% Answering 25% of all calls incoming on the ACD for 4 employees with a range of +/- 5% Answering 20% of all calls incoming on the ACD for 5 employees with a range of +/- 5%

19 First Call Resolution (FCR)
Percentage of incidents resolved on the initial contact with the customer Used to measure the knowledge and skill level of the analyst Weight 10% Telecom Goals: 60% 60% = 10 Points 50% = 8 Points 40% = 6 Points 30% = 4 Points 20% = 2 Points <20% = 0 Points

20 Percent Available Time
Percentage of total time the analyst has been available to take incoming or make outgoing calls Talk time + Ready time + After Call Work (ACW) – Not Ready time = % Available Weight 10% Goal: 6 hours 30 minutes of time logged in to the ACD 85% = 10 Points 70% = 5 Points <60% = 0 Points Avg speed to answer goes down Abandon rate goes down Answer more calls Service level goes up Customer Service Ratings

21 Average Talk Time Average talk time per analyst Weight 10%
Average time an analyst spends talking to a customer on each call Used to determine staffing and training needs Weight 10% Goal 5 minutes 5 minutes or less 10 points 5 minutes or over 0 points

22 Call Monitoring In order to improve the customer experience, evaluation of calls are reviewed and graded 10 70%

23 Four Part Scoring Greeting the customer Key points during the call
Ending the call Behavioral Questions

24

25

26 Bonus Points Knowledge Database Document Contribution
Training, Online learning Seminars attended Must return and present to the team what you learned from the seminar and how it can be applied to the job or team Presentations to the Team (SME) Unsolicited Customer Commendations

27 Additional Performance Appraisal Requirements
Professional Development 20 hours of class time per calendar year: Conflict Resolution in Everyday Life, Customer Service for the Professional, Setting Personal Goals Certification once per year Microsoft Certified Desktop Support Technician (MCDST) Microsoft Certifications for IT Professionals A+, Network +, Security +, ITIL, VoIP

28 Who Performs Customer Surveys?

29 Customer Surveys I am satisfied with….
The courtesy of the support representative? The technical skills/knowledge of the support representative? The timeliness of the service provided? The quality of the service provided? The overall service experience? We stopped doing the surveys. WHY?

30 Total Satisfaction (Combined 4/5 Percentage)
Customer Surveys Overall Survey Results Customer Satisfaction with 1 (Very Dissatisfied) 2 Dissatisfied 3 Neutral 4 Satisfied 5 (Very Satisfied) Total Satisfaction (Combined 4/5 Percentage) The courtesy of the representative 1% 0% 3% 4% 91% 96% The technical skills/knowledge of the support representative? 6% 89% 95% The timeliness of the service provided? 5% 87% 92% The quality of the service provided? 2% 90% The overall service experience? 8% 88%

31 Celebrate Your Success
Write a blog, write something! University of Miami User Support Services scores higher customer satisfaction ratings than the HDI Customer satisfaction benchmarking study report. User Support Services scores higher customer satisfaction ratings than benchmarking report September 5, 2012 Eddie Vidal, Manger Enterprise Support Services University of Miami User Support Services scores higher customer satisfaction ratings than 2012 HDI Customer satisfaction benchmarking study report. In June 2012 the University of Miami User Support Services (USS) team developed a customer survey process to obtain customer feedback. The survey asks the customer to rate the quality of service, timeliness, technical skills, courtesy and overall satisfaction and experience on a per incident basis. An with a link to a five question survey is sent to customers after an incident/ticket is closed. USS created an Incident Tracking system using SharePoint forms in Jan 2012 and has closed approximately 8500 incidents. The survey’s results shown in the table below indicate USS customers are very satisfied with their service. HDI is the leading professional association and certification body for technical service and support professionals. HDI published a Customer Satisfaction benchmarking study report in September of The results are based on over 350,000 survey responses and 510 supports centers representing 215 companies. As the results show, USS customers are very satisfied with support achieving higher results than HDI's benchmarking study. Comparing USS to other Educational organizations USS exceeded the results published in the HDI report. The creation of customer surveys was developed following best IT service management practices and to obtain customers opinion of the services and support provided by USS. With the results of the survey, USS used the data to begin a continual service improvement program. The results were used to determine weaknesses and find ways to improve and correct them. This was accomplished by reviewing the survey results submitted and contacting users who rated USS service with a score of 3 or below. The rating scale is 1 for very dissatisfied and 5 for very satisfied. By contacting customers, USS learned many things and have found ways to improve the delivery of our services. The data has shown the recent effort of implementing the customer survey process has delivered the desired results of improving the delivery of IT services to the university community.

32

33

34 Dashboard

35

36 Performance Dashboards
Who is your customer/audience? What is the information requested? How often do they want it? What format? How do they want to receive information? To make reports meaningful to the reader, you must make them actionable by putting the right information in the right people’s hands at the right time in the right format using the right medium.

37 USS Metrics Dashboard Information Requested
s Phone Tickets Requests Received Sent Offered Answered Abandoned Abandoned Rate Opened Closed ITES Jan 365 775 564 145 19% 893 894 Feb 58 581 411 133 23% 162 155 Mar 134 536 447 73 14% 282 296 Apr 503 657 467 154 1061 1129 May 805 569 465 80 1357 1396 3 2 Jun 525 659 524 103 16% 1318 1269 1 Jul 617 545 463 65 12% 1257 1250 8 11 Aug 588 594 500 83 1055 995 318 Sep 423 396 328 61 15% 275 272 575 611 Oct 258 87 24% 287 285 762 717 Nov 360 332 216 90 27% 214 213 613 595 Dec 552 233 194 29 851 849 426 Grand Total 5377 6242 4837 1103 18% 9012 9003 2707 2728

38 Channel Tracking

39

40 Trending Year-to-Year

41 Three Takeaways Common Sense Approach – Keeping it simple
Useful information – Why it’s important to use metrics. What did you learn today – can you apply it to your job? Common Sense Approach – Keeping it simple Useful information – Why it’s important to use metrics. Obtain Buy-in Create Professional Development Program

42 Managing Director – EJV Corp
Eddie Vidal HDI Hall of Fame – Class of 2016 HDI & Fusion Track Chair & Speaker HDI Strategic Advisory Board Founder of South Florida HDI Local Chapter Published in Support World Magazine & HDI Connect itSMF Monthly Podcast Producer 2014 itSMF President’s Award Eddie Vidal Managing Director – EJV Corp Eddie Vidal has over twenty years’ experience in information technology, where he focuses primarily on service delivery and support for IT infrastructures. In his current position as the manager of enterprise support services for the information technology department at the University of Miami, Eddie supports over 35,000 faculty, staff, and students. In addition to higher education, Eddie’s experience includes the hospitality and travel industries. Eddie currently serves as the president of the HDI South Florida local chapter and a member of the HDI Desktop Support Advisory Board (DSAB). He has spoken at local, regional, and national events and has been published in HDI's SupportWorld magazine. Managing Director – EJV Corp @eddievidal

43 Thank you for attending this session.
Please don’t forget to complete an evaluation form! Eddie Vidal | |


Download ppt "Performance Dashboards A Common Sense Approach"

Similar presentations


Ads by Google