Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 Supporting Sakai: Lessons from the University of Michigan Sean DeMonner Jeff Ziegler Usability Support and Evaluation Lab.

Similar presentations


Presentation on theme: "1 Supporting Sakai: Lessons from the University of Michigan Sean DeMonner Jeff Ziegler Usability Support and Evaluation Lab."— Presentation transcript:

1 1 Supporting Sakai: Lessons from the University of Michigan Sean DeMonner Jeff Ziegler Usability Support and Evaluation Lab

2 2 Usability, Support & Evaluation Lab, Digital Media Commons, University of Michigan What we’d like to share with you Brief history, scale and scope of the implementation Support group structure Descriptive statistics related to supporting a large scale Sakai implementation Support metrics and satisfaction indicators Common issues, and strategies for addressing them Best practices in support and lessons learned

3 3 Usability, Support & Evaluation Lab, Digital Media Commons, University of Michigan Overview - History Sakai in use at U-M for about three years Successfully migrated from legacy CMS to “CTools” CLE in Summer 2005 Currently running Sakai 2.1.2

4 4 Usability, Support & Evaluation Lab, Digital Media Commons, University of Michigan Overview - Scale Total existing sites 10,760 Course sites 7,217 Project sites Winter 06 Statistics 2,806 Course sites in use 44,740 unique users 12,700 daily users (avg) 4,190 concurrent users at peak

5 5 Usability, Support & Evaluation Lab, Digital Media Commons, University of Michigan Overview - Scope Email and phone support available during standard business hours Remote email support Weekends Evenings (when warranted) 24x7 system availability except maintenance windows

6 6 Usability, Support & Evaluation Lab, Digital Media Commons, University of Michigan Usage Statistics - NetTracker

7 7 Usability, Support & Evaluation Lab, Digital Media Commons, University of Michigan Daily Usage Pattern

8 8 Usability, Support & Evaluation Lab, Digital Media Commons, University of Michigan Weekly Usage Pattern

9 9 Usability, Support & Evaluation Lab, Digital Media Commons, University of Michigan Support Structure Two-tiered Tech Support Model 1 st Tier – 90 hours/wk temporary staff Front line technical issues 2 nd Tier - 1.5 FTE professional staff Policy issues Issues escalated by 1 st Tier Training and Documentation 1.25 FTE professional staff.5 Staff Training.75 Help Documentation

10 10 Usability, Support & Evaluation Lab, Digital Media Commons, University of Michigan Support Structure Quality Assurance 30 hours/wk temporary staff Local Unit Support CTools Affiliate program Department IT staff Sites computing consultants

11 11 Usability, Support & Evaluation Lab, Digital Media Commons, University of Michigan User Training Ongoing, with a pre-term peak Instructor focused Group/department presentations preferred Low-stakes student introduction Coordination with release timing desirable

12 12 Usability, Support & Evaluation Lab, Digital Media Commons, University of Michigan Documentation Do not assume “no one reads the docs” 16% of faculty & 7% of students list docs as “most effective way to get help” Include docs in localization effort Occasional request for manual Online and task based PDF “chapters” Flash visual tutorials are being implemented

13 13 Usability, Support & Evaluation Lab, Digital Media Commons, University of Michigan Quality Control Internal staff tests local instance Risk based testing Test integration points with campus systems Test across campus browser pool 1-2 weeks preferred; semi-formal reports; go/no-go meetings Cross-test w/Sakai Operations Group runs load testing Evolution of protocol Automated load testing

14 14 Usability, Support & Evaluation Lab, Digital Media Commons, University of Michigan Tech Support Performance Metrics Number of support requests Email (91%) Phone (9%) Response Time Number of touches Satisfaction data from annual survey Daily and weekly status reports

15 15 Usability, Support & Evaluation Lab, Digital Media Commons, University of Michigan Issue Tracking FootPrints Electronic queuing mechanism Feeds metrics gathering and reporting Issue classification:

16 16 Usability, Support & Evaluation Lab, Digital Media Commons, University of Michigan Support Requests - Email LS&A mandate Fall site creation Winter site creation

17 17 Usability, Support & Evaluation Lab, Digital Media Commons, University of Michigan Support Requests - Email W06

18 18 Usability, Support & Evaluation Lab, Digital Media Commons, University of Michigan Response Time Service level goal: 2 hours or less Business hours constrain this goal *On average* response of 15 min. or less Peak times see increases in wait times Daily “clear the queue” push

19 19 Usability, Support & Evaluation Lab, Digital Media Commons, University of Michigan Touches Per Ticket N= 5518; does not include site creations These numbers should see some improvement due to “up front” requests for user info

20 20 Usability, Support & Evaluation Lab, Digital Media Commons, University of Michigan 2006 Support Survey - Instructors 1,357 respondents 33% have contacted support 18% report email is “most effective way” they get help Instructors are “very satisfied” with support services (4.21 out of 5 quality rating)

21 21 Usability, Support & Evaluation Lab, Digital Media Commons, University of Michigan 2006 Support Survey - Students 2,485 respondents 15% have contacted support 36% report “keep trying on my own” is the most effective way to get help Students are “moderately satisfied” with Support Services (3.7 out of 5 quality rating)

22 22 Usability, Support & Evaluation Lab, Digital Media Commons, University of Michigan 2006 Support Survey - Comparison

23 23 Usability, Support & Evaluation Lab, Digital Media Commons, University of Michigan Support Request by Feature

24 24 Usability, Support & Evaluation Lab, Digital Media Commons, University of Michigan Status Reports Daily and weekly communications

25 25 Usability, Support & Evaluation Lab, Digital Media Commons, University of Michigan Voice of the Customer Customer feedback informs organization:

26 26 Usability, Support & Evaluation Lab, Digital Media Commons, University of Michigan Voice of the Customer Policy Issues often come through Support:

27 27 Usability, Support & Evaluation Lab, Digital Media Commons, University of Michigan Common Issues Sites created before registrar data is available Maintenance windows Large files & upload/download problems Site reuse vs. creating new

28 28 Usability, Support & Evaluation Lab, Digital Media Commons, University of Michigan Common Issues Integration with immature or unsupported services/browsers Lack of Auditing Tools Distributed product delivery (e.g. Identity management, File systems, Registrar data)

29 29 Usability, Support & Evaluation Lab, Digital Media Commons, University of Michigan Common Issues Managing large resource collections Managing peak support loads

30 30 Usability, Support & Evaluation Lab, Digital Media Commons, University of Michigan Lessons Learned Training is key Low-stakes introduction to tools Use a ticket tracking system Multi-tiered support staffing works well Hire and train Tier 1 staff early (July for Fall)

31 31 Usability, Support & Evaluation Lab, Digital Media Commons, University of Michigan Lessons Learned Distributed support model (Affiliates in units) Escalation of Issues / Communication Flow Advisory committee(s) / Policy making bodies Embrace the workaround Establish Support Accounts and use Jira Combine MOTD with Known Issues

32 32 Usability, Support & Evaluation Lab, Digital Media Commons, University of Michigan Lessons Learned Know browser targets and test local instance Migration never ends “Eat your own dogfood” Changing expectations of online systems There’s no substitute for talking with customers, attending trainings, etc. Support is the ear of the organization and should “have a spot at the table”

33 33 Usability, Support & Evaluation Lab, Digital Media Commons, University of Michigan Questions? Sean DeMonnerdemonner@umich.edudemonner@umich.edu Jeff Zieglerziegler@umich.eduziegler@umich.edu

34 34 Usability, Support & Evaluation Lab, Digital Media Commons, University of Michigan What else? FootPrints demo NetTracker demo Selecting support staff

35 35 Usability, Support & Evaluation Lab, Digital Media Commons, University of Michigan Abstract Sakai has been in use at the University of Michigan for the past 3 years, serving tens of thousands of customers. Hear the in and outs of supporting a large scale Sakai deployment from the folks who answer the phones and respond to the email. Description: This presentation will review the experiences of the team supporting Sakai at the University of Michigan, including: - brief history, scope and scale of the implementation - support group structure & relations with other groups (Development, Operations, QC) - descriptive statistics related to supporting a large scale Sakai implementation - common issues and strategies for addressing them - best practices in support and lessons learned - support metrics and satisfaction indicators Attendees will come away with quantitative and qualitative information related to supporting a large scale Sakai implementation.


Download ppt "1 Supporting Sakai: Lessons from the University of Michigan Sean DeMonner Jeff Ziegler Usability Support and Evaluation Lab."

Similar presentations


Ads by Google