Presentation on theme: "Using Technology to see how our users navigate online interfaces Cynthia Bail and Cameron Metcalf uOttawa Library Friday, January 30, 2009. 9:05 a.m. SuperConference."— Presentation transcript:
Using Technology to see how our users navigate online interfaces Cynthia Bail and Cameron Metcalf uOttawa Library Friday, January 30, :05 a.m. SuperConference 2009, Toronto
Presentation Overview Methodology of usability studies Design and implementation of user testing –Past studies (2004, 2007) –Current study ( ) In-person & remote testing –Set up & procedures –Reporting and follow-up Benefits & Challenges
Get hold of some representatives… …and observe! User testing is described as the most basic method to study usability. Nielsen (2003) suggested: Get hold of some representative users Ask the users to perform representative tasks Observe what users do, where they succeed, where they have interface difficulties and the golden rules are: Test users individually & let them solve any problems on their own Help or direct their attention to a particular part of the screen and you contaminate test results Testing 5 users is typically enough to identify a design's most important usability problems Nielsen, Jakob. "Usability Testing With 5 Users (Jakob Nielsen's Alertbox)." March 19, 2000.http://www.useit.com/alertbox/ html "Usability 101: Definition and Fundamentals - What, Why, How (Jakob Nielsen's Alertbox)." August 25, 2003.http://www.useit.com/alertbox/ html
Methods available for usability testing Observation Observation & screen capture Remote software keystroke screen capture Eye tracking technology
Observation – Users Recorded on Video Methodology: Individual video recording device (camera on tripod) –Integrated or individual audio recording device Software and technology (web-cam) –Simultaneous video (of participant) and screen capture Goal: Capture voice, face, posture…physical cues
Observation – Users Recorded on Video Benefits: Record of body language; Eye movement; Secondary copy of audio narrative; A comprehensive record of the experiment; Facilitate “recall” and identity of individual sessions.
Drawbacks: Misinterpretation by observers; Additional technology adds extra presence => stress; Discouragement of potential participants (privacy); Additional overhead in planning, costs, and execution; Definitively obtaining return on investment Observation – Users Recorded on Video
Observation - Users Recorded on Video; UO Marketing 2003 Web-cam option with Morae software (Techsmith) One-way glass additional observers in Faculty of Education; Capture screen webcam feed and separate audio, mouse clicks; Marketing Department prefers two people for facilitation; Resumption in 2009 with “Listening Labs”; –Format: five-to-ten minute usability studies –New staff: “web content strategist” –Frequency: establish “routine” for usability testing
2004 Library Web Usability Study Observation and Screen Capture Viewlet Builder (Qarbon) Feature-comparable to Camtasia at the time Existing in-house experience and expertise Exportable formats (flash, wmv, native viewlet); Powerful without being resource-intensive; Captured audio as exportable distinct file;
2004 Library Web Usability Study Observation and Screen Capture (cont’d) Technical Setup Larger Room (computer lab); PC; Projector and Screen; Microphone (rented from AV Services) On The PC IE and Firefox + necessary plugins; Wiped browser cache prior to each session; Keyboard shortcuts to start / stop / pause the recording
2004 Library Web Usability Study Observation and Screen Capture (cont’d) Three facilitators (Justification) Scripts –Preamble –12 tasks Room set-up Recording mechanisms for observation / discussion
2004 Library Web Usability Study Challenges in Data Collection Rough notes: compiling, tabulating, and dissecting Review of screen / video capture Develop your strategies ahead of time –mouse clicks –elapsed time between clicks; hesitation(?) –elapsed time to complete each task –screen scrolling
Remote software – keystroke screen capture Methodology Software to display and capture screen activity on a remote PC Ethics process letter to be signed by researcher and participant Chat windows to push & receive web links and instructions Output files produced by UserVue capture software on facilitator PC –Media file (can be viewed in any AV player) –Proprietary file (can be viewed and analysed in Morae Manager) Demo of UserVue for a screen capture session Note: Demonstrations at SuperConference only – media links not included in website copy of presentation.
Remote software – Data collection & analysis If you don’t buy Morae, You can manually calculate time per task. UserVue states that you can create markers as you observe & record then this data can be saved to a.csv file. I would find this too distracting to the observation process –I prefer to mark up the files in the Morae Manager module after the fact when more time is available. –This also allows me to make a decision about what the start & end marker positions should be after I’ve seen all participants. If you plan to include other colleagues in the observation or analysis phase or if you anticipate others might want to share your raw data, you should confirm privacy and ethics policies in your organization.
2007 Catalogue search patterns study – Remote usability testing Participant Completed online pre & post surveys (participant profile, user experience survey) Performed 4 similar tasks in 5 different Library catalogues –1 st catalogue set up as a ‘model’ with step-by-step instructions –opportunity to complete tasks in 1 or 2 sittings Completed survey Q between each catalogue re: perceived correctness of task Researcher Analysed the time required to complete ‘similar tasks’ Compared the level of difficulty encountered while completing ‘similar tasks’ Contrasted the time per task, task success & perceived success
Catalogue search patterns study – Results Results: Graphs depicted time spent and difficulty encountered Graphs contrasted similar tasks across 5 catalogues Results analyzed to determine any patterns Patterns: certain users were much faster completing task #1, others much slower faster searchers for task 1 all completed the first task < 40 s slower searchers took > 120 s despite having step-by-step instructions In the case of certain fast users, none of the 5 catalogues changed the speed with which they executed their searches (accuracy did not always follow) * This is just a sample of the patterns which emerged from this study.
Eye tracking technology Research on eye-tracking can provide some interesting results about how users concentrate on certain parts of a web page (Nielsen and Pernice, 2007). –eyetracking heat maps display different colors –red on the parts of a web page on which a user concentrates most –Gaze ‘patterns’ emerge and features of a webpage hold more interest Equipment and facility costs as well as setup and analysis time are significant Can public and not-for-profit organizations justify this type of costly study? Libraries gain insight from results emerging out eye-tracking researcheye-tracking research –Annual conferences devoted to this topic are discussing low-cost eye tracking e.g. ETRA (Eye tracking research & applications) Nielsen, Jakob and Kara Pernice. "Eyetracking Research into Web Usability." http://www.useit.com/eyetracking/
2008 citation management study – to compare remote vs in-person usability testing Compared both methods that we had each experienced Designed a research project to contrast both Framed it around the use of citation management software Screen captured each participant in two separate sessions: Remote (UserVue) In-person (Morae recorder)
2008 citation management study – requirement and outputs Participants Have used a bibliography generating software actively for past 6 months Be willing to spend 30 minutes at the Library and 30 minutes off campus on tasks Demonstrate searching & saving records from various Library databases Create a bibliography from these records Researchers Analysed the time required to complete search & save task from each database Compared the time required to produce the bibliography in each product Contrasted the time per task Noted the variety of strategies employed to extract citation data for each product
Could in-person or remote testing alone… have achieved the same results? For the task of saving records: During in-person & remote testing, there were certain databases that were more of a challenge in terms of strategy and time. Note: Due to the small sample size, problems with access to 1 database during 1 participant’s session and the fact that 1 participant never(*) attempted to search for an interface save option, we cannot conclusively state % of users that had difficulties with a particular database. We can however demonstrate the types of difficulties users encountered in certain databases] Conclusion: It can be suggested that databases for which most users were able to complete the task under 2 minutes were easier to navigate to find a save records option. By examining the videos, it was confirmed that the databases which caused even 1 user to spend more than 2 minutes involved multiple strategies on their part before saving records was successful.
Could in-person or remote testing alone… have achieved the same results? For the task of creating a bibliography: During in-person & remote testing, 75% of the users were able to create accurate bibliographies that included consistent data amongst the 9 records. Note: 1 participant seemed unfamiliar with the importance of consistency in a bibliography. Since the save strategy consisted entirely of cut & paste of citations as seen in the result list (ie. no attention to the order or pieces of information required for a complete citation. The 9 citations became a bibliography that lacked uniformity in data elements (i.e. dates, if they appeared, were in different positions within a reference). Conclusion: Users who understand the purpose of a citation – to credit an author and serve as an effective finding tool for themselves and others -- will be more likely to ensure that a bibliography is accurately created within their preferred software.
Benefits of a remote usability study An interesting user experience regardless of level of technology experience: It’s possible to begin and end the session with a ‘live’ phone call (to review test expectations, importance of closing personal information before screen capture, the reason search questions cannot be answered if the person runs into difficulties). Easy to focus on task & avoid certain chit chat with user during testing Chat can be used during testing to present web pages describing tasks, to push or re-push url links to the user (if window accidentally closed or buried) Standard ‘assistance’ texts can be created and pushed through chat window, as needed. Markers can be added to the screen capture file while testing is in progress. Facilitator can express enthusiasm (privately) if user sails through task or misses the mark!
Challenges of a remote usability study Difficulty impressing upon some users the importance of closing personal data to prevent inadvertent screen capture of their name. (despite recommendations, some users leave , facebook, etc. open) Connection problems may cause time delays that are interpreted as hesitations. Time-consuming portion is the 1 st run-through but once chat texts are established can be easily pushed to each user. Determining the appropriate time to push some chat messages is an art since some users hesitate and are thinking while others are stuck.
Benefits of an in-person usability study Lots of anecdotal information A chance to further advocate library services and answer questions on an individual basis: “I always wondered but was afraid to ask…” Other questions come up that you might think to probe on the spur of the moment, as the relationship develops In reviewing notes easier to associate sessions by recalling someone’s face During the session it is easy to communicate small cues to reassure the participant when they are on the right path and express your appreciation for their contribution (smiling sincerity)
Challenges of an in-person usability study Difficult to offer suggestions and help to ease participant’s frustration, without skewing the results Time-consuming (room booking, arranging schedules among four people) Requires extra commitment on part of participants and out of twelve, we had at least two “no-shows”
Conclusions Remote testing is preferred to in-person testing: Scenarios were easy to repeat It’s more convenient for both facilitator and participant We recommend setting timelines based on: Any upcoming interface improvements for test products Potential availability of your participants In our next study: We aim to improve randomization of task order Increase sample size