Presentation is loading. Please wait.

Presentation is loading. Please wait.

Lecture #8 SPECIAL METHODS OF TESTING Y39TUR Spring 2011 Tvorba uživatelského rozhraní.

Similar presentations


Presentation on theme: "Lecture #8 SPECIAL METHODS OF TESTING Y39TUR Spring 2011 Tvorba uživatelského rozhraní."— Presentation transcript:

1 Lecture #8 SPECIAL METHODS OF TESTING Y39TUR Spring 2011 Tvorba uživatelského rozhraní

2 Remote Testing

3 (3) Software for X Example: A software needs to be tested for the market in the country of X. Possibilities: – –Invite 10 people from X to the Czech Republic Air tickets, accommodation, visa Not their own environment – –Go to X Use a local recruitment agency Rent a usability lab Vaccination – –Is it always necessary? – –Use the remote testing

4 (4) Traditional Methods vs. Remote Testing Traditional methods Participants sit in the lab Testers physically observe & record Remote testing Participants sit in their office/home Testers observe their screen via a cable & record

5 (5) Hierarchy of remote testing methods Same timeDifferent time Same place Different place Classic usability testing (Up until now) (Not included in the course) Remote testing Teleconferencing Surveys On-line evaluation tools …

6 Same time, different place

7 (7) Same time, different place Synchronous – –People connected via teleconferencing MODERATOR PARTICIPANT STAKEHOLDERS

8 (8) Remote Testing The testers observe the participants remotely – –Via telephone – –Via videoconferencing – –Via screen capturing and streaming software Could be a combination of a remote desktop (VNC, …) + a screen grabber (Camtasia, …) Methodology similar to the one of the classic usability tests – –Certain differences

9 (9) Selvaraj & Houck-Whitaker: – –Remote tests have at least the same effectiveness as traditional Benefits – –Time and costs savings You and your participants dont need to spend time traveling – –Realistic context of use You reach people in their own environment – –Geographic representation Different portions of the globe can be covered – –Access to professionals Its easier to ask a $500/hr professional to take part in this test because it will claim less of their time Quality Comparison

10 (10) Quality Comparison Limitations – –Lack of nonverbal signs Communication delay Low resolution of the video, or perhaps no video link at all – –No control over the participants conditions To check the software is well installed To make sure the participant is not being disturbed – –The moderator cant assist the users on-site The users are on their own using the system – –Higher level of the user IT literacy is expected Can not test with the novice users

11 Quality Comparison Limitations – –Will the users trust our application? People afraid of spyware Privately owned vs. corporate computers – –Will the stakeholders believe that its not fake?

12 Costs Comparison Traditional Tests Lab Equipment Recruitment Travel costs User incentives – physical presents, money Remote Tests Online Meeting Users equipment Recruitment No travel costs User incentives – electronic coupons, money

13 (13) Remote Testing Overview Very similar to the classic usability testing – –Define Objectives & Target Audience – –Set up Test Scenario – –Recruit Test Users – –Carry out Tests – –Analyze Findings – –Design Report & Brief Stakeholders

14 (14) Remote Testing Overview What are you testing? Who are you testing? Representative Tasks – –Within time-limits & user capabilities – –In line with test objectives Methods of data collection – –Screen capture – –Questionnaires, interviews

15 (15) Remote Testing: Test preparation Consult the objectives with the project stakeholders Develop instructions for the participants Run pilot test with home users Apply changes suggested by the results of the pilot test

16 (16) Remote Testing: Recruitment Define user profile & recruitment criteria Set up recruitment screener – –Screener can be filled out on the web Questionnaires Database of potential participants Selection from the database – –Telephone screener Very low success rate (telephone marketing failure) Decide on incentives

17 (17) Remote Testing: Recruitment Recruitment channels – –Web Social networks, mailing lists, job portals – –Traditional: Newspapers, ads With a URL to enter – –Recruitment agency May be important when testing in an unknown market Perhaps better targeted participants – –Web – advanced services ethnio.com clicktale.com

18 (18) ethnio.com – Example of Screener

19 (19) ethnio.com Recruiting people directly from a website Procedure: – –Set up a screener at your ethnio.com profile – –Set up your website to display the screener – –A website visitor will see the screener – –If responds, you will be notified immediately – –You contact the person by telephone / e-mail

20 (20) Remote Testing: Recruitment Specific requirement – –The users must be able to install: The software that is to be tested The tools used for the test – –The task sheet for the participants must be more specific There is no moderator in their place Consent solicitation – –By voice, saying Yes, I agree. – –By clicking I agree on the screener form

21 (21) Remote Testing: Technology to use Teleconferencing – –Skype – –Screen capture and streaming VNC Remote Desktop in MS Windows

22 (22) Carry out the Test During the test: – –confirm user profile eligibility – –ask for permission to record session – –limit moderator intrusion – –encourage thinking aloud – –take notes – –deliver incentive/payment – –have fun

23 (23) Analysis & Reports During tests: track all usability issues After each test: compare notes & analyze After all tests: summarize patterns & major problems Set up report & sample videos Communicate to all stakeholders

24 Same place, different time

25 Data are physically acquired Data are picked up later on Examples: – –Customer satisfaction surveys – –Elections – –Geocashing

26 Different time, different place

27 (27) Different time, different place Asynchronous – –Passing messages between the testers and the participants – –The whole test can take a considerable amount of time due to delays of communication between the testers and the participants – –Testers provide instructions Through a website / e-mail message – –Participants provide data answering a questionnaire by monitored interaction with the product – –The data are aggregated automatically

28 (28) Different time, different place Features – –Can be done automatically – –Good for quantitative data collection – –Good when there are lot of participants (25 – 100) Drawback – –We cant control the conditions well

29 (29) Questionnaire-based Testing Questionnaire – –A set of questions With defined responses ([yes][no], [1][2][3][4][5], …) Open ended questions – –The same questionnaire administered to all participants – –Easy to administer Point to a web form Send a structured e-mail – –Easy to process Automatic processing of the web forms Automatic processing of returned e-mails

30 (30) Questionnaire-based Testing Not many people respond to questionnaires – –Need to market the study well How to aim for specific target group? – –Questionnaire should contain some screening questions Questionnaire contains Screener Danger of … – –… self-selection!

31 (31) SUMI Software Usability Measurement Inventory Measuring software quality from the users point of view – –Quality of Use Input: – –The software or its prototype must exist – –10 users minimum Output: – –Five grades: Efficiency, Affect, Helpfulness, Control, Learnability – –Based on existing database of gathered questionnaires Kept by the authors of SUMI

32 (32) SUMI: Use How can be used: – –Assess new products during product evaluation – –Make comparisons between products or versions of products – –Set targets for future application developments Able to test verifiable goals for quality of use Track achievement of targets during product development In a quantitative manner Source: http://www.ucc.ie/hfrg/questionnaires/sumi/whatis.html

33 (33) SUMI: Scales Efficiency – –Tasks are completed by the user in a direct and timely manner Affect – –How much the product captures emotional responses Helpfulness – –The product seems to assist the user Control – –Users feels that they set the pace, not the product (they are in control) Learnability – –Ease with which the user can learn using the software and/or new features

34 (34) SUMI: Questionnaire 50 fixed and predefined questions, such as: – –This software responds too slowly to inputs – –I would recommend this software to my colleagues – –The instructions and prompts are helpful – –I sometimes wonder if I am using the right command – –I think this software is consistent Responses to these questions: – –Yes – –No – –Undecided

35 (35) SUMI: Processing The assignment between questions and scales is not disclosed. – –SUMI is a commercial service – –Know-how of the authors Procedure: – –Participants try the tested system – –SUMI questionnaires administered to the participants by testers – –Responses to the questionnaires sent to the authors of SUMI e-mail, web form, … – –Testers receive the grades from SUMI A nominal fee (hundreds USD)

36 (36) SUMI: Example Evaluation Reference Value

37 (37) SUMI: Evaluation The data provided are with respect to the corpus of previously gathered data – –The values show the usability of the system compared to the reference score (50 in each scale) The data can be used to compare two different systems – –Better score vs. worse score

38 SUMI: Evaluation Enough to provide an unbiased and objective results? – –YES Enough to give insights into particular problems? – –NO … we only have 5 numbers as an output – –We know nothing about the sources of errors

39 Automated User Testing

40 (40) Federico M. Facca 40 Automating Usability Testing Usability Testing – –a prototype or the final application is provided to a set of users and the evaluator collect and analyze usage data – –can be based on a set of predetermined tasks What can be automated in such method? – –capture of usage data – –analysis based on predefined metrics or a model Usability Evaluation of: – –navigation – –functionalities

41 (41) Federico M. Facca 41 Capturing Data Information – –easy to record but difficult to interpret (e.g., keystrokes) – –meaningful but difficult to label correctly (e.g., when a task can be considered completed?) Method Type: – –Performance logging (e.g. events and time of occurrence, no evaluator) – –Remote testing (e.g. assigned task performed by user and monitored by evaluators)

42 (42) Federico M. Facca 42 Capturing Data – the Web – Server-side Logging Web Server commonly log each user request to the server Available information is: – –IP address, request time, requested page, referrer We can derive: – –Number of visitors – –Breakdown by countries – –Coverage by robots …

43 Server-side Logging Pro – –huge quantity of easily available data – –do not require ideal users Typical questions that we can answer: – –Which contents is interesting? – –Do people reach all contents? Is all contents necessary? … which is not the same as: Is the navigation good? – –Does the new design keep people longer on site? – –Does the new design make people buy more?

44 Server-side Loggin Disadvantages: – –Highly quantitative method – –Almost no data of exact user interaction with the interface

45 (45) Federico M. Facca 45 Client Side Logging Dedicated tools and settings – –The web client must be enhanced to log information on interaction – –The client pushes information into a repository on the testers server Available information is: – –IP address, request time, requested page, referring page, mouse position on the screen, clicked links, back button… Pro – –actual data of exact user interaction with the interface – –session are automatically reconstructed Against: – –The participant must use this enhanced browser.

46 (46) Federico M. Facca 46 Tools Formal Client Side User Tracking/Analysis – –Commercial tools ETHNIO (http://www.ethnio.com/)http://www.ethnio.com/ Ulog/Observer (http://www.noldus.com)http://www.noldus.com UserZoom (http://www.userzoom.com)http://www.userzoom.com ClickTale (http://www.clicktale.com/)http://www.clicktale.com/ Usabilla (http://www.usabilla.com)http://www.usabilla.com Nielsen Eye Tracking (example in the next slides) – –Other tools (some are a bit old) WebQuilt (http://guir.berkeley.edu/projects/webquilt/) SCONE/TEA (http://www.scone.de/docus.html) NIST WebMetrics (http://zing.ncsl.nist.gov/WebTools/, not only for tracking and relative analysis)http://zing.ncsl.nist.gov/WebTools/

47 (47) Federico M. Facca 47 Tools Informal Client Side Tracking/Analysis – –Commercial Tools Google Analytics (http://www.google.com/analytics/)http://www.google.com/analytics/ Fireclick (http://www.fireclick.com/)http://www.fireclick.com/ SiteCatalyst (http://www.omniture.com/products/web_analytics)http://www.omniture.com/products/web_analytics Hitslink (http://os.hitslink.com/)http://os.hitslink.com/ Crazy Egg (http://crazyegg.com) nice examplehttp://crazyegg.com Usabilla (http://usabilla.com)http://usabilla.com …. tons really – –Free Tools Search with Google you can find some Server side analysis – –Again tons of solutions!

48 (48) usabilla.com A Web 2.0 application Main principle – –Testers present a website screenshot – –Participants mark points on the screenshots according to tasks, e.g.: Click on the element that you would remove from the page. – –Comments can be added – –The testers can see the results in an aggregate form Individual points and comments are anonymous

49 TUR 2010 (49) Task: Click on the element that you found most interesting. Responses from the user

50 (50) usabilla.com: Example Object of the test – –Czech version of the home page of the DCGI website Participants – –CTU students Tasks – –Click on the element that you found most interesting – –Click on the elements that you like most – –Click on the elements that you would remove from the page – –Where would you look for contact information? – –Where would you look for CVs of the faculty members?

51 (51) usabilla.com: Example Recruitment – –18 people asked to participate – –People who were online at the moment – –Via ICQ and Skype – –10 users participated Data acquired – –95 points (for all tasks) – –3 comments Total time to carry out this test: 1 hour

52 usabilla.com: Example – All points (for all tasks)

53 usabilla.com: Example – Most interesting

54 usabilla.com: Example – What to remove Comments

55 (55) usabilla.com: Example – What to remove – Comments The users clicked almost all elements to be removed – –Can we trust such data? – –We can not assume that all the users preferred to remove all the elements – –We need to interpret this as a possible dissatisfaction with the layout of the website This needs to be verified and concretized Separate test (different method) needs to be applied

56 (56) usabilla.com: Example – What to remove – Comments Actual responses from the participants: – –"osklive menu" (the menu is ugly) – –"mno nevim jestli odstranil, ale mail to moc nepripomina" (well, probably not really remove but it does not look like an e-mail) – –"no tohle je asi z nejakeho publikacniho systemu, me to trochu rozciluje" (this is probably from some CMS, makes me angry a bit) How to interpret these? – –Only suggestions for further testing – –(Not enough data for conclusive statements)

57 usabilla.com: Example – Where to find faculty CVs? Very nice and conclusive cluster.

58 usabilla.com: Example – Where to find contact info? Most people would look here But some assumed there were other ways

59 (59) usabilla.com: General rule: – –We should trust a point only as long as it is verified by multiple instances Benefits: – –A rapid method of testing – –Very easy analysis of data Drawbacks: – –No protection from malevolent participants Ill click you to death! – –Motivation for placing the points not always understood

60 (60) By Federico M. Facca 60 SCONE / TEA Support for Formal User Testing: – –task specification – –browser control – –user tracking according to task specification

61 (61) Federico M. Facca 61 Specific user behavior testing - Eye Tracking Records eye movements Originally designed for mobility impaired users Relies on the eye-mind hypothesis Used in neuroscience, cognitive psychology, advertising, and now … usability

62 (62) Federico M. Facca Eye Tracking Equipment – –Head-mounted systems – –Remote systems (ERICA) – –Computer monitor camera systems

63 (63) Federico M. Facca 63 Eye Tracking Eye movements collected – –Fixations – where the eye stops long enough to absorb information – –Saccades – move the eye from one fixation to the next Visual representation – –Scanpath – the temporal sequence of fixations and saccades Other representations – –Numerical – –Real-time

64 (64) Federico M. Facca 64 Eye Tracking Provides a higher level of granularity than other data collection methods & quantitative measures of user behavior Reveals behavior not evident in concurrent think- aloud protocol – –Scanning continues when people are silent or using verbal fillers (ums and ahs) – –Eye movement occurs faster than verbalization Shows parts of a user interface/web page that receive user attention and how search is visually distributed.

65 (65) Federico M. Facca 65 Eye Tracking

66 (66) Federico M. Facca 66 Eye Tracking

67 Eye Tracking – Heat Map


Download ppt "Lecture #8 SPECIAL METHODS OF TESTING Y39TUR Spring 2011 Tvorba uživatelského rozhraní."

Similar presentations


Ads by Google