Presentation is loading. Please wait.

Presentation is loading. Please wait.

Discount Usability Engineering Marti Hearst (UCB SIMS) SIMS 213, UI Design & Development March 2, 1999.

Similar presentations


Presentation on theme: "Discount Usability Engineering Marti Hearst (UCB SIMS) SIMS 213, UI Design & Development March 2, 1999."— Presentation transcript:

1 Discount Usability Engineering Marti Hearst (UCB SIMS) SIMS 213, UI Design & Development March 2, 1999

2 Adapted from slide by James Landay Last Time l UI tools good for testing more developed UI ideas l Two styles of tools –“Prototyping” vs. UI builders l Most ignore the “insides” of application

3 Today l Discount Usability Engineering –How to do Heuristic Evaluation –Comparison of Techniques

4 Evaluation Techniques l UI Specialists –User Studies / Usability Testing –Heuristic Evaluation l Interface Designers and Implementors –Software Guidelines –Cognitive Walkthroughs

5 What is Heuristic Evaluation? l A “discount” usability testing method l UI experts inspect an interface –prototype initially –later, the full system l Check the design against a list of design guidelines or heuristics l Present a list of problems to the UI designers and developers, ranked by severity

6 Adapted from slide by James Landay Phases of Heuristic Evaluation 1) Pre-evaluation training –give evaluators needed domain knowledge and information on the scenario 2) Evaluation –individuals evaluate and then aggregate results 3) Severity rating –determine how severe each problem is (priority) 4) Debriefing –discuss the outcome with design team

7 Heuristic Evaluation l Developed by Jakob Nielsen l Helps find usability problems in a UI design l Small set (3-5) of evaluators examine UI –independently check for compliance with usability principles (“heuristics”) –different evaluators will find different problems –evaluators only communicate afterwards »findings are then aggregated l Can perform on working UI or on sketches

8 Why Multiple Evaluators? l Every evaluator doesn’t find every problem l Good evaluators find both easy & hard ones

9 Heuristics (original) l H1-1: Simple & natural dialog l H1-2: Speak the users’ language l H1-3: Minimize users’ memory load l H1-4: Consistency l H1-5: Feedback l H1-6: Clearly marked exits l H1-7: Shortcuts l H1-8: Precise & constructive error messages l H1-9: Prevent errors l H1-10: Help and documentation

10 Adapted from slide by James Landay How to Perform H.E. l At least two passes for each evaluator –first to get feel for flow and scope of system –second to focus on specific elements l Assistance from implementors/domain experts –If system is walk-up-and-use or evaluators are domain experts, then no assistance needed –Otherwise might supply evaluators with scenarios and have implementors standing by

11 Adapted from slide by James Landay How to Perform H.E. l Each evaluator produces list of problems –Explain each problem with reference to heuristics or other information –Be specific –List each problem separately

12 Adapted from slide by James Landay Example Problem Descriptions l Can’t copy info from one window to another –Violates “Minimize the users’ memory load” (H1-3) –Fix: allow copying l Typography uses mix of upper/lower case formats and fonts –Violates “Consistency and standards” (H2-4) –Slows users down –Probably wouldn’t be found by user testing –Fix: pick a single format for entire interface

13 Adapted from slide by James Landay How to Perform Evaluation l Where problems may be found –single location in UI –two or more locations that need to be compared –problem with overall structure of UI –something that is missing »this part is hard to find with paper prototypes so work extra hard on those

14 Adapted from slide by James Landay Debriefing l Conduct with evaluators, observers, and development team members l Discuss general characteristics of UI l Suggest potential improvements to address major usability problems l Dev. team rates how hard things are to fix l Make it a brainstorming session –little criticism until end of session

15 Adapted from slide by James Landay Severity Rating l Used to allocate resources to fix problems l Estimates of need for more usability efforts l Combination of –frequency –impact –persistence (one time or repeating) l Should be calculated after all evaluations are done l Should be done independently by all evaluators

16 Adapted from slide by James Landay Severity Ratings (from Nielsen & Mack 94) 0 - don’t agree that this is a usability problem 1 - cosmetic problem 2 - minor usability problem 3 - major usability problem; important to fix 4 - usability catastrophe; imperative to fix

17 Adapted from slide by James Landay Severity Ratings Example 1. [H1-4 Consistency] [Severity 3] The interface used the string "Save" on the first screen for saving the user's file, but used the string "Write file" on the second screen. Users may be confused by this different terminology for the same function.

18 Adapted from slide by James Landay Results of Using HE l Discount: benefit-cost ratio of 48 [Nielsen94] –cost was $10,500 for benefit of $500,000 –value of each problem ~15K (Nielsen & Landauer) –how might we calculate this value? »in-house  productivity; open market  sales l There is a correlation between severity & finding w/ HE

19 Adapted from slide by James Landay Results of Using HE (cont.) l Single evaluator achieves poor results –only finds 35% of usability problems –5 evaluators find ~ 75% of usability problems –why not more evaluators? 10? 20? »adding evaluators costs more »adding more evaluators doesn’t increase the number of unique problems found

20 Adapted from slide by James Landay Decreasing Returns problems foundbenefits / cost l Caveat: graphs for a specific example

21 Adapted from slide by James Landay Summary of H.E. l Heuristic evaluation is a “discount” method l Procedure –Evaluators go through the UI twice –Check to see if it complies with heuristics –Note where it doesn’t and say why l Follow-up –Combine the findings from 3 to 5 evaluators –Have evaluators independently rate severity –Discuss problems with design team l Alternate with user testing

22 Study by Jeffries et al. 91 l Compared the four techniques –Evaluating the HP-VUE windows-like OS interface –Used a standardized usability problem reporting form

23 Study by Jeffries et al. 91 l Compared the four techniques –Usability test »conducted by a human factors professional »six participants (subjects) »3 hours learning, 2 hours doing ten tasks –Software guidelines »62 internal guidelines »3 software engineers (had not done the implementation, but had relevant experience)

24 Study by Jeffries et al. 91 l Compared the four techniques –Cognitive Walkthrough »3 software engineers, as a group (had not done the implementation but had relevant experience) »procedure: developers “step through” the interface in the context of core tasks a typical user will need to accomplish »used tasks selected by Jeffries et al. »a pilot test was done first

25 Study by Jeffries et al. 91 l Compared the four techniques –Heuristic Evaluation »4 HCI researchers »two-week period (interspersed evaluation with the rest of their tasks)

26 Results of Study l H.E. group found the largest number of unique problems l H.E. group found more than 50% of all problems found

27 Results of Study Problems Discovered

28 Time Required (in person-hours)

29 Average Severity Ratings l Partition into three groups l Classify results into two of these –most severe / least severe –average >=3.86 / average <=2.86 l HE 28/52 l UT18/2 l SG12/11 l CW 9/10

30 Benefit/Cost Ratios l Average severity rating/time –HE: 12 –UT: 1 –SG: 2 –CW: 3 –(S.G. becomes 6 if you factor in the time required for HP-VUE training)

31 Adapted from slide by James Landay HE vs. User Testing l HE is much faster –1-2 hours each evaluator vs. days-weeks –finds some problems user testing does not find l HE doesn’t require interpreting user’s actions l User testing is more accurate (by def.) –takes into account actual users and tasks –finds some problems HE doesn’t find –HE may miss problems & find “false positives” l Good to alternate between HE and user testing –find different problems –don’t waste participants

32 Next Time l Work on Low-fi prototypes in class


Download ppt "Discount Usability Engineering Marti Hearst (UCB SIMS) SIMS 213, UI Design & Development March 2, 1999."

Similar presentations


Ads by Google