SIMS 213: User Interface Design & Development Marti Hearst Tues, Feb 25, 2003.

Slides:



Advertisements
Similar presentations
Actionable Information: Tools and Techniques to Help Design Effective Intranets Frank Cervone Assistant University Librarian for Information Technology.
Advertisements

Task-Centered User Interface Design who are the users? what are the tasks? plagiarize! iterative design –rough descriptions, mock-ups, prototypes test.
Prof. James A. Landay University of Washington CSE 440 USER INTERFACE DESIGN + PROTOTYPING + EVALUATION February 19, 2013 Heuristic Evaluation.
Acknowledgements: Most of this course is based on the excellent course offered by Prof. Kellogg Booth at the British Columbia University, Vancouver, Canada.
Heuristic Evaluation.
Design Reviews. Genres of assessment  Automated: Usability measures computed by software  Empirical: Usability assesses by testing with real users 
Prof. James A. Landay Computer Science Department Stanford University Autumn 2014 HCI+D: USER INTERFACE DESIGN + PROTOTYPING + EVALUATION Heuristic Evaluation.
Code Inspections and Heuristic Evaluation
SIMS 213: User Interface Design & Development Marti Hearst Tues, Feb 19, 2004.
SIMS 213: User Interface Design & Development Marti Hearst Thurs, Feb 23, 2006.
SIMS 213: User Interface Design & Development Marti Hearst Tues, Feb 21, 2006.
Heuristic Evaluation Professor: Tapan Parikh TA: Eun Kyoung Choe
AJ Brush Richard Anderson
Heuristic Evaluation. Sources for today’s lecture: Professor James Landay: stic-evaluation/heuristic-evaluation.ppt.
Usable Privacy and Security Carnegie Mellon University Spring 2007 Cranor/Hong 1 User Studies Methods Feb 01,
1 Heuristic Evaluation. 2 Interface Hall of Shame or Fame? Standard MS calculator on all Win95/98/NT/2000/XP.
Heuristic Evaluation IS 485, Professor Matt Thatcher.
Heuristic Evaluation Evaluating with experts. Discount Evaluation Techniques  Basis: Observing users can be time- consuming and expensive Try to predict.
Nine principles of design Simple and natural dialog Speak the user’s language Minimize user’s memory load Be consistent Provide feedback Provide clearly.
Evaluating with experts
SIMS 213: User Interface Design & Development Marti Hearst Tues Feb 13, 2001.
Heuristic Evaluation John Kelleher. 1 What do you want for your product? Good quality? Inexpensive? Quick to get to the market? Good, cheap, quick: pick.
Hueristic Evaluation. Heuristic Evaluation Developed by Jakob Nielsen Helps find usability problems in a UI design Small set (3-5) of evaluators examine.
SIMS 213: User Interface Design & Development Marti Hearst Tues, Feb 20, 2007.
I213: User Interface Design & Development Marti Hearst Thurs, Feb 22, 2007.
Discount Usability Engineering Marti Hearst (UCB SIMS) SIMS 213, UI Design & Development March 2, 1999.
Usability Testing.
Evaluation in HCI Angela Kessell Oct. 13, Evaluation Heuristic Evaluation Measuring API Usability Methodology Matters: Doing Research in the Behavioral.
Lecture 23: Heuristic Evaluation
Usability Methods: Cognitive Walkthrough & Heuristic Evaluation Dr. Dania Bilal IS 588 Spring 2008 Dr. D. Bilal.
Heuristic Evaluation “Discount” Usability Testing Adapted from material by Marti Hearst, Loren Terveen.
Heuristic Evaluation HCC 729, 2/13/14 ☃. We’ll follow up next time Inspirations, reading feedback Your HTAs and personas.
Discount Evaluation Evaluating with experts. Discount Evaluation Techniques Basis: – Observing users can be time-consuming and expensive – Try to predict.
INFO3315 Week 4 Personas, Tasks Guidelines, Heuristic Evaluation.
Object-Oriented Software Engineering Practical Software Development using UML and Java Chapter 7: Focusing on Users and Their Tasks.
Usability Evaluation June 8, Why do we need to do usability evaluation?
SEG3120 User Interfaces Design and Implementation
Prof. James A. Landay University of Washington Autumn 2008 Heuristic Evaluation October 28, 2008.
An Overview of Usability Evaluation #15 1. Outline  What is usability evaluation ?  Why perform usability evaluation ?  Types of usability evaluations.
Y ASER G HANAM Heuristic Evaluation. Roadmap Introduction How it works Advantages Shortcomings Conclusion Exercise.
Evaluating a UI Design Expert inspection methods Cognitive Walkthrough
June 5, 2007Mohamad Eid Heuristic Evaluation Chapter 9.
Stanford hci group / cs147 u 25 October 2007 Design Reviews Scott Klemmer tas: Marcello Bastea-Forte, Joel Brandt, Neil Patel,
Heuristic Evaluation Short tutorial to heuristic evaluation
Usability Heuristics Avoid common design pitfalls by following principles of good design Nielsen proposes 10 heuristics, others propose more or less. Inspect.
Fall 2002CS/PSY Predictive Evaluation (Evaluation Without Users) Gathering data about usability of a design by a specified group of users for a particular.
Prof. James A. Landay University of Washington Autumn 2007 Heuristic Evaluation October 30, 2007.
Efficient Techniques for Evaluating UI Designs CSE 403.
© 2016 Cognizant. © 2016 Cognizant Introduction PREREQUISITES SCOPE Heuristic evaluation is a discount usability engineering method for quick, cheap,
University of Washington HCDE 518 & INDE 545 Empirical Evaluation HCDE 518 & INDE 545 Winter 2012 With credit to Jake Wobbrock, Dave Hendry, Andy Ko, Jennifer.
Heuristic Evaluation May 4, 2016
Heuristic Evaluation October 26, 2006.
Sampath Jayarathna Cal Poly Pomona
Heuristic Evaluation August 5, 2016
Heuristic Evaluation 3 CPSC 481: HCI I Fall 2014
(adapted from Berkeley GUIR)
Professor John Canny Fall /27/04
(adapted from Berkeley GUIR)
Professor John Canny Spring 2006
User Studies Methods Feb 01, 2007.
PostPC Computing Heuristic Evaluation Prof. Scott Kirkpatrick, HUJI
Evaluation ECE 695 Alexander J. Quinn March 30, 2018.
Professor John Canny Fall 2001 Sept 27, 2001
Heuristic Evaluation.
Professor John Canny Spring 2004 Feb 13
Professor John Canny Spring 2003 Feb 19
Miguel Tavares Coimbra
Miguel Tavares Coimbra
SE365 Human Computer Interaction
Miguel Tavares Coimbra
Presentation transcript:

SIMS 213: User Interface Design & Development Marti Hearst Tues, Feb 25, 2003

Evaluating UI Designs Usability testing is a major technique “Discount” methods don’t require users –Heuristic Evaluation –Cognitive Walkthrough

Heuristic Evaluation Developed by Jakob Nielsen Helps find usability problems in a UI design Small set (3-5) of evaluators examine UI –independently check for compliance with usability principles (“heuristics”) –different evaluators will find different problems –evaluators only communicate afterwards findings are then aggregated Can perform on working UI or on sketches

Adapted from slide by James Landay Phases of Heuristic Evaluation 1) Pre-evaluation training –give evaluators needed domain knowledge and information on the scenarios 2) Evaluation –individuals evaluate and then aggregate results 3) Severity rating –determine how severe each problem is (priority) 4) Debriefing –discuss the outcome with design team

Heuristics (original) H1-1: Simple & natural dialog H1-2: Speak the users’ language H1-3: Minimize users’ memory load H1-4: Consistency H1-5: Feedback H1-6: Clearly marked exits H1-7: Shortcuts H1-8: Precise & constructive error messages H1-9: Prevent errors H1-10: Help and documentation

Adapted from slide by James Landay How to Perform H.E. At least two passes for each evaluator –first to get feel for flow and scope of system –second to focus on specific elements Assistance from implementors/domain experts –If system is walk-up-and-use or evaluators are domain experts, then no assistance needed –Otherwise might supply evaluators with scenarios and have implementors standing by

Adapted from slide by James Landay How to Perform Evaluation Where problems may be found –single location in UI –two or more locations that need to be compared –problem with overall structure of UI –something that is missing

Adapted from slide by James Landay Example Problem Descriptions Can’t copy info from one window to another –Violates “Minimize the users’ memory load” (H1-3) –Fix: allow copying Typography uses mix of upper/lower case formats and fonts –Violates “Consistency and standards” (H2-4) –Slows users down –Probably wouldn’t be found by user testing –Fix: pick a single format for entire interface

Adapted from slide by James Landay Severity Ratings Used to allocate resources to fix problems Estimates of need for more usability efforts Combination of –frequency –impact –persistence (one time or repeating) Should be calculated after all evaluations are done Should be done independently by all evaluators

Adapted from slide by James Landay Severity Ratings (from Nielsen & Mack 94) 0 - don’t agree that this is a usability problem 1 - cosmetic problem 2 - minor usability problem 3 - major usability problem; important to fix 4 - usability catastrophe; imperative to fix

Adapted from slide by James Landay Severity Ratings Example 1. [H1-4 Consistency] [Severity 3] The interface used the string "Save" on the first screen for saving the user's file, but used the string "Write file" on the second screen. Users may be confused by this different terminology for the same function.

Adapted from slide by James Landay Debriefing Conduct with evaluators, observers, and development team members Discuss general characteristics of UI Suggest potential improvements to address major usability problems Dev. team rates how hard things are to fix Make it a brainstorming session

Adapted from slide by James Landay Results of Using HE (cont.) Single evaluator achieves poor results –only finds 35% of usability problems –5 evaluators find ~ 75% of usability problems –why not more evaluators? 10? 20? adding evaluators costs more adding more evaluators doesn’t increase the number of unique problems found

Adapted from slide by James Landay Decreasing Returns problems foundbenefits / cost (from Nielsen) Caveat: these graphs are for a specific example This is a controversial point.

Why Multiple Evaluators? Every evaluator doesn’t find every problem Good evaluators find both easy & hard ones

Comments from Today’s CHI-Web From Gilbert Cockton (2/19/02): –Inspection methods are discount methods for practitioners. They are not rigorous scientific methods. –All inspection methods are subjective. –No inspection method can compensate for inexperience or poor judgement. –Using multiple analysts results in an inter-subjective synthesis. However, this also a) raises the false alarm rate, unless a voting system is applied b) reduces the hit rate if a voting system is applied! –Group synthesis of a prioritized problem list seems to be the most effective current practical approach.

In-Class Heuristic Evaluation Windows Character Map Program