SIMS 213: User Interface Design & Development Marti Hearst Tues, Feb 21, 2006.

Slides:



Advertisements
Similar presentations
DEVELOPING A METHODOLOGY FOR MS3305 CW2 Some guidance.
Advertisements

Heuristic Evaluation.
SIMS 213: User Interface Design & Development Marti Hearst Thurs, March 6, 2003.
Saul Greenberg, James Tam Design Principles And Usability Heuristics You can avoid common design pitfalls by following 9 design principles You can inspect.
Ch 11 Cognitive Walkthroughs and Heuristic Evaluation Yonglei Tao School of Computing and Info Systems GVSU.
11 HCI - Lesson 5.1 Heuristic Inspection (Nielsen’s Heuristics) Prof. Garzotto.
SIMS 213: User Interface Design & Development Marti Hearst Tues, Feb 25, 2003.
SIMS 213: User Interface Design & Development Marti Hearst Tues, Feb 18, 2003.
SIMS 213: User Interface Design & Development Marti Hearst Tues, Feb 19, 2004.
Heuristic Evaluation Professor: Tapan Parikh TA: Eun Kyoung Choe
Part 4: Evaluation Days 25, 27, 29, 31 Chapter 20: Why evaluate? Chapter 21: Deciding on what to evaluate: the strategy Chapter 22: Planning who, what,
AJ Brush Richard Anderson
Heuristic Evaluation. Sources for today’s lecture: Professor James Landay: stic-evaluation/heuristic-evaluation.ppt.
1 Heuristic Evaluation. 2 Interface Hall of Shame or Fame? Standard MS calculator on all Win95/98/NT/2000/XP.
Today’s class Group Presentation More about principles, guidelines, style guides and standards In-class exercises More about usability Norman’s model of.
Heuristic Evaluation IS 485, Professor Matt Thatcher.
SIMS 213: User Interface Design & Development Marti Hearst Tues Feb 13, 2001.
PowerPoint Presentation for Dennis, Wixom & Tegarden Systems Analysis and Design Copyright 2001 © John Wiley & Sons, Inc. All rights reserved. Slide 1.
Help and Documentation CSCI324, IACT403, IACT 931, MCS9324 Human Computer Interfaces.
SIMS 213: User Interface Design & Development Marti Hearst Thurs, Feb 13, 2003.
SIMS 213: User Interface Design & Development Marti Hearst Thurs, Feb 17, 2005.
© Lethbridge/Laganière 2001 Chapter 7: Focusing on Users and Their Tasks1 7.1 User Centred Design (UCD) Software development should focus on the needs.
SIMS 213: User Interface Design & Development Marti Hearst Tues, Feb
SIMS 213: User Interface Design & Development Marti Hearst Tues Feb 6, 2001.
Hueristic Evaluation. Heuristic Evaluation Developed by Jakob Nielsen Helps find usability problems in a UI design Small set (3-5) of evaluators examine.
SIMS 213: User Interface Design & Development Marti Hearst Tues, Feb 20, 2007.
Design Principles and Usability Heuristics
© 2005 by Prentice Hall Chapter 3c Designing Interfaces and Dialogues.
Usability Heuristics John Kelleher (IT Sligo). 1 "The two most important tools an architect has are the eraser in the drawing room and the sledge hammer.
SIMS 213: User Interface Design & Development Marti Hearst Thurs, Feb 14, 2002.
Saul Greenberg Design Principles and Usability Heuristics You can avoid common design pitfalls by following 9 design principles You can inspect an interface.
Discount Usability Engineering Marti Hearst (UCB SIMS) SIMS 213, UI Design & Development March 2, 1999.
Evaluation in HCI Angela Kessell Oct. 13, Evaluation Heuristic Evaluation Measuring API Usability Methodology Matters: Doing Research in the Behavioral.
User Interface Design Chapter 11. Objectives  Understand several fundamental user interface (UI) design principles.  Understand the process of UI design.
Heuristic evaluation IS 403: User Interface Design Shaun Kane.
1. Learning Outcomes At the end of this lecture, you should be able to: –Define the term “Usability Engineering” –Describe the various steps involved.
Heuristic Evaluation “Discount” Usability Testing Adapted from material by Marti Hearst, Loren Terveen.
Usability Heuristics CMPT 281.
INFO3315 Week 4 Personas, Tasks Guidelines, Heuristic Evaluation.
User interface design. Recap OOD is an approach to design so that design components have their own private state and operations Objects should have constructor.
CSC 480 Software Engineering Lecture 19 Nov 11, 2002.
Object-Oriented Software Engineering Practical Software Development using UML and Java Chapter 7: Focusing on Users and Their Tasks.
10 Usability Heuristics for User Interface Design.
Click to edit Master subtitle style USABILITY and USER INTERFACE DESIGN Application.
Productivity Programs Common Features and Commands.
SEG3120 User Interfaces Design and Implementation
Prof. James A. Landay University of Washington Autumn 2008 Heuristic Evaluation October 28, 2008.
Fall 2002CS/PSY Design. Fall 2002CS/PSY System-Centered Design Focus is on the technology  What can be built easily using the available tools.
Why do we need good user interfaces?. Goals of User Interfaces Usable – how much effort to do a task? – example: often-used buttons easier to find – example:
User Support Chapter 8. Overview Assumption/IDEALLY: If a system is properly design, it should be completely of ease to use, thus user will require little.
Y ASER G HANAM Heuristic Evaluation. Roadmap Introduction How it works Advantages Shortcomings Conclusion Exercise.
Evaluating a UI Design Expert inspection methods Cognitive Walkthrough
Usability 1 Usability evaluation Without users - analytical techniques With users - survey and observational techniques.
June 5, 2007Mohamad Eid Heuristic Evaluation Chapter 9.
Heuristic Evaluation Short tutorial to heuristic evaluation
Yonglei Tao School of Computing & Info Systems GVSU Ch 7 Design Guidelines.
Object-Oriented Software Engineering Practical Software Development using UML and Java Chapter 7: Focusing on Users and Their Tasks.
Usability Heuristics Avoid common design pitfalls by following principles of good design Nielsen proposes 10 heuristics, others propose more or less. Inspect.
Writing to Teach - Tutorials Chapter 2. Writing to Teach - Tutorials The purpose of a tutorial is to accommodate information to the needs of the user.
Prof. James A. Landay University of Washington Autumn 2007 Heuristic Evaluation October 30, 2007.
Efficient Techniques for Evaluating UI Designs CSE 403.
© 2016 Cognizant. © 2016 Cognizant Introduction PREREQUISITES SCOPE Heuristic evaluation is a discount usability engineering method for quick, cheap,
Ten Usability Heuristics These are ten general principles for user interface design. They are called "heuristics" because they are more in the nature of.
Heuristic Evaluation May 4, 2016
Heuristic Evaluation October 26, 2006.
Heuristic Evaluation August 5, 2016
Heuristic Evaluation 3 CPSC 481: HCI I Fall 2014
Unit 14 Website Design HND in Computing and Systems Development
Heuristic Evaluation Jon Kolko Professor, Austin Center for Design.
Heuristic Evaluation.
Presentation transcript:

SIMS 213: User Interface Design & Development Marti Hearst Tues, Feb 21, 2006

Today Modes Selected Design Guidelines Heuristic Evaluation –How to do it –How well does it work? –In-class H.E.

Modes What are they? –The same user actions have different effects in different situations. –Examples? Keycaps lock Adobe reader example: vs. Powerpoint drawing example

Modes When are they useful? –Temporarily restrict users’ actions –When logical and clearly visible and easily switchable Drawing with paintbrush vs. pencil Autocorrect (if easy to switch the mode) Why can they be problematic? –Big memory burden –Source of many serious errors How can these problems be fixed? –Don’t use modes – redesign the system to be modeless –Redundantly visible –Raskin -- quasimodes

Modal Blooper

Modal Blooper (and other problems too)

A Summary Statement Raskin, p. 69 –“We must make sure that every detail of an interface matches both our cognitive capabilities and the demands of the task…”

Design Guidelines There are LOTS of them –Based on common sense and experience Not necessarily proven –Often conflict with one another –Often don’t say HOW to implement them

Design Guidelines What do to: –Focus on those guidelines most applicable to the kind of interface under development –Focus on those emphasized in our readings Bloopers, chapter 1 Usability Engineering, chapter 5 Raskin, chapter 3 All of Don Norman’s concerns –Use common sense

All-Star Usability Design Guidelines An edited selection

Slide adapted from Saul Greenberg 1 Simple and natural dialogue Use the user’s conceptual model Match the users’ task in as natural a way as possible –minimize mapping between interface and task semantics

1 Simple and natural dialogue Present exactly the information the user needs –less is more less to learn, to get wrong, to distract... –information should appear in natural order related information is graphically clustered order of accessing information matches user’s expectations –remove or hide irrelevant or rarely needed information competes with important information on screen –use windows frugally don’t make navigation and window management excessively complex

Slide adapted from Saul Greenberg

2 Speak the users’ language Examples?

Slide adapted from Saul Greenberg 3 Minimize user’s memory load Computers good at remembering things, people aren’t! Promote recognition over recall –menus, icons, choice dialog boxes vs command lines, field formats –relies on visibility of objects to the user (but less is more!)

Slide adapted from Saul Greenberg 3 Minimize user’s memory load

Slide adapted from Saul Greenberg 3: Minimize user’s memory load Describe required input format and example, and default Small number of rules applied universally –generic commands same command can be applied to all interface objects copy, cut, paste, drag ’n drop,... for characters, words, paragraphs, circles, files

Slide adapted from Saul Greenberg 3: Minimize user’s memory load

Slide adapted from Saul Greenberg 4: Be consistent Consistency of effects –same words, commands, actions will always have the same effect in equivalent situations Consistency of language and graphics –same information/controls in same location on all screens / dialog boxes –same visual appearance across the system (e.g. widgets) e.g. different scroll bars in a single window system! Consistency of input –consistent syntax across complete system

Slide adapted from Saul Greenberg 4. Be Consistent These are labels with a raised appearance. Is it any surprise that people try and click on them?

Slide adapted from Saul Greenberg 5: Provide feedback Continuously inform the user about –what it is doing –how it is interpreting the user’s input –user should always be aware of what is going on > Doit What’s it doing? > Doit This will take 5 minutes... Time for coffee.

Slide adapted from Saul Greenberg 5. Provide feedback Should be as specific as possible, based on user’s input Best within the context of the action

Slide adapted from Saul Greenberg 5. Provide feedback What did I select? What mode am I in now? How is the system interpreting my actions?

Slide adapted from Saul Greenberg Provide feedback Drawing Board LT Multiple files being copied, but feedback is file by file.

Slide adapted from Saul Greenberg 5. Provide feedback Response time –how users perceive delays 0.1 second max: perceived as “instantaneous” 1 second max: user’s flow of thought stays uninterrupted, but delay noticed 10 seconds: limit for keeping user’s attention focused on the dialog > 10 seconds: user will want to perform other tasks while waiting

Slide adapted from Saul Greenberg How do I get out of this? 6. Provide clearly marked exits

Slide adapted from Saul Greenberg 6. Provide clearly marked exits Users don’t like to feel trapped by the computer! –should offer an easy way out of as many situations as possible Strategies: –Cancel button (for dialogs waiting for user input) –Universal Undo (can get back to previous state) –Interrupt (especially for lengthy operations) –Quit (for leaving the program at any time) –Defaults (for restoring a property sheet)

Slide adapted from Saul Greenberg 7. Provide shortcuts Experienced users should be able to perform frequently used operations quickly Strategies: –keyboard and mouse accelerators abbreviations command completion menu shortcuts function keys double clicking vs menu selection –type-ahead (entering input before the system is ready for it) navigation jumps e.g., going to location directly, and avoiding intermediate nodes –history systems WWW: ~60% of pages are revisits

Keyboard accelerators for menus Customizable toolbars and palettes for frequent actions Split menu, with recently used fonts on top Scrolling controls for page-sized increments Double-click raises object- specific menu Double-click raises toolbar dialog box

Slide adapted from Saul Greenberg 8: Deal with errors in a positive and helpful manner People will make errors! Errors we make –Mistakes arise from conscious deliberations that lead to an error instead of the correct solution –Slips unconscious behaviour that gets misdirected en route to satisfying goal –e.g. drive to store, end up in the office shows up frequently in skilled behaviour –usually due to inattention often arises from similarities of actions

Slide adapted from Saul Greenberg Designing for slips General rules –Prevent errors before they occur –Detect and correct errors when they do occur –User correction through feedback and undo

Slide adapted from Saul Greenberg Designing for slips Examples –mode errors have as few modes as possible (preferably none) make modes highly visible –capture errors instead of confirmation, make actions undoable allows reconsideration of action by user –loss of activation if system knows goal, make it explicit if not, allow person to see path taken –description errors in icon-based interfaces, make sure icons are not too similar, check for reasonable input, etc.

Slide adapted from Saul Greenberg 8 Deal with errors in a positive and helpful manner Prevent errors –try to make errors impossible –modern widgets: only “legal commands” selected, or “legal data” entered Provide reasonableness checks on input data –on entering order for office supplies 5000 pencils is an unusually large order. Do you really want to order that many?

Slide adapted from Saul Greenberg 9. Provide help Help is not a replacement for bad design! Simple systems: –walk up and use; minimal instructions Most other systems: –feature rich –some users will want to become “experts” rather than “casual” users –intermediate users need reminding, plus a learning path Volume 37: A user's guide to...

Slide adapted from Saul Greenberg Types of help Tutorial and/or getting started manuals –short guides that people are likely to read when first obtaining their systems encourages exploration and getting to know the system tries to get conceptual material across and essential syntax –on-line “tours”, exercises, and demos demonstrates very basic principles through working examples

Slide adapted from Saul Greenberg Types of help Reference manuals –used mostly for detailed lookup by experts –on-line hypertext search / find table of contents index cross-index

Slide adapted from Saul Greenberg Types of help Reminders –short reference cards expert user who just wants to check facts novice who wants to get overview of system’s capabilities –keyboard templates shortcuts/syntactic meanings of keys; recognition vs. recall; capabilities –tooltips text over graphical items indicates their meaning or purpose

Slide adapted from Saul Greenberg Types of help Context-sensitive help –system provides help on the interface component the user is currently working with Tool tips Macintosh “balloon help” Microsoft “What’s this” help –brief help explaining whatever the user is pointing at on the screen

Slide adapted from Saul Greenberg Types of help Wizards –walks user through typical tasks – but problematic if user gets stuck

Slide adapted from Saul Greenberg Types of help Tips –migration path to learning system features –also context-specific tips on being more efficient –must be “smart”, otherwise boring and tedious

Discount Usability Testing Heuristic Evaluation

Evaluating UI Designs Usability testing is a major technique “Discount” methods don’t require users –Heuristic Evaluation –Cognitive Walkthrough

Heuristic Evaluation Developed by Jakob Nielsen Helps find usability problems in a UI design Small set (3-5) of evaluators examine UI –independently check for compliance with usability principles (“heuristics”) –different evaluators will find different problems –evaluators only communicate afterwards findings are then aggregated Can perform on working UI or on sketches

Adapted from slide by James Landay Phases of Heuristic Evaluation 1) Pre-evaluation training –give evaluators needed domain knowledge and information on the scenarios 2) Evaluation –individuals evaluate and then aggregate results 3) Severity rating –determine how severe each problem is (priority) 4) Debriefing –discuss the outcome with design team

Heuristics (original) H1-1: Simple & natural dialog H1-2: Speak the users’ language H1-3: Minimize users’ memory load H1-4: Consistency H1-5: Feedback H1-6: Clearly marked exits H1-7: Shortcuts H1-8: Precise & constructive error messages H1-9: Prevent errors H1-10: Help and documentation

Adapted from slide by James Landay How to Perform H.E. At least two passes for each evaluator –first to get feel for flow and scope of system –second to focus on specific elements Assistance from implementors/domain experts –If system is walk-up-and-use or evaluators are domain experts, then no assistance needed –Otherwise might supply evaluators with scenarios and have implementors standing by

Adapted from slide by James Landay How to Perform Evaluation Where problems may be found –single location in UI –two or more locations that need to be compared –problem with overall structure of UI –something that is missing

Adapted from slide by James Landay Example Problem Descriptions Can’t copy info from one window to another –Violates “Minimize the users’ memory load” (H1-3) –Fix: allow copying Typography uses mix of upper/lower case formats and fonts –Violates “Consistency and standards” (H2-4) –Slows users down –Probably wouldn’t be found by user testing –Fix: pick a single format for entire interface

Adapted from slide by James Landay Severity Ratings Used to allocate resources to fix problems Estimates of need for more usability efforts Combination of –frequency –impact –persistence (one time or repeating) Should be calculated after all evaluations are done Should be done independently by all evaluators

Adapted from slide by James Landay Severity Ratings (from Nielsen & Mack 94) 0 - don’t agree that this is a usability problem 1 - cosmetic problem 2 - minor usability problem 3 - major usability problem; important to fix 4 - usability catastrophe; imperative to fix

Adapted from slide by James Landay Severity Ratings Example 1. [H1-4 Consistency] [Severity 3] The interface used the string "Save" on the first screen for saving the user's file, but used the string "Write file" on the second screen. Users may be confused by this different terminology for the same function.

Adapted from slide by James Landay Debriefing Conduct with evaluators, observers, and development team members Discuss general characteristics of UI Suggest potential improvements to address major usability problems Dev. team rates how hard things are to fix Make it a brainstorming session

Next Time Evaluation of using Heuristic Evaluation Low-Fi Prototyping