Overview of User Interface Design, Prototyping, and Evaluation (or, All of HCI in 80 Minutes) Jason I. Hong January 18, 2007.

Slides:



Advertisements
Similar presentations
Design, prototyping and construction
Advertisements

Conceptual Models & Interface Metaphors. 2 Interface Hall of Fame or Shame? Tabbed dialog for setting options in MS Web Studio –more tabs than space to.
HCI 특론 (2007 Fall) Conceptual Models & Interface Metaphors.
Web E’s goal is for you to understand how to create an initial interaction design and how to evaluate that design by studying a sample. Web F’s goal is.
HCI 특론 (2007 Fall) User Testing. 2 Hall of Fame or Hall of Shame? frys.com.
Semester in review. The Final May 7, 6:30pm – 9:45 pm Closed book, ONE PAGE OF NOTES Cumulative Similar format to midterm (probably about 25% longer)
1 User Testing. 2 Hall of Fame or Hall of Shame? frys.com.
Usability presented by the OSU Libraries’ u-team.
Presentation: Usability Testing Steve Laumaillet November 22, 2004 Comp 585 V&V, Fall 2004.
User Interface Testing. Hall of Fame or Hall of Shame?  java.sun.com.
1 Contextual Inquiry. 2 Hall of Fame or Hall of Shame? Gas pump display.
Stanford hci group / cs376 research topics in human-computer interaction Fieldwork / Prototyping Scott Klemmer 11 October 2005.
Overview of User Interface Design, Prototyping, and Evaluation (or, All of HCI in 80 Minutes) Jason I. Hong January 19, 2006.
Course Wrap-Up IS 485, Professor Matt Thatcher. 2 C.J. Minard ( )
2/23/991 Low-fidelity Prototyping & Storyboarding SIMS 213, Spring ‘99 Professor James Landay, EECS Dept. February 23, 1999.
User interface design Designing effective interfaces for software systems Objectives To suggest some general design principles for user interface design.
SIMS 213: User Interface Design & Development Marti Hearst Thurs, Jan 22, 2004.
Conceptual Models & Interface Metaphors Working as a Team*
1 Design Discovery. 2 Interface Hall of Shame or Fame?
Mid-Term Exam Review IS 485, Professor Matt Thatcher.
Damian Gordon.  Summary and Relevance of topic paper  Definition of Usability Testing ◦ Formal vs. Informal methods of testing  Testing Basics ◦ Five.
Prof. James A. Landay University of Washington Autumn 2008 Design Discovery: Task Analysis October 7, 2008.
User Interface Design Chapter 11. Objectives  Understand several fundamental user interface (UI) design principles.  Understand the process of UI design.
Web Design Process CMPT 281. Outline How do we know good sites from bad sites? Web design process Class design exercise.
Usability Methods: Cognitive Walkthrough & Heuristic Evaluation Dr. Dania Bilal IS 588 Spring 2008 Dr. D. Bilal.
1. Learning Outcomes At the end of this lecture, you should be able to: –Define the term “Usability Engineering” –Describe the various steps involved.
Paper Prototyping Source:
User Interface Theory & Design
Predictive Evaluation
1 SWE 513: Software Engineering Usability II. 2 Usability and Cost Good usability may be expensive in hardware or special software development User interface.
Presentation: Techniques for user involvement ITAPC1.
Stanford hci group / cs376 u Scott Klemmer · 10 October 2006 Fieldwor k & Prototyp ing.
1 Low-fidelity Prototyping. 2 Interface Hall of Shame or Fame? PowerBuilder List of objects with associated properties.
June 2004 User Interface Design, Prototyping, and Evaluation 1 Design Discovery.
Computer –the machine the program runs on –often split between clients & servers Human-Computer Interaction (HCI) Human –the end-user of a program –the.
HCI 특론 (2010 Spring) Low-fi Prototyping. 2 Interface Hall of Shame or Fame? Amtrak Web Site.
Prof. James A. Landay Computer Science Department Stanford University Spring 2015 CS 377 E : ENGELBART ’ S UNFINISHED LEGACY : DESIGNING SOLUTIONS TO GLOBAL.
June 2004 User Interface Design, Prototyping, and Evaluation 1 Outline Low-fi prototyping Wizard of Oz technique Informal UI prototyping tools Hi-fi prototyping.
Involving Users in Interface Evaluation Marti Hearst (UCB SIMS) SIMS 213, UI Design & Development April 8, 1999.
Usability Evaluation/LP Usability: how to judge it.
Object-Oriented Software Engineering Practical Software Development using UML and Java Chapter 7: Focusing on Users and Their Tasks.
Click to edit Master subtitle style USABILITY and USER INTERFACE DESIGN Application.
What is Usability? Usability Is a measure of how easy it is to use something: –How easy will the use of the software be for a typical user to understand,
Prof. James A. Landay University of Washington Autumn 2004 Conceptual Models & Interface Metaphors October 7, 2004.
SEG3120 User Interfaces Design and Implementation
PowerPoint Presentation for Dennis, Wixom, & Tegarden Systems Analysis and Design with UML, 3rd Edition Copyright © 2009 John Wiley & Sons, Inc. All rights.
Software Architecture
User Interface Theory & Design Lecture 6a 1.  User interface is everything the end user comes into contact with while using the system  To the user,
Task Analysis Methods IST 331. March 16 th
Evaluating a UI Design Expert inspection methods Cognitive Walkthrough
Design Process … and some design inspiration. Course ReCap To make you notice interfaces, good and bad – You’ll never look at doors the same way again.
EVALUATION PROfessional network of Master’s degrees in Informatics as a Second Competence – PROMIS ( TEMPUS FR-TEMPUS-JPCR)
Prof. James A. Landay University of Washington Autumn 2004 User Testing December 2, 2004.
Writing to Teach - Tutorials Chapter 2. Writing to Teach - Tutorials The purpose of a tutorial is to accommodate information to the needs of the user.
Low-fidelity Prototyping. Outline Low-fidelity prototyping Wizard of OZ technique Informal user interfaces Sketching user interfaces electronically Informal.
Prof. James A. Landay University of Washington Autumn 2006 User Testing November 30, 2006.
User Testing. CSE490f - Autumn 2006User Interface Design, Prototyping, & Evaluation2 Hall of Fame or Hall of Shame? frys.com.
1 Usability evaluation and testing User interfaces Jaana Holvikivi Metropolia.
Oct 211 The next two weeks Oct 21 & 23: Lectures on user interface evaluation Oct 28: Lecture by Dr. Maurice Masliah No office hours (out of town) Oct.
Fall 2002CS/PSY Predictive Evaluation (Evaluation Without Users) Gathering data about usability of a design by a specified group of users for a particular.
Usability Evaluation or, “I can’t figure this out...do I still get the donuts?”
Conceptual Model Design Informing the user what to do Lecture # 10 (b) Gabriel Spitz.
Prototyping Creation of concrete but partial implementations of a system design to explore usability issues.
Usability Engineering Dr. Dania Bilal IS 582 Spring 2007.
Usability Engineering Dr. Dania Bilal IS 587 Fall 2007.
11/10/981 User Testing CS 160, Fall ‘98 Professor James Landay November 10, 1998.
Medium-fi Prototyping
Conceptual Models & Interface Metaphors
Task Analysis IST 331 – Organization and Design of Information Systems: User and System Principles Instructor: Mithu Bhattacharya Spring 2011.
Presentation transcript:

Overview of User Interface Design, Prototyping, and Evaluation (or, All of HCI in 80 Minutes) Jason I. Hong January 18, 2007

User Interface Hall of Fame or Shame? IE5 page setup for printing Problems codes for header & footer information requires recall! recognition over recall no equivalent GUI help is the way to find out, but not obvious

Human-Computer Interaction (HCI) the end-user of a program the others in the organization Computer the machine the program runs on clients & servers, PDAs, cars, microwaves So far you have probably studied lots about Computers, but little about Humans and Interaction. This course will concentrate on how these three areas come together. Interaction the user tells the computer what they want (input) the computer communicates results (output)

HCI Approach to UI Design Organizational & Social Issues Tasks Design Technology Humans Give Examples of Tasks: high level: - writing a paper - drawing a picture low level: - copying a word from one paragraph to another - coloring a line Other considerations we won’t look at Business models, level of fun

Why is HCI Important? Major part of work for “real” programs (~50%) Bad user interfaces cost: money (reduced profits, call centers) WiFi Alliance: 30% of WiFi boxes returned reputation of organization (e.g., brand loyalty) time (wasted effort and energy by users, rework) lives (Therac-25) Studies have shown that the design, programming, and evaluation of the UI can take up to 50% of the project time and cost for a wide range of commercial and in-house software Nearly 25% of all applications projects fail. Why? overrun budgets & management pulls plug others complete, but are too hard to learn/use Solution is user-centered design. Why? easier to learn & use products sell better can help keep a product on schedule finding problems early makes them easier to fix! training costs reduced

Why is HCI Important? Privacy and Security phishing scams accidental disclosures (ex. location info, cookies) difficulty diagnosing the situation (intrusion detection) intentionally circumventing security mechanisms User interfaces hard to get right people are unpredictable intuition of designers often wrong need good design methods Studies have shown that the design, programming, and evaluation of the UI can take up to 50% of the project time and cost for a wide range of commercial and in-house software Nearly 25% of all applications projects fail. Why? overrun budgets & management pulls plug others complete, but are too hard to learn/use Solution is user-centered design. Why? easier to learn & use products sell better can help keep a product on schedule finding problems early makes them easier to fix! training costs reduced

Four Myths about Good Design Myth 1: Only experts create good designs experts faster, simple and effective techniques anyone can apply Myth 2: We can fix the user interface at the end good design is more than just user interface having right features, building those features right Myth 3: Good design takes too long / costs too much simple and effective techniques that can reduce total development time & cost (finds problems early on) Myth 4: Good design is just cool graphics graphics part of bigger picture of what to communicate & how

Outline Overview of the Design Process User Models Design Prototyping Evaluation User Models Affordances Mappings Metaphors

Who Builds User Interfaces? A team of specialists (ideally) graphic designers interaction / interface designers information architects technical writers marketers test engineers usability engineers software engineers users In this course you will wear the hats of many of these specialists.

Iterative design is crucial Prototype Evaluate

Design Design is driven by requirements focus on the core needs, not how implemented e.g., Nokia N80 not as important as “mobile” app might be multiple ways of achieving your goals A design is a simplified representation of the desired artifact text description of tasks screen sketches or storyboards flow diagrams / outline showing task structure executable prototypes Write essay start word processor write outline fill out outline Start word processor find word processor icon double click on icon Write outline write down high-level ideas .

Web Design Representations Site Maps Storyboards Schematics Mock-ups Designers create representations of sites at multiple levels of detail Web designs are iteratively refined at all levels of detail

Usability Goals? According to the ISO: The effectiveness, efficiency, and satisfaction with which specified users achieve specified goals in particular environments This does not mean you have to create a “dry” design or something that is only good for novices – it all depends on your goals

Usability Goals Set goals early and use them to measure progress Goals often have tradeoffs, so prioritize Example goals: Learnable faster the 2nd time & so on Memorable from session to session Flexible multiple ways to accomplish tasks Efficient perform tasks quickly Robust minimal error rates good feedback so user can recover Pleasing high user satisfaction Fun

Know Thy User Cognitive abilities Organizational / job abilities perception physical manipulation memory Organizational / job abilities what skills required who they talk to `

You Are Not the User You already know too much Richard Gregory Dalmatian

You Are Not the User User-centered design mind-set You already know too much easy to think of self as typical user easy to make mistaken assumptions Keep users involved throughout the design understanding work process getting constant feedback User-centered design mind-set thinking of the world in users’ terms (empathy) not technology-centered / feature driven, think of benefit to users `

Outline Overview of the Design Process User Models Design Prototyping Evaluation User Models Affordances Mappings Metaphors

Example of Design Failure BART “Charge-a-Ticket” Machines allow riders to buy BART tickets or add fare takes ATM cards, credit cards, & cash

Example of Design Failure BART “Charge-a-Ticket” Machines allow riders to buy BART tickets or add fare takes ATM cards, credit cards, & cash Problems (?) no visual flow (Where do I start? Where do I go next?) one “path” of operation ticket type -> payment type -> payment -> ticket BART Plus has minimum of $28, no indication of this until after inserting >= $1 can’t switch to regular BART ticket too many instructions large dismiss transaction button does nothing

Lessons from the BART machine Can’t we just define “good” interfaces? “good” has to be taken in context of users might be acceptable for office work, not for play infinite variety of tasks and users guidelines are too vague to be generative e.g., “give adequate feedback” How can we avoid similar results? “What is required to perform the user’s tasks?”

Task Analysis Find out: Observe existing work practices who users are what tasks they need to perform Observe existing work practices Create scenarios of actual use This lets us try new ideas before building software! Get rid of problems early in the design process while they are still cheap to fix!

Task Analysis Questions Who is going to use the system? What tasks do they now perform? What tasks are desired? How are the tasks learned? Where are the tasks performed? What’s the relationship between user & data?

Task Analysis Questions (cont.) What other tools does the user have? How do users communicate with each other? How often are the tasks performed? What are the time constraints on the tasks? What happens when things go wrong?

Who? Identity Background Skills Work habits and preferences in-house or specific customer is easy need several prototypical users for broad product Background Skills Work habits and preferences Physical characteristics

Who (BART)? Identity? Background? Skills? people who ride BART business people, students, disabled, elderly, tourists Background? may have an ATM or credit card have used other fare machines before Skills? may know how to put cards into ATM know how to buy BART tickets

Who (BART cont.)? Work habits and preferences? Some people: use BART 5 days a week Others: first time use Physical characteristics? varying heights -> don’t make it too high or too low!

What Tasks? Important for both automation and new functionality Understand relative importance of tasks Example: on-line billing small dentists office installed new billing system assistants became unhappy with new system old forms contained hand-written margin notes e.g., patient A’s insurance takes longer than most

Where is the Task Performed? Office, laboratory, point of sale? Effects of environment on users? Users under stress? Confidentiality required? Wet, dirty, or slippery hands? Soft drinks? Lighting? Noise?

What is the Relationship Between Users & Data? Public data? Open government records, public web sites Personal data? Ex. health records, bank records always accessed at same machine? do users move between machines? Common data? used concurrently? passed sequentially between users? Remote access required? Access to data restricted?

What Other Tools Does the User Have? More than just compatibility How user works with collection of tools to get things done Example: automating lab data collection how is data collected now? by what instruments and manual procedures? how is the information analyzed? are the results transcribed for records or publication? what media/forms are used and how are they handled?

How Often Do Users Perform the Tasks? Frequent users remember more details Infrequent users may need more help even for simple operations make these tasks possible to do Which function is performed most frequently? by which users? optimizing system for these tasks will improve perception of good performance

Involve Users to Answer Task Analysis Questions Users help designers learn: what they do and how they do it Developers reveal technical capabilities builds rapport and ideas of what is possible users can comment on whether ideas make sense How do we do this? observe & interview prospective users in work place!

Design from Data Heated arguments with others on design team Over what tools, skills, and knowledge users have These tend to generate lots of heat, little light Go out to real users and get real data from them find out what they really do how would your system fit in Are they too busy? buy their time (t-shirts, coffee mugs) find substitutes (medical students) Master apprentice in contrast to other kinds of relationships, ex. scientist-observer or parent-child

Can’t we just ask users what they want? Not familiar with what is possible with technology Not familiar with design constraints Budget, legacy code, time, etc Not familiar with good design Not familiar with security and privacy Sometimes users don’t know what they want Ex. Remote controls Contextual inquiry is an important method for understanding users’ needs Also, attitude vs actual behavior Master apprentice in contrast to other kinds of relationships, ex. scientist-observer or parent-child

Contextual Inquiry Go to the workplace & see the work as it unfolds People summarize, but we want details Keep it concrete when people start to abstract “We usually get reports by email”, ask “Can I see one?” not there to get a list of questions answered not there to answer their questions either (easy traps to fall into) Contextual inquiry is not an interview, more of an process for understanding

Contextual Inquiry Facts are only the starting point, you want a design based on correct interpretations Validate & rephrase share interpretations to check your reasoning Ex. “So accountability means a paper trail?” people will be uncomfortable until the phrasing is right Turned out that accountability actually means safety for personnel and equipment

Conducting a Contextual Inquiry Use recording technologies notebooks, tape recorders, still & video cameras Structure conventional interview (15 minutes) introduce focus & deal with ethical issues get used to each other by getting summary data transition (30 seconds) state new rules – they work while you watch & interrupt contextual interview (1-2 hours) take notes, draw, be nosy! (“who was on the phone?”) wrap-up (15 minutes) summarize your notes & confirm what is important

Selecting Tasks Choose real tasks users face Have a mixture of simple & complex tasks easy tasks (common or introductory) moderate tasks difficult tasks (infrequent or for power users) Good tasks are fundamental to good usability does your system support key tasks desired by users? how well does your system support these? we’ll see more of tasks in evaluation

Using Tasks in Design Write up a description of tasks (informal ok) tells a cohesive and plausible story run by users and rest of the design team do they make sense? Note that task descriptions focus on what, not how Manny is in the city at a club and would like to call his girlfriend, Sherry, to see when she will be arriving at the club. She called from a friend’s house while he was driving, so he couldn’t answer the phone. He would like to check his missed calls and find the number so that he can call her back.

Using Tasks in Design (contd.) Okay to have some freeform tasks Ex: “purchase tickets for a movie you want to see” navigation, reviews, shopping cart, etc specific tasks good for understanding usability, freeform tasks good for understanding usefulness

Outline Overview of the Design Process User Models Design Prototyping Evaluation User Models Affordances Mappings Metaphors

Why Do We Prototype? Quickly experiment with alternative designs Get feedback on your design faster fix problems before code is written saves time and money Keeps the design centered on the user must test & observe ideas with users

Fidelity in Prototyping Fidelity refers to level of detail High fidelity prototype looks like the final product Low fidelity artist’s rendition with many details missing

Low-fi Sketches & Storyboards

Low-fi Sketches & Storyboards

Ink Chat

Why Use Low-fi Prototypes? Traditional methods take too long sketches  build prototype  evaluate  iterate don’t want to program for weeks or months before feedback Simulate the prototype sketches  evaluate  iterate sketches act as prototypes designer “plays computer” other design team members observe & record Kindergarten implementation skills allows non-programmers to participate helps make sure everyone on the team is together

Qualities of Lo-Fi Prototypes Advice: avoid high-fidelity tools until necessary Informal visual representation communicates “unfinished” encourages creativity faster to create higher-level feedback Formal visual representation communicates “finished” inhibits creativity (detailing) slower to create

The Basic Materials Large, heavy, white paper (11 x 17) 5x8 in. index cards Post-its Tape, stick glue, correction tape Pens & markers (many colors & sizes) Overhead transparencies Scissors, X-acto knives, etc.

ESP

Constructing the Model

Constructing the Paper Prototype Set a deadline a few hours or 1-2 days don’t think for too long - build it! Draw a window frame on large paper Put different screen regions on cards anything that moves, changes, appears/disappears Ready response for any user action e.g., have those pull-down menus already made Use photocopier to make many versions

Advantages of Low-fi Prototyping Takes only a few hours no expensive equipment needed Can test multiple alternatives fast iterations number of iterations is tied to final quality Almost all interaction can be faked

Outline Overview of the Design Process User Models Design Prototyping Evaluation A lot here applies to evaluating both paper prototypes and real systems User Models Affordances Mappings Metaphors

Preparing for a Test Select people representative of target users minimize use of friends or family job-specific vocab / knowledge approximate if needed medical students, engineering students use incentives to get participants Prepare tasks that are: typical of the product during actual use ideally, same defined tasks from beforehand Practice to avoid “bugs”

Conducting a Test Roles: Typical session is 1 hour greeter – puts users at ease & gets demographic data facilitator – only team member who speaks gives instructions & encourages thoughts, opinions computer – simulates response, w/o explanation observers – take notes & recommendations Typical session is 1 hour preparation, the test, debriefing Make a recording or take good notes

Instructions to Participants Describe the purpose of the evaluation “I’m testing the product; I’m not testing you” Tell them they can quit at any time Demonstrate any equipment Explain how to think aloud tell us what they are trying to do tell us questions that arise as they work tell us things they read Explain that you will not provide help Explain the basic concept of the UI, but not too much Describe the task give written instructions, one task at a time

Conducting a Test Do a debriefing at the end Were tasks realistic? What parts made sense? Confusing? Any features missing?

Ethical Considerations Sometimes tests can be distressing users have left in tears You have a responsibility to alleviate make voluntary with informed consent avoid pressure to participate let them know they can stop at any time stress that you are testing the system, not them make collected data as anonymous as possible Often must get human subjects approval (IRB)

Deciding on Data to Collect Two types of data process data observations of what users are doing & thinking bottom-line data summary of what happened (time, errors, success) independent and dependent variables

Which Type of Data to Collect? Always focus on process data first gives good overview of where problems are Bottom-line doesn’t tell you where to fix just says: “too slow”, “too many errors”, etc. Hard to get reliable bottom-line results need many users for statistical significance

Using the Test Results Sort & prioritize observations what was important? lots of problems in the same area? Summarize the data make a list of all critical incidents, positive & negative try to judge why each difficulty occurred What does data tell you? UI work the way you thought it would? missing features?

Using the Results (cont.) Update task analysis & rethink design rate severity & ease of fixing critical incidents fix both severe problems & make the easy fixes Will thinking aloud give the right answers? not always if you ask a question, people will always give an answer, even it is has nothing to do with facts panty hose example try to avoid specific questions

Measuring Bottom-Line Usability Situations in which numbers are useful time on task # tasks successfully completed # errors define in advance what these mean Do not combine with thinking-aloud. Why? talking can affect speed & accuracy

Measuring Subjective User Preference Can rate system on a Likert scale Example: The user interface was easy to use 1 - Strongly disagree 2 - Disagree 3 - Neither agree nor disagree 4 - Agree 5 - Strongly agree Can be hard to be sure what data means novelty of UI, not realistic setting … Can also get some useful data by asking what they liked, disliked, where they had trouble, best part, worst part, etc. (redundant questions are OK)

Comparing Two Alternatives Between groups experiment two groups of test users each group uses only 1 of the systems Within groups experiment one group of test users each person uses both systems can’t use the same tasks or order (learning) best for low-level interaction techniques Between groups requires many more participants than within groups See if differences are statistically significant assumes normal distribution & same std. dev. B A

Reporting the Results Report what you did & what happened Images & graphs help people get it! Video clips can be quite convincing Especially for convincing other engineers

Discount Usability Reaction to “expensive” user tests Walkthroughs Meant to be cheap, fast, effective Walkthroughs put yourself in the shoes of a user like a code walkthrough Low-fi prototyping Action analysis (GOMS) On-line, remote usability tests Heuristic evaluation

Outline Overview of the Design Process User Models Design Prototyping Evaluation User Models Affordances Mappings Metaphors

Design of Everyday Things By Don Norman (UCSD, Apple, HP, NN Group) Design of everyday objects illustrates problems faced by designers of systems Explains conceptual models doors, washing machines, digital watches, telephones, ... Resulting design guides -> Highly recommend this book I re-read this book for the course. My lord, it’s *really* good. A bit reductive, certainly not at the forefront of research in cognitve science. (For a more academic overview of the field, I’d recommend Pinker’s book.) But it’s very readable, offers compelling, reusable anecdotes, and can be read in a few hours.

Conceptual Models Mental representation of how object works and how interface controls affect it People may have preconceived models that are hard to change (4 + 5) vs. (4 5 +) dragging to trash? delete file but eject disk Interface must communicate model visually online help and documentation can help, but shouldn’t be necessary

Affordances as Perceptual Clues Well-designed objects have affordances clues to their operation often visual, but not always (e.g., speech) In DOET, affordances are visual. But affordances are more than that. Affordances can be

Affordances as Perceptual Clues In DOET, affordances are visual. But affordances are more than that. Affordances can be Siemens Pocket PC Phone Pen input, no keypad Handspring Treo Pen input/keypad input

Affordances as Perceptual Clues Poorly-designed objects no clues or misleading clues French artist Jacques Carelman In DOET, affordances are visual. But affordances are more than that. Affordances can be Crazy design for a screw punch!

Problem: freezer too cold, but fresh food just right Refrigerator freezer fresh food Problem: freezer too cold, but fresh food just right

Refrigerator Controls Normal Settings C and 5 Colder Fresh Food C and 6-7 Coldest Fresh Food B and 8-9 Colder Freezer D and 7-8 Warmer Fresh Food C and 4-1 OFF (both) 0 A B C D E 7 6 5 4 3 What is your conceptual model?

A Common Conceptual Model A B C D E cooling unit 7 6 5 4 3 cooling unit independent controls

Actual Conceptual Model A B C D E cooling unit 7 6 5 4 3 Now can you fix the problem? Possible solutions make controls map to user’s model make controls map to actual system

Design Model & User Model System Image Users get model from experience & usage through system image What if the two models don’t match?

Conceptual Model Mismatch Mismatch between designer’s and user’s conceptual model leads to… Slow performance Errors And inability to recover Frustration ...

Notorious Example

A Personal Example

Make Things Visible Refrigerator (?) make the A..E dial something about percentage of cooling between the two compartments? Controls available on watch w/ 3 buttons? too many and they are not visible! Compare to controls on simple car radio #controls = #functions controls are labeled (?) and grouped together

Map Interface Controls Control should mirror real-world Which is better for dashboard speaker front / back control? Dashboard

Map Interface Controls

Map Interface Controls

Iterative design is crucial Four Key Ideas Iterative design is crucial Know Thy User You Are Not the User Design from Data

Summary Overview of the Design Process User Models Design (Task Analysis, Contextual Inquiry, Tasks) Prototyping (Paper prototyping) Evaluation (Lab tests, Heuristic evaluation) User Models Affordances Mappings Metaphors

Using the Data You Learn Say who the users are (use personas or profiles)

Example Persona Name: Patricia Age: 31 Occupation: Sales Manager, IKEA Store Hobbies: Painting Fitness/biking Taking son Devon to the park Likes: Emailing friends & family Surprises for her husband Talking on cell phone with friends Top 40 radio stations Eating Thai food Going to sleep late Dislikes: Slow service at checkout lines Smokers “Wixon and colleagues were developing an interface for a file management system. It passed lab tests with flying colors, but bombed as soon as customers got it. The problem was that it had a scrolling file list that was (say) twenty characters wide, but the file names customers used were very long, and in fact often identical for more than twenty characters (the names were made up by concatenating various qualifiers, and for many names the first several qualifiers would be the same.) Customers were not amused by needing to select from a scrolling list of umpty-ump identical entries that stood for different files. And this wasn't an oddball case, it was in fact typical. How had it been missed in the lab tests? Nobody thought it would matter what specific file names you used for the test, so of course they were all short.” – from Chapter 2 of TASK-CENTERED USER INTERFACE DESIGN A Practical Introduction by Clayton Lewis and John Rieman.

Using the Data You Learn Say who the users are (use personas or profiles) personas do not have to be a real person, but should be based on real facts and details design can really differ depending on who the target is provide names (easier to reference) characteristics of the users (job, expertise, etc.) Might have one persona for each class of users helps the design team think in terms of the users Keep in mind we already use personas “I wouldn’t like that” “My mom wouldn’t be able to use that”

1 Minute Break Good design matters in all areas of our lives The little things really do matter A designer’s proposed changes to airport screenings Picture from NYTimes, Molotch I believe

Metaphor Definition ? Lakoff & Johnson, Metaphors We Live By “The transference of the relation between one set of objects to another set for the purpose of brief explanation.” Lakoff & Johnson, Metaphors We Live By “...the way we think, what we experience, and what we do every day is very much a matter of metaphor.” in our language & thinking - “argument is war” he attacked every weak point ... criticisms right on target ... if you use that strategy We can use metaphors to leverage existing conceptual models

Desktop Metaphor Suggests a conceptual model The “user illusion” Not really an attempt to simulate a real desktop Leverages existing knowledge about files, folders, trash A way to explain why some windows seemed blocked The “user illusion”

Example Metaphors Global metaphors Data & function Collections personal assistant, e-wallet, e-mail, digital library Data & function rolodex, to-do list, calendar, documents Collections drawers, files, books, newspapers, photo albums Develop interface metaphor or conceptual model Communicate that metaphor to the user Provide high-level task-oriented operations not low-level implementation commands

What Happens When Things Go Wrong? How do people deal with task-related errors? practical difficulties? catastrophes? Is there a backup strategy?

Discount Usability Engineering Reaction to excuses for not doing user testing “too expensive”, “takes too long”, … Cheap no special labs or equipment needed the more careful you are, the better it gets Fast on order of 1 day to apply standard usability testing may take a week or more Easy to use some techniques can be taught in 2-4 hours

Example Heuristic H2-8: Aesthetic and minimalist design no irrelevant information in dialogues

Example Heuristic H2-9: Help users recognize, diagnose, and recover from errors error messages in plain language precisely indicate the problem constructively suggest a solution