Prof. James A. Landay University of Washington Winter 2007 (1) Action Analysis (2) Automated Evaluation January 8, 2007.

Slides:



Advertisements
Similar presentations
Chapter 15: Analytical evaluation
Advertisements

Chapter 11 Designing the User Interface
User Modeling CIS 376 Bruce R. Maxim UM-Dearborn.
User Interface Structure Design
SECOND MIDTERM REVIEW CS 580 Human Computer Interaction.
Evaluating Requirements. Outline Brief Review Stakeholder Review Requirements Analysis Summary Activity 1.
Calendar Browser is a groupware used for booking all kinds of resources within an organization. Calendar Browser is installed on a file server and in a.
Predictive Assessment of Usability Laura Marie Leventhal.
Evaluation Types GOMS and KLM
Human Computer Interface. HCI and Designing the User Interface The user interface is a critical part of an information system -- it is what the users.
1 Action Analysis Automated Evaluation. 2 Hall of Fame or Hall of Shame? java.sun.com.
User Interface Testing. Hall of Fame or Hall of Shame?  java.sun.com.
COMP6703 : eScience Project III ArtServe on Rubens Emy Elyanee binti Mustapha Supervisor: Peter Stradzins Client: Professor Michael.
SIMS 213: User Interface Design & Development
Hall of Fame or Hall of Shame? java.sun.com. Hall of Fame Good branding –java logo –value prop Inverse pyramid writing style Fresh content –changing first.
1 CS 430 / INFO 430 Information Retrieval Lecture 24 Usability 2.
SIMS 213: User Interface Design & Development Marti Hearst Tues Feb 13, 2001.
SIMS 213: User Interface Design & Development Marti Hearst Tues, April 19, 2005.
From Scenarios to Paper Prototypes Chapter 6 of About Face Defining requirements Defining the interaction framework.
Knowledge is Power Marketing Information System (MIS) determines what information managers need and then gathers, sorts, analyzes, stores, and distributes.
HCI revision lecture. Main points Understanding Applying knowledge Knowing key points Knowing relationship between things If you’ve done the group project.
I213: User Interface Design & Development Marti Hearst Tues, April 17, 2007.
Analytical Evaluations 2. Field Studies
Discount Usability Engineering Marti Hearst (UCB SIMS) SIMS 213, UI Design & Development March 2, 1999.
User Centered Design Lecture # 5 Gabriel Spitz.
User Interface Design Chapter 11. Objectives  Understand several fundamental user interface (UI) design principles.  Understand the process of UI design.
Evaluating User Interfaces Walkthrough Analysis Joseph A. Konstan
Web Design and Patterns CMPT 281. Outline Motivation: customer-centred design Web design introduction Design patterns.
©2011 1www.id-book.com Analytical evaluation Chapter 15.
Heuristic Evaluation “Discount” Usability Testing Adapted from material by Marti Hearst, Loren Terveen.
Chapter 5 Models and theories 1. Cognitive modeling If we can build a model of how a user works, then we can predict how s/he will interact with the interface.
Predictive Evaluation
1 Designing Need-based Internet Web Sites in Counseling and Career Services James P. Sampson, Jr. Florida State University Copyright 2002 by James P. Sampson,
Computer –the machine the program runs on –often split between clients & servers Human-Computer Interaction (HCI) Human –the end-user of a program –the.
Online, Remote Usability Testing  Use web to carry out usability evaluations  Two main approaches agent-based evaluation (e.g., WebCritera)  model automatically.
Automating Database Processing Chapter 6. Chapter Introduction Design and implement user-friendly menu – Called navigation form Macros – Automate repetitive.
©2010 John Wiley and Sons Chapter 12 Research Methods in Human-Computer Interaction Chapter 12- Automated Data Collection.
GOMS CS 160 Discussion Chris Long 3/5/97. What is GOMS? l A family of user interface modeling techniques l Goals, Operators, Methods, and Selection rules.
Gary MarsdenSlide 1University of Cape Town Human-Computer Interaction - 6 User Models Gary Marsden ( ) July 2002.
SEG3120 User Interfaces Design and Implementation
PowerPoint Presentation for Dennis, Wixom, & Tegarden Systems Analysis and Design with UML, 3rd Edition Copyright © 2009 John Wiley & Sons, Inc. All rights.
Slide 1 Chapter 11 User Interface Structure Design Chapter 11 Alan Dennis, Barbara Wixom, and David Tegarden John Wiley & Sons, Inc. Slides by Fred Niederman.
Prof. James A. Landay University of Washington Autumn 2004 (1) Action Analysis (2) Automated Evaluation December 7, 2004.
Identifying needs and establishing requirements
GOMs and Action Analysis and more. 1.GOMS 2.Action Analysis.
Microsoft ® Office Excel 2003 Training Using XML in Excel SynAppSys Educational Services presents:
Task Analysis Methods IST 331. March 16 th
Chapter 12 cognitive models. Cognitive models goal and task hierarchies linguistic physical and device architectural.
Evaluating & Maintaining a Site Domain 6. Conduct Technical Tests Dreamweaver provides many tools to assist in finalizing and testing your website for.
EVALUATION PROfessional network of Master’s degrees in Informatics as a Second Competence – PROMIS ( TEMPUS FR-TEMPUS-JPCR)
1 Cognitive Modeling GOMS, Keystroke Model Getting some details right!
Cognitive Models Lecture # March, 2008Human Computer Intercation Spring 2008, Lecture #10 2 Agenda Cognitive models –KLM –GOMS –Fitt’s Law –Applications.
Cognitive Walkthrough More evaluating with experts.
Chapter 15: Analytical evaluation. Aims: Describe inspection methods. Show how heuristic evaluation can be adapted to evaluate different products. Explain.
Evaluating Requirements
1CS 338: Graphical User Interfaces. Dario Salvucci, Drexel University. Lecture 15: User Modeling.
Task Analysis Lecture # 8 Gabriel Spitz 1. Key Points  Task Analysis is a critical element of UI Design  It describes what is a user doing or will.
Task Analysis Lecture # 8 Gabriel Spitz 1. Key Points  Task Analysis is a critical element of UI Design  It specifies what functions the user will need.
Prof. James A. Landay University of Washington Winter 2009 (1) Action Analysis (2) Automated Evaluation January 13, 2009.
Creating New Forms Projects can appear more professional when using different windows for different types of information. Select Add Windows Form from.
Prof. James A. Landay University of Washington Winter 2007 Video Prototyping January 22, 2007.
McGraw-Hill/Irwin The Interactive Computing Series © 2002 The McGraw-Hill Companies, Inc. All rights reserved. Microsoft Excel 2002 Using Macros Lesson.
Chapter 5 – Cognitive Engineering
CIS 376 Bruce R. Maxim UM-Dearborn
Prototyping.
An Introduction to Computers and Visual Basic
Chapter 12: Automated data collection methods
GOMS as a Simulation of Cognition
An Introduction to Computers and Visual Basic
Presentation transcript:

Prof. James A. Landay University of Washington Winter 2007 (1) Action Analysis (2) Automated Evaluation January 8, 2007

1/8/2006CS490f II - User Interface, Design, Prototyping, & Evaluation2 Hall of Fame or Hall of Shame? Bryce 2 –for building 3D models

1/8/2006CS490f II - User Interface, Design, Prototyping, & Evaluation3 Hall of Shame! Icons all look similar –what do they do???? How do you exit? Note –nice visuals, but must be usable What if purely for entertainment & exploration?

Prof. James A. Landay University of Washington Winter 2007 (1) Action Analysis (2) Automated Evaluation January 8, 2007

1/8/2006CS490f II - User Interface, Design, Prototyping, & Evaluation5 Outline Action analysis GOMS? What’s that? The G, O, M, & S of GOMS How to do the analysis Announcements Automated evaluation tools

1/8/2006CS490f II - User Interface, Design, Prototyping, & Evaluation6 Action Analysis Predicts Performance Cognitive model –model some aspect of human understanding, knowledge, intentions, or processing –two types competence –predict behavior sequences performance –predict performance, but limited to routine behavior Action analysis uses performance model to analyze goals & tasks –generally done hierarchically (similar to TA)

1/8/2006CS490f II - User Interface, Design, Prototyping, & Evaluation7 GOMS – Most Popular AA Technique Family of UI modeling techniques –based on Model Human Processor GOMS stands for (?) –Goals –Operators –Methods –Selection rules Input: detailed description of UI/task(s) Output: qualitative & quantitative measures

1/8/2006CS490f II - User Interface, Design, Prototyping, & Evaluation8 Quick Example Goal (the big picture) –go from hotel to the airport Methods (or subgoals)? –walk, take bus, take taxi, rent car, take train Operators (or specific actions) –locate bus stop; wait for bus; get on the bus;... Selection rules (choosing among methods)? –Example: Walking is cheaper, but tiring and slow –Example: Taking a bus is complicated abroad

1/8/2006CS490f II - User Interface, Design, Prototyping, & Evaluation9 Goals Something the user wants to achieve Examples? –go to airport –delete file –create directory Hierarchical structure –may require many subgoals

1/8/2006CS490f II - User Interface, Design, Prototyping, & Evaluation10 Methods Sequence of steps to accomplish a goal –goal decomposition –can include other goals Assumes method is learned & routine Examples –drag file to trash –retrieve from long-term memory command

1/8/2006CS490f II - User Interface, Design, Prototyping, & Evaluation11 Operators Specific actions (small scale or atomic) Lowest level of analysis –can associate with times Examples –Locate icon for item on screen –Move cursor to item –Hold mouse button down –Locate destination icon –User reads the dialog box

1/8/2006CS490f II - User Interface, Design, Prototyping, & Evaluation12 Selection Rules If > 1 method to accomplish a goal, Selection rules pick method to use Examples –IF THEN accomplish –IF THEN

1/8/2006CS490f II - User Interface, Design, Prototyping, & Evaluation13 GOMS Output Execution time –add up times from operators –assumes ? experts (mastered the tasks) & error free behavior –very good rank ordering –absolute accuracy ~10-20% Procedure learning time (NGOMSL only) –accurate for relative comparison only –doesn’t include time for learning domain knowledge

1/8/2006CS490f II - User Interface, Design, Prototyping, & Evaluation14 GOMS Output Used To Ensure frequent goals achieved quickly Making hierarchy is often the value –functionality coverage & consistency does UI contain needed functions? consistency: are similar tasks performed similarly? –operator sequence in what order are individual operations done?

1/8/2006CS490f II - User Interface, Design, Prototyping, & Evaluation15 How to do GOMS Analysis Generate task description –pick high-level user Goal –write Method for accomplishing Goal - may invoke subgoals –write Methods for subgoals this is recursive stops when Operators are reached Evaluate description of task Apply results to UI Iterate!

1/8/2006CS490f II - User Interface, Design, Prototyping, & Evaluation16 Comparative Example - DOS Goal: Delete a File Method for accomplishing goal of deleting file –retrieve from Long term memory that command verb is “del” –think of directory name & file name and make it the first listed parameter –accomplish goal of entering & executing command –return with goal accomplished

1/8/2006CS490f II - User Interface, Design, Prototyping, & Evaluation17 Comparative Example - Mac Goal: Delete a File Method for accomplishing goal of deleting file –find file icon –accomplish goal of dragging file to trash –return with goal accomplished

1/8/2006CS490f II - User Interface, Design, Prototyping, & Evaluation18 Comparative Example - DOS Goal: Remove a directory Method for accomplishing goal of removing a directory - –?????

1/8/2006CS490f II - User Interface, Design, Prototyping, & Evaluation19 Comparative Example - DOS Goal: Remove a directory Method for accomplishing goal of removing a directory –accomplish goal of making sure directory is empty –retrieve from long term memory that command verb is ‘RMDIR’ –think of directory name and make it the first listed parameter –accomplish goal of entering & executing command –return with goal accomplished

1/8/2006CS490f II - User Interface, Design, Prototyping, & Evaluation20 Comparative Example - Mac Goal: Remove a directory Method for accomplishing goal of removing a directory –????

1/8/2006CS490f II - User Interface, Design, Prototyping, & Evaluation21 Comparative Example - Mac Goal: Remove a directory Method for accomplishing goal of removing a directory –find folder icon –accomplish goal of dragging folder to trash –return with goal accomplished Note the consistency with delete file on the Mac! This makes it much easier.

1/8/2006CS490f II - User Interface, Design, Prototyping, & Evaluation22 Applications of GOMS Compare different UI designs Profiling (time) Building a help system? Why? –modeling makes user tasks & goals explicit –can suggest questions users might ask & the answers

1/8/2006CS490f II - User Interface, Design, Prototyping, & Evaluation23 What GOMS Can Model Task must be goal-directed –some activities are more goal-directed creative activities may not be as goal-directed Task must use routine cognitive skills –as opposed to problem solving –good for things like machine operators Serial & parallel tasks (CPM-GOMS)

1/8/2006CS490f II - User Interface, Design, Prototyping, & Evaluation24 Real-world GOMS Applications Keystroke Level Model (KLM) –Mouse-based text editor –Mechanical CAD system NGOMSL –TV control system –Nuclear power plant operator’s associate CPM-GOMS –Telephone operator workstation

1/8/2006CS490f II - User Interface, Design, Prototyping, & Evaluation25 Advantages of GOMS Gives qualitative & quantitative measures Model explains the results Less work than user study – no users! Easy to modify when UI is revised Research: tools to aid modeling process since it can still be tedious

1/8/2006CS490f II - User Interface, Design, Prototyping, & Evaluation26 Disadvantages of GOMS Not as easy as HE, guidelines, etc. Takes lots of time, skill, & effort Only works for goal-directed tasks Assumes tasks performed by experts without error Does not address several UI issues, –readability, memorizability of icons, commands

1/8/2006CS490f II - User Interface, Design, Prototyping, & Evaluation27 Announcements Teams –Cluster Carolyn Sierra Fred Chris –Panlingual Mobile Camera Jonathan Tim Martin Peter Kinsley –Don’t Forget Andy Kenneth Elisabeth Kevin 1 st homework due Friday – update Denim prototype based on HE results & other issues you are aware of Questions?

1/8/2006CS490f II - User Interface, Design, Prototyping, & Evaluation28 Rapid Iterative Design is the Best Practice for Creating Good UIs Design Prototyping Evaluation We have seen how computer-based tools can improve the Design (e.g., Denim) & Prototyping (e.g., VB) phases

1/8/2006CS490f II - User Interface, Design, Prototyping, & Evaluation29 Automated GOMS Tools Can save, modify & re-use the model Automation of execution time calculation, etc.

1/8/2006CS490f II - User Interface, Design, Prototyping, & Evaluation30 QGOMSQGOMS tool

1/8/2006CS490f II - User Interface, Design, Prototyping, & Evaluation31 CRITIQUE Hudson et al (1999) 1. Prototype system –in this case with the SubArctic toolkit 2. Demonstrate a procedure (task) –record events –apply rules 3. Automatically generate KLMs 4. Semi-automatically generate classic GOMS models

1/8/2006CS490f II - User Interface, Design, Prototyping, & Evaluation32 Factors Driving Repeat Visits Should Drive Evaluation High quality content75% Ease of use66% Quick to download58% (Source: Forrester Research, 1/99)

Warning I was a founder of the following company – watch for bias!

1/8/2006CS490f II - User Interface, Design, Prototyping, & Evaluation34 The Trouble With Current Site Analysis Tools Unknowns Who? What? Why? Did they find it? Satisfied? Leave

1/8/2006CS490f II - User Interface, Design, Prototyping, & Evaluation35 NetRaker Provided User-centric Remote Evaluation Using Key Metrics NetRaker Index –short pop-up survey shown to 1 in n visitors –on-going tracking & evaluation data Market Research & Usability Templates –surveys & task testing –invitation delivered through , links, or pop-ups

1/8/2006CS490f II - User Interface, Design, Prototyping, & Evaluation36 NetRaker Usability Research: See how customers accomplish real tasks on site

1/8/2006CS490f II - User Interface, Design, Prototyping, & Evaluation37 NetRaker Usability Research: See how customers accomplish real tasks on site

1/8/2006CS490f II - User Interface, Design, Prototyping, & Evaluation38 NetRaker Usability Research: See how customers accomplish real tasks on site

1/8/2006CS490f II - User Interface, Design, Prototyping, & Evaluation39 WebQuilt: Visual Analysis Goals –link page elements to user actions –identify behavior/navigation patterns –highlight potential problems areas Solution –interactive graph based on web content nodes represent web pages edges represent aggregate traffic between pages –designers can indicate expected paths –color code common usability interests –filtering to show only target participants –use zooming for analyzing data at varying granularity

1/8/2006CS490f II - User Interface, Design, Prototyping, & Evaluation40

1/8/2006CS490f II - User Interface, Design, Prototyping, & Evaluation41

1/8/2006CS490f II - User Interface, Design, Prototyping, & Evaluation42

1/8/2006CS490f II - User Interface, Design, Prototyping, & Evaluation43 Advantages of Remote Usability Testing Fast –can set up research in 3-4 hours –get results in 36 hours More accurate –can run with large samples ( users -> stat. sig.) –uses real people (customers) performing tasks –natural environment (home/work/machine) Easy-to-use –templates make setting up easy Can compare with competitors –indexed to national norms

1/8/2006CS490f II - User Interface, Design, Prototyping, & Evaluation44 Disadvantages of Remote Usability Testing Miss observational feedback –facial expressions –verbal feedback (critical incidents) Need to involve human participants –costs some amount of money (typically $20- $50/person) People often do not like pop-ups –need to be careful when using them

1/8/2006CS490f II - User Interface, Design, Prototyping, & Evaluation45 Summary GOMS –provides info about important UI properties –doesn’t tell you everything you want to know about UI only gives performance for expert, error-free behavior –hard to create model, but still easier than user testing changing later is much less work than initial generation Automated usability –faster than traditional techniques –can involve more participants  convincing data –easier to do comparisons across sites –tradeoff with losing observational data

1/8/2006CS490f II - User Interface, Design, Prototyping, & Evaluation46 Next Time Advanced User Testing –Appendix A from The Design of Sites –Gomoll paperGomoll paper –Statistica Ch1, and parts of Ch3Ch1Ch3 –Lewis & Rieman Ch. 5Lewis & Rieman Ch. 5

1/8/2006CS490f II - User Interface, Design, Prototyping, & Evaluation47 BACKUP

1/8/2006CS490f II - User Interface, Design, Prototyping, & Evaluation48 Max – WebCriteria’s GOMS Model Predicts how long information seeking tasks would take on a particular web site Automated procedure: seed with start page and goal page procedure reads page model predicts how long to find & click proper link load time, scan time, and mouse movement time repeat until find goal page Claim time is directly related to usability

1/8/2006CS490f II - User Interface, Design, Prototyping, & Evaluation49 Advantages of Max-style Model Inexpensive (no users needed) Fast (robot runs & then computes model) Can run on many sites & compare -> benchmarks

1/8/2006CS490f II - User Interface, Design, Prototyping, & Evaluation50 Disadvantages of Max-style Model Focus on time (much of it download time) –only 3 rd in important factors driving repeat visits –can’t tell you anything about your content –doesn’t say anything directly about usability problems Robots aren’t humans –doesn’t make mistakes remember, GOMS assumes expert behavior! –doesn’t account for understanding text –only tries the best path – users will use many Major flaw is the lack of real users in the process

1/8/2006CS490f II - User Interface, Design, Prototyping, & Evaluation51 Small number of rotated questions increases response rate NetRaker Index: On-going customer intelligence gathering

1/8/2006CS490f II - User Interface, Design, Prototyping, & Evaluation52 Small number of rotated questions increases response rate NetRaker Index: On-going customer intelligence gathering

1/8/2006CS490f II - User Interface, Design, Prototyping, & Evaluation53 Small number of rotated questions increases response rate NetRaker Index: On-going customer intelligence gathering

1/8/2006CS490f II - User Interface, Design, Prototyping, & Evaluation54 Small number of rotated questions increases response rate NetRaker Index: On-going customer intelligence gathering

1/8/2006CS490f II - User Interface, Design, Prototyping, & Evaluation55 NetRaker Index: On-going customer intelligence gathering Increasing these indices (e.g., retention) moderately (5%) leads to a large increase in revenue growth