SIMS 213: User Interface Design & Development Marti Hearst Tues, April 6, 2004.

Slides:



Advertisements
Similar presentations
Chapter 15: Analytical evaluation
Advertisements

Chapter 12 cognitive models.
User Modeling CIS 376 Bruce R. Maxim UM-Dearborn.
Predictive Assessment of Usability Laura Marie Leventhal.
Evaluation Types GOMS and KLM
GOMS Analysis & Automated Usability Assessment Melody Y. Ivory (UCB CS) SIMS 213, UI Design & Development March 8, 2001.
Fitts’ Law and the Model Human Processor
Midterm Review Packet Stolen Borrowed from Prof. Marti Hearst HFID Spring 2005.
GOMS Analysis & Automating Usability Assessment Melody Y. Ivory SIMS 213, UI Design & Development March 19, 2002.
SIMS 213: User Interface Design & Development
CS160 Discussion Section Fitts Law and KLM David Sun Sept 26 th 2007.
KLM and GOMS Professor: Tapan Parikh TA: Eun Kyoung Choe
Objectives Define predictive and descriptive models and explain why they are useful. Describe Fitts’ Law and explain its implications for interface design.
SIMS 213: User Interface Design & Development Marti Hearst Tues, April 19, 2005.
GOMS and keystroke predictive methods Judy Kay CHAI: Computer human adapted interaction research group School of Information Technologies.
Predictive Evaluation Predicting performance. Predictive Models Translate empirical evidence into theories and models that can influence design. Performance.
Some questions of hypermedia and CHI Josep Blat Universitat Pompeu Fabra.
I213: User Interface Design & Development Marti Hearst Tues, April 17, 2007.
Analytical Evaluations 2. Field Studies
Predictive Evaluation Simple models of human performance.
©2011 1www.id-book.com Analytical evaluation Chapter 15.
Chapter 5 Models and theories 1. Cognitive modeling If we can build a model of how a user works, then we can predict how s/he will interact with the interface.
The Growth of Cognitive Modeling in Human- Computer Interaction Since GOMS By Judith Reitman Olson and Gary M. Olson The University of Michigan.
User Models Predicting a user’s behaviour. Fitts’ Law.
UNDERSTANDING USERS: MODELING TASKS AND LOW- LEVEL INTERACTION Human-Computer Interaction
Slides based on those by Paul Cairns, York ( users.cs.york.ac.uk/~pcairns/) + ID3 book slides + slides from: courses.ischool.berkeley.edu/i213/s08/lectures/i ppthttp://www-
CSC 480 Software Engineering Lecture 19 Nov 11, 2002.
1 Rensselaer Cognitive Science Keystroke-Level Model: Intro The simplest of all GOMS models: OM only!!!  No explicit goals or selection rules  Operators.
Stanford hci group / cs October 2008 Inp ut Scott Klemmer.
User Modeling 1 Predicting thoughts and actions. Agenda Cognitive models Physical models Fall 2006PSYCH / CS
GOMS CS 160 Discussion Chris Long 3/5/97. What is GOMS? l A family of user interface modeling techniques l Goals, Operators, Methods, and Selection rules.
Gary MarsdenSlide 1University of Cape Town Human-Computer Interaction - 6 User Models Gary Marsden ( ) July 2002.
COMP5047 Pervasive Computing: 2012 GOMS and keystroke predictive methods Judy Kay CHAI: Computer human adapted interaction research group School of Information.
GOMS Timing for WIMP interfaces When (fine-grained) speed matters.
Testing & modeling users. The aims Describe how to do user testing. Discuss the differences between user testing, usability testing and research experiments.
Identifying needs and establishing requirements
GOMs and Action Analysis and more. 1.GOMS 2.Action Analysis.
Evaluation of User Interface Design 4. Predictive Evaluation continued Different kinds of predictive evaluation: 1.Inspection methods 2.Usage simulations.
Chapter 15: Analytical evaluation Q1, 2. Inspections Heuristic evaluation Walkthroughs Start Q3 Reviewers tend to use guidelines, heuristics and checklists.
Analytical evaluation Prepared by Dr. Nor Azman Ismail Department of Computer Graphics and Multimedia Faculty of Computer Science & Information System.
Cognitive Modeling 1 Predicting thougts and actions
© Simeon Keates 2009 Usability with Project Lecture 14 – 30/10/09 Dr. Simeon Keates.
ITM 734 Introduction to Human Factors in Information Systems
The Psychology of Human-Computer Interaction
6.813/6.831 User Interface Design and Implementation
Evaluation Using Modeling. Testing Methods Same as Formative Surveys/questionnaires Interviews Observation Documentation Automatic data recording/tracking.
1 Cognitive Modeling GOMS, Keystroke Model Getting some details right!
Cognitive Models Lecture # March, 2008Human Computer Intercation Spring 2008, Lecture #10 2 Agenda Cognitive models –KLM –GOMS –Fitt’s Law –Applications.
마스터 제목 스타일 편집 마스터 텍스트 스타일을 편집합니다 둘째 수준 셋째 수준 넷째 수준 다섯째 수준 The GOMS Family of User Interface Analysis Techniques : Comparison and Contrast Bonnie E. John.
Chapter 15: Analytical evaluation. Aims: Describe inspection methods. Show how heuristic evaluation can be adapted to evaluate different products. Explain.
Copyright © 2008 Pearson Education, Inc. Publishing as Pearson Addison-Wesley Prof Jim Warren with reference to sections 7.1 and 7.2 of The Resonant Interface.
1CS 338: Graphical User Interfaces. Dario Salvucci, Drexel University. Lecture 15: User Modeling.
GOMS Analysis & Web Site Usability Melody Y. Ivory (UCB CS) SIMS 213, UI Design & Development April 15, 1999.
1 1 ITM 734 Introduction to Human Factors in Information Systems Cindy Corritore This material has been developed by Georgia Tech HCI.
Evaluation Types GOMS and KLM CS352. Quiz Announcements Notice upcoming due dates (web page). Where we are in PRICPE: –Predispositions: Did this in Project.
A Survey on User Modeling in HCI PRESENTED BY: MOHAMMAD SAJIB AL SERAJ SUPERVISED BY: PROF. ROBERT PASTEL.
CIS 376 Bruce R. Maxim UM-Dearborn

GOMS Adapted from Berkeley Guir.
Models and Theories.
15. Human-Computer Interaction
GOMS as a Simulation of Cognition
Evaluation.
Cognitive models linguistic physical and device architectural
Model based design NGOMSL and CPM- GOMS
Model based design keystroke level model
Chapter 12 cognitive models.
Human Computer Interaction Lecture 24 Cognitive Models
Chapter 12 cognitive models.
Presentation transcript:

SIMS 213: User Interface Design & Development Marti Hearst Tues, April 6, 2004

Today Evaluation based on Cognitive Modeling Comparing Evaluation Methods

Another Kind of Evaluation Evaluation based on Cognitive Modeling Fitts’ Law  Used to predict a user’s time to select a target Keystroke-Level Model  low-level description of what users would have to do to perform a task. GOMS  structured, multi-level description of what users would have to do to perform a task

Slide adapted from Melody Ivory GOMS at a glance Proposed by Card, Moran & Newell in 1983 –Apply psychology to CS employ user model (MHP) to predict performance of tasks in UI –task completion time, short-term memory requirements –Applicable to user interface design and evaluation training and documentation –Example of automating usability assessment

Slide adapted from Melody Ivory Model Human Processor (MHP) Card, Moran & Newell (1983) –most influential model of user interaction used in GOMS analysis –3 interacting subsystems cognitive, perceptual & motor each with processor & memory –described by parameters »e.g., capacity, cycle time serial & parallel processing Adapted from slide by Dan Glaser

Slide adapted from Melody Ivory Original GOMS (CMN-GOMS) Card, Moran & Newell (1983) Engineering model of user interaction G oals - user’s intentions (tasks) –e.g., delete a file, edit text, assist a customer O perators - actions to complete task –cognitive, perceptual & motor (MHP) –low-level (e.g., move the mouse to menu)

Slide adapted from Melody Ivory CMN-GOMS Engineering model of user interaction (continued) M ethods - sequences of actions (operators) –based on error-free expert –may be multiple methods for accomplishing same goal »e.g., shortcut key or menu selection S elections - rules for choosing appropriate method –method predicted based on context –hierarchy of goals & sub-goals

Keystroke-Level Model Simpler than CMN-GOMS Model was developed to predict time to accomplish a task on a computer Predicts expert error-free task-completion time with the following inputs: –a task or series of subtasks –method used –command language of the system –motor-skill parameters of the user –response-time parameters of the system Prediction is the sum of the subtask times and overhead

Slide adapted from Newstetter & Martin, Georgia Tech KLM-GOMS Keystroke level model 1. Predict (What Raskin refers to as GOMS) Action 1 Action 2 Action 3 x sec. y sec. z sec.+ t sec. 2. Evaluate Time using interface 1 Time using interface 2

Slide adapted from Newstetter & Martin, Georgia Tech Symbols and values KBPHDMRKBPHDMR Press Key Mouse Button Press Point with Mouse Home hand to and from keyboard Drawing - domain dependent Mentally prepare Response from system - measure / Operator RemarksTime (s) Raskin excludes Assumption: expert user

Slide adapted from Newstetter & Martin, Georgia Tech Raskin’s rules KBPHDMRKBPHDMR / Rule 0: Initial insertion of candidate M’s Rule 1: Deletion of anticipated M’s M before K M before P iff P selects command If an operator following an M is fully anticipated, delete that M. i.e. not when P points to arguments e.g. when you point and click

Slide adapted from Newstetter & Martin, Georgia Tech Raskin’s rules KBPHDMRKBPHDMR / Rule 2: Deletion of M’s within cognitive units Rule 3: Deletion of M’s before consecutive terminators If a string of MK’s belongs to a cognitive unit, delete all M’s but the first. If a K is a redundant delimiter, delete the M before it. e.g e.g. )’

Slide adapted from Newstetter & Martin, Georgia Tech Raskin’s rules KBPHDMRKBPHDMR / Rule 4: Deletion of M’s that are terminators of commands Rule 5: Deletion of overlapped M’s If K is a delimiter that follows a constant string, delete the M in front of it. Do not count any M that overlaps an R.

Slide adapted from Newstetter & Martin, Georgia Tech Example 1 KBPHDMRKBPHDMR / Temperature Converter Choose which conversion is desired, then type the temperature and press Enter. Convert F to C. Convert C to F. HPBHKKKKK HMPMBHMKMKMKMKMK HMPBHMKKKKMK Apply Rule 0 Apply Rules 1 and 2 Convert to numbers (.2) =7.15

Slide adapted from Newstetter & Martin, Georgia Tech Example 1 KBPHDMRKBPHDMR / Temperature Converter Choose which conversion is desired, then type the temperature and press Enter. Convert F to C. Convert C to F. HPBHKKKKK HMPMBHMKMKMKMKMK HMPBHMKKKKMK Apply Rule 0 Apply Rules 1 and 2 Convert to numbers (.2) =7.15

Example 2 GUI temperature interface Assume a button for compressing scale Ends up being much slower –16.8 seconds/avg prediction

Using KLM and Information Theory to Design More Efficient Interfaces (Raskin) Armed with knowledge of the minimum information the user has to specify: –Assume inputting 4 digits on average –One more keystroke for C vs. F –Another keystroke for Enter Can we design a more efficient interface?

Using KLM to Make More Efficient Interfaces First Alternative: To convert temperatures, Type in the numeric temperature, Followed by C for Celcius or F for Fahrenheit. The converted Temperature will be displayed. MKKKKMK = 3.7 sec

Using KLM to Make More Efficient Interfaces Second Alternative: –Translates to both simultaneously MKKKK = 2.15 sec C F

Slide adapted from Melody Ivory GOMS in Practice Mouse-driven text editor (KLM) CAD system (KLM) Television control system (NGOMSL) Minimalist documentation (NGOMSL) Telephone assistance operator workstation (CMP-GOMS) –saved about $2 million a year

Drawbacks Assumes an expert user Assumes an error-free usage Overall, very idealized

Slide adapted from Newstetter & Martin, Georgia Tech Fitts’ Law Models movement time for selection tasks The movement time for a well-rehearsed selection task increases as the distance to the target increases decreases as the size of the target increases

Slide adapted from Newstetter & Martin, Georgia Tech Fitts’ Law Time (in msec) = a + b log 2 (D/S+1) where a, b = constants (empirically derived) D = distance S = size ID is Index of Difficulty = log 2 (D/S+1)

Slide adapted from Pourang Irani Fitts’ Law Same ID → Same Difficulty Target 1 Target 2 Time = a + b log 2 (D/S+1)

Slide adapted from Pourang Irani Fitts’ Law Smaller ID → Easier Target 2 Target 1 Time = a + b log 2 (D/S+1)

Slide adapted from Pourang Irani Fitts’ Law Larger ID → Harder Target 2 Target 1 Time = a + b log 2 (D/S+1)

Slide adapted from Pourang Irani Determining Constants for Fitts’ Law To determine a and b design a set of tasks with varying values for D and S (conditions) For each task condition –multiple trials conducted and the time to execute each is recorded and stored electronically for statistical analysis Accuracy is also recorded –either through the x-y coordinates of selection or –through the error rate — the percentage of trials selected with the cursor outside the target

Slide adapted from Pourang Irani A Quiz Designed to Give You Fitts Microsoft Toolbars offer the user the option of displaying a label below each tool. Name at least one reason why labeled tools can be accessed faster. (Assume, for this, that the user knows the tool and does not need the label just simply to identify the tool.)

Slide adapted from Pourang Irani A Quiz Designed to Give You Fitts 1.The label becomes part of the target. The target is therefore bigger. Bigger targets, all else being equal, can always be acccessed faster. Fitt's Law. 2.When labels are not used, the tool icons crowd together.

Slide adapted from Pourang Irani A Quiz Designed to Give You Fitts You have a palette of tools in a graphics application that consists of a matrix of 16x16-pixel icons laid out as a 2x8 array that lies along the left-hand edge of the screen. Without moving the array from the left-hand side of the screen or changing the size of the icons, what steps can you take to decrease the time necessary to access the average tool?

Slide adapted from Pourang Irani A Quiz Designed to Give You Fitts 1.Change the array to 1X16, so all the tools lie along the edge of the screen. 2.Ensure that the user can click on the very first row of pixels along the edge of the screen to select a tool. There should be no buffer zone.

Comparing Evaluation Methods Jeffries et al., 1991

Comparing Evaluation Methods “User Interface Evaluation in the Real World: A Comparison of Four Techniques” (Jeffries et al., CHI 1991) Compared: –Heuristic Evaluation (HE) 4 evaluators, 2 weeks time –Software Guidelines (SG) 3 software engineers, familiar with Unix –Cognitive Walkthrough (CW) 3 software engineers, familiar with Unix –Usability Testing (UT) Usability professional, 6 participants The Interface: –HP-VUE, a GUI for Unix (beta version)

Comparing Evaluation Methods Jeffries et al., CHI ‘91

On a 9 point scale Higher is more critical

Comparing Evaluation Methods Jeffries et al., CHI ‘91

Conclusions: –HE is best from a cost/benefit analysis, but requires access to several experienced designers –Usability testing second best – found recurring, general, and critical errors but is expensive to conduct –Guideline-based evaluators missed a lot but did not realize this They were software engineers, not usability specialists –Cognitive walkthrough process was tedious