User Interface Evaluation Heuristic Evaluation Lecture #17.

Slides:



Advertisements
Similar presentations
DEVELOPING A METHODOLOGY FOR MS3305 CW2 Some guidance.
Advertisements

Ch 11 Cognitive Walkthroughs and Heuristic Evaluation Yonglei Tao School of Computing and Info Systems GVSU.
11 HCI - Lesson 5.1 Heuristic Inspection (Nielsen’s Heuristics) Prof. Garzotto.
Electronic Communications Usability Primer.
Multimedia and the World Wide Web
MScIT HCI Web GUI design. IBM’s CUA guidelines - taster Design Principles Each principle has supporting implementation techniques. The two design.
Part 4: Evaluation Days 25, 27, 29, 31 Chapter 20: Why evaluate? Chapter 21: Deciding on what to evaluate: the strategy Chapter 22: Planning who, what,
AJ Brush Richard Anderson
Heuristic Evaluation. Sources for today’s lecture: Professor James Landay: stic-evaluation/heuristic-evaluation.ppt.
Interactive Systems Technical Design
Testing your design Without users: With users: Cognitive walkthrough
Today’s class Group Presentation More about principles, guidelines, style guides and standards In-class exercises More about usability Norman’s model of.
Heuristic Evaluation Evaluating with experts. Discount Evaluation Techniques  Basis: Observing users can be time- consuming and expensive Try to predict.
Evaluating with experts
Evaluation techniques Part 1
Heuristic Evaluation of Usability Teppo Räisänen
Review an existing website Usability in Design. to begin with.. Meeting Organization’s objectives and your Usability goals Meeting User’s Needs Complying.
Human-Computer Interaction
1 SKODA-AUTO.CZ prototype evaluation Poznań, 23th of March 2015.
Heuristic evaluation IS 403: User Interface Design Shaun Kane.
Heuristic Evaluation: Hotels.com
1 Usability evaluation and testing User interfaces Jaana Holvikivi Metropolia.
User Centred Design Overview. Human centred design processes for interactive systems, ISO (1999), states: "Human-centred design is an approach to.
Mario Čagalj University of Split 2013/2014. Human-Computer Interaction (HCI)
SAMPLE HEURISTIC EVALUATION FOR 680NEWS.COM Glenn Teneycke.
INFO3315 Week 4 Personas, Tasks Guidelines, Heuristic Evaluation.
CS 4720 Usability and Accessibility CS 4720 – Web & Mobile Systems.
Chapter 26 Inspections of the UI. Heuristic inspection Recommended before but in lieu of user observations Sort of like an expert evaluation Heuristics.
Basic Principles of HCI Lecture Requirements Analysis Establish the goals for the Website from the standpoint of the user and the business. Agree.
Nielsen’s Ten Usability Heuristics
Usability Evaluation/LP Usability: how to judge it.
10 Usability Heuristics for User Interface Design.
Multimedia Specification Design and Production 2012 / Semester 1 / week 5 Lecturer: Dr. Nikos Gazepidis
Usability Evaluation June 8, Why do we need to do usability evaluation?
SEG3120 User Interfaces Design and Implementation
Design 2 (Chapter 5) Conceptual Design Physical Design Evaluation
LZW Compression Grant Friedline Robert Frankeny Thomas Sutcavage.
Y ASER G HANAM Heuristic Evaluation. Roadmap Introduction How it works Advantages Shortcomings Conclusion Exercise.
Mahindra Infotainment System Heuristic Evaluation v1.0 Maya Studios July 6, 2010.
Evaluating a UI Design Expert inspection methods Cognitive Walkthrough
Information Systems and Organisations
 What to “know”? ◦ Goals of information visualization. ◦ About human perceptual capabilities. ◦ About the issues involved in designing visualization for.
Usability 1 Usability evaluation Without users - analytical techniques With users - survey and observational techniques.
Developed by Tim Bell Department of Computer Science and Software Engineering University of Canterbury Human Computer Interaction.
June 5, 2007Mohamad Eid Heuristic Evaluation Chapter 9.
CENG 394 Introduction to HCI Usability Heuristics.
User Interface Evaluation Cognitive Walkthrough Lecture #16.
Heuristic Evaluation Short tutorial to heuristic evaluation
Alan Woolrych My Background Currently – Research & Liaison Officer (DMN) From 1 st January 2003 Usability Researcher with.
CMSC 345, Version 1/11 S. Mitchell 1 Usability and User Interface Design.
Usability Heuristics Avoid common design pitfalls by following principles of good design Nielsen proposes 10 heuristics, others propose more or less. Inspect.
1 Usability evaluation and testing User interfaces Jaana Holvikivi Metropolia.
Basic Elements.  Design is the process of collecting ideas, and aesthetically arranging and implementing them, guided by certain principles for a specific.
Ten Usability Heuristics with Example.. Page 2 Heuristic Evaluation Heuristic evaluation is the most popular of the usability inspection methods. Heuristic.
CS 575 Spring 2012 CSULA Bapa Rao Lecture 6. Agenda for today Review of previous meeting Student Comments Heuristic Evaluation Presentation Team reports.
COGNITIVE WALKTHROUGH Vigneshwar Poojar. The cognitive walkthrough is a formalized way of imagining people’s thoughts and actions when they use an interface.
© 2016 Cognizant. © 2016 Cognizant Introduction PREREQUISITES SCOPE Heuristic evaluation is a discount usability engineering method for quick, cheap,
Ten Usability Heuristics These are ten general principles for user interface design. They are called "heuristics" because they are more in the nature of.
User Interface Design SCMP Special Topic: Software Development
Human Computer Interaction Lecture 15 Usability Evaluation
Human Computer Interaction Slide 2
Heuristic Evaluation 3 CPSC 481: HCI I Fall 2014
A NEW FACE OF THE TECHNICAL COMMUNICATOR – UX IS OUR STRENGTH – WE DESIGN TO WRITE BY CHRIS GANTA © 2016, STC INDIA CHAPTER.
Unit 14 Website Design HND in Computing and Systems Development
Heuristic Evaluation Jon Kolko Professor, Austin Center for Design.
One-timer?. A new face of the technical communicator – UX is our strength – we design to write.
10 Design Principles.
Chapter 26 Inspections of the user interface
Nilesen 10 hueristics.
CSM18 Usability Engineering
Presentation transcript:

User Interface Evaluation Heuristic Evaluation Lecture #17

17 April, 2008Human Computer Interaction Spring 2008, Lecture #17 2 Agenda Evaluation through Expert Analysis –Cognitive walkthrough –Heuristic evaluation –Model-based evaluation –Cognitive dimension of notations

17 April, 2008Human Computer Interaction Spring 2008, Lecture #17 3 Heuristic Evaluation Nielsen et al. [1994] devised a method and treated as the most efficient usability inspection method References: 1.Heuristic Evaluation by Jacob Nielsen in Usability Inspection Methods edited by J. Nielsen, R. L. Mack, John Wiley, New York, Jacob Nielsen’s website:

17 April, 2008Human Computer Interaction Spring 2008, Lecture #17 4 Heuristic Evaluation The method is cost-effective, fast, relatively simple, and flexible approach –Can be performed on a design specification, that is, it can be used for evaluation at an early design –It can also be used on prototypes, short-boards and fully functioning systems

17 April, 2008Human Computer Interaction Spring 2008, Lecture #17 5 HE: Basic Concept Several evaluators evaluate the interface and come up with the potential usability problems It is important that the evaluation be done independently To aid the evaluators in discovering usability problems, Nielsen proposed 10 usability heuristics –A number of these are recognizably derived from the principles of Direct Manipulation by Ben Shneiderman, although they apply to a wide range of different interaction styles –They are called heuristic because they are more in the nature of rules of thumb than specific usability guidelines

17 April, 2008Human Computer Interaction Spring 2008, Lecture #17 6 HE: 10 Usability Heuristics 1.Visibility of system status –The system should always keep users informed about what is going on, through appropriate feedback within reasonable time 2.Match between system and the real world –The system should speak the users' language, with words, phrases and concepts familiar to the user, rather than system-oriented terms. Follow real-world conventions, making information appear in a natural and logical order 3.User control and freedom –Users often choose system functions by mistake and will need a clearly marked "emergency exit" to leave the unwanted state without having to go through an extended dialogue. Support undo and redo

17 April, 2008Human Computer Interaction Spring 2008, Lecture #17 7 HE: 10 Usability Heuristics 4.Consistency and standards –Users should not have to wonder whether different words, situations, or actions mean the same thing. Follow platform conventions 5.Error prevention –Even better than good error messages is a careful design which prevents a problem from occurring in the first place 6.Recognition rather than recall –Make objects, actions, and options visible. The user should not have to remember information from one part of the dialogue to another. Instructions for use of the system should be visible or easily retrievable whenever appropriate

17 April, 2008Human Computer Interaction Spring 2008, Lecture #17 8 HE: 10 Usability Heuristics 7.Flexibility and efficiency of use –Accelerators -- unseen by the novice user -- may often speed up the interaction for the expert user such that the system can cater to both inexperienced and experienced users. Allow users to tailor frequent actions 8.Aesthetic and minimalist design –Dialogues should not contain information which is irrelevant or rarely needed. Every extra unit of information in a dialogue competes with the relevant units of information and diminishes their relative visibility 9.Help users recognize, diagnose, and recover from errors –Error messages should be expressed in plain language (no codes), precisely indicate the problem, and constructively suggest a solution

17 April, 2008Human Computer Interaction Spring 2008, Lecture #17 9 HE: 10 Usability Heuristics 10. Help and documentation –Even though it is better if the system can be used without documentation, it may be necessary to provide help and documentation. Any such information should be easy to search, focused on the user's task, list concrete steps to be carried out, and not be too large Jacob Nielsen originally developed the heuristics for heuristic evaluation in collaboration with Rolf Molich in Nielsen since refined the heuristics based on a factor analysis of 249 usability problems to derive a set of heuristics with maximum explanatory power, resulting in this revised set of heuristics

17 April, 2008Human Computer Interaction Spring 2008, Lecture #17 10 HE: Evaluation Procedure Each evaluator assesses the system and notes violations of any of theses usability heuristic that would indicate any potential usability problem The evaluator also assesses the severity of each usability problem based on four factors 1.How common is the problem 2.How easy is it for the user to overcome 3.Will be a one-off problem or persistent problem 4.How seriously will the problem be perceived

17 April, 2008Human Computer Interaction Spring 2008, Lecture #17 11 HE: Evaluation Procedure All these factors can be combined into an overall severity rating on a scale of 0-4 (Nielsen) 0 = I don’t agree that this is a usability problem at all 1 = Cosmetic problem only; need not be fixed unless extra time is available on project 2 = Minor usability problem; fixing this should be given low priority 3 = Major usability problem; important to fix, so should be given high priority 4 = Usability catastrophe; imperative to fix this before product can be released

17 April, 2008Human Computer Interaction Spring 2008, Lecture #17 12 HE: Number of Evaluators It is obvious that Nielsen's 10 usability heuristics are important features in HE In addition to this, Nielsen argued that number of evaluators required in a HE is another important issue –In general, HE is difficult for a single individual to do because one person will never be able to find all the usability problems in an interface –Research reveals that different people find different usability problems

17 April, 2008Human Computer Interaction Spring 2008, Lecture #17 13 HE: Number of Evaluators

17 April, 2008Human Computer Interaction Spring 2008, Lecture #17 14 HE: Number of Evaluators Experiment by Nielsen A case study where 19 evaluators were used to find 16 usability problems Each row represents one of the 19 evaluators and each column represents one of the 16 usability problems Each square represents whether the problem was detected (black) or not (white) The rows and columns are presented in sorted fashion

17 April, 2008Human Computer Interaction Spring 2008, Lecture #17 15 HE: Number of Evaluators Observation –It is therefore essential that HE involves multiple evaluators (preferably with different background) in order to consider the system from different perspective

17 April, 2008Human Computer Interaction Spring 2008, Lecture #17 16 HE: Number of Evaluators

17 April, 2008Human Computer Interaction Spring 2008, Lecture #17 17 HE: Number of Evaluators Experiment by Nielsen –Number of usability problems in 6 case studies varies from 16 to 50 –Single evaluator found only 35% of the usability problems in the interface –More evaluators used (up to 15) the higher the proportion of usability problems detected –On average, just five evaluators detects almost 75% of the usability problems

17 April, 2008Human Computer Interaction Spring 2008, Lecture #17 18 HE: Number of Evaluators Conclusion –Normally to use three to five evaluators –The exact number of evaluators to use would depend on a cost-benefit analysis –More evaluators should obviously be used in cases where usability is critical or –when large payoffs can be expected due to extensive mission-critical use of a system

17 April, 2008Human Computer Interaction Spring 2008, Lecture #17 19 HE: Nielsen & Landaus Model Quantitative model on the number of usability problem detectable in heuristic evaluation by Nielsen & Landaus [1993] ProblemsFound(i) = Where ProblemsFound(i) = the number of problems found by aggregating reports for i independent evaluators N = the total number of usability problems in the interface K = the proportion of all usability problems found by a single evaluators (in some case studies value of K ranged from 19% to 51% with a mean of 34%)

17 April, 2008Human Computer Interaction Spring 2008, Lecture #17 20

17 April, 2008Human Computer Interaction Spring 2008, Lecture #17 21 Recommended Materials See the course web page (For the presentation slides of the current lecture and other materials) Book Human-Computer Interaction by Alan Dix et al. Pearson-Education, Chapter 9

17 April, 2008Human Computer Interaction Spring 2008, Lecture #17 22