Heuristic Evaluation Jon Kolko Professor, Austin Center for Design.

Slides:



Advertisements
Similar presentations
DEVELOPING A METHODOLOGY FOR MS3305 CW2 Some guidance.
Advertisements

Member FINRA/SIPCThursday, November 12, 2009 Resource Menu Changes - Report User Experience Study | Kevin Cornwall.
EST/CSE/ISE 323 Spring 2011 Tony Scarlatos
Ch 11 Cognitive Walkthroughs and Heuristic Evaluation Yonglei Tao School of Computing and Info Systems GVSU.
11 HCI - Lesson 5.1 Heuristic Inspection (Nielsen’s Heuristics) Prof. Garzotto.
Electronic Communications Usability Primer.
Multimedia and the World Wide Web
MScIT HCI Web GUI design. IBM’s CUA guidelines - taster Design Principles Each principle has supporting implementation techniques. The two design.
Part 4: Evaluation Days 25, 27, 29, 31 Chapter 20: Why evaluate? Chapter 21: Deciding on what to evaluate: the strategy Chapter 22: Planning who, what,
AJ Brush Richard Anderson
Interactive Systems Technical Design
Testing your design Without users: With users: Cognitive walkthrough
Today’s class Group Presentation More about principles, guidelines, style guides and standards In-class exercises More about usability Norman’s model of.
Evaluation Through Expert Analysis U U U
Heuristic Evaluation.
Evaluation techniques Part 1
Heuristic Evaluation of Usability Teppo Räisänen
Human-Computer Interaction
1 SKODA-AUTO.CZ prototype evaluation Poznań, 23th of March 2015.
Heuristic evaluation IS 403: User Interface Design Shaun Kane.
Heuristic Evaluation: Hotels.com
User Centred Design Overview. Human centred design processes for interactive systems, ISO (1999), states: "Human-centred design is an approach to.
Mario Čagalj University of Split 2013/2014. Human-Computer Interaction (HCI)
SAMPLE HEURISTIC EVALUATION FOR 680NEWS.COM Glenn Teneycke.
INFO3315 Week 4 Personas, Tasks Guidelines, Heuristic Evaluation.
CS 4720 Usability and Accessibility CS 4720 – Web & Mobile Systems.
Basic Principles of HCI Lecture Requirements Analysis Establish the goals for the Website from the standpoint of the user and the business. Agree.
Nielsen’s Ten Usability Heuristics
Usability Evaluation/LP Usability: how to judge it.
10 Usability Heuristics for User Interface Design.
Multimedia Specification Design and Production 2012 / Semester 1 / week 5 Lecturer: Dr. Nikos Gazepidis
Usability Evaluation June 8, Why do we need to do usability evaluation?
SEG3120 User Interfaces Design and Implementation
Design 2 (Chapter 5) Conceptual Design Physical Design Evaluation
LZW Compression Grant Friedline Robert Frankeny Thomas Sutcavage.
Mahindra Infotainment System Heuristic Evaluation v1.0 Maya Studios July 6, 2010.
Evaluating a UI Design Expert inspection methods Cognitive Walkthrough
Information Systems and Organisations
 What to “know”? ◦ Goals of information visualization. ◦ About human perceptual capabilities. ◦ About the issues involved in designing visualization for.
Usability 1 Usability evaluation Without users - analytical techniques With users - survey and observational techniques.
Developed by Tim Bell Department of Computer Science and Software Engineering University of Canterbury Human Computer Interaction.
June 5, 2007Mohamad Eid Heuristic Evaluation Chapter 9.
CENG 394 Introduction to HCI Usability Heuristics.
Heuristic Evaluation Short tutorial to heuristic evaluation
Alan Woolrych My Background Currently – Research & Liaison Officer (DMN) From 1 st January 2003 Usability Researcher with.
CMSC 345, Version 1/11 S. Mitchell 1 Usability and User Interface Design.
Usability Heuristics Avoid common design pitfalls by following principles of good design Nielsen proposes 10 heuristics, others propose more or less. Inspect.
Basic Elements.  Design is the process of collecting ideas, and aesthetically arranging and implementing them, guided by certain principles for a specific.
Ten Usability Heuristics with Example.. Page 2 Heuristic Evaluation Heuristic evaluation is the most popular of the usability inspection methods. Heuristic.
CS 575 Spring 2012 CSULA Bapa Rao Lecture 6. Agenda for today Review of previous meeting Student Comments Heuristic Evaluation Presentation Team reports.
COGNITIVE WALKTHROUGH Vigneshwar Poojar. The cognitive walkthrough is a formalized way of imagining people’s thoughts and actions when they use an interface.
User Interface Evaluation Heuristic Evaluation Lecture #17.
© 2016 Cognizant. © 2016 Cognizant Introduction PREREQUISITES SCOPE Heuristic evaluation is a discount usability engineering method for quick, cheap,
Ten Usability Heuristics These are ten general principles for user interface design. They are called "heuristics" because they are more in the nature of.
Discount Evaluation User Interface Design. Startup Weekend Wellington CALLING ALL DESIGNERS, DEVELOPERS AND IDEAS FOLK: Startup Weekend returns to Wellington.
User Interface Design SCMP Special Topic: Software Development
Human Computer Interaction Lecture 15 Usability Evaluation
Human Computer Interaction Slide 2
Heuristic Evaluation 3 CPSC 481: HCI I Fall 2014
Some Usability Engineering Methods
A NEW FACE OF THE TECHNICAL COMMUNICATOR – UX IS OUR STRENGTH – WE DESIGN TO WRITE BY CHRIS GANTA © 2016, STC INDIA CHAPTER.
Unit 14 Website Design HND in Computing and Systems Development
One-timer?. A new face of the technical communicator – UX is our strength – we design to write.
10 Design Principles.
Cognitive Walkthrough
Chapter 26 Inspections of the user interface
Nilesen 10 hueristics.
CSM18 Usability Engineering
User Interface Design SCMP 368: Software Development Spring 2018
Some Usability Engineering Methods
Presentation transcript:

Heuristic Evaluation Jon Kolko Professor, Austin Center for Design

Heuristic Evaluation Compare an interface to an established list of heuristics – best practices – to identify usability problems. 2

Heuristic Evaluation Compare an interface to an established list of heuristics – best practices – to identify usability problems. A HEURISTIC/ Is defined by a person or a group of people Is deemed to be a “good principle” to follow Is recognized by others as a “good principle” Is not a hard/fast rule Is not always right 3

4

Heuristic Evaluation: The Basics On your own, compare an interface to a list of heuristics Determine which heuristics are violated by the interface As a group, combine these lists into a more exhaustive listing Write a report and include redesign suggestions 5

Heuristic Evaluation: The Heuristics Visibility of system status Match between system and the real world User control and freedom Consistency and standards Error prevention Recognition rather than recall Flexibility and efficiency of use Aesthetic and minimalist design Help users recognize, diagnose and recover from errors Help and documentation 6

Heuristic Evaluation: The Heuristics Visibility of system status “The system should always keep users informed about what is going on, through appropriate feedback within reasonable time” 7

Heuristic Evaluation: The Heuristics 2. Match between system and the real world “The system should speak the users’ language, with words, phrases, and concepts familiar to the user, rather than system-oriented terms. Follow real-world conventions, making information appear in a natural and logical order” 8

Heuristic Evaluation: The Heuristics 3. User control and freedom “Users often choose system functions by mistake and will need a clearly marked ‘emergency exit’ to leave the unwanted state without having to go through an extended dialogue. Support undo and redo.” 9

Heuristic Evaluation: The Heuristics 4. Consistency & Standards “Users should not have to wonder whether different words, situations, or actions mean the same thing. Follow software/hardware platform conventions.” 10

Heuristic Evaluation: The Heuristics 5. Error Prevention “Even better than good error messages is a careful design which prevents a problem from occurring in the first place” 11

Heuristic Evaluation: The Heuristics 6. Recognition Rather Than Recall “Make objects, actions and options visible. The user should not have to remember information from one part of the dialogue to another. Instructions for use of the system should be visible or easily retrievable whenever appropriate.” 12

Heuristic Evaluation: The Heuristics 7. Flexibility and Efficiency Of Use “Accelerators – unseen by the novice user – may often speed up the interaction for the expert user to such an extent that the system can cater to both inexperienced and experienced users. Allow users to tailor frequent actions.” 13

Heuristic Evaluation: The Heuristics 8. Aesthetic and Minimalist Design “Dialogues should not contain information which is irrelevant or rarely needed. Every extra unit of information in a dialogue competes with the relevant units of information and diminishes their relative visibility” 14

Heuristic Evaluation: The Heuristics 9. Help users Recognize, Diagnose and Recover from Errors “Error messages should be expressed in plain language (no codes), precisely indicate the problem, and constructively suggest a solution” 15

Heuristic Evaluation: The Heuristics 10. Help and Documentation “Even though it is better if the system can be used without documentation, it may be necessary to provide help and documentation. Any such information should be easy to search, focused on the user’s task, list concrete steps to be carried out, and not be too large” 16

Heuristic Evaluation: Evaluators No users required in Heuristic Evaluation. Instead, do the initial evaluation yourself, and then combine your results with a team. Different evaluators find different problems, with diminishing returns after 5-6 evaluators: 17

Heuristic Evaluation: How to Do It Get prepared. You need a prototype. It can be any level of fidelity. You need some expert evaluators. Print every screen of your interface. Walk through each control area on the screen, screen by screen, and identify areas where it conflicts with any heuristic. You will probably find multiple errors per screen. Document the problem area, noting specifically which screen(s) are addressed and which controls are problematic. Err on the side of ‘too many’ instead of ‘too few’. 18

Logging Heuristic Evaluation Findings Unique Identifier Unique Screen ID Problem Description Evidence – Heuristic Violated Severity (1-5, 5 is high) Frequency (1-5, 5 is common) Proposed Solution JK_1 1.4.A Required fields during signup are not obvious Visibility of System Status 3 4 Indicate required fields with a red asterisk (*) JK_2 1.4.B Errors during form validation are not uniquely defined Help users Recognize, Diagnose and Recover from Errors Use a verbose description of changes that have to occur when alerting the user of an error Use a unique identifier that combines the initials of the evaluator with a running number tally (JK = Jon Kolko, 1 = incident number one) Identify the heuristic that is violated Define a severity and frequency rating to indicate the relative impact of the critical incident and the number of times this incident is likely to be identified by a user 19

Heuristic Evaluation Compare an interface to an established list of heuristics – best practices – to identify usability problems. A HEURISTIC/ Is defined by a person or a group of people Is deemed to be “good principles” to follow Is recognized by others as “good principles” Is not a hard/fast rule Is not always right RUNNING A HEURISTIC EVALUATION/ Prepare a prototype Compare the interface on each screen to the 10 heuristics Identify problems, both alone and in a group Catalog the problems in a spreadsheet Propose changes to the problems Prepare a presentation that describes the top findings 20