Usability Methods: Cognitive Walkthrough & Heuristic Evaluation Dr. Dania Bilal IS 588 Spring 2008 Dr. D. Bilal.

Slides:



Advertisements
Similar presentations
Chapter 15: Analytical evaluation
Advertisements

SIMS 213: User Interface Design & Development Marti Hearst Tues, Feb 25, 2003.
Administrivia  Review Deliverable 2 –Overview (audience) –Excellent additions  User Goals  Usability Goals  User Group (who are you designing for?)
Discount Evaluation Evaluating with experts. Agenda Online collaboration tools Heuristic Evaluation Perform HE on each other’s prototypes Cognitive Walkthrough.
Empirical Methods in Human- Computer Interaction.
COMP6703 : eScience Project III ArtServe on Rubens Emy Elyanee binti Mustapha Supervisor: Peter Stradzins Client: Professor Michael.
Heuristic Evaluation IS 485, Professor Matt Thatcher.
Heuristic Evaluation Evaluating with experts. Discount Evaluation Techniques  Basis: Observing users can be time- consuming and expensive Try to predict.
Evaluating with experts
SIMS 213: User Interface Design & Development Marti Hearst Tues Feb 13, 2001.
Usability 2004 J T Burns1 Usability & Usability Engineering.
HCI revision lecture. Main points Understanding Applying knowledge Knowing key points Knowing relationship between things If you’ve done the group project.
Web 2.0 Testing and Marketing E-engagement capacity enhancement for NGOs HKU ExCEL3.
CSI-553 Internet Information Presented by: Ignacio Castro June 28, 2006 Internet Usability.
System Evaluation: Usability Assessment Dr. Dania Bilal IS 582 Spring 2009.
Usability 2009 J T Burns1 Usability & Usability Engineering.
1. Learning Outcomes At the end of this lecture, you should be able to: –Define the term “Usability Engineering” –Describe the various steps involved.
©2011 1www.id-book.com Analytical evaluation Chapter 15.
Heuristic Evaluation “Discount” Usability Testing Adapted from material by Marti Hearst, Loren Terveen.
Evaluation IMD07101: Introduction to Human Computer Interaction Brian Davison 2010/11.
1 Usability evaluation and testing User interfaces Jaana Holvikivi Metropolia.
Predictive Evaluation
User Interface Evaluation Usability Testing Methods.
Discount Evaluation Evaluating with experts. Discount Evaluation Techniques Basis: – Observing users can be time-consuming and expensive – Try to predict.
Interacting with IT Systems Fundamentals of Information Technology Session 5.
Overview of the rest of the semester Building on Assignment 1 Using iterative prototyping.
Copyright © 2008 Pearson Education, Inc. Publishing as Pearson Addison-Wesley The Resonant Interface HCI Foundations for Interaction Design First Edition.
Object-Oriented Software Engineering Practical Software Development using UML and Java Chapter 7: Focusing on Users and Their Tasks.
Usability testing. Goals & questions focus on how well users perform tasks with the product. – typical users – doing typical tasks. Comparison of products.
Usability Evaluation June 8, Why do we need to do usability evaluation?
Copyright © 2008 Pearson Education, Inc. Publishing as Pearson Addison-Wesley The Resonant Interface HCI Foundations for Interaction Design First Edition.
Overview of the rest of the semester Iteratively design interface to help people log their food intake over the long term.
SEG3120 User Interfaces Design and Implementation
SE: CHAPTER 7 Writing The Program
Usability Testing Chapter 6. Reliability Can you repeat the test?
INTRO TO USABILITY Lecture 12. What is Usability?  Usability addresses the relationship between tools and their users. In order for a tool to be effective,
Y ASER G HANAM Heuristic Evaluation. Roadmap Introduction How it works Advantages Shortcomings Conclusion Exercise.
Chapter 15: Analytical evaluation. Inspections Heuristic evaluation Walkthroughs.
Usability Heuristics Usability Materials Dr. Dania Bilal IS 582 Spring 2007.
Chapter 15: Analytical evaluation Q1, 2. Inspections Heuristic evaluation Walkthroughs Start Q3 Reviewers tend to use guidelines, heuristics and checklists.
Chapter 8 Usability Specification Techniques Hix & Hartson.
Evaluating a UI Design Expert inspection methods Cognitive Walkthrough
Interaction Design: Overview
Usability 1 Usability evaluation Without users - analytical techniques With users - survey and observational techniques.
Usability Engineering Dr. Dania Bilal IS 582 Spring 2006.
Usability Dr. Dania Bilal IS 582 Spring Usability Measures multiple components of the user interfaceMeasures multiple components of the user interface.
Overview and Revision for INFO3315. The exam
Usability Engineering Dr. Dania Bilal IS 592 Spring 2005.
June 5, 2007Mohamad Eid Usability Testing Chapter 8.
Chapter 15: Analytical evaluation. Aims: Describe inspection methods. Show how heuristic evaluation can be adapted to evaluate different products. Explain.
Introduction to Evaluation without Users. Where are you at with readings? Should have read –TCUID, Chapter 4 For Next Week –Two Papers on Heuristics from.
Object-Oriented Software Engineering Practical Software Development using UML and Java Chapter 7: Focusing on Users and Their Tasks.
1 Usability evaluation and testing User interfaces Jaana Holvikivi Metropolia.
Oct 211 The next two weeks Oct 21 & 23: Lectures on user interface evaluation Oct 28: Lecture by Dr. Maurice Masliah No office hours (out of town) Oct.
Usability Engineering Dr. Dania Bilal IS 582 Spring 2007.
5. 2Object-Oriented Analysis and Design with the Unified Process Objectives  Describe the activities of the requirements discipline  Describe the difference.
Usability Engineering Dr. Dania Bilal IS 587 Fall 2007.
Data Analysis: Reporting Findings Dr. Dania Bilal IS588 Spring 2008.
Evaluation / Usability. ImplementDesignAnalysisEvaluateDevelop ADDIE.
1 Usability Analysis n Why Analyze n Types of Usability Analysis n Human Subjects Research n Project 3: Heuristic Evaluation.
PresQT Workshop, Tuesday, May 2, 2017
SIE 515 Design Evaluation Lecture 7.
Human Computer Interaction Lecture 15 Usability Evaluation
Usability engineering
Usability ECE 695 Alexander J. Quinn 3/21/2016.
SY DE 542 User Testing March 7, 2005 R. Chow
Usability Techniques Lecture 13.
Chapter 26 Inspections of the user interface
Evaluation.
COMP444 Human Computer Interaction Usability Engineering
Presentation transcript:

Usability Methods: Cognitive Walkthrough & Heuristic Evaluation Dr. Dania Bilal IS 588 Spring 2008 Dr. D. Bilal

Purposes Measures multiple components of the user interface Addresses relationships between system and its users Bridges the gap between human and machines

Purposes Measures the quality of system design in relation to its intended users Involves several methods, each applied at appropriate time of the design and development process

Usability Attributes As described by Neilsen Learnability Efficiency Memorability Errors & their severity Subjective satisfaction

Learnability System must be easy to learn, especially for novice users Hard to learn systems are usually designed for expert users Learning curve for novice and expert users

Efficiency System should be efficient to use so that once the user has learned how to use it, the user can achieve a high level of productivity Efficiency increases with learning

Memorability System should be easy to remember, especially by casual users No need to learn how to use system all over again after a period of not using it

Errors System should have a low error rate System should provide user with a recovery mechanism Minor errors Major errors

Minor Errors Errors that did not greatly slow down user’s interaction with the system User is able to recover from them through system feedback through awareness of error made

Major Errors Difficult to recover from them Lead to faulty work if high in frequency May not be discovered by the user Errors can be catastrophic

Subjective Satisfaction System should be likeable by users (affective) Satisfaction varies with purpose of system user goals

Assumptions The designer’s best guess is not good enough The user is always right The user is not always right Users are not designers Designers are not users More features are not always better Minor interface details matter Online help does not really help Source: Nielsen, J. (1993). Usability Engineering. San Diego: Morgan Kaufman.

Cognitive Walkthrough Method Involves experts acting on behalf of actual users Characteristics of typical users are identified & documented Tasks focusing on aspects of design to be evaluated are developed

Cognitive Walkthrough Method An observer “experimenter” is present Prepares tasks Takes notes Provides help, etc. Coordinates and overseas final report

Cognitive Walkthrough Method Expert walkthrough interface on each task Expert records problems that user may experience Assumptions about what would cause problems and why are noted Benchmarks may be used for each task

Sample Questions for Walkthrough Will the user know what to do to complete part of or the whole task successfully? Can the user see the button or icon to use for the next action? Can the user find specific subject category from the hierarchy?

Cognitive Walkthrough Each expert documents experience about walkthrough for each task Critical problems documented Problems and what cause them are explained Draft report/notes are compiled and shared with other experts and Experimenter

Debriefing Session Experts and experimenter meet & discuss findings Experimenter shares his/her observational notes with experts Findings include success stories & failure stories, as applicable Consolidated report is generated

Walkthrough Report Include questions experts for each of the tasks and the consolidated answer Use benchmarks and map out the finding for each task See Assignment 4: Usability for additional information on benchmarks

Heuristic Evaluation Evaluators interact with an interface several times and map interface to specific heuristics or guidelines Example: Nielsen’s ten heuristics Each evaluator generates a report Reports are aggregated and final report is generated An observer may be present

Stages of Heuristic Evaluation Stage 1: Debriefing session Experts told what to do Written instructions provided to each expert Heuristics provided to each expert as part of written instructions Verbal instructions may be included

Stages of Heuristic Evaluation Stage 2: Evaluation sessions Each expert tests system based on heuristics Expert may also use specific tasks Two passes are taken through interface First pass: overview and familiarity Second pass: Focus on specific features & identify usability problems

Stages of Heuristic Evaluation Stage 3: Debriefing session Experts meet to discuss outcome and compare findings Experts consolidate findings Experts prioritize usability problems found & suggest solutions

Neilsen’s Heuristics Ten heuristics found at tic_list.html tic_list.html Additional rules, see Text. Some heuristics can be combined under categories and given general description.

Usability Heuristics uation.htmlhttp:// uation.html (how to conduct a heuristic evaluation) (collection of articles) Learning about usability test (Jared Spool) (Severity rating)