Heuristic Evaluation IS 485, Professor Matt Thatcher.

Slides:



Advertisements
Similar presentations
Chapter 15: Analytical evaluation
Advertisements

Prof. James A. Landay University of Washington CSE 440 USER INTERFACE DESIGN + PROTOTYPING + EVALUATION February 19, 2013 Heuristic Evaluation.
Acknowledgements: Most of this course is based on the excellent course offered by Prof. Kellogg Booth at the British Columbia University, Vancouver, Canada.
Heuristic Evaluation.
Design Reviews. Genres of assessment  Automated: Usability measures computed by software  Empirical: Usability assesses by testing with real users 
Prof. James A. Landay Computer Science Department Stanford University Autumn 2014 HCI+D: USER INTERFACE DESIGN + PROTOTYPING + EVALUATION Heuristic Evaluation.
SIMS 213: User Interface Design & Development Marti Hearst Tues, Feb 25, 2003.
AJ Brush Richard Anderson
Heuristic Evaluation. Sources for today’s lecture: Professor James Landay: stic-evaluation/heuristic-evaluation.ppt.
Usable Privacy and Security Carnegie Mellon University Spring 2007 Cranor/Hong 1 User Studies Methods Feb 01,
1 Heuristic Evaluation. 2 Interface Hall of Shame or Fame? Standard MS calculator on all Win95/98/NT/2000/XP.
Heuristic Evaluation Evaluating with experts. Discount Evaluation Techniques  Basis: Observing users can be time- consuming and expensive Try to predict.
Evaluation Through Expert Analysis U U U
Evaluating with experts
SIMS 213: User Interface Design & Development Marti Hearst Tues Feb 13, 2001.
Heuristic Evaluation John Kelleher. 1 What do you want for your product? Good quality? Inexpensive? Quick to get to the market? Good, cheap, quick: pick.
Heuristic Evaluation.
Hueristic Evaluation. Heuristic Evaluation Developed by Jakob Nielsen Helps find usability problems in a UI design Small set (3-5) of evaluators examine.
Evaluation: Inspections, Analytics & Models
Discount Usability Engineering Marti Hearst (UCB SIMS) SIMS 213, UI Design & Development March 2, 1999.
Usability Testing.
Review an existing website Usability in Design. to begin with.. Meeting Organization’s objectives and your Usability goals Meeting User’s Needs Complying.
Lecture 23: Heuristic Evaluation
©2011 1www.id-book.com Analytical evaluation Chapter 15.
Heuristic Evaluation “Discount” Usability Testing Adapted from material by Marti Hearst, Loren Terveen.
1 Usability evaluation and testing User interfaces Jaana Holvikivi Metropolia.
INFO3315 Week 4 Personas, Tasks Guidelines, Heuristic Evaluation.
Part 1-Intro; Part 2- Req; Part 3- Design  Chapter 20 Why evaluate the usability of user interface designs?  Chapter 21 Deciding on what you need to.
Usability Expert Review Anna Diubina. What is usability? The effectiveness, efficiency and satisfaction with which specified users achieve specified goals.
Usability Evaluation June 8, Why do we need to do usability evaluation?
Heuristic Evaluation and Discount Usability Engineering Taken from the writings of Jakob Nielsen – inventor of both.
SEG3120 User Interfaces Design and Implementation
Prof. James A. Landay University of Washington Autumn 2008 Heuristic Evaluation October 28, 2008.
Y ASER G HANAM Heuristic Evaluation. Roadmap Introduction How it works Advantages Shortcomings Conclusion Exercise.
Chapter 15: Analytical evaluation. Inspections Heuristic evaluation Walkthroughs.
Chapter 15: Analytical evaluation Q1, 2. Inspections Heuristic evaluation Walkthroughs Start Q3 Reviewers tend to use guidelines, heuristics and checklists.
Evaluating a UI Design Expert inspection methods Cognitive Walkthrough
June 5, 2007Mohamad Eid Heuristic Evaluation Chapter 9.
Chapter 15: Analytical evaluation. Aims: Describe inspection methods. Show how heuristic evaluation can be adapted to evaluate different products. Explain.
CPSC 481 – Week 10 Heuristic Evaluation Sowmya Somanath (based on previous tutorials by Alice Thudt, Jonathan Haber and Brennan Jones)
Usability Heuristics Avoid common design pitfalls by following principles of good design Nielsen proposes 10 heuristics, others propose more or less. Inspect.
Prof. James A. Landay University of Washington Autumn 2007 Heuristic Evaluation October 30, 2007.
Ten Usability Heuristics with Example.. Page 2 Heuristic Evaluation Heuristic evaluation is the most popular of the usability inspection methods. Heuristic.
COGNITIVE WALKTHROUGH Vigneshwar Poojar. The cognitive walkthrough is a formalized way of imagining people’s thoughts and actions when they use an interface.
Efficient Techniques for Evaluating UI Designs CSE 403.
© 2016 Cognizant. © 2016 Cognizant Introduction PREREQUISITES SCOPE Heuristic evaluation is a discount usability engineering method for quick, cheap,
University of Washington HCDE 518 & INDE 545 Empirical Evaluation HCDE 518 & INDE 545 Winter 2012 With credit to Jake Wobbrock, Dave Hendry, Andy Ko, Jennifer.
Heuristic Evaluation May 4, 2016
Heuristic Evaluation October 26, 2006.
Human Computer Interaction Lecture 15 Usability Evaluation
Sampath Jayarathna Cal Poly Pomona
Heuristic Evaluation August 5, 2016
Heuristic Evaluation 3 CPSC 481: HCI I Fall 2014
(adapted from Berkeley GUIR)
Professor John Canny Fall /27/04
(adapted from Berkeley GUIR)
Professor John Canny Spring 2006
PostPC Computing Heuristic Evaluation Prof. Scott Kirkpatrick, HUJI
Evaluation ECE 695 Alexander J. Quinn March 30, 2018.
Professor John Canny Fall 2001 Sept 27, 2001
Heuristic Evaluation.
Professor John Canny Spring 2004 Feb 13
Chapter 26 Inspections of the user interface
Evaluation.
Professor John Canny Spring 2003 Feb 19
Miguel Tavares Coimbra
Miguel Tavares Coimbra
SE365 Human Computer Interaction
Miguel Tavares Coimbra
Evaluation: Inspections, Analytics & Models
Presentation transcript:

Heuristic Evaluation IS 485, Professor Matt Thatcher

2 Agenda l Administrivia l Heuristic evaluation

3 Heuristic Evaluation l Helps find usability problems in a UI design l Can perform on working UI or on sketches l Small set (3-5) of evaluators examine UI –each evaluator independently goes through UI several times »inspects various dialogue/design elements »compares with list of usability principles (or heuristics of good interface design) »identify any violations of these heuristics –evaluators only communicate afterwards (i.e., no interaction) and findings are aggregated l Usability principles --> Nielsen’s heuristics l Use violations to redesign / fix problems

4 Heuristics l H2-1: Visibility of system status l H2-2: Match between system and real world l H2-3: User control and freedom l H2-4: Consistency and standards l H2-5: Error prevention l H2-6: Recognition over recall l H2-7: Flexibility and efficiency of use l H2-8: Aesthetic and minimalist design l H2-9: Help users recognize, diagnose, and recover from errors l H2-10: Help and documentation

5 Phases of Heuristic Evaluation 1) Pre-evaluation training –give evaluators list of principles with which to evaluate –give evaluators needed domain knowledge –give evaluators information on the scenario 2) Evaluation –individuals evaluate and then aggregate results 3) Severity rating –determine how severe each problem is (priority) 4) Debriefing –discuss the outcome with design team

6 How to Perform Evaluation l At least two passes for each evaluator –first to get feel for flow and scope of system –second to focus on specific elements l If system is walk-up-and-use or evaluators are domain experts, then no assistance needed –otherwise might supply evaluators with scenarios l Each evaluator produces list of problems –explain why with reference to heuristic or other info. –be specific and list each problem separately

7 Examples l Can’t copy info from one window to another –violates “Recognition Over Recall” (H2-6) –fix: allow copying l Typography uses mix of upper/lower case formats and fonts –violates “Consistency and standards” (H2-4) –slows users down –probably wouldn’t be found by user testing –fix: pick a single format for entire interface

8 Aggregate the Results l Take all the lists and aggregate the results into a single list of violations l Eliminate redundancies and make clarifications l You will end up with the following Problem # [Heuristic Violated] Brief description of the problem found

9 An Example of Aggregated Results Aggregated List of Violations 1. [H2-4 Consistency and Standards] The interface used the string “Save” on the first screen for saving the user’s file, but used the string “Write file” on the second screen. Users may be confused by this different terminology for the same function 2. [H2-5 Error Prevention]...

10 Severity Ratings l Used to allocate resources to fix the most serious problems l Estimates of need for more usability efforts l Combination of –frequency, impact, persistence l Should be calculated after all evals. are in l Should be done independently by all judges

11 Severity Ratings 0 - don’t agree that this is a usability problem 1 - cosmetic problem only 2 - minor usability problem; fixing this should be given low priority 3 - major usability problem; important to fix 4 - usability catastrophe; imperative to fix

12 Example of Severity Ratings Evaluator # 1 1. [H2-4 Consistency and Standards] [Severity 3] The interface used the string “Save” on the first screen for saving the user’s file, but used the string “Write file” on the second screen. Users may be confused by this different terminology for the same function 2. [H2-5 Error Prevention] [Severity 4]... Problem # [Heuristic violated] [Severity rating] Problem description

13 Summary Report 1. [H2-4 Consistency and Standards] [Severity 2.7] The interface used the string “Save” on the first screen for saving the user’s file, but used the string “Write file” on the second screen. Users may be confused by this different terminology for the same function 2. [H2-5 Error Prevention] [Severity 3.3]... Problem # [Heuristic violated] [Average severity] Problem description

14 Debriefing l Conduct with evaluators, observers, and development team members l Discuss general characteristics of UI l Suggest potential improvements to address major usability problems l Add ratings on how hard things are to fix –e.g., technological feasibility, time issues, etc. l Make it a brainstorming session –little criticism until end of session

15 Fix Ratings l Together team should also identify a fix rating for each usability problem identified in the summary report l How much time, resources, and effort would it take to fix each usability problems –programmers and techies are crucial here l Fix the important ones (see severity ratings) l Fix the easy ones (see fix ratings) l Make a decision about the rest

16 Fix Ratings 0 - Very easy to fix; only takes a few minutes 1 - Relatively simple to fix; takes a few hours 2 - Difficult to fix; takes a few days or more 3 - Impossible to fix

17 Final Adjustment Final Report for the Heuristic Evaluation 1. [H2-4 Consistency and Standards] [Severity 2.7] [Fix 1] The interface used the string “Save” on the first screen for saving the user’s file, but used the string “Write file” on the second screen. Users may be confused by this different terminology for the same function 2. [H2-5 Error Prevention] [Severity 3.3] [Fix 0] … Problem # [Heur violated] [Avg severity rating] [Fix rating] Problem description

18 Independent Evaluations Aggregated List of Violations Independent Severity Ratings Summary Report with Avg Severity Ratings (SR) Final HE Report with SR and Fix Ratings

19 Some Summary Statistics l Number of violations for the entire interface l For each heuristic, list the number of violations l For each evaluator, list the % of violations found l For each evaluator and severity rating, give the % total violations of that rating found by that evaluator

20 Summary l Expert reviews are discount usability engineering methods l Heuristic evaluation is very popular –have evaluators go through the UI twice –ask them to see if it complies with heuristics »note where it doesn’t and say why –combine the findings from 3 to 5 evaluators –have evaluators independently rate severity –discuss problems with design team –alternate with user testing

21 TRAVELweather Example