Acknowledgements: Most of this course is based on the excellent course offered by Prof. Kellogg Booth at the British Columbia University, Vancouver, Canada.

Slides:



Advertisements
Similar presentations
DEVELOPING A METHODOLOGY FOR MS3305 CW2 Some guidance.
Advertisements

Chapter 15: Analytical evaluation
Miguel Tavares Coimbra
Heuristic Evaluation.
CS305: HCI in SW Development Continuing Evaluation: Asking Experts Inspections and walkthroughs.
SIMS 213: User Interface Design & Development Marti Hearst Tues, Feb 25, 2003.
AJ Brush Richard Anderson
Heuristic Evaluation. Sources for today’s lecture: Professor James Landay: stic-evaluation/heuristic-evaluation.ppt.
Discount Evaluation Evaluating with experts. Agenda Online collaboration tools Heuristic Evaluation Perform HE on each other’s prototypes Cognitive Walkthrough.
Heuristics  Basis  Evaluators –Qualifications –Training  Preparation –Scenario  Results –List of problems –Severity –Group synthesis.
Heuristic Evaluation IS 485, Professor Matt Thatcher.
Heuristic Evaluation Evaluating with experts. Discount Evaluation Techniques  Basis: Observing users can be time- consuming and expensive Try to predict.
Evaluation Through Expert Analysis U U U
Evaluating with experts
SIMS 213: User Interface Design & Development Marti Hearst Tues Feb 13, 2001.
Heuristic Evaluation.
Usability 2004 J T Burns1 Usability & Usability Engineering.
Evaluation: Inspections, Analytics & Models
Discount Usability Engineering Marti Hearst (UCB SIMS) SIMS 213, UI Design & Development March 2, 1999.
Evaluation in HCI Angela Kessell Oct. 13, Evaluation Heuristic Evaluation Measuring API Usability Methodology Matters: Doing Research in the Behavioral.
Heuristic evaluation IS 403: User Interface Design Shaun Kane.
©2011 1www.id-book.com Analytical evaluation Chapter 15.
Heuristic Evaluation “Discount” Usability Testing Adapted from material by Marti Hearst, Loren Terveen.
1 Usability evaluation and testing User interfaces Jaana Holvikivi Metropolia.
Predictive Evaluation
Discount Evaluation Evaluating with experts. Discount Evaluation Techniques Basis: – Observing users can be time-consuming and expensive – Try to predict.
INFO3315 Week 4 Personas, Tasks Guidelines, Heuristic Evaluation.
Part 1-Intro; Part 2- Req; Part 3- Design  Chapter 20 Why evaluate the usability of user interface designs?  Chapter 21 Deciding on what you need to.
Multimedia Specification Design and Production 2012 / Semester 1 / week 5 Lecturer: Dr. Nikos Gazepidis
Usability Evaluation June 8, Why do we need to do usability evaluation?
Heuristic Evaluation and Discount Usability Engineering Taken from the writings of Jakob Nielsen – inventor of both.
SEG3120 User Interfaces Design and Implementation
Y ASER G HANAM Heuristic Evaluation. Roadmap Introduction How it works Advantages Shortcomings Conclusion Exercise.
Chapter 15: Analytical evaluation. Inspections Heuristic evaluation Walkthroughs.
1 Lecture 18 chapter 9 evaluation techniques. 2 Evaluation Techniques Evaluation –tests usability and functionality of system –occurs in laboratory, field.
Chapter 15: Analytical evaluation Q1, 2. Inspections Heuristic evaluation Walkthroughs Start Q3 Reviewers tend to use guidelines, heuristics and checklists.
Evaluating a UI Design Expert inspection methods Cognitive Walkthrough
Usability 1 Usability evaluation Without users - analytical techniques With users - survey and observational techniques.
June 5, 2007Mohamad Eid Heuristic Evaluation Chapter 9.
Heuristic Evaluation Short tutorial to heuristic evaluation
Chapter 15: Analytical evaluation. Aims: Describe inspection methods. Show how heuristic evaluation can be adapted to evaluate different products. Explain.
CPSC 481 – Week 10 Heuristic Evaluation Sowmya Somanath (based on previous tutorials by Alice Thudt, Jonathan Haber and Brennan Jones)
Administrivia  Feedback from the mid-term evaluation  Insights from project proposal.
1 Usability evaluation and testing User interfaces Jaana Holvikivi Metropolia.
Oct 211 The next two weeks Oct 21 & 23: Lectures on user interface evaluation Oct 28: Lecture by Dr. Maurice Masliah No office hours (out of town) Oct.
Introduction to Evaluation “Informal” approaches.
Fall 2002CS/PSY Predictive Evaluation (Evaluation Without Users) Gathering data about usability of a design by a specified group of users for a particular.
Ten Usability Heuristics with Example.. Page 2 Heuristic Evaluation Heuristic evaluation is the most popular of the usability inspection methods. Heuristic.
COGNITIVE WALKTHROUGH Vigneshwar Poojar. The cognitive walkthrough is a formalized way of imagining people’s thoughts and actions when they use an interface.
User Interface Evaluation Heuristic Evaluation Lecture #17.
© 2016 Cognizant. © 2016 Cognizant Introduction PREREQUISITES SCOPE Heuristic evaluation is a discount usability engineering method for quick, cheap,
Discount Evaluation User Interface Design. Startup Weekend Wellington CALLING ALL DESIGNERS, DEVELOPERS AND IDEAS FOLK: Startup Weekend returns to Wellington.
Heuristic Evaluation May 4, 2016
SIE 515 Design Evaluation Lecture 7.
Human Computer Interaction Lecture 15 Usability Evaluation
Heuristic Evaluation August 5, 2016
Evaluation Techniques 1
Heuristic Evaluation 3 CPSC 481: HCI I Fall 2014
Miguel Tavares Coimbra
Heuristic Evaluation Jon Kolko Professor, Austin Center for Design.
PostPC Computing Heuristic Evaluation Prof. Scott Kirkpatrick, HUJI
Chapter 26 Inspections of the user interface
Miguel Tavares Coimbra
Evaluation.
Miguel Tavares Coimbra
Miguel Tavares Coimbra
Miguel Tavares Coimbra
Miguel Tavares Coimbra
Miguel Tavares Coimbra
Evaluation: Inspections, Analytics & Models
Presentation transcript:

Acknowledgements: Most of this course is based on the excellent course offered by Prof. Kellogg Booth at the British Columbia University, Vancouver, Canada. Please acknowledge the original source when reusing these slides for academic purposes. Mestrado em Informática Médica SIntS 13/14 – T7 Discount Evaluation Methods Miguel Tavares Coimbra

Summary Discount Evaluation Methods Cognitive Walkthrough Heuristic Evaluation SIntS 13/14 - T7 – Discount Evaluation Methods

Discount usability engineering Cheap (thus ‘discount’) –No special labs or equipment needed –Doesn’t need to involve users directly –the more careful (and informed by users) you are, the better it gets Fast –On order of 1 day to apply –Standard usability testing may take a week Easy to use –Can be taught in 2-4 hours SIntS 13/14 - T7 – Discount Evaluation Methods

Cognitive walkthrough Evaluate “mental model” –Assesses “exploratory learning stage” New users, occasional users –What mental model does the system image facilitate? –Done by non-experts and/or domain experts SIntS 13/14 - T7 – Discount Evaluation Methods

Heuristic Evaluation “Fine tune” system image –Targets broader use range New, occasional and expert users –Fine-tunes the interface –HCI professionals apply a list of heuristics while simulating task execution SIntS 13/14 - T7 – Discount Evaluation Methods

Cognitive Walkthrough SIntS 13/14 - T7 – Discount Evaluation Methods

Cognitive walkthrough What for: assessing how well a new user will be able to figure out the interface Not for: assessing performance at highly skilled, frequently performed tasks; or finding radically new approaches Additional advantages: helps work out task sequence models through observation Disadvantages: limited utility for frequent- use interfaces, narrow focus, relatively time consuming & laborious (compared to HE) SIntS 13/14 - T7 – Discount Evaluation Methods

Cognitive walkthrough Possible outputs: –Loci & sources of confusion, errors, dead ends –Estimates of success rates, error recovery; performance speed less evident –Helps to figure out what activity sequences could or should be What’s required: complete interface description –Horizontal prototype (paper prototype, Balsamiq, etc) Who does it: –anyone – different benefits will accrue from using design team members, naïve users or expert outside analysts. More distance = better! SIntS 13/14 - T7 – Discount Evaluation Methods

How? Roughly: Start with a scenario (a design-specific task) Ask these questions at each step as relevant: –Q1: Will the correct action be evident? –Q2: Will the user recognize the correct action? –Q3: Will the user interpret the result correctly? –Q4: Will the user be able to progress towards goal? SIntS 13/14 - T7 – Discount Evaluation Methods

Suggestion: Follow Norman’s diagram for each task SIntS 13/14 - T6 – Mental Models

Simpler approach User 1User 2User 3 Task 1 Task 2 Task 3 SIntS 13/14 - T7 – Discount Evaluation Methods 1 – User completed the task successfully 2 – User completed the task with difficulty 3 – User did not complete the task

Heuristic Evaluation SIntS 13/14 - T7 – Discount Evaluation Methods

Heuristic evaluation What for: identifying (listing & describing) problems with existing prototypes (any kind of interface) Not for: coming up with radically new solutions Additional advantages: contributes valuable insights from objective observers Disadvantages: –Reinforces existing design - better solutions might exist –Not very repeatable SIntS 13/14 - T7 – Discount Evaluation Methods

Heuristic evaluation What’s required: –A good model of the proposed interface (e.g., at least a paper prototype) –A list of design heuristics to be applied –A scenario (task example + design prototype) Who does it: –Team of 3 to 5 experienced, objective people (“experts”) who aren’t on the design team. General idea: –Independently check compliance with usability principles (“heuristics”) step 1: each evaluator works with interface alone (different evaluators will find different problems) step 2: evaluators aggregate findings afterwards SIntS 13/14 - T7 – Discount Evaluation Methods

One list of heuristics (Nielson, ‘93) H2-1: visibility of system status H2-2: match between system & the real world H2-3: user control & freedom H2-4: consistency and standards H2-5: error prevention H2-6: recognition rather than recall H2-7: flexibility and efficiency of use H2-8: aesthetic and minimalist design H2-9: help users recognize, diagnose & recover f/ errors H2-10: help and documentation SIntS 13/14 - T7 – Discount Evaluation Methods

Step 1: Individual evaluation At least two passes for each evaluator –First to get feel for flow and scope of system –Second to focus on specific elements Each evaluator produces list of problems –Explain problem w/reference to heuristic or other info –Be specific and list each problem separately –Assign rating of severity to each violation SIntS 13/14 - T7 – Discount Evaluation Methods

Severity ratings Each violation is assigned a severity rating Combination of: –Frequency –Impact –Persistence (one time or repeating) Used to: –Allocate resources to fix problems –Estimate need for more usability efforts Done independently by all evaluators SIntS 13/14 - T7 – Discount Evaluation Methods

Severity & extent scales One severity scale (others possible): 0 - don’t agree that this is a usability problem 1 - cosmetic problem 2 - minor usability problem 3 - major usability problem; important to fix 4 - usability catastrophe; imperative to fix One extent scale: 1 = single case 2 = several places 3 = widespread SIntS 13/14 - T7 – Discount Evaluation Methods

Step 2: aggregating results & making recommendations Evaluation team meets and compares results Through discussion and consensus, each violation is documented and categorized in terms of severity, extent Violations are ordered in terms of severity → combined report goes back to design team. SIntS 13/14 - T7 – Discount Evaluation Methods

Heuristic evaluation Advantages –The “minimalist” approach General guidelines can correct for majority of usability problems Easily remembered, easily applied with modest effort “black box”: systematic technique that is reproducible with care. –Discount usability engineering Cheap and fast way to inspect a system Can be done by usability experts and end users Problems: –Principles must be applied intuitively and carefully Can’t be treated as a simple checklist Subtleties involved in their use –Doesn’t necessarily predict users/customers’ overall satisfaction –May not have same “credibility” as user test data A solution: include design team & developers in usability evaluation SIntS 13/14 - T7 – Discount Evaluation Methods

Resources 1.Kellogg S. Booth, Introduction to HCI Methods, University of British Columbia, Canada nt-term/ nt-term/ SIntS 13/14 - T7 – Discount Evaluation Methods