PostPC Computing Heuristic Evaluation Prof. Scott Kirkpatrick, HUJI

Slides:



Advertisements
Similar presentations
Chapter 15: Analytical evaluation
Advertisements

Acknowledgements: Most of this course is based on the excellent course offered by Prof. Kellogg Booth at the British Columbia University, Vancouver, Canada.
CS305: HCI in SW Development Continuing Evaluation: Asking Experts Inspections and walkthroughs.
SIMS 213: User Interface Design & Development Marti Hearst Tues, Feb 25, 2003.
Part 4: Evaluation Days 25, 27, 29, 31 Chapter 20: Why evaluate? Chapter 21: Deciding on what to evaluate: the strategy Chapter 22: Planning who, what,
AJ Brush Richard Anderson
Heuristic Evaluation. Sources for today’s lecture: Professor James Landay: stic-evaluation/heuristic-evaluation.ppt.
Heuristic Evaluation IS 485, Professor Matt Thatcher.
Heuristic Evaluation Evaluating with experts. Discount Evaluation Techniques  Basis: Observing users can be time- consuming and expensive Try to predict.
Evaluation Through Expert Analysis U U U
Evaluating with experts
SIMS 213: User Interface Design & Development Marti Hearst Tues Feb 13, 2001.
Heuristic Evaluation.
Usability 2004 J T Burns1 Usability & Usability Engineering.
Evaluation: Inspections, Analytics & Models
Review an existing website Usability in Design. to begin with.. Meeting Organization’s objectives and your Usability goals Meeting User’s Needs Complying.
©2011 1www.id-book.com Analytical evaluation Chapter 15.
Heuristic Evaluation “Discount” Usability Testing Adapted from material by Marti Hearst, Loren Terveen.
1 Usability evaluation and testing User interfaces Jaana Holvikivi Metropolia.
Heuristic Evaluation VINCENT SCHOENMAKERS CARLOS A. TIJERINA IBARRA EDGAR I. VILLANUEVA.
Discount Evaluation Evaluating with experts. Discount Evaluation Techniques Basis: – Observing users can be time-consuming and expensive – Try to predict.
Part 1-Intro; Part 2- Req; Part 3- Design  Chapter 20 Why evaluate the usability of user interface designs?  Chapter 21 Deciding on what you need to.
Usability Expert Review Anna Diubina. What is usability? The effectiveness, efficiency and satisfaction with which specified users achieve specified goals.
Usability Evaluation June 8, Why do we need to do usability evaluation?
Heuristic Evaluation and Discount Usability Engineering Taken from the writings of Jakob Nielsen – inventor of both.
©2011 Elsevier, Inc. Heuristic Evaluation of MealpayPlus website Ruidi Tan Rachel Vilceus Anant Patil Junior Anthony Xi Li Kinberley Seals Niko Maresco.
SEG3120 User Interfaces Design and Implementation
Y ASER G HANAM Heuristic Evaluation. Roadmap Introduction How it works Advantages Shortcomings Conclusion Exercise.
Chapter 15: Analytical evaluation. Inspections Heuristic evaluation Walkthroughs.
Chapter 15: Analytical evaluation Q1, 2. Inspections Heuristic evaluation Walkthroughs Start Q3 Reviewers tend to use guidelines, heuristics and checklists.
Evaluating a UI Design Expert inspection methods Cognitive Walkthrough
Usability 1 Usability evaluation Without users - analytical techniques With users - survey and observational techniques.
June 5, 2007Mohamad Eid Heuristic Evaluation Chapter 9.
Heuristic Evaluation Short tutorial to heuristic evaluation
Chapter 15: Analytical evaluation. Aims: Describe inspection methods. Show how heuristic evaluation can be adapted to evaluate different products. Explain.
CPSC 481 – Week 10 Heuristic Evaluation Sowmya Somanath (based on previous tutorials by Alice Thudt, Jonathan Haber and Brennan Jones)
ParaQ Usability ParaQ Summit II March 14, 2006 Matthew Wong, SNL/CA.
1 Usability evaluation and testing User interfaces Jaana Holvikivi Metropolia.
Oct 211 The next two weeks Oct 21 & 23: Lectures on user interface evaluation Oct 28: Lecture by Dr. Maurice Masliah No office hours (out of town) Oct.
Fall 2002CS/PSY Predictive Evaluation (Evaluation Without Users) Gathering data about usability of a design by a specified group of users for a particular.
Ten Usability Heuristics with Example.. Page 2 Heuristic Evaluation Heuristic evaluation is the most popular of the usability inspection methods. Heuristic.
CS 575 Spring 2012 CSULA Bapa Rao Lecture 6. Agenda for today Review of previous meeting Student Comments Heuristic Evaluation Presentation Team reports.
COGNITIVE WALKTHROUGH Vigneshwar Poojar. The cognitive walkthrough is a formalized way of imagining people’s thoughts and actions when they use an interface.
APPLE MAPS FOR APP Heuristic Evaluation By Rayed Alakhtar.
© 2016 Cognizant. © 2016 Cognizant Introduction PREREQUISITES SCOPE Heuristic evaluation is a discount usability engineering method for quick, cheap,
Asking Users and Experts Li Zhang (Jacey) Yuewei Zhou (Joanna)
Heuristic Evaluation May 4, 2016
Heuristic Evaluation October 26, 2006.
SIE 515 Design Evaluation Lecture 7.
Human Computer Interaction Lecture 15 Usability Evaluation
Imran Hussain University of Management and Technology (UMT)
CS3205: HCI in SW Development Evaluation (Return to…)
Heuristic Evaluation August 5, 2016
Evaluation Techniques 1
Heuristic Evaluation 3 CPSC 481: HCI I Fall 2014
GAN-MVL 2.1 Heuristic Evaluation
Evaluation technique Heuristic Evaluation.
Heuristic Evaluation Jon Kolko Professor, Austin Center for Design.
Software Engineering D7025E
Mockups and Prototypes
Evaluation ECE 695 Alexander J. Quinn March 30, 2018.
Heuristic Evaluation.
Chapter 26 Inspections of the user interface
Evaluation.
Nilesen 10 hueristics.
Miguel Tavares Coimbra
Miguel Tavares Coimbra
Miguel Tavares Coimbra
Evaluation: Inspections, Analytics & Models
Evaluation: Inspections, Analytics, and Models
Presentation transcript:

PostPC Computing Heuristic Evaluation Prof. Scott Kirkpatrick, HUJI Amnon Dekel, Bezalel 12/31/2018 *: Tom Igoe, NYU-IYTP: http://stage.itp.nyu.edu/~tigoe/pcomp/pic/pic-analog.shtml

Heuristic Evaluation A Usability Engineering Technique developed by Jacob Nielson 12/31/2018 *: Tom Igoe, NYU-IYTP: http://stage.itp.nyu.edu/~tigoe/pcomp/pic/pic-analog.shtml

Heuristic Evaluation HE is done by looking at an interface and trying to develop an opinion about what is good and bad about it's interface. Most people probably perform a personal heuristic evaluation based on their own common sense and intuition. HE as described by Nielson is a systematic inspection of a user interface design for usability. HE involves having a small number of evaluators examine the interface and judge its compliance with recognized usability principles. 12/31/2018 *: Tom Igoe, NYU-IYTP: http://stage.itp.nyu.edu/~tigoe/pcomp/pic/pic-analog.shtml

How Many Evaluators are Needed? Nielson shows that any single evaluator will miss most usability problems in an interface. Single evaluators found only 35% of the problems in an interface. Different evaluators tend to find different problems in an interface. By aggregating the results of more than one evaluator it is possible to get much better results. Nielson recommends using no less than 3 evaluators, while 5 evaluators can find close to 75% of the problems in an interface. 12/31/2018 *: Tom Igoe, NYU-IYTP: http://stage.itp.nyu.edu/~tigoe/pcomp/pic/pic-analog.shtml

Performing HE – Method 1 In this method each evaluator performs the inspections alone. This is important in order to ensure unbiased evaluations. Results can be either: Written evaluator reports Observer reports of evaluator recordings. (evaluator talk out loud) Using an observer adds to the overhead involved, but reduces the evaluator workload and can help in making results available fairly quickly. 12/31/2018 *: Tom Igoe, NYU-IYTP: http://stage.itp.nyu.edu/~tigoe/pcomp/pic/pic-analog.shtml

Performing HE – Method 1 Observers can also: Assist evaluators with an unstable interface. Answer evaluator's questions. (Only after the evaluators are clearly in trouble and have commented on the usability problem). In a user test, the observer uses their observation of a user in order to analyze the interface design. In this case the observer is responsible for reaching conclusions as to the interface problems of a design. In a HE, it is the evaluator who is responsible for analyzing the interface, and the observer just needs to record the evaluator's comments. (And help if the need arises under the above constraints). 12/31/2018 *: Tom Igoe, NYU-IYTP: http://stage.itp.nyu.edu/~tigoe/pcomp/pic/pic-analog.shtml

Performing HE – Method 1 In a HE session, the evaluator goes through the interface several times. During their runs, they inspect the design and dialog elements and compare them with a list of recognized usability principles (i.e. Nielson's 10 Usability Heuristics (http://www.useit.com/papers/heuristic/heuristic_list.html) It is generally recommended (Nielson 93, p. 158) that evaluators go through the design at least twice. The first pass allows them to get a feel for the flow of interaction and the general scope of the system. The second pass allows them to focus on specific interface elements, while knowing how they fit in the larger picture. 12/31/2018 *: Tom Igoe, NYU-IYTP: http://stage.itp.nyu.edu/~tigoe/pcomp/pic/pic-analog.shtml

10 Usability Heuristics Visibility of system status Match between system and the real world User control and freedom Consistency and standards Error prevention Recognition rather than recall Flexibility and efficiency of use Aesthetic and minimalist design Help users recognize, diagnose, and recover from errors Help and documentation 12/31/2018 *: Tom Igoe, NYU-IYTP: http://stage.itp.nyu.edu/~tigoe/pcomp/pic/pic-analog.shtml

Performing HE – Helping? If the system is intended to be used as a walk up and use interface, or if the evaluators are domain experts, then it will be possible to let them use the system without any further help. If the system is domain dependent, and the evaluators lack domain expertise, then it will be necessary to help them in using the system. One successful approach has been to supply the evaluators with a typical usage scenario, listing the various steps that a user would have to take to perform a few realistic tasks (Nielson 93, p. 159). 12/31/2018 *: Tom Igoe, NYU-IYTP: http://stage.itp.nyu.edu/~tigoe/pcomp/pic/pic-analog.shtml

Results of HE Once finished, a HE should provide the team with a list of usability problems in the interface. A HE does not in and of itself provide a systematic way to generate fixes. But because the method tries to link each usability problem with an established usability principle, it will often be fairly simple to generate a revised design according to the guidelines provided by the usability principle that was violated. Additionally, many usability problems have obvious fixes once they are identified. 12/31/2018 *: Tom Igoe, NYU-IYTP: http://stage.itp.nyu.edu/~tigoe/pcomp/pic/pic-analog.shtml

Debriefing This can be done after the last evaluation session and can extend the HE method to provide some design advice. Participants: Evaluators, observers, and members of the design team. Method: brainstorming. 12/31/2018 *: Tom Igoe, NYU-IYTP: http://stage.itp.nyu.edu/~tigoe/pcomp/pic/pic-analog.shtml