Evaluation of User Interface Design 4. Predictive Evaluation continued Different kinds of predictive evaluation: 1.Inspection methods 2.Usage simulations.

Slides:



Advertisements
Similar presentations
DEVELOPING A METHODOLOGY FOR MS3305 CW2 Some guidance.
Advertisements

Chapter 15: Analytical evaluation
Evaluation of User Interface Design
User Modeling CIS 376 Bruce R. Maxim UM-Dearborn.
Task-Centered User Interface Design who are the users? what are the tasks? plagiarize! iterative design –rough descriptions, mock-ups, prototypes test.
Ch 11 Cognitive Walkthroughs and Heuristic Evaluation Yonglei Tao School of Computing and Info Systems GVSU.
11 HCI - Lesson 5.1 Heuristic Inspection (Nielsen’s Heuristics) Prof. Garzotto.
Part 4: Evaluation Days 25, 27, 29, 31 Chapter 20: Why evaluate? Chapter 21: Deciding on what to evaluate: the strategy Chapter 22: Planning who, what,
1 User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept. Cal Poly San Luis Obispo FJK 2009.
Discount Evaluation Evaluating with experts. Agenda Online collaboration tools Heuristic Evaluation Perform HE on each other’s prototypes Cognitive Walkthrough.
Discount Evaluation Evaluating with experts. Agenda Part 4 preview Heuristic Evaluation Perform HE on each other’s prototypes Cognitive Walkthrough Perform.
CSC 205 Software Engineering I 1 Overview - Cognitive Walkthroughs Brief introduction to Human-Computer Interaction Introduction to Cognitive Walkthroughs.
Usability Inspection n Usability inspection is a generic name for a set of methods based on having evaluators inspect or examine usability-related issues.
Evaluation Methodologies
Inspection Methods. Inspection methods Heuristic evaluation Guidelines review Consistency inspections Standards inspections Features inspection Cognitive.
Design and Evaluation of Iterative Systems n For most interactive systems, the ‘design it right first’ approach is not useful. n The 3 basic steps in the.
Heuristic Evaluation Evaluating with experts. Discount Evaluation Techniques  Basis: Observing users can be time- consuming and expensive Try to predict.
Nine principles of design Simple and natural dialog Speak the user’s language Minimize user’s memory load Be consistent Provide feedback Provide clearly.
Evaluating with experts
Testing HCI Usability Testing. Chronological order of testing Individual program units are built and tested (white-box testing / unit testing) Units are.
1 User Centered Design and Evaluation. 2 Overview My evaluation experience Why involve users at all? What is a user-centered approach? Evaluation strategies.
Evaluation techniques Part 1
Evaluation: Inspections, Analytics & Models
Analytical Evaluations 2. Field Studies
Evaluation in HCI Angela Kessell Oct. 13, Evaluation Heuristic Evaluation Measuring API Usability Methodology Matters: Doing Research in the Behavioral.
Usability Methods: Cognitive Walkthrough & Heuristic Evaluation Dr. Dania Bilal IS 588 Spring 2008 Dr. D. Bilal.
©2011 1www.id-book.com Analytical evaluation Chapter 15.
Predictive Evaluation
Discount Evaluation Evaluating with experts. Discount Evaluation Techniques Basis: – Observing users can be time-consuming and expensive – Try to predict.
Ch 14. Testing & modeling users
Q Q Human Computer Interaction – Part 1© 2005 Mohammed Alabdulkareem Human Computer Interaction - 1 Dr. Mohammed Alabdulkareem
Interacting with IT Systems Fundamentals of Information Technology Session 5.
Multimedia Specification Design and Production 2013 / Semester 1 / week 9 Lecturer: Dr. Nikos Gazepidis
Overview of the rest of the semester Building on Assignment 1 Using iterative prototyping.
Human Computer Interaction
Object-Oriented Software Engineering Practical Software Development using UML and Java Chapter 7: Focusing on Users and Their Tasks.
Multimedia Specification Design and Production 2012 / Semester 1 / week 5 Lecturer: Dr. Nikos Gazepidis
Usability Evaluation June 8, Why do we need to do usability evaluation?
Designing Interface Components. Components Navigation components - the user uses these components to give instructions. Input – Components that are used.
Overview of the rest of the semester Iteratively design interface to help people log their food intake over the long term.
GOMS Timing for WIMP interfaces When (fine-grained) speed matters.
SEG3120 User Interfaces Design and Implementation
Testing & modeling users. The aims Describe how to do user testing. Discuss the differences between user testing, usability testing and research experiments.
An Overview of Usability Evaluation #15 1. Outline  What is usability evaluation ?  Why perform usability evaluation ?  Types of usability evaluations.
Design 2 (Chapter 5) Conceptual Design Physical Design Evaluation
Usability Assessment Methods beyond Testing Chapter 7 Evaluating without users.
Chapter 15: Analytical evaluation. Inspections Heuristic evaluation Walkthroughs.
Evaluation of the Advice Generator of an Intelligent Learning Environment Maria Virvou, Katerina Kabassi Department of Informatics University of Piraeus.
Chapter 15: Analytical evaluation Q1, 2. Inspections Heuristic evaluation Walkthroughs Start Q3 Reviewers tend to use guidelines, heuristics and checklists.
Analytical evaluation Prepared by Dr. Nor Azman Ismail Department of Computer Graphics and Multimedia Faculty of Computer Science & Information System.
Evaluating a UI Design Expert inspection methods Cognitive Walkthrough
Usability 1 Usability evaluation Without users - analytical techniques With users - survey and observational techniques.
EVALUATION PROfessional network of Master’s degrees in Informatics as a Second Competence – PROMIS ( TEMPUS FR-TEMPUS-JPCR)
Heuristic Evaluation Short tutorial to heuristic evaluation
Cognitive Walkthrough More evaluating with experts.
Chapter 15: Analytical evaluation. Aims: Describe inspection methods. Show how heuristic evaluation can be adapted to evaluate different products. Explain.
Introduction to Evaluation “Informal” approaches.
Fall 2002CS/PSY Predictive Evaluation (Evaluation Without Users) Gathering data about usability of a design by a specified group of users for a particular.
The Design Process A bluffer’s guide to interface design A ‘proper’ Design process.
Usability Engineering Dr. Dania Bilal IS 587 Fall 2007.
Human Computer Interaction Lecture 15 Usability Evaluation
CIS 376 Bruce R. Maxim UM-Dearborn
Evaluation Techniques 1
SY DE 542 User Testing March 7, 2005 R. Chow
Evaluation techniques
GOMS as a Simulation of Cognition
Evaluation.
HCI Evaluation Techniques
Evaluation: Inspections, Analytics & Models
Presentation transcript:

Evaluation of User Interface Design 4. Predictive Evaluation continued Different kinds of predictive evaluation: 1.Inspection methods 2.Usage simulations 3.Heuristic evaluation 4.Discount usability evaluation 5.Walkthroughs 6.Modelling: The keystroke level model

Evaluation of User Interface Design 4. Predictive Evaluation continued 1.Inspection methods Inspection of technology aspects by specialists who know about both the users and the technology. Usually, the specialists focus on the interaction dialogue between individual users and a system. The main purpose is to generate a list of usability problems.

Evaluation of User Interface Design 4. Predictive Evaluation continued 2. Usage simulations Expert review of the system to find out about possible usability problems. The reviewers simulate the behaviour of less experienced users and try to anticipate their usability problems.

Evaluation of User Interface Design 4. Predictive Evaluation continued 2. Usage simulations continued This is typically seen as very effective, because a small number of reviewers can identify a range of usability problems for which a much greater number of real users would be required otherwise. The expert reviewers can also give quick advice on what to do better.

Evaluation of User Interface Design 4. Predictive Evaluation continued 3. Heuristic evaluation Expert review where the inspection is guided by a set of high-level heuristics, e.g. * Use simple and natural dialogue/minimise user‘s memory load * Provide feedback, shortcuts, good error messages and clearly marked exits * Prevent errors in the first place

Evaluation of User Interface Design 4. Predictive Evaluation continued 4. Discount usability evaluation...is intended for developers with few resources in terms of time, money, expertise, e.g. small companies. The discount usability evaluation is a mixture between empirical usability testing and heuristic evaluation.

Evaluation of User Interface Design 4. Predictive Evaluation continued 4. Discount usability evaluation continued In its first part, this method typically involves the construction of small scenarios (e.g. paper mock-ups, HyperCard simulations) that are tested with informal Think Aloud Protocols. On the basis of these results, the scenario is changed and tested again.

Evaluation of User Interface Design 4. Predictive Evaluation continued 4. Discount usability evaluation continued In its second part, this method typically involves testing the scenario using the heuristic evaluation method that was mentioned in the previous point. Only a small number of reviewers are involved in the Discount usability evaluation.

Evaluation of User Interface Design 4. Predictive Evaluation continued 5. Walkthroughs...are aimed at discovering problems very early on so that they can be removed. They involve carefully defined tasks, e.g. a walk through cognitive and operational activities that are required to get from one screen to the next.

Evaluation of User Interface Design 4. Predictive Evaluation continued 5. Walkthroughs continued First, experts will specify the exact task, the context and any assumptions about the possible users. They then carefully walk through the task by reviewing the necessary actions to achieve the task and by predicting the most likely user behaviour and problems.

Evaluation of User Interface Design 4. Predictive Evaluation continued 5. Walkthroughs continued A Walkthrough resembles a review in many aspects, except that a more detailed prediction of user behaviour is required for the walkthrough.

Evaluation of User Interface Design 4. Predictive Evaluation continued 6. Modelling: The keystroke level model Modelling is somewhat more remote from the actions of real users. It requires a specification of the system‘s functionality and a task analysis (= a list of all the proposed user tasks with a breakdown of each task into its components).

Evaluation of User Interface Design 4. Predictive Evaluation continued 6. Modelling: The keystroke level model continued In this respect, a number of analytic methods have been developed, of which the keystroke level model is just one simple example (though it is one of the most well-known).

Evaluation of User Interface Design 4. Predictive Evaluation continued 6. Modelling: The keystroke level model continued The keystroke level model deals with very short tasks, e.g. single commands, and measures task performance times of expert users to give the designer an idea of the minimum task performance time.

Evaluation of User Interface Design 4. Predictive Evaluation continued 6. Modelling: The keystroke level model continued The keystroke level model does not consider novice users, who are prone to make errors. The model assumes expert error-free performance.

Evaluation of User Interface Design 4. Predictive Evaluation continued 6. Modelling: The keystroke level model continued The keystroke level model consists of physical motor operators: K (keystroke), P (pointing), H (homing), D (drawing). It also consists of a mental operator M (the user) and a response operator R (the system).

Evaluation of User Interface Design 4. Predictive Evaluation continued 6. Modelling: The keystroke level model continued The keystroke level model is quite old and only deals with command line interfaces, so dragging and clicking from today‘s graphical user interfaces were not considered in this model.

Evaluation of User Interface Design 4. Predictive Evaluation continued 6. Modelling: The keystroke level model continued Execution time in the keystroke level model is simply the sum of the time for each operator: T=T(K)+T(P)+T(D)+T(M)+T(H)+(R).

Evaluation of User Interface Design 4. Predictive Evaluation continued 6. Modelling: The keystroke level model continued There is the assumption that the task starts with a homing action when the user places his/her hand on the mouse.

Evaluation of User Interface Design Some final remarks about Evaluation: It is important to carry out evaluation. Not even the best organised and planned design implementation can replace evaluation. Before doing an evaluation we must state which question our evaluation is intended to answer or which hypothesis it is about to test.

Evaluation of User Interface Design Some final remarks about Evaluation: It is generally considered a good idea to run a pilot study first. This is a small study to test the procedures that will be used in the larger evaluation study. The benefits of carrying out a pilot study are that the main evaluation study can be planned better afterwards (e.g. it gives us the chance to practise the evaluation). End of lecture