CSM18 Usability Engineering

Slides:



Advertisements
Similar presentations
DEVELOPING A METHODOLOGY FOR MS3305 CW2 Some guidance.
Advertisements

Human Computer Interaction
Human Computer Interaction
Ch 11 Cognitive Walkthroughs and Heuristic Evaluation Yonglei Tao School of Computing and Info Systems GVSU.
11 HCI - Lesson 5.1 Heuristic Inspection (Nielsen’s Heuristics) Prof. Garzotto.
Electronic Communications Usability Primer.
Evaluation Howell Istance. Why Evaluate? n testing whether criteria defining success have been met n discovering user problems n testing whether a usability-related.
MScIT HCI Web GUI design. IBM’s CUA guidelines - taster Design Principles Each principle has supporting implementation techniques. The two design.
Part 4: Evaluation Days 25, 27, 29, 31 Chapter 20: Why evaluate? Chapter 21: Deciding on what to evaluate: the strategy Chapter 22: Planning who, what,
AJ Brush Richard Anderson
4/16/2017 Usability Evaluation Howell Istance 1.
Evaluation Methodologies
Heuristics  Basis  Evaluators –Qualifications –Training  Preparation –Scenario  Results –List of problems –Severity –Group synthesis.
Testing your design Without users: With users: Cognitive walkthrough
Evaluation Through Expert Analysis U U U
Heuristic Evaluation.
Usability 2004 J T Burns1 Usability & Usability Engineering.
Evaluation techniques Part 1
Heuristic Evaluation of Usability Teppo Räisänen
1 SKODA-AUTO.CZ prototype evaluation Poznań, 23th of March 2015.
Heuristic evaluation IS 403: User Interface Design Shaun Kane.
Heuristic Evaluation: Hotels.com
1 Usability evaluation and testing User interfaces Jaana Holvikivi Metropolia.
SAMPLE HEURISTIC EVALUATION FOR 680NEWS.COM Glenn Teneycke.
CS 4720 Usability and Accessibility CS 4720 – Web & Mobile Systems.
Evaluation Techniques Material from Authors of Human Computer Interaction Alan Dix, et al.
Basic Principles of HCI Lecture Requirements Analysis Establish the goals for the Website from the standpoint of the user and the business. Agree.
Nielsen’s Ten Usability Heuristics
Human Computer Interaction
Usability Evaluation/LP Usability: how to judge it.
10 Usability Heuristics for User Interface Design.
Multimedia Specification Design and Production 2012 / Semester 1 / week 5 Lecturer: Dr. Nikos Gazepidis
Usability Evaluation June 8, Why do we need to do usability evaluation?
Evaluation Techniques zEvaluation ytests usability and functionality of system yoccurs in laboratory, field and/or in collaboration with users yevaluates.
New Media Research Methods- Part 1 Planning stage: Using Methods, Data and Tools to explore User’s Experience. How methods relate to research questions?
Design 2 (Chapter 5) Conceptual Design Physical Design Evaluation
LZW Compression Grant Friedline Robert Frankeny Thomas Sutcavage.
Usability 1 Usability evaluation Without users - analytical techniques With users - survey and observational techniques.
Developed by Tim Bell Department of Computer Science and Software Engineering University of Canterbury Human Computer Interaction.
Heuristic Evaluation Short tutorial to heuristic evaluation
Usability Heuristics Avoid common design pitfalls by following principles of good design Nielsen proposes 10 heuristics, others propose more or less. Inspect.
1 Usability evaluation and testing User interfaces Jaana Holvikivi Metropolia.
Basic Elements.  Design is the process of collecting ideas, and aesthetically arranging and implementing them, guided by certain principles for a specific.
Oct 211 The next two weeks Oct 21 & 23: Lectures on user interface evaluation Oct 28: Lecture by Dr. Maurice Masliah No office hours (out of town) Oct.
Ten Usability Heuristics with Example.. Page 2 Heuristic Evaluation Heuristic evaluation is the most popular of the usability inspection methods. Heuristic.
COGNITIVE WALKTHROUGH Vigneshwar Poojar. The cognitive walkthrough is a formalized way of imagining people’s thoughts and actions when they use an interface.
User Interface Evaluation Heuristic Evaluation Lecture #17.
© 2016 Cognizant. © 2016 Cognizant Introduction PREREQUISITES SCOPE Heuristic evaluation is a discount usability engineering method for quick, cheap,
Chapter 9 evaluation techniques. Evaluation Techniques Evaluation – tests usability and functionality of system – occurs in laboratory, field and/or in.
Ten Usability Heuristics These are ten general principles for user interface design. They are called "heuristics" because they are more in the nature of.
Discount Evaluation User Interface Design. Startup Weekend Wellington CALLING ALL DESIGNERS, DEVELOPERS AND IDEAS FOLK: Startup Weekend returns to Wellington.
Human Computer Interaction
User Interface Design SCMP Special Topic: Software Development
SIE 515 Design Evaluation Lecture 7.
Human Computer Interaction Lecture 15 Usability Evaluation
Evaluation Techniques 1
Evaluation techniques
A NEW FACE OF THE TECHNICAL COMMUNICATOR – UX IS OUR STRENGTH – WE DESIGN TO WRITE BY CHRIS GANTA © 2016, STC INDIA CHAPTER.
Unit 14 Website Design HND in Computing and Systems Development
Planning an evaluation
Heuristic Evaluation Jon Kolko Professor, Austin Center for Design.
Evaluation ECE 695 Alexander J. Quinn March 30, 2018.
One-timer?. A new face of the technical communicator – UX is our strength – we design to write.
10 Design Principles.
Chapter 26 Inspections of the user interface
Evaluation.
HCI Evaluation Techniques
Nilesen 10 hueristics.
Formative Evaluation cs3724: HCI.
Evaluation Techniques
Presentation transcript:

CSM18 Usability Engineering Evaluation: test the usability and functionality of an interactive system Goals of an evaluation • assess the extent of the system’s functionality • assess its usability - see 10 heuristics assess the effect & affect of the interface on the user • identify any specific problems with the system or with its use UniS Department of Computing Dr Terry Hinton 17 April, 2019

Evaluation Methods for Interactive Systems Analytical Methods Experimental Methods Observational Methods Query Methods UniS Department of Computing Dr Terry Hinton 17 April, 2019

Evaluation Methods for Interactive Systems Analytical Methods Predict performance based on a model e.g. analysis of cash dispenser based on number of key strokes required, time needed to press a key, time needed to think, time needed to react. Experimental Methods design experiments in the laboratory e.g. speed of recognition of key words dependent on font & colour UniS Department of Computing Dr Terry Hinton 17 April, 2019

Evaluation Methods for Interactive Systems Observational Methods - in the field Users: expert users typical users novice users Query Methods - survey opinions, attitudes, easy, enjoyable also consider contextual issues Interviews Questionnaires UniS Department of Computing Dr Terry Hinton 17 April, 2019

UniS Department of Computing Usability Usability defined: usability = efficiency + effectiveness + enjoyment Can’t compute a usability parameter - J Neilson proposed 10 Usability heuristics see later heuristics - set of rules for solving problems other than by an algorithm (Collins English Dictionary 2nd Ed.) UniS Department of Computing Dr Terry Hinton 17 April, 2019

Experimental methods - in the laboratory design an experiment for laboratory conditions make an hypothesis - testable select your subjects - sample size select the variables - change one at a time statistical measures - time, speed, number of events, number of errors (recoverable, fatal) UniS Department of Computing Dr Terry Hinton 17 April, 2019

Observational techniques - in the field Observe behaviour - arbitrary activity - set the tasks Task analysis Specify a set of tasks gives insight into usability Specify a goal gives insight into cognitive strategy used Record - actions, time, errors etc. UniS Department of Computing Dr Terry Hinton 17 April, 2019

Observational techniques - in the field Verbal Protocol - Think aloud Protocol analysis paper and pencil audio recording video recording computer logging user notebooks Automatic protocol analysis tools Post-event protocol - teach-back or Post-task “walkthroughs” UniS Department of Computing Dr Terry Hinton 17 April, 2019

Query techniques - Attitudinal Data Interviews (see page 35 et seq.) design an interview schedule Questionnaires general open-ended scalar multi-choice UniS Department of Computing Dr Terry Hinton 17 April, 2019

Planning an evaluation Factors to be considered in planning an evaluation purpose - who are the stake holders laboratory vs field studies qualitative vs quantitative measures information provided immediacy of response intrusiveness resources UniS Department of Computing Dr Terry Hinton 17 April, 2019

Ten usability heuristics by J Neilson Visibility of system status system should keep users informed about what is going on Match between system and the real world system should speak users’ language - words, phrases and concepts familiar to the user (rather than system-oriented terms) User control and freedom users often choose system functions by mistake - support undo/redo Consistency and standards follow platform conventions (users shouldn’t have to wonder whether diferent words, situations, or actions mean the same thing. UniS Department of Computing Dr Terry Hinton 17 April, 2019

Ten usability heuristics Error prevention better than error messages Recognition rather than recall make objects, actions, and options visible (users shouldn’t have to remember information) Flexibility and efficiency of use accelerators (unseen by novice users) my speed up interaction for expert users - system allows users to tailor frequent actions Aesthetic and minimalist design simplicity is beauty UniS Department of Computing Dr Terry Hinton 17 April, 2019

Ten usability heuristics Help users recognise, diagnose, and recover from errors Express error messages in plain language (no codes), indicate the problem, and suggest solution Help and documentation Ideally, its better if system can be used without documentation, Most often it is necessary to provide help and documentation. Such information should be easy to search, focused on the user’s task, list concrete steps to be carried out and not be too long. Examples are always helpful. UniS Department of Computing Dr Terry Hinton 17 April, 2019

Questionnaire Design (see page 35 et seq.) A simple checklist yes no Don’t’ know copy paste Example of 6-point rating scale (avoid a middle value) Very useful Of no use UniS Department of Computing Dr Terry Hinton 17 April, 2019

UniS Department of Computing Questionnaire Design An example of a Likert Scale strongly agree agree slightly agree neutral slightly disagree disagree strongly disagree An example of a semantic differential scale extremely slightly neutral slightly extremely easy difficult UniS Department of Computing Dr Terry Hinton 17 April, 2019

UniS Department of Computing Questionnaire Design An example of a ranked order question: Place the following commands in order of usefulness using the numbers 1 to 4, 1 being the most useful. copy paste group clear UniS Department of Computing Dr Terry Hinton 17 April, 2019

Evaluation in the Design Phase Participatory Design - user is involved in the whole design life cycle Number of methods to help convey information between user and designer brainstorming storyboarding workshops pencil & paper exercise role playing UniS Department of Computing Dr Terry Hinton 17 April, 2019

UniS Department of Computing Evaluating the design Cognitive walk through Heuristic evaluation Review based evaluation Model based evaluation UniS Department of Computing Dr Terry Hinton 17 April, 2019

UniS Department of Computing Choosing an evaluation method Ref: Dix, A., Finlay, J.,Abowd, G., Beale, R. (1994) UniS Department of Computing Dr Terry Hinton 17 April, 2019

Choosing an evaluation method UniS Department of Computing Dr Terry Hinton 17 April, 2019

Choosing an evaluation method UniS Department of Computing Dr Terry Hinton 17 April, 2019