Usability Engineering Dr. Dania Bilal IS 587 Fall 2007.

Slides:



Advertisements
Similar presentations
Chapter 14: Usability testing and field studies. 2 FJK User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept.
Advertisements

Usability presented by the OSU Libraries’ u-team.
Empirical Methods in Human- Computer Interaction.
COMP6703 : eScience Project III ArtServe on Rubens Emy Elyanee binti Mustapha Supervisor: Peter Stradzins Client: Professor Michael.
Heuristic Evaluation IS 485, Professor Matt Thatcher.
Heuristic Evaluation Evaluating with experts. Discount Evaluation Techniques  Basis: Observing users can be time- consuming and expensive Try to predict.
Evaluating with experts
Usability 2004 J T Burns1 Usability & Usability Engineering.
HCI revision lecture. Main points Understanding Applying knowledge Knowing key points Knowing relationship between things If you’ve done the group project.
Usability Evaluation Methods Computer-Mediated Communication Lab week 10; Tuesday 7/11/06.
Damian Gordon.  Summary and Relevance of topic paper  Definition of Usability Testing ◦ Formal vs. Informal methods of testing  Testing Basics ◦ Five.
Web 2.0 Testing and Marketing E-engagement capacity enhancement for NGOs HKU ExCEL3.
CSI-553 Internet Information Presented by: Ignacio Castro June 28, 2006 Internet Usability.
System Evaluation: Usability Assessment Dr. Dania Bilal IS 582 Spring 2009.
Usability Methods: Cognitive Walkthrough & Heuristic Evaluation Dr. Dania Bilal IS 588 Spring 2008 Dr. D. Bilal.
1. Learning Outcomes At the end of this lecture, you should be able to: –Define the term “Usability Engineering” –Describe the various steps involved.
Heuristic Evaluation “Discount” Usability Testing Adapted from material by Marti Hearst, Loren Terveen.
Evaluation IMD07101: Introduction to Human Computer Interaction Brian Davison 2010/11.
Instructional Design Brian Newberry. Instructional Design Instructional Design is a systematic process for the creation of educational resources.
1 Usability evaluation and testing User interfaces Jaana Holvikivi Metropolia.
Predictive Evaluation
User Interface Evaluation Usability Testing Methods.
Evaluation Framework Prevention vs. Intervention CHONG POH WAN 21 JUNE 2011.
Discount Evaluation Evaluating with experts. Discount Evaluation Techniques Basis: – Observing users can be time-consuming and expensive – Try to predict.
Lecture 9 Usability of Health Informatics Applications (Chapter 9)
Interacting with IT Systems Fundamentals of Information Technology Session 5.
Multimedia Specification Design and Production 2013 / Semester 1 / week 9 Lecturer: Dr. Nikos Gazepidis
Overview of the rest of the semester Building on Assignment 1 Using iterative prototyping.
Copyright © 2008 Pearson Education, Inc. Publishing as Pearson Addison-Wesley The Resonant Interface HCI Foundations for Interaction Design First Edition.
Object-Oriented Software Engineering Practical Software Development using UML and Java Chapter 7: Focusing on Users and Their Tasks.
Usability testing. Goals & questions focus on how well users perform tasks with the product. – typical users – doing typical tasks. Comparison of products.
Usability Evaluation June 8, Why do we need to do usability evaluation?
Copyright © 2008 Pearson Education, Inc. Publishing as Pearson Addison-Wesley The Resonant Interface HCI Foundations for Interaction Design First Edition.
Copyright © 2008 Pearson Education, Inc. Publishing as Pearson Addison-Wesley Lecturer – Prof Jim Warren Lecture 4 - Usability Testing Based on Heim, Chapter.
Usability Testing Chapter 6. Reliability Can you repeat the test?
COMP5047 Pervasive Computing: 2012 Think-aloud usability experiments or concurrent verbal accounts Judy Kay CHAI: Computer human adapted interaction research.
Systems Life Cycle. Know why it is necessary to evaluate a new system Understand the need to evaluate in terms of ease-of- use, appropriateness and efficiency.
Y ASER G HANAM Heuristic Evaluation. Roadmap Introduction How it works Advantages Shortcomings Conclusion Exercise.
COMPSCI 345 / SOFTENG 350 Review for mid-semester test AProf Beryl Plimmer.
Chapter 15: Analytical evaluation. Inspections Heuristic evaluation Walkthroughs.
Usability Heuristics Usability Materials Dr. Dania Bilal IS 582 Spring 2007.
Chapter 15: Analytical evaluation Q1, 2. Inspections Heuristic evaluation Walkthroughs Start Q3 Reviewers tend to use guidelines, heuristics and checklists.
Chapter 8 Usability Specification Techniques Hix & Hartson.
Evaluating a UI Design Expert inspection methods Cognitive Walkthrough
Interaction Design: Overview
Usability 1 Usability evaluation Without users - analytical techniques With users - survey and observational techniques.
Usability Engineering Dr. Dania Bilal IS 582 Spring 2006.
Usability Dr. Dania Bilal IS 582 Spring Usability Measures multiple components of the user interfaceMeasures multiple components of the user interface.
Overview and Revision for INFO3315. The exam
Usability Engineering Dr. Dania Bilal IS 592 Spring 2005.
June 5, 2007Mohamad Eid Usability Testing Chapter 8.
Cognitive Walkthrough More evaluating with experts.
Chapter 15: Analytical evaluation. Aims: Describe inspection methods. Show how heuristic evaluation can be adapted to evaluate different products. Explain.
Introduction to Evaluation without Users. Where are you at with readings? Should have read –TCUID, Chapter 4 For Next Week –Two Papers on Heuristics from.
Object-Oriented Software Engineering Practical Software Development using UML and Java Chapter 7: Focusing on Users and Their Tasks.
Oct 211 The next two weeks Oct 21 & 23: Lectures on user interface evaluation Oct 28: Lecture by Dr. Maurice Masliah No office hours (out of town) Oct.
Fall 2002CS/PSY Predictive Evaluation (Evaluation Without Users) Gathering data about usability of a design by a specified group of users for a particular.
Usability Engineering Dr. Dania Bilal IS 582 Spring 2007.
5. 2Object-Oriented Analysis and Design with the Unified Process Objectives  Describe the activities of the requirements discipline  Describe the difference.
Evaluation / Usability. ImplementDesignAnalysisEvaluateDevelop ADDIE.
1 Usability Analysis n Why Analyze n Types of Usability Analysis n Human Subjects Research n Project 3: Heuristic Evaluation.
SIE 515 Design Evaluation Lecture 7.
Human Computer Interaction Lecture 15 Usability Evaluation
Usability engineering
SY DE 542 User Testing March 7, 2005 R. Chow
Usability Techniques Lecture 13.
Chapter 26 Inspections of the user interface
Evaluation.
COMP444 Human Computer Interaction Usability Engineering
HCI Evaluation Techniques
Presentation transcript:

Usability Engineering Dr. Dania Bilal IS 587 Fall 2007

Purposes  Measure multiple components of the user interface  Address relationships between system and its users  Bridge the gap between human and machines

Purposes  Measure the quality of system design in relation to its intended users  Assess the user’s experience  Improve the user’s experience with the system Suggest system design improvements in terms of appearance, navigation, content, etc. to meet user’s need

Usability Attributes  As described by Neilsen Learnability Efficiency Memorability Errors & their severity Subjective satisfaction

Learnability  System must be easy to learn, especially for novice users Hard to learn  systems are usually designed for expert users Learning curve for novice and expert users

Efficiency  System should be efficient to use so that once the user has learned how to use it, the user can achieve a high level of productivity Efficiency increases with learning Learning curve

Memorability  System should be easy to remember, especially by casual users No need to learn how to use system all over again after a period of not using it

Errors  System should have a low error rate  System should provide user with a recovery mechanism Minor errors Major errors

Minor Errors  Errors that did not greatly slow down user’s interaction with the system  User is able to recover from them through system feedback through awareness of error made

Major Errors  Difficult to recover from them  Lead to faulty work if high in frequency  May not be discovered by the user Errors can be catastrophic

Subjective Satisfaction  System should be likeable by users (affective)  Satisfaction varies with purpose of system user goals

Assumptions  The designer’s best guess is not good enough  The user is always right  The user is not always right  Users are not designers  Designers are not users  More features are not always better  Minor interface details matter  Online help does not really help Source: Nielsen, J. (1993). Usability Engineering. San Diego: Morgan Kaufman.

Methods  Several methods are used for usability  Each method is applied at an appropriate time of the design and development process  Usability is also performed after the system is implemented to test the user’s experience.

HCI Techniques  Observe user  Gather user opinion  Gather expert opinion  Test user performance  Model user performance  Use mixed method (2 or more techniques)

Cognitive Walkthrough  Involves experts acting on behalf of actual users  Characteristics of typical users are identified & documented  Tasks focusing on aspects of design to be evaluated are developed

Cognitive Walkthrough  An observer “experimenter” is present Prepares tasks Takes notes, Provides help, etc. Coordinates and overseas final report

Cognitive Walkthrough  Expert walkthrough interface on each task  Expert records problems that user may experience  Assumptions about what would cause problems and why are noted  Benchmarks may be used for each task

Sample Questions for Walkthrough  Will the user know what to do to complete part of or whole task successfully?  Can user see button or icon to use for next action?  Can user find specific subject category from the hierarchy?

Cognitive Walkthrough  Each expert documents experience about walkthrough for each task Critical problems documented Problems and what cause them are explained Draft report/notes are compiled and shared with other experts and Experimenter

Debriefing Session  Experts and experimenter meet & discuss findings  Experimenter shares his/her observational notes with experts  Findings include success stories & failure stories, as applicable  Consolidated report is generated

Walkthrough Report  Include questions experts for each of the tasks and the consolidated answer  Use benchmarks and map out the finding for each task

Heuristic Evaluation  Evaluators interact with an interface several times and map interface to specific heuristics or guidelines Example: Nielsen’s ten heuristics  Each evaluator generates a report  Reports are aggregated and final report is generated  An observer may be present

Stages of Heuristic Evaluation  Stage 1: Debriefing session Experts told what to do Written instructions provided to each expert Heuristics provided to each expert as part of written instructions Verbal instructions may be included

Stages of Heuristic Evaluation  Stage 2: Evaluation sessions Each expert tests system based on heuristics Expert may also use specific tasks Two passes are taken through interface  First pass: overview and familiarity  Second pass: Focus on specific features & identify usability problems

Stages of Heuristic Evaluation  Stage 3: Debriefing session Experts meet to discuss outcome and compare findings Experts consolidate findings Experts prioritize usability problems found & suggest solutions

Neilsen’s Heuristics  Ten heuristics found at stic_list.html stic_list.html  Some heuristics can be combined under specific categories and given a general description.

User Testing with Experimentation  Used to test usability of a product by intended user population  Systematic approach to evaluate user performance  Used to improve usability design

User Testing Requirements  Goals  Selection of participants How many is appropriate?  Development of tasks Type of tasks  Assigned or imposed  Self-generated  Semi assigned or imposed

User Testing Procedures  Development of procedures Script to greet participants Script to explain procedure Script to introduce/describe tasks Script to direct users to think aloud Script to ask questions after task completion

Data and the User’s Experience  Data collection Varies with method used  Data Analysis  Findings

Usability Sources   _evaluation.html (how to conduct a heuristic evaluation) _evaluation.html  (collection of articles)  n/ Learning about usability test (Jared Spool) n/  ating.html (Severity rating) ating.html