Presentation is loading. Please wait.

Presentation is loading. Please wait.

Usability Engineering Dr. Dania Bilal IS 587 Fall 2007.

Similar presentations


Presentation on theme: "Usability Engineering Dr. Dania Bilal IS 587 Fall 2007."— Presentation transcript:

1 Usability Engineering Dr. Dania Bilal IS 587 Fall 2007

2 Purposes  Measure multiple components of the user interface  Address relationships between system and its users  Bridge the gap between human and machines

3 Purposes  Measure the quality of system design in relation to its intended users  Assess the user’s experience  Improve the user’s experience with the system Suggest system design improvements in terms of appearance, navigation, content, etc. to meet user’s need

4 Usability Attributes  As described by Neilsen Learnability Efficiency Memorability Errors & their severity Subjective satisfaction

5 Learnability  System must be easy to learn, especially for novice users Hard to learn  systems are usually designed for expert users Learning curve for novice and expert users

6 Efficiency  System should be efficient to use so that once the user has learned how to use it, the user can achieve a high level of productivity Efficiency increases with learning Learning curve

7 Memorability  System should be easy to remember, especially by casual users No need to learn how to use system all over again after a period of not using it

8 Errors  System should have a low error rate  System should provide user with a recovery mechanism Minor errors Major errors

9 Minor Errors  Errors that did not greatly slow down user’s interaction with the system  User is able to recover from them through system feedback through awareness of error made

10 Major Errors  Difficult to recover from them  Lead to faulty work if high in frequency  May not be discovered by the user Errors can be catastrophic

11 Subjective Satisfaction  System should be likeable by users (affective)  Satisfaction varies with purpose of system user goals

12 Assumptions  The designer’s best guess is not good enough  The user is always right  The user is not always right  Users are not designers  Designers are not users  More features are not always better  Minor interface details matter  Online help does not really help Source: Nielsen, J. (1993). Usability Engineering. San Diego: Morgan Kaufman.

13 Methods  Several methods are used for usability  Each method is applied at an appropriate time of the design and development process  Usability is also performed after the system is implemented to test the user’s experience.

14 HCI Techniques  Observe user  Gather user opinion  Gather expert opinion  Test user performance  Model user performance  Use mixed method (2 or more techniques)

15 Cognitive Walkthrough  Involves experts acting on behalf of actual users  Characteristics of typical users are identified & documented  Tasks focusing on aspects of design to be evaluated are developed

16 Cognitive Walkthrough  An observer “experimenter” is present Prepares tasks Takes notes, Provides help, etc. Coordinates and overseas final report

17 Cognitive Walkthrough  Expert walkthrough interface on each task  Expert records problems that user may experience  Assumptions about what would cause problems and why are noted  Benchmarks may be used for each task

18 Sample Questions for Walkthrough  Will the user know what to do to complete part of or whole task successfully?  Can user see button or icon to use for next action?  Can user find specific subject category from the hierarchy?

19 Cognitive Walkthrough  Each expert documents experience about walkthrough for each task Critical problems documented Problems and what cause them are explained Draft report/notes are compiled and shared with other experts and Experimenter

20 Debriefing Session  Experts and experimenter meet & discuss findings  Experimenter shares his/her observational notes with experts  Findings include success stories & failure stories, as applicable  Consolidated report is generated

21 Walkthrough Report  Include questions experts for each of the tasks and the consolidated answer  Use benchmarks and map out the finding for each task

22 Heuristic Evaluation  Evaluators interact with an interface several times and map interface to specific heuristics or guidelines Example: Nielsen’s ten heuristics  Each evaluator generates a report  Reports are aggregated and final report is generated  An observer may be present

23 Stages of Heuristic Evaluation  Stage 1: Debriefing session Experts told what to do Written instructions provided to each expert Heuristics provided to each expert as part of written instructions Verbal instructions may be included

24 Stages of Heuristic Evaluation  Stage 2: Evaluation sessions Each expert tests system based on heuristics Expert may also use specific tasks Two passes are taken through interface  First pass: overview and familiarity  Second pass: Focus on specific features & identify usability problems

25 Stages of Heuristic Evaluation  Stage 3: Debriefing session Experts meet to discuss outcome and compare findings Experts consolidate findings Experts prioritize usability problems found & suggest solutions

26 Neilsen’s Heuristics  Ten heuristics found at http://www.useit.com/papers/heuristic/heuri stic_list.html http://www.useit.com/papers/heuristic/heuri stic_list.html  Some heuristics can be combined under specific categories and given a general description.

27 User Testing with Experimentation  Used to test usability of a product by intended user population  Systematic approach to evaluate user performance  Used to improve usability design

28 User Testing Requirements  Goals  Selection of participants How many is appropriate?  Development of tasks Type of tasks  Assigned or imposed  Self-generated  Semi assigned or imposed

29 User Testing Procedures  Development of procedures Script to greet participants Script to explain procedure Script to introduce/describe tasks Script to direct users to think aloud Script to ask questions after task completion

30 Data and the User’s Experience  Data collection Varies with method used  Data Analysis  Findings

31 Usability Sources  http://www.usabilityfirst.com/methods http://www.usabilityfirst.com/methods  http://www.useit.com/papers/heuristic/heuristic _evaluation.html (how to conduct a heuristic evaluation) http://www.useit.com/papers/heuristic/heuristic _evaluation.html  http://www.uie.com/articles (collection of articles) http://www.uie.com/articles  http://www.uie.com/articles/usability_tests_lear n/ Learning about usability test (Jared Spool) http://www.uie.com/articles/usability_tests_lear n/  http://www.useit.com/papers/heuristic/severityr ating.html (Severity rating) http://www.useit.com/papers/heuristic/severityr ating.html


Download ppt "Usability Engineering Dr. Dania Bilal IS 587 Fall 2007."

Similar presentations


Ads by Google