Useability.

Slides:



Advertisements
Similar presentations
Chapter 14: Usability testing and field studies
Advertisements

Human Computer Interaction
CS305: HCI in SW Development Evaluation (Return to…)
Cognitive Walkthrough More evaluation without users.
Chapter 14: Usability testing and field studies. 2 FJK User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept.
Data gathering.
Chapter 14: Usability testing and field studies. Usability Testing Emphasizes the property of being usable Key Components –User Pre-Test –User Test –User.
Human Computer Interface. HCI and Designing the User Interface The user interface is a critical part of an information system -- it is what the users.
©N. Hari Narayanan Computer Science & Software Engineering Auburn University 1 COMP 7620 Evaluation Chapter 9.
Evaluation. formative 4 There are many times throughout the lifecycle of a software development that a designer needs answers to questions that check.
Usability Inspection n Usability inspection is a generic name for a set of methods based on having evaluators inspect or examine usability-related issues.
Evaluation Methodologies
Design and Evaluation of Iterative Systems n For most interactive systems, the ‘design it right first’ approach is not useful. n The 3 basic steps in the.
Evaluating with experts
An evaluation framework
SE 555 Software Requirements & Specification Requirements Validation.
Usability 2004 J T Burns1 Usability & Usability Engineering.
SIMS 213: User Interface Design & Development Marti Hearst Thurs, Jan 27, 2005.
From Controlled to Natural Settings
Usability and Evaluation Dov Te’eni. Figure ‎ 7-2: Attitudes, use, performance and satisfaction AttitudesUsePerformance Satisfaction Perceived usability.
Usability Methods: Cognitive Walkthrough & Heuristic Evaluation Dr. Dania Bilal IS 588 Spring 2008 Dr. D. Bilal.
1. Learning Outcomes At the end of this lecture, you should be able to: –Define the term “Usability Engineering” –Describe the various steps involved.
Chapter 14: Usability testing and field studies
Predictive Evaluation
Evaluation Framework Prevention vs. Intervention CHONG POH WAN 21 JUNE 2011.
Ch 14. Testing & modeling users
Evaluation Techniques Material from Authors of Human Computer Interaction Alan Dix, et al.
Human Computer Interaction
Usability testing. Goals & questions focus on how well users perform tasks with the product. – typical users – doing typical tasks. Comparison of products.
Evaluation Techniques zEvaluation ytests usability and functionality of system yoccurs in laboratory, field and/or in collaboration with users yevaluates.
SEG3120 User Interfaces Design and Implementation
Testing & modeling users. The aims Describe how to do user testing. Discuss the differences between user testing, usability testing and research experiments.
Evaluation Techniques Evaluation –tests usability and functionality of system –occurs in laboratory, field and/or in collaboration with users –evaluates.
Software Engineering User Interface Design Slide 1 User Interface Design.
User Interfaces 4 BTECH: IT WIKI PAGE:
Evaluating a UI Design Expert inspection methods Cognitive Walkthrough
The product of evaluation is knowledge. This could be knowledge about a design, knowledge about the user or knowledge about the task.
Design Process … and some design inspiration. Course ReCap To make you notice interfaces, good and bad – You’ll never look at doors the same way again.
User Interface Evaluation Cognitive Walkthrough Lecture #16.
Copyright 2010, The World Bank Group. All Rights Reserved. Testing and Documentation Part II.
EVALUATION PROfessional network of Master’s degrees in Informatics as a Second Competence – PROMIS ( TEMPUS FR-TEMPUS-JPCR)
Cognitive Walkthrough More evaluating with experts.
Chapter 15: Analytical evaluation. Aims: Describe inspection methods. Show how heuristic evaluation can be adapted to evaluate different products. Explain.
Introduction to Evaluation without Users. Where are you at with readings? Should have read –TCUID, Chapter 4 For Next Week –Two Papers on Heuristics from.
1 The Software Development Process ► Systems analysis ► Systems design ► Implementation ► Testing ► Documentation ► Evaluation ► Maintenance.
Introduction to Evaluation “Informal” approaches.
Fall 2002CS/PSY Predictive Evaluation (Evaluation Without Users) Gathering data about usability of a design by a specified group of users for a particular.
Data gathering (Chapter 7 Interaction Design Text)
Usability Engineering Dr. Dania Bilal IS 587 Fall 2007.
Observational Methods Think Aloud Cooperative evaluation Protocol analysis Automated analysis Post-task walkthroughs.
1 Usability Analysis n Why Analyze n Types of Usability Analysis n Human Subjects Research n Project 3: Heuristic Evaluation.
Design Evaluation Overview Introduction Model for Interface Design Evaluation Types of Evaluation –Conceptual Design –Usability –Learning Outcome.
Chapter 9 evaluation techniques. Evaluation Techniques Evaluation – tests usability and functionality of system – occurs in laboratory, field and/or in.
Chapter 6 : User interface design
User Interface Evaluation
SIE 515 Design Evaluation Lecture 7.
Evaluation through user participation
Human Computer Interaction Lecture 15 Usability Evaluation
Evaluation Techniques 1
Evaluation techniques
Usability Techniques Lecture 13.
Evaluation.
COMP444 Human Computer Interaction Usability Engineering
HCI Evaluation Techniques
CSM18 Usability Engineering
Testing & modeling users
Evaluation Techniques
Experimental Evaluation
Cognitive Walkthrough
Presentation transcript:

Useability

The maxim of HCI designers Know Thy Users For They Are Not You Who are your users? How old are they What do they know What do they want How do they work

Key Pillars of Design: Guidelines Prototyping (software tools) Reviews and usability testing

Interaction and Tasks and Users Computers are good at remembering, people are not. Command languages are not good for occasional users/novices Do what people expect, model after real world object, or user interface guidelines: both better and consistent Experts do remember – give them shortcuts, even command languages Give people individual options, avoid overall modes.

Scenario development Study range and distribution of task frequencies and sequences User communities x Tasks

Ethnographic observation Guidelines Preparation Field study Analysis Reporting

Participatory design = Having end-users participate in design (lavi) Pros More accurate info about tasks Opportunity for users to influence design decisions Increased user acceptance Cons Very costly Lengthens implementation process Builds antagonism with users whose ideas are rejected Force designers to compromise designs to satisfy incompetent participants

Usability A. Expert reviews Guidelines review Consistency inspection Cognitive walkthrough Formal usability inspection www.bgu.ac.il/~dleiser

B. Usability testing Typical excuse: nice idea, but time/resources are limited Steps: determine tasks and users, design and test activities, develop and test prototypes, collect data, analyze data , repeat

Cognitive walkthrough The origin of the cognitive walkthrough approach to evaluation is the code walkthrough familiar in software engineering. Walkthroughs require a detailed review of a sequence of actions. Ie steps that an interface will require a user to perform in order to accomplish some task. The evaluators then step through that action sequence to check it for potential usability problems. Usually, the focus is on learning through exploration. Experience shows that many users prefer to learn how to use a system by exploring its functionality hands on, and not after sufficient training or examination of a user's manual. To do this, the evaluators go through each step in the task and provide a story about why that step is or is not good for a new user.

Walkthrough requirements A description of the prototype of the system. It doesn't have to be complete, but it should be fairly detailed. Details such as the location and wording for a menu can make a big difference. A description of the task the user is to perform on the system. This should be a representative task that most users will want to do. A complete, written list of the actions needed to complete the task with the given prototype. An indication of who the users are and what kind of experience and knowledge the evaluators can assume about them. Given this information, the evaluators step through the action sequence (item 3 above) to critique the system

The iteration cycle for each action, the evaluators try to answer the following four questions A. Will the users be trying to produce whatever effect the action has? Are the assumptions about what task the action is supporting correct given the user's experience and knowledge up to this point in the interaction? B. Will users be able to notice that the correct action is available? Will users see the button or menu item, for example, that is how the next action is actually achieved by the system? This is not asking whether they will know that the button is the one they want. This is merely asking whether it is visible to them at the time when they will need to invoke it. example a VCR remote control has a hidden panel of buttons that are not obvious to a new user. C. Once users find the correct action at the interface, will they know that it is the right one for the effect they are trying to produce? This complements the previous question. D. After the action is taken, will users understand the feedback they get?

Records what is good and what needs improvement in the design standard evaluation forms Then for each action (from item 3 on the cover form), a separate standard form is filled out that answers each of the questions A-D above. Any negative answer for any of the questions for any particular action should be documented on a separate usability problem report sheet. This problem report sheet should indicate the system being built (the version, if necessary), the date, the evaluators and a detailed description of the usability problem. severity of the problem, that is, whether the evaluators think this problem will occur often and an impression of how serious the problem will be for the users. This information will help the designers to decide priorities for correcting the design.

Evaluation during active use Interviews Focus-group discussions Data logging Online consultants Newsgroup Newsletters

Experimental evaluation Subjects match expected user population use actual users if possible choose sample size to yield statistically significant results Variables independent: vars that are manipulated (interface style, num items) dependent: vars that are measured (speed, errors) Hypotheses: prediction of outcome in terms of variables Experimental design Between-groups: subjects assigned to different conditions Within-groups: all subjects use all conditions Statistical measures 2 rules: look at the data, save the data

Observational techniques Think-aloud user describes what they believe is happening, why they act, what they want to do simple, little expertise, useful insight Protocol analysis paper and pencil audio recording video recording computer logging user notebooks automatic protocol analysis tools Post-task walkthroughs Discuss alternative (but not pursued) actions Reflect back on actions

Query techniques A. Interviews Level of questioning can be varied to suit context Evaluator can probe the user on interesting issues pro: High-level evaluation, info about preferences, reveal problems con: hard to plan, need skilled interviewers and willing participants

B Questionnaires General Open-ended Scalar Multi-choice Ranked establish background, gender, experience, personality Open-ended unpromped opinion on a question Can you suggest...? How would you...? often result in brief answers, cannot be summarized statistically Scalar judge statement on numeric scale 1-5, 1-7, -2-2 used to balance coarse/fine negative choice (hostile, vague, misleading), positive (friendly, specific, beneficial) Multi-choice choose one or more from a list of explicit responses ex: How do you get help with the system: manual, online, colleague gather info on user's previous experience Ranked place an ordering on a list to indicate user's preferences ex: Please rank usefulness of methods: menu, command line, accelerator