SIE 515 Design Evaluation Lecture 7.

Slides:



Advertisements
Similar presentations
Chapter 15: Analytical evaluation
Advertisements

Human Computer Interaction
Dialog on - Usability Düsseldorf / PVpage 1 USABILITY Based on a lecture by Raino Vastamäki, Research Director Adage Oy in Kiljava on May 2003.
Evaluation (cont.): Heuristic Evaluation Cognitive Walkthrough CS352.
©N. Hari Narayanan Computer Science & Software Engineering Auburn University 1 COMP 7620 Evaluation Chapter 9.
Usability presented by the OSU Libraries’ u-team.
Part 4: Evaluation Days 25, 27, 29, 31 Chapter 20: Why evaluate? Chapter 21: Deciding on what to evaluate: the strategy Chapter 22: Planning who, what,
Empirical Methods in Human- Computer Interaction.
Evaluation Methodologies
Heuristics  Basis  Evaluators –Qualifications –Training  Preparation –Scenario  Results –List of problems –Severity –Group synthesis.
Heuristic Evaluation Evaluating with experts. Discount Evaluation Techniques  Basis: Observing users can be time- consuming and expensive Try to predict.
Evaluation Through Expert Analysis U U U
Evaluating with experts
1 CS 430 / INFO 430 Information Retrieval Lecture 24 Usability 2.
An evaluation framework
Heuristic Evaluation.
Usability 2004 J T Burns1 Usability & Usability Engineering.
Evaluation techniques Part 1
Evaluation: Inspections, Analytics & Models
Usability Evaluation Methods Computer-Mediated Communication Lab week 10; Tuesday 7/11/06.
Allison Bloodworth, Senior User Interaction Designer, University of California, Berkeley Gary Thompson, User Experience Leader, Unicon, Inc. Introduction.
UX testing for mobile app Marine embe
©2011 1www.id-book.com Analytical evaluation Chapter 15.
1 Usability evaluation and testing User interfaces Jaana Holvikivi Metropolia.
Evaluation Framework Prevention vs. Intervention CHONG POH WAN 21 JUNE 2011.
Chapter 11: An Evaluation Framework Group 4: Tony Masi, Sam Esswein, Brian Rood, & Chris Troisi.
Lecture 9 Usability of Health Informatics Applications (Chapter 9)
Part 1-Intro; Part 2- Req; Part 3- Design  Chapter 20 Why evaluate the usability of user interface designs?  Chapter 21 Deciding on what you need to.
Chapter 26 Inspections of the UI. Heuristic inspection Recommended before but in lieu of user observations Sort of like an expert evaluation Heuristics.
Nielsen’s Ten Usability Heuristics
Human Computer Interaction
Usability Expert Review Anna Diubina. What is usability? The effectiveness, efficiency and satisfaction with which specified users achieve specified goals.
Usability Evaluation June 8, Why do we need to do usability evaluation?
Heuristic Evaluation and Discount Usability Engineering Taken from the writings of Jakob Nielsen – inventor of both.
Usability Testing CS774 Human Computer Interaction Spring 2004.
Level 2 Prepared by: RHR First Prepared on: Nov 23, 2006 Last Modified on: Quality checked by: MOH Copyright 2004 Asia Pacific Institute of Information.
New Media Research Methods- Part 1 Planning stage: Using Methods, Data and Tools to explore User’s Experience. How methods relate to research questions?
Y ASER G HANAM Heuristic Evaluation. Roadmap Introduction How it works Advantages Shortcomings Conclusion Exercise.
Chapter 15: Analytical evaluation. Inspections Heuristic evaluation Walkthroughs.
Evaluation Techniques Evaluation –tests usability and functionality of system –occurs in laboratory, field and/or in collaboration with users –evaluates.
Chapter 15: Analytical evaluation Q1, 2. Inspections Heuristic evaluation Walkthroughs Start Q3 Reviewers tend to use guidelines, heuristics and checklists.
Evaluating a UI Design Expert inspection methods Cognitive Walkthrough
Human-computer interaction: users, tasks & designs User modelling in user-centred system design (UCSD) Use with Human Computer Interaction by Serengul.
Chapter 15: Analytical evaluation. Aims: Describe inspection methods. Show how heuristic evaluation can be adapted to evaluate different products. Explain.
ParaQ Usability ParaQ Summit II March 14, 2006 Matthew Wong, SNL/CA.
1 Usability evaluation and testing User interfaces Jaana Holvikivi Metropolia.
Oct 211 The next two weeks Oct 21 & 23: Lectures on user interface evaluation Oct 28: Lecture by Dr. Maurice Masliah No office hours (out of town) Oct.
Fall 2002CS/PSY Predictive Evaluation (Evaluation Without Users) Gathering data about usability of a design by a specified group of users for a particular.
Ten Usability Heuristics with Example.. Page 2 Heuristic Evaluation Heuristic evaluation is the most popular of the usability inspection methods. Heuristic.
COGNITIVE WALKTHROUGH Vigneshwar Poojar. The cognitive walkthrough is a formalized way of imagining people’s thoughts and actions when they use an interface.
Chapter 9 evaluation techniques. Evaluation Techniques Evaluation – tests usability and functionality of system – occurs in laboratory, field and/or in.
Discount Evaluation User Interface Design. Startup Weekend Wellington CALLING ALL DESIGNERS, DEVELOPERS AND IDEAS FOLK: Startup Weekend returns to Wellington.
Human Computer Interaction Lecture 15 Usability Evaluation
Imran Hussain University of Management and Technology (UMT)
Human-Computer Interaction
CS3205: HCI in SW Development Evaluation (Return to…)
Evaluation Techniques 1
SIE 515 Design Rules Lecture 5.
GAN-MVL 2.1 Heuristic Evaluation
Evaluation techniques
PostPC Computing Heuristic Evaluation Prof. Scott Kirkpatrick, HUJI
Evaluation ECE 695 Alexander J. Quinn March 30, 2018.
Chapter 26 Inspections of the user interface
Evaluation.
HCI Evaluation Techniques
CSM18 Usability Engineering
Human-Computer Interaction: Overview of User Studies
COMP444 Human Computer Interaction Evaluation
Evaluation: Inspections, Analytics & Models
Evaluation: Inspections, Analytics, and Models
Presentation transcript:

SIE 515 Design Evaluation Lecture 7

Today’s Schedule Goals of design evaluation Evaluation by expert analysis Six approaches Evaluation through user-based participation Five approaches Laboratory vs. field studies Discussion of design project

Goals of Design Evaluation Two components of evaluation: Assessing system design and checking user requirements Done throughout the design cycle (feedback loop) Three goals of design evaluation Assess system functionality Assess user experience Identify problems 3

Two Forms of Evaluation Evaluation through expert analysis Goal = guide initial specifications and development No user input User-based evaluation (analysis) Input from users based on real-time use of system Later in design cycle

Expert Evaluation Conducted when not possible to perform user-based evaluations. Usability expert: assesses design for core user groups based on specific metrics. Goal = Guide initial specifications and identify design problems. 5

Approaches to Expert Analysis Cognitive walkthroughs: applying cognitive principles to user interaction. Heuristic evaluation: usability based on rules of thumb. Use of models: merging formal cognitive models with design principles. Use of prior research: existing results inform current design. Guidelines review: conformance with internal guidelines documents. Consistency inspection: across multiple application interfaces. 6

Expert Analysis Cognitive walkthroughs: applying cognitive principles to user interaction Evaluates each step of interaction to perform task Support learning through user-directed exploration Heuristic evaluation: usability based on rules of thumb Critiques design using accepted principles, guidelines and standards Five evaluators will identify 75% of all usability problems Problems identified are rated on severity scale

Nielsen’s Ten Heuristics Visibility of system status Match between system and the real world User control and freedom Consistency and standards Error prevention Recognition rather than recall Flexibility and efficiency of use Aesthetic and minimalist design Help users recognize, diagnose and recover from errors Help and documentation

Other Expert Evaluation Model-based evaluation: Merging formal cognitive models with design principles Examples = GOMS model, dialog models, & network models Use of prior research: existing results inform current design Often sufficient to make usability design decisions

Evaluation Through User Participation Observational methods Think aloud procedures Query techniques: interviews and questionnaires Physiological monitoring Experimental methods: using empirical results from human testing 10

Laboratory User Evaluation Good experimental control, less real-world validity

Field Study User Evaluation Realistic context, but less experimental control Good for longitudinal investigations

Usability Lab Usability lab test with participant and observer seated at a workstation, separated by a one way mirror.

Observational Methods Based on observations of user interacting with system on pre-defined task Most common user-based evaluation technique Must have consistency between observations Observational methods do not provide information about user’s decision-making processes Combine with description of user actions (think aloud technique)

Observational Techniques External observation: being present and viewing a group but not participating. Non-participant: secretly observing a group. Participant observation: joining the routines of a group and observing action. Covert observation: totally immersing yourself in a group culture and not identifying yourself.

Think Aloud Techniques User talks through every action Data provides insight into user’s information processing capacity and mental model Cooperative evaluation User acts as evaluator instead of participant Users and experimenters can ask questions Protocol analysis = scoring of verbal record

Query Techniques Query evaluation = direct input from users Problem: data is subjective and hard to generalize Most common query techniques = interviews and questionnaires Interviews should be focused, and based on specific questions Questionnaires provide more control but less flexibility

Physiological Evaluation Most objective and direct approach for obtaining user data. Best when combined with behavioral measures. Problems = expensive equipment and complex technology; hard to analyze data.

Eye-Tracking Equipment Output

Galvanic Skin Conduction Equipment Output

Electrophysiological Sensor Equipment Output

fMRI Output Equipment

For Next Class Assignment 7: Task 1: Read Shneiderman Chapter 4, 4.7 onward Task 2: Answer the following questions: What are the most important contributions from the reading? How do these factors relate to good design? Give an example. Task 3: Identify questions or issues that you would like to talk more about during class/on the participation blog. 23