Methodology Overview Dr. Saul Greenberg John Kelleher.

Slides:



Advertisements
Similar presentations
Introducing evaluation. The aims Discuss how developers cope with real-world constraints. Explain the concepts and terms used to discuss evaluation. Examine.
Advertisements

Chapter 14: Usability testing and field studies
Chapter 12: Introducing Evaluation. The aims To illustrate how observation, interviews and questionnaires that you encountered in Chapters 7 and 8 are.
Evaluation of User Interface Design
References Prof. Saul Greenberg, University of Calgary, notes and articles INUSE 6.2 and RESPECT 5.3 Handbook Prof. , University of , Notes and articles.
Agile Usability Testing Methods
CS305: HCI in SW Development Evaluation (Return to…)
IS214 Recap. IS214 Understanding Users and Their Work –User and task analysis –Ethnographic methods –Site visits: observation, interviews –Contextual.
Methodology Overview Why do we evaluate in HCI? Why should we use different methods? How can we compare methods? What methods are there? see
Saul Greenberg User Centered Design Why User Centered Design is important Approaches to User Centered Design.
Chapter 14: Usability testing and field studies. 2 FJK User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept.
1 User Centered Design and Evaluation. 2 Overview Why involve users at all? What is a user-centered approach? Evaluation strategies Examples from “Snap-Together.
1 User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept. Cal Poly San Luis Obispo FJK 2005.
Evaluation Methodologies
Design and Evaluation of Iterative Systems n For most interactive systems, the ‘design it right first’ approach is not useful. n The 3 basic steps in the.
1 CS 430 / INFO 430 Information Retrieval Lecture 24 Usability 2.
An evaluation framework
Evaluation Methods April 20, 2005 Tara Matthews CS 160.
Usability 2004 J T Burns1 Usability & Usability Engineering.
1 User Centered Design and Evaluation. 2 Overview My evaluation experience Why involve users at all? What is a user-centered approach? Evaluation strategies.
Saul Greenberg Evaluating Interfaces With Users Why evaluation is crucial to interface design General approaches and tradeoffs in evaluation The role of.
Evaluation: Inspections, Analytics & Models
ISE554 The WWW 3.4 Evaluation Methods. Evaluating Interfaces with Users Why evaluation is crucial to interface design General approaches and tradeoffs.
James Tam Evaluating Interfaces With Users Why evaluation is crucial to interface design General approaches and tradeoffs in evaluation The role of ethics.
©2011 1www.id-book.com Analytical evaluation Chapter 15.
류 현 정류 현 정 Human Computer Interaction Introducing evaluation.
Predictive Evaluation
Evaluation Framework Prevention vs. Intervention CHONG POH WAN 21 JUNE 2011.
Chapter 11: An Evaluation Framework Group 4: Tony Masi, Sam Esswein, Brian Rood, & Chris Troisi.
Evaluation of Adaptive Web Sites 3954 Doctoral Seminar 1 Evaluation of Adaptive Web Sites Elizabeth LaRue by.
Lecture 9 Usability of Health Informatics Applications (Chapter 9)
Formative Evaluation cs3724: HCI. Problem scenarios summative evaluation Information scenarios claims about current practice analysis of stakeholders,
Multimedia Specification Design and Production 2013 / Semester 1 / week 9 Lecturer: Dr. Nikos Gazepidis
Human Computer Interaction
Usability Evaluation June 8, Why do we need to do usability evaluation?
Evaluation Paradigms & Techniques IS 588 Spring 2008 Dr. D. Bilal.
Evaluation Techniques zEvaluation ytests usability and functionality of system yoccurs in laboratory, field and/or in collaboration with users yevaluates.
Testing & modeling users. The aims Describe how to do user testing. Discuss the differences between user testing, usability testing and research experiments.
Midterm Stats Min: 16/38 (42%) Max: 36.5/38 (96%) Average: 29.5/36 (78%)
Chapter 12: Introducing Evaluation. The aims To illustrate how observation, interviews and questionnaires that you encountered in Chapters 7 and 8 are.
CS2003 Usability Engineering Human-Centred Design Dr Steve Love.
Web Content Development Dr. Komlodi Class 25: Evaluative testing.
Chapter 15: Analytical evaluation. Inspections Heuristic evaluation Walkthroughs.
Evaluation Techniques Evaluation –tests usability and functionality of system –occurs in laboratory, field and/or in collaboration with users –evaluates.
Chapter 15: Analytical evaluation Q1, 2. Inspections Heuristic evaluation Walkthroughs Start Q3 Reviewers tend to use guidelines, heuristics and checklists.
Chapter 8 Usability Specification Techniques Hix & Hartson.
Chapter 12: Introducing Evaluation. The aims To illustrate how observation, interviews and questionnaires that you encountered in Chapters 7 and 8 are.
EVALUATION PROfessional network of Master’s degrees in Informatics as a Second Competence – PROMIS ( TEMPUS FR-TEMPUS-JPCR)
Evaluation Methods - Summary. How to chose a method? Stage of study – formative, iterative, summative Pros & cons Metrics – depends on what you want to.
Chapter 15: Analytical evaluation. Aims: Describe inspection methods. Show how heuristic evaluation can be adapted to evaluate different products. Explain.
Oct 211 The next two weeks Oct 21 & 23: Lectures on user interface evaluation Oct 28: Lecture by Dr. Maurice Masliah No office hours (out of town) Oct.
Introduction to Evaluation “Informal” approaches.
Fall 2002CS/PSY Predictive Evaluation (Evaluation Without Users) Gathering data about usability of a design by a specified group of users for a particular.
User Interface Evaluation Introduction Lecture #15.
Design Evaluation Overview Introduction Model for Interface Design Evaluation Types of Evaluation –Conceptual Design –Usability –Learning Outcome.
1 Design and evaluation methods: Objectives n Design life cycle: HF input and neglect n Levels of system design: Going beyond the interface n Sources of.
Methodology Overview basics in user studies Lecture /slide deck produced by Saul Greenberg, University of Calgary, Canada Notice: some material in this.
SIE 515 Design Evaluation Lecture 7.
Methodology Overview 2 basics in user studies Lecture /slide deck produced by Saul Greenberg, University of Calgary, Canada Notice: some material in this.
Introducing Evaluation
Evaluation techniques
Model based design.
Usability Techniques Lecture 13.
Evaluation.
HCI Evaluation Techniques
Testing & modeling users
Evaluation Techniques
Evaluation: Inspections, Analytics & Models
Presentation transcript:

Methodology Overview Dr. Saul Greenberg John Kelleher

1 Role of Evaluation Star Life Cycle(Hix and Hartson) Implementation Task analysis/ functional analysis Task analysis/ functional analysis Evaluation Conceptual design/ formal design Conceptual design/ formal design Requirements specification Requirements specification Prototyping

2

3 Goals of Evaluation Assess the extent of system’s functionality incorporating ‘articulatory directness’ Assess the effect of the I/F on user learnability, users’ attitudes Identify any specific problems with the system typified by lack of context awareness regarding ‘real-world’

4 When to evaluate? Initial stages evaluate choices of initial design ideas and representations is the representation appropriate? does it reflect how people think of their task Iterative development fine tune the interface, looking for usability bugs can people use this system? Evaluation produces: user reaction to design validation and list of problem areas (bugs) new design ideas

5 Why Evaluate?(contd.) Post-design acceptance test: did we deliver on promises? verify that human/computer system meets expected performance criteria ease of learning, usability, user’s attitude, performance criteria E.g., a first time user will take 1-3 minutes to learn how to withdraw $50. from the automatic teller revisions: what do we need to change? effects: What did we change in the way people do their tasks? Evaluation produces testable usability metrics actual reactions validation and list of problem areas (bugs) changes in original work practices/requirements

6 Generalized knowledge are there general design principles? are there theories? Can we validate ideas / visions / hypotheses? Evaluation produces: principles and guidelines evidence supporting/rejection theories Why Evaluate?(contd.)

7 Two forms of Evaluation Formative continuous process throughout the design expect a number of iterations requires only a few users each time must be structured but also informal Don't just ask users to try system! Average of 3 major ‘design-test-redesign’ cycles Summative typically, at the end of the process a “go-no go” decision – quality control issue should involve many users must be structured relative to usability (and other) requirements Formal, often involving statistical inferences

8 Evaluating the 1984 OMS Early tests of printed scenarios & user guides Early simulations of telephone keypad An Olympian joined team to provide feedback Interviews & demos with Olympians outside US Overseas interface tests with friends and family. Free coffee and donut tests Usability tests with 100 participants. A ‘try to destroy it’ test Pre-Olympic field-test at an international event Reliability of the system with heavy traffic

9 Why Should We Use Different Methods? Information requirements differ pre-design, iterative design, post-design, generalisable knowledge… Information produced differs outputs should match the particular problem/needs Cost/benefit of using method cost of method should match the benefit gained from the result, One method’s strength can complement another’s weakness no one method can address all situations Constraints may force you to chose quick and dirty discount usability methods

10 How Can We Compare Methods? Relevance does the method provide information to our question / problem? Naturalistic: is the method applied in an ecologically valid situation? observations reflect real world settings: real environment, real tasks, real people, real motivation Setting field vs laboratory? Support are there tools for supporting the method and analyzing the data?

11 How Can We Compare Methods? Quickness can I do a good job with this method within my time constraints? Cost Is the cost of this method reasonable for my question? Equipment What special equipment / resources required? Personnel, training and expertise What people are required to run this method? What expertise must they have? Subject selection how many subjects do I need, who do they have to be, and can I get them?

12 Usability Metrics time to complete a task time spent in error time spent using help or documentation ratio of success and failure number of commands used number of times the user is “misled” by the interface number of times the user expresses frustration or satisfaction number of good or bad features recalled by the user time to reach a specific level of performance frequency and type of errors made by users user reaction times to particular tasks frequency of command usage, e.g. help

13

14 What methods are there? Lab tests Experimental methodologies highly controlled observations and measurements to answer very specific questions i.e., hypothesis testing Usability testing mostly qualitative, less controlled observations of users performing tasks Interface inspection Heuristic Evaluation Walkthroughs experts and others analyze an interface by considering what a user would have to do a step at a time while performing their task

15 Field studies Ethnography field worker immerses themselves in a culture to understand what that culture is doing Contextual inquiry interview methodology that gains knowledge of what people do in their real-world context What methods are there?

16 What methods are there? Cognitive Modeling Fitt’s Law  Used to predict a user’s time to select a target Keystroke-Level Model  low-level description of what users would have to do to perform a task. GOMS  structured, multi-level description of what users would have to do to perform a task Self Reporting questionnaires surveys

17 Comparative Study of Big 4 User Interface Evaluation in the Real World: A Comparison of Four Techniques (Jeffries et al. 1991) Heuristic Evaluation (HE) Software Guidelines (SG) Cognitive Walkthrough (CW) Usability Testing (UT)

18 Problem Identification Source: User Interface Evaluation in the Real World: A Comparison of Four Techniques (Jeffries, Miller, Wharton, Uyeda): HP 

19 Cost-Benefit Analysis

20 Summary of Results