Utah School of Computing HCI Validation Richard F. Riesenfeld University of Utah Fall 2004.

Slides:



Advertisements
Similar presentations
Acknowledgements: Most of this course is based on the excellent course offered by Prof. Kellogg Booth at the British Columbia University, Vancouver, Canada.
Advertisements

Utah School of Computing HCI Validation Richard F. Riesenfeld University of Utah Fall 2009 Lecture Set 16.
GUI Testing. High level System Testing Test only those scenarios and outputs that are observable by the user Event-driven Interactive Two parts to test.
CS305: HCI in SW Development Evaluation (Return to…)
IS214 Recap. IS214 Understanding Users and Their Work –User and task analysis –Ethnographic methods –Site visits: observation, interviews –Contextual.
Usability and taste  Taste is subjective  but not necessarily trivial  Taste is subject to fashion  Changes over time  Influenced by other people.
©2010 John Wiley and Sons Chapter 14 Research Methods in Human-Computer Interaction Chapter 14- Working with Human Subjects.
CyLab Usable Privacy and Security Laboratory 1 C yLab U sable P rivacy and S ecurity Laboratory Designing.
©N. Hari Narayanan Computer Science & Software Engineering Auburn University 1 COMP 7620 Evaluation Chapter 9.
Usability Inspection n Usability inspection is a generic name for a set of methods based on having evaluators inspect or examine usability-related issues.
Inspection Methods. Inspection methods Heuristic evaluation Guidelines review Consistency inspections Standards inspections Features inspection Cognitive.
Heuristic Evaluation Evaluating with experts. Discount Evaluation Techniques  Basis: Observing users can be time- consuming and expensive Try to predict.
Usable Privacy and Security Carnegie Mellon University Spring 2008 Lorrie Cranor 1 Designing user studies February.
Evaluating with experts
1 CS 430 / INFO 430 Information Retrieval Lecture 24 Usability 2.
An evaluation framework
Usability 2004 J T Burns1 Usability & Usability Engineering.
1 User Centered Design and Evaluation. 2 Overview My evaluation experience Why involve users at all? What is a user-centered approach? Evaluation strategies.
Evaluation: Inspections, Analytics & Models
User Interface Evaluation CIS 376 Bruce R. Maxim UM-Dearborn.
Evaluation in HCI Angela Kessell Oct. 13, Evaluation Heuristic Evaluation Measuring API Usability Methodology Matters: Doing Research in the Behavioral.
User Interface Design Chapter 11. Objectives  Understand several fundamental user interface (UI) design principles.  Understand the process of UI design.
Usability 2009 J T Burns1 Usability & Usability Engineering.
©2011 1www.id-book.com Analytical evaluation Chapter 15.
Heuristic Evaluation “Discount” Usability Testing Adapted from material by Marti Hearst, Loren Terveen.
1 Usability evaluation and testing User interfaces Jaana Holvikivi Metropolia.
Predictive Evaluation
Conducting a User Study Human-Computer Interaction.
Evaluation Framework Prevention vs. Intervention CHONG POH WAN 21 JUNE 2011.
Requirements Gathering. Why are requirements important? To understand what we are going to be doing We build systems for others, not for ourselves Requirements.
11 Reasons Why Manuscripts are Rejected
Chapter 4 Expert Reviews, Usability, Testing, Surveys, and Continuing Assessments Saba Alavi,Jacob Hicks,Victor Chen.
INFO3315 Week 4 Personas, Tasks Guidelines, Heuristic Evaluation.

Part 1-Intro; Part 2- Req; Part 3- Design  Chapter 20 Why evaluate the usability of user interface designs?  Chapter 21 Deciding on what you need to.
Multimedia Specification Design and Production 2013 / Semester 1 / week 9 Lecturer: Dr. Nikos Gazepidis
Usability Evaluation June 8, Why do we need to do usability evaluation?
Usability Testing CS774 Human Computer Interaction Spring 2004.
SEG3120 User Interfaces Design and Implementation
Evaluation of User Interface Design 4. Predictive Evaluation continued Different kinds of predictive evaluation: 1.Inspection methods 2.Usage simulations.
Cmpe 589 Spring Sampling Target population Cost Sample is representative of population (measure statistical average age is 37- if you get 20 for.
Chapter 15: Analytical evaluation. Inspections Heuristic evaluation Walkthroughs.
Chapter 15: Analytical evaluation Q1, 2. Inspections Heuristic evaluation Walkthroughs Start Q3 Reviewers tend to use guidelines, heuristics and checklists.
Evaluating a UI Design Expert inspection methods Cognitive Walkthrough
Usability Engineering Dr. Dania Bilal IS 582 Spring 2006.
EVALUATION PROfessional network of Master’s degrees in Informatics as a Second Competence – PROMIS ( TEMPUS FR-TEMPUS-JPCR)
Usability Engineering Dr. Dania Bilal IS 592 Spring 2005.
Chapter 15: Analytical evaluation. Aims: Describe inspection methods. Show how heuristic evaluation can be adapted to evaluate different products. Explain.
1 Usability evaluation and testing User interfaces Jaana Holvikivi Metropolia.
Oct 211 The next two weeks Oct 21 & 23: Lectures on user interface evaluation Oct 28: Lecture by Dr. Maurice Masliah No office hours (out of town) Oct.
Introduction to Evaluation “Informal” approaches.
Fall 2002CS/PSY Predictive Evaluation (Evaluation Without Users) Gathering data about usability of a design by a specified group of users for a particular.
Usability Engineering Dr. Dania Bilal IS 582 Spring 2007.
Usability Engineering Dr. Dania Bilal IS 587 Fall 2007.
How do we know if our UI is good or bad?.
1 Usability Analysis n Why Analyze n Types of Usability Analysis n Human Subjects Research n Project 3: Heuristic Evaluation.
User Interface Evaluation
SIE 515 Design Evaluation Lecture 7.
Imran Hussain University of Management and Technology (UMT)
CS3205: HCI in SW Development Evaluation (Return to…)
Usability Testing 3 CPSC 481: HCI I Fall 2014 Anthony Tang.
Chapter 26 Inspections of the user interface
Evaluation.
HCI Evaluation Techniques
CSM18 Usability Engineering
Miguel Tavares Coimbra
Miguel Tavares Coimbra
Human-Computer Interaction: Overview of User Studies
Miguel Tavares Coimbra
Reviewing Features in Word
Presentation transcript:

Utah School of Computing HCI Validation Richard F. Riesenfeld University of Utah Fall 2004

Utah School of Computing slide 2 Student Name Server Fall 2003 Major Considerations - 1 Stage of design -Conceptual, preliminary, detail Novelty of project -Do we know what we are doing? Number of expected users -How important is this? -How amenable to change will it be?

Utah School of Computing slide 3 Student Name Server Fall 2003 Major Considerations - 2 Criticality of the interface -Are lives at stake if there are problems? Cost of product -Allocation for testing Time available for testing Experience of designers and evaluators

Utah School of Computing slide 4 Student Name Server Fall 2003 Expert Review methods - 1 Heuristic evaluation -Experts critique it wrt established criteria Guidelines review -Does it meet “spec” -Can be an overwhelming list -Bureaucratic approach

Utah School of Computing slide 5 Student Name Server Fall 2003 Expert Review methods - 2 Consistency inspection -Experts check of style, function, form, etc. Cognitive walkthrough -Experts perform role of users -Try to assess its success from experience

Utah School of Computing slide 6 Student Name Server Fall 2003 Expert Review methods - 3 Formal usability inspection -Moot court -Countervailing opinions -Can be unwieldy

Utah School of Computing slide 7 Student Name Server Fall 2003 Comparative Evaluations - 1 Different experts see different issues -Can get caught with conflicting advice* -Limit the number of experts Get a “bird’s-eye” view in the beginning -Throw images on a wall, etc. * “ * “ For every PhD, there is an equal and opposite PhD”

Utah School of Computing slide 8 Student Name Server Fall 2003 Comparative Evaluations - 2 Formal (statistical) methods -Form a hypothesis -Determine dependent variables -Identify independent variables * “

Utah School of Computing slide 9 Student Name Server Fall 2003 Usability Labs, etc Hard to employ because of time and money constraints in product development -Development cycle schedule -Budgets -Corporate/Cultural Attitude

Utah School of Computing slide 10 Student Name Server Fall 2003 Controlled Experients Statistical testing -Establish level of statistical significance -At the 95% confidence level we know… Usability testing -Find flaws in the interface through more informal (inconclusive) methods -Empirical methods

Utah School of Computing slide 11 Student Name Server Fall 2003 Human Subjects -1 Careful, “This isn’t Kansas anymore!” Many new dimensions need attention Human respect and dignity -Voice generated check outs violated privacy  Military has NO privacy  Other cultures treat matters differently

Utah School of Computing slide 12 Student Name Server Fall 2003 Human Subjects -2 Real LEGAL issues, so get it right! -Informed consent -Understand your liability -Get it in writing, copy to each party Government, or institutional rules -We are not accustomed to this -Need cognizant approvals  IRBs, etc.  Research proposals, etc

Utah School of Computing slide 13 Student Name Server Fall 2003 Observations methods Have subjects “think aloud” -Will subjects be honest, etc. Use video recording Field tests -Study the successes/failures of the interface -Getting access -Reliance on memories  “How is it going?” (We tend to react to most recent)

Utah School of Computing slide 14 Student Name Server Fall 2003 Destructive Testing Hey, can you break this? Good for security Good for games Durability testing appropriate for some environments -ATM in high crime area -Military -Students, they can’t resist a challenge

Utah School of Computing slide 15 Student Name Server Fall 2003 Competitive Testing -1 Consumers Union, Road & Track, style -Take several into lab and have a “shoot out” Expensive Takes skill (like a movie review) -Depends on the criteria -Depends on good and representative judgment

Utah School of Computing slide 16 Student Name Server Fall 2003 Competitive Testing -2 Major Limitations -Limited coverage of features -Depends on initial user experiences

Utah School of Computing slide 17 Student Name Server Fall 2003 Surveys Tricky business, can lead to nearly any conclusion -Population selection -Question choices -Size -Leading questions, other bias Negative bias – users with complaints

Utah School of Computing slide 18 Student Name Server Fall 2003 Online Surveys More issues…

Utah School of Computing slide 19 Student Name Server Fall 2003 Conclusions HCI is a new game Not exact science Old methods not entirely applicable Need newer, faster, light weight, flexible, informal, subjective, intelligent approaches

Utah School of Computing slide 20 Student Name Server Fall 2003 Recommendations Use good judgment Trust good judgment -Yours -Others, whom you trust Be open to criticism and suggestion

Utah School of Computing slide 21 Student Name Server Fall 2003 Interpretation What is being said? What is the real issue? What is the real fix? RSI is a problem -Pain -Keyboard or mouse? -Need different devices, or speech, or simply a better mouse and keyboard?

Utah School of Computing slide 22 Student Name Server Fall 2003 acm.org/~perlman/question.cgi Please rate the usability of the system. Try to respond to all the items. For items that are not applicable, use: NA Make sure these fields are filled in: System: to: Add a comment about an item by clicking on its icon, or add comment fields for all items by clicking on Comment All. List the most negative aspect(s): 3.List the most positive aspect(s): 4. Top of Form Top of Form

Utah School of Computing slide 23 Student Name Server Fall 2003 acm.org/~perlman/question.cgi To mail in your results, click on: Mail Data System: to: Optionally provide comments and your address in the box. List the most negative aspect(s): 3.List the most positive aspect(s): 4. Top of Form Top of Form

Utah School of Computing slide 24 Student Name Server Fall 2003 acm.org/~perlman/question.cgi NA 1.Overall, I am satisfied with how easy it is to use this system strongly disagreestrongly agree 2.It was simple to use this system strongly disagreestrongly agree 3.I can effectively complete my work using this system strongly disagreestrongly agree 4.I am able to complete my work quickly using this system strongly disagreestrongly agree 5.I am able to efficiently complete my work using this system strongly disagreestrongly agree 6.I feel comfortable using this system strongly disagreestrongly agree List the most negative aspect(s): 3.List the most positive aspect(s): 4. Top of Form Top of Form

Utah School of Computing slide 25 Student Name Server Fall 2003 RHS of Each Row NA 1. strongly disagree strongly agree

Utah School of Computing slide 26 Student Name Server Fall 2003 acm.org/~perlman/question.cgi 7.It was easy to learn to use this system 8.I believe I became productive quickly using this system 9.The system gives error messages that clearly tell me how to fix problems 10.Whenever I make a mistake using the system, I recover easily and quickly 11.The information (such as online help, on-screen messages, and other documentation) provided with this system is clear 12.It is easy to find the information I needed

Utah School of Computing slide 27 Student Name Server Fall 2003 acm.org/~perlman/question.cgi 13.The information provided for the system is easy to understand 14.The information is effective in helping me complete the tasks and scenarios 15.The organization of information on the system screens is clear 16.The interface of this system is pleasant 17.I like using the interface of this system 18.This system has all the functions and capabilities I expect it to have 19.Overall, I am satisfied with this system List the most negative aspect(s): 3.List the most positive aspect(s): 4. Top of Form Top of Form

Utah School of Computing slide 28 Student Name Server Fall 2003 acm.org/~perlman/question.cgi List the most negative aspect(s): 3.List the most positive aspect(s): 4. Top of Form Top of Form List the most negative aspect(s): 3.List the most positive aspect(s): 4. Top of Form Top of Form

Utah School of Computing END HCI Validation