1 CS 430 / INFO 430 Information Retrieval Lecture 24 Usability 2.

Slides:



Advertisements
Similar presentations
Query Chains: Learning to Rank from Implicit Feedback Paper Authors: Filip Radlinski Thorsten Joachims Presented By: Steven Carr.
Advertisements

The design process IACT 403 IACT 931 CSCI 324 Human Computer Interface Lecturer:Gene Awyzio Room:3.117 Phone:
Chapter 4 Design Approaches and Methods
1 CS 501 Spring 2002 CS 501: Software Engineering Lecture 11 Designing for Usability I.
Eye Tracking Analysis of User Behavior in WWW Search Laura Granka Thorsten Joachims Geri Gay.
CS305: HCI in SW Development Evaluation (Return to…)
Information Retrieval in Practice
Usability Evaluation Evaluation should occur continually through the design and implementation process. Evaluation methods are applied as the interface.
©N. Hari Narayanan Computer Science & Software Engineering Auburn University 1 COMP 7620 Evaluation Chapter 9.
CS CS 5150 Software Engineering Lecture 12 Usability 2.
Evaluation Methodologies
CS CS 5150 Software Engineering Lecture 12 Usability 2.
Formative and Summative Evaluations
An evaluation framework
CS CS 5150 Software Engineering Lecture 11 Usability 1.
1 CS 501 Spring 2007 CS 501: Software Engineering Lectures 11 & 12 Usability.
Usability Specifications
1 CS 501 Spring 2006 CS 501: Software Engineering Lectures 11 & 12 Usability.
10th Workshop "Software Engineering Education and Reverse Engineering" Ivanjica, Serbia, 5-12 September 2010 First experience in teaching HCI course Dusanka.
1 CS/INFO 430 Information Retrieval Lecture 23 Usability 1.
Software Development Models: Waterfall and Spiral Sung Hee Park Department of Mathematics and Computer Science Virginia State University August 21, 2012.
Web Design cs414 spring Announcements Project status due Friday (submit pdf)
INTRODUCTION. Concepts HCI, CHI Usability User-centered Design (UCD) An approach to design (software, Web, other) that involves the user Interaction Design.
User Interface Evaluation CIS 376 Bruce R. Maxim UM-Dearborn.
Damian Gordon.  Summary and Relevance of topic paper  Definition of Usability Testing ◦ Formal vs. Informal methods of testing  Testing Basics ◦ Five.
Overview of Search Engines
©2011 1www.id-book.com Analytical evaluation Chapter 15.
Heuristic Evaluation “Discount” Usability Testing Adapted from material by Marti Hearst, Loren Terveen.
Predictive Evaluation
1 SWE 513: Software Engineering Usability II. 2 Usability and Cost Good usability may be expensive in hardware or special software development User interface.
Chapter 11: An Evaluation Framework Group 4: Tony Masi, Sam Esswein, Brian Rood, & Chris Troisi.
System Design: Designing the User Interface Dr. Dania Bilal IS582 Spring 2009.
Evaluation of Adaptive Web Sites 3954 Doctoral Seminar 1 Evaluation of Adaptive Web Sites Elizabeth LaRue by.
CS 5150 Software Engineering Lecture 11 Usability 2.
Part 1-Intro; Part 2- Req; Part 3- Design  Chapter 20 Why evaluate the usability of user interface designs?  Chapter 21 Deciding on what you need to.
Usability Evaluation June 8, Why do we need to do usability evaluation?
What is Usability? Usability Is a measure of how easy it is to use something: –How easy will the use of the software be for a typical user to understand,
Copyright © 2008 Pearson Education, Inc. Publishing as Pearson Addison-Wesley The Resonant Interface HCI Foundations for Interaction Design First Edition.
Heuristic evaluation Functionality: Visual Design: Efficiency:
SEG3120 User Interfaces Design and Implementation
Methodology and Explanation XX50125 Lecture 3: Usability testing Dr. Danaë Stanton Fraser.
Implicit User Feedback Hongning Wang Explicit relevance feedback 2 Updated query Feedback Judgments: d 1 + d 2 - d 3 + … d k -... Query User judgment.
Level 2 Prepared by: RHR First Prepared on: Nov 23, 2006 Last Modified on: Quality checked by: MOH Copyright 2004 Asia Pacific Institute of Information.
Usability and Accessibility Usability of Accessibility Features Janey Barnes, PhD User-View, Inc. 1.
1 CS430: Information Discovery Lecture 18 Usability 3.
ISE 412: Human Factors Engineering Dr. Laura Moody Fall, 2005.
Design Process … and some design inspiration. Course ReCap To make you notice interfaces, good and bad – You’ll never look at doors the same way again.
CS CS 5150 Software Engineering Lecture 12 Usability 2.
Overview and Revision for INFO3315. The exam
1 CS 430: Information Retrieval Lecture 13 Usability 1 Guest Lecture: Gilly Leshed.
June 5, 2007Mohamad Eid Usability Testing Chapter 8.
CS5604: Final Presentation ProjOpenDSA: Log Support Victoria Suwardiman Anand Swaminathan Shiyi Wei Department of Computer Science, Virginia Tech December.
1 CS 430: Information Discovery Lecture 5 Ranking.
User Interface Evaluation Introduction Lecture #15.
1 CS 501 Spring 2003 CS 501: Software Engineering Lecture 13 Usability 1.
Evaluating the Usability of Web-based Applications A Case Study of a Field Study Sam J. Racine, PhD Unisys Corporation.
SIE 515 Design Evaluation Lecture 7.
CS 3120 USER INTERFACE DESIGN, IMPLEMENTATION AND EVALUATION (UIDIE)
Human Computer Interaction Lecture 15 Usability Evaluation
Testing Multimedia Products
Evaluation Techniques 1
Usability Evaluation, part 2
Katherine Prentice, MSIS Richard Usatine, MD
HCI in the software process
Evaluation techniques
Usability Techniques Lecture 13.
HCI in the software process
HCI Evaluation Techniques
CS/INFO 430 Information Retrieval
COMP444 Human Computer Interaction Evaluation
Presentation transcript:

1 CS 430 / INFO 430 Information Retrieval Lecture 24 Usability 2

2 Course Administration

3 Assignment 3 Grades were returned yesterday

4 The Design/Evaluate Process Requirements (needs of users and other stakeholders) Design (creative application of design principles) Implementation (may be prototype) Evaluation release start

5 Usability Factors in Searching: User Interface Design of an interface for a simple fielded search. Interface: Fill in boxes, text string,... ? Presentation of results... ? Manipulation of results... ? Functions: Specify field(s), content, operators,... ? Retain results for manipulation... ? Query options... ? Data:Metadata formats... ? Data structures and file structures... ? Systems:Performance... ? How do we evaluate various designs?

6 Usability Factors in Searching: Ordering of results The order in which the hits are presented to the user: Ranked by similarity of match (e.g., term weighting) Sorted by a specified field (e.g., date) Ranked by importance of document as calculated by some algorithm (e.g., Google PageRank) Duplicates shown separately or merged into a single record Filters and other user options What impact do these choices have on the usability? How do we evaluate various designs?

7 Evaluation What is usability? Usability comprises the following aspects: Effectiveness – the accuracy and completeness with which users achieve certain goals Measures: quality of solution, error rates Efficiency – the relation between the effectiveness and the resources expended in achieving them Measures: task completion time, learning time, clicks number Satisfaction – the users’ comfort with and positive attitudes towards the use of the system Measures: attitude rating scales From ISO

8 Evaluation The process of determining the worth of, or assigning a value to, the usability on the basis of careful examination and judgment. Making sure that a system is usable before launching it. Iterative improvements after launch. Categories of evaluation methods: –Analytical evaluation: without users –Empirical evaluation: with users –Measurements of operational systems

9 Evaluation without Users Assessing systems using established theories and methods Evaluation techniques Heuristic Evaluation (Nielsen, 1994) –Evaluate the design using “rules of the thumb” Cognitive Walkthrough (Wharton et al, 1994) –A formalized way of imagining people’s thoughts and actions when they use the interface for the first time Claims Analysis – based on scenario-based analysis –Generating positive and negative claims about the effects of features on the user

10 Evaluation with Users Testing the system, not the users! Stages of evaluation with users: Preparation Sessions conduct Analysis of results User testing is time-consuming and expensive.

11 Evaluation with Users Preparation Determine goals of the usability testing “The user can find the required information in no more than 2 minutes” Write the user tasks “Answer the question: how hot is the sun?” Recruit participants Use the descriptions of users from the requirements phase to detect potential users

12 Usability Laboratory Concept: monitor users while they use system Evaluators User one-way mirror

13 Evaluation with Users Sessions Conduct Conduct the session –Usability Lab –Simulated working environment Observe the user –Human observer(s) –Video camera –Audio recording Inquire satisfaction data

14 Evaluation with Users Results Analysis If possible, use statistical summaries Pay close attention to areas where users –were frustrated –took a long time –couldn't complete tasks Respect the data and users' responses, don't make excuses for designs that failed Note designs that worked and make sure they're incorporated in the final product

15 Measurements on operational systems Analysis of system logs Which user interface options were used? When was was the help system used? What errors occurred and how often? Which hyperlinks were followed (click through data)? Human feedback Complaints and praise Bug reports Requests made to customer service

16 The Search Explorer Application: Reconstruct a User Sessions

17 Refining the design based on evaluation Designers and evaluators need to work as a team Designers are poor evaluators of their own work, but know the requirements, constraints, and context of the design: Some user problems can be addressed with small changes Some user problems require major changes Some user requests (e.g., lots of options) are incompatible with other requests (e.g., simplicity) Do not allow evaluators to become designers and vice versa

18 Experiment on the Google Interface Methodology 10 information seeking tasks in 2 categories Users randomized across tasks Click through data to see what the user did Eye tracking data to see what the user viewed Google results presented with ranks changed or reversed An example of interdisciplinary information science research by Cornell's Human Computer Interaction Group and Computer Science Department

19 Evaluation Example: Eye Tracking

20 Evaluation Example: Eye Tracking

21 Google Evaluation Click-Through Data Number of users who clicked on link Rank of hit

22 Google Evaluation Eye Tracking Data Number of users who viewed short record before first click Rank of hit

23 Google Evaluation: Eye Tracking Data Title: 17.4% Snippet: 42.1% Category: 1.9% URL: 30.4% Other: 8.2% (includes, cached, similar pages, description) Part of short record viewed before first click (% of users)

24 Google Experiment: Click Through Data with Ranks Reversed Rank of hit Percentage of users who clicked on link