Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 CS 430 / INFO 430 Information Retrieval Lecture 24 Usability 2.

Similar presentations


Presentation on theme: "1 CS 430 / INFO 430 Information Retrieval Lecture 24 Usability 2."— Presentation transcript:

1 1 CS 430 / INFO 430 Information Retrieval Lecture 24 Usability 2

2 2 Course Administration

3 3 Assignment 3 Grades were returned yesterday

4 4 The Design/Evaluate Process Requirements (needs of users and other stakeholders) Design (creative application of design principles) Implementation (may be prototype) Evaluation release start

5 5 Usability Factors in Searching: User Interface Design of an interface for a simple fielded search. Interface: Fill in boxes, text string,... ? Presentation of results... ? Manipulation of results... ? Functions: Specify field(s), content, operators,... ? Retain results for manipulation... ? Query options... ? Data:Metadata formats... ? Data structures and file structures... ? Systems:Performance... ? How do we evaluate various designs?

6 6 Usability Factors in Searching: Ordering of results The order in which the hits are presented to the user: Ranked by similarity of match (e.g., term weighting) Sorted by a specified field (e.g., date) Ranked by importance of document as calculated by some algorithm (e.g., Google PageRank) Duplicates shown separately or merged into a single record Filters and other user options What impact do these choices have on the usability? How do we evaluate various designs?

7 7 Evaluation What is usability? Usability comprises the following aspects: Effectiveness – the accuracy and completeness with which users achieve certain goals Measures: quality of solution, error rates Efficiency – the relation between the effectiveness and the resources expended in achieving them Measures: task completion time, learning time, clicks number Satisfaction – the users’ comfort with and positive attitudes towards the use of the system Measures: attitude rating scales From ISO 9241-11

8 8 Evaluation The process of determining the worth of, or assigning a value to, the usability on the basis of careful examination and judgment. Making sure that a system is usable before launching it. Iterative improvements after launch. Categories of evaluation methods: –Analytical evaluation: without users –Empirical evaluation: with users –Measurements of operational systems

9 9 Evaluation without Users Assessing systems using established theories and methods Evaluation techniques Heuristic Evaluation (Nielsen, 1994) –Evaluate the design using “rules of the thumb” Cognitive Walkthrough (Wharton et al, 1994) –A formalized way of imagining people’s thoughts and actions when they use the interface for the first time Claims Analysis – based on scenario-based analysis –Generating positive and negative claims about the effects of features on the user

10 10 Evaluation with Users Testing the system, not the users! Stages of evaluation with users: Preparation Sessions conduct Analysis of results User testing is time-consuming and expensive.

11 11 Evaluation with Users Preparation Determine goals of the usability testing “The user can find the required information in no more than 2 minutes” Write the user tasks “Answer the question: how hot is the sun?” Recruit participants Use the descriptions of users from the requirements phase to detect potential users

12 12 Usability Laboratory Concept: monitor users while they use system Evaluators User one-way mirror

13 13 Evaluation with Users Sessions Conduct Conduct the session –Usability Lab –Simulated working environment Observe the user –Human observer(s) –Video camera –Audio recording Inquire satisfaction data

14 14 Evaluation with Users Results Analysis If possible, use statistical summaries Pay close attention to areas where users –were frustrated –took a long time –couldn't complete tasks Respect the data and users' responses, don't make excuses for designs that failed Note designs that worked and make sure they're incorporated in the final product

15 15 Measurements on operational systems Analysis of system logs Which user interface options were used? When was was the help system used? What errors occurred and how often? Which hyperlinks were followed (click through data)? Human feedback Complaints and praise Bug reports Requests made to customer service

16 16 The Search Explorer Application: Reconstruct a User Sessions

17 17 Refining the design based on evaluation Designers and evaluators need to work as a team Designers are poor evaluators of their own work, but know the requirements, constraints, and context of the design: Some user problems can be addressed with small changes Some user problems require major changes Some user requests (e.g., lots of options) are incompatible with other requests (e.g., simplicity) Do not allow evaluators to become designers and vice versa

18 18 Experiment on the Google Interface Methodology 10 information seeking tasks in 2 categories Users randomized across tasks Click through data to see what the user did Eye tracking data to see what the user viewed Google results presented with ranks changed or reversed An example of interdisciplinary information science research by Cornell's Human Computer Interaction Group and Computer Science Department

19 19 Evaluation Example: Eye Tracking

20 20 Evaluation Example: Eye Tracking

21 21 Google Evaluation Click-Through Data Number of users who clicked on link Rank of hit

22 22 Google Evaluation Eye Tracking Data Number of users who viewed short record before first click Rank of hit

23 23 Google Evaluation: Eye Tracking Data Title: 17.4% Snippet: 42.1% Category: 1.9% URL: 30.4% Other: 8.2% (includes, cached, similar pages, description) Part of short record viewed before first click (% of users)

24 24 Google Experiment: Click Through Data with Ranks Reversed Rank of hit Percentage of users who clicked on link


Download ppt "1 CS 430 / INFO 430 Information Retrieval Lecture 24 Usability 2."

Similar presentations


Ads by Google