Computer Science Department California Polytechnic State University San Luis Obispo, CA, U.S.A. Franz J. Kurfess CPE/CSC 484: User-Centered Design and.

Slides:



Advertisements
Similar presentations
Chapter 15: Analytical evaluation
Advertisements

Asking Users and Experts
CS305: HCI in SW Development Continuing Evaluation: Asking Experts Inspections and walkthroughs.
Chapter 14: Usability testing and field studies. 2 FJK User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept.
Evaluation (cont.): Heuristic Evaluation Cognitive Walkthrough CS352.
Asking users & experts.
Chapter 15: Analytical evaluation. 2 FJK User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept. Cal Poly San.
Asking users & experts. The aims Discuss the role of interviews & questionnaires in evaluation. Teach basic questionnaire design. Describe how do interviews,
1 User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept. Cal Poly San Luis Obispo FJK 2009.
1 User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept. Cal Poly San Luis Obispo FJK 2005.
Heuristics  Basis  Evaluators –Qualifications –Training  Preparation –Scenario  Results –List of problems –Severity –Group synthesis.
User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept. Cal Poly San Luis Obispo FJK 2005.
Heuristic Evaluation IS 485, Professor Matt Thatcher.
Heuristic Evaluation Evaluating with experts. Discount Evaluation Techniques  Basis: Observing users can be time- consuming and expensive Try to predict.
Evaluation Through Expert Analysis U U U
Evaluating with experts
Heuristic Evaluation.
Usability 2004 J T Burns1 Usability & Usability Engineering.
Evaluation: Inspections, Analytics & Models
1 User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept. Cal Poly San Luis Obispo FJK 2005.
Design in the World of Business
Usability 2009 J T Burns1 Usability & Usability Engineering.
©2011 1www.id-book.com Analytical evaluation Chapter 15.
Computer Science Department California Polytechnic State University San Luis Obispo, CA, U.S.A. Franz J. Kurfess CPE/CSC 484: User-Centered Design and.
Usability testing and field studies
1 Asking users & experts and Testing & modeling users Ref: Ch
Predictive Evaluation
Discount Evaluation Evaluating with experts. Discount Evaluation Techniques Basis: – Observing users can be time-consuming and expensive – Try to predict.
Slides based on those by Paul Cairns, York ( users.cs.york.ac.uk/~pcairns/) + ID3 book slides + slides from: courses.ischool.berkeley.edu/i213/s08/lectures/i ppthttp://www-
Ch 14. Testing & modeling users
Part 1-Intro; Part 2- Req; Part 3- Design  Chapter 20 Why evaluate the usability of user interface designs?  Chapter 21 Deciding on what you need to.
Formative Evaluation cs3724: HCI. Problem scenarios summative evaluation Information scenarios claims about current practice analysis of stakeholders,
Computer Science Department California Polytechnic State University San Luis Obispo, CA, U.S.A. Franz J. Kurfess CPE/CSC 484: User-Centered Design and.
Chapter 26 Inspections of the UI. Heuristic inspection Recommended before but in lieu of user observations Sort of like an expert evaluation Heuristics.
©2011 1www.id-book.com Introducing Evaluation Chapter 12 adapted by Wan C. Yoon
Usability Expert Review Anna Diubina. What is usability? The effectiveness, efficiency and satisfaction with which specified users achieve specified goals.
Usability Evaluation June 8, Why do we need to do usability evaluation?
Computer Science Department California Polytechnic State University San Luis Obispo, CA, U.S.A. Franz J. Kurfess CPE/CSC 484: User-Centered Design and.
SEG3120 User Interfaces Design and Implementation
Testing & modeling users. The aims Describe how to do user testing. Discuss the differences between user testing, usability testing and research experiments.
Identifying needs and establishing requirements
Evaluation of User Interface Design 4. Predictive Evaluation continued Different kinds of predictive evaluation: 1.Inspection methods 2.Usage simulations.
Chapter 15: Analytical evaluation. Inspections Heuristic evaluation Walkthroughs.
Chapter 15: Analytical evaluation Q1, 2. Inspections Heuristic evaluation Walkthroughs Start Q3 Reviewers tend to use guidelines, heuristics and checklists.
Analytical evaluation Prepared by Dr. Nor Azman Ismail Department of Computer Graphics and Multimedia Faculty of Computer Science & Information System.
Evaluating a UI Design Expert inspection methods Cognitive Walkthrough
Usability 1 Usability evaluation Without users - analytical techniques With users - survey and observational techniques.
Computer Science Department California Polytechnic State University San Luis Obispo, CA, U.S.A. Franz J. Kurfess CPE/CSC 484: User-Centered Design and.
User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept. Cal Poly San Luis Obispo FJK 2005.
Asking users & experts. The aims Discuss the role of interviews & questionnaires in evaluation. Teach basic questionnaire design. Describe how do interviews,
Cognitive Walkthrough More evaluating with experts.
Chapter 15: Analytical evaluation. Aims: Describe inspection methods. Show how heuristic evaluation can be adapted to evaluate different products. Explain.
Administrivia  Feedback from the mid-term evaluation  Insights from project proposal.
1 Usability evaluation and testing User interfaces Jaana Holvikivi Metropolia.
Oct 211 The next two weeks Oct 21 & 23: Lectures on user interface evaluation Oct 28: Lecture by Dr. Maurice Masliah No office hours (out of town) Oct.
Introduction to Evaluation “Informal” approaches.
Fall 2002CS/PSY Predictive Evaluation (Evaluation Without Users) Gathering data about usability of a design by a specified group of users for a particular.
COGNITIVE WALKTHROUGH Vigneshwar Poojar. The cognitive walkthrough is a formalized way of imagining people’s thoughts and actions when they use an interface.
Asking Users and Experts Li Zhang (Jacey) Yuewei Zhou (Joanna)
SIE 515 Design Evaluation Lecture 7.
Human Computer Interaction Lecture 15 Usability Evaluation
Imran Hussain University of Management and Technology (UMT)
CS3205: HCI in SW Development Evaluation (Return to…)
Evaluation Techniques 1
Chapter 26 Inspections of the user interface
Evaluation.
Formative Evaluation cs3724: HCI.
Testing & modeling users
Evaluation: Inspections, Analytics & Models
Evaluation: Inspections, Analytics, and Models
Presentation transcript:

Computer Science Department California Polytechnic State University San Luis Obispo, CA, U.S.A. Franz J. Kurfess CPE/CSC 484: User-Centered Design and Development

2 © Franz J. Kurfess Logistics ❖ Assignments  A4 - Data Collection trial runs complete  A4 presentation schedule: Team 3 today, rest Thu during the lab time  A5 - Usability Evaluation  selection of systems to evaluate is on the Project Teams Web page ❖ Guest presentation on Universal Design and ADA compliance on Thu, May 17  Trey Duffy, John Lee, Disability Resource Center ❖ HCI Lab Opening Ceremony on Thu, May 31, 9: :00 am  poster boards, demos from 484 teams? ❖ CEng Project Fair Thu, May 31, 4:00 - 7:00 pm  final project displays ❖ CSC IAB presentations Fri, June 1, 10: :00  20 min project presentations

© Franz J. Kurfess Copyright Notice These slides are a revised version of the originals provided with the book “Interaction Design” by Jennifer Preece, Yvonne Rogers, and Helen Sharp, Wiley, I added some material, made some minor modifications, and created a custom show to select a subset.  Slides added or modified by me use a different style, with my name at the bottom

© Franz J. Kurfess Chapter Overview user testing and evaluations experiments – variables and conditions – data collection and analysis predictive models – GOMs – keystroke level model FJK 2009

5 © Franz J. Kurfess FJK 2008 Motivation ❖ user modeling tries to predict user performance for tasks performed on a system ❖ heuristic evaluations and walk-throughs can provide quick feedback without the overhead of user testing

6 © Franz J. Kurfess FJK 2005 Objectives ❖ know the advantages and disadvantages of analytical evaluation  become familiar with the GOMS user model, the keystroke level model, and Fitts’ law  know how to do a keystroke level analysis ❖ understand the heuristic evaluation and walkthroughs methods ❖ know how heuristic evaluation can be adapted to evaluate different products  determine when these techniques can be applied

©2011 Analytical evaluation Chapter 15

©2011 Aims: Describe the key concepts associated with inspection methods. Explain how to do heuristic evaluation and walkthroughs. Explain the role of analytics in evaluation. Describe how to perform two types of predictive methods, GOMS and Fitts’ Law.

©2011 Inspections Several kinds. Experts use their knowledge of users & technology to review software usability. Expert critiques (crits) can be formal or informal reports. Heuristic evaluation is a review guided by a set of heuristics. Walkthroughs involve stepping through a pre-planned scenario noting potential problems.

©2011 Heuristic evaluation Developed Jacob Nielsen in the early 1990s. Based on heuristics distilled from an empirical analysis of 249 usability problems. These heuristics have been revised for current technology. Heuristics being developed for mobile devices, wearables, virtual worlds, etc. Design guidelines form a basis for developing heuristics.

©2011 Nielsen’s original heuristics Visibility of system status. Match between system and real world. User control and freedom. Consistency and standards. Error prevention. Recognition rather than recall. Flexibility and efficiency of use. Aesthetic and minimalist design. Help users recognize, diagnose, recover from errors. Help and documentation.

©2011 Discount evaluation Heuristic evaluation is referred to as discount evaluation when 5 evaluators are used. Empirical evidence suggests that on average 5 evaluators identify 75-80% of usability problems.

©2011 No. of evaluators & problems

© stages for doing heuristic evaluation Briefing session to tell experts what to do. Evaluation period of 1-2 hours in which: –Each expert works separately; –Take one pass to get a feel for the product; –Take a second pass to focus on specific features. Debriefing session in which experts work together to prioritize problems.

©2011 Advantages and problems Few ethical & practical issues to consider because users not involved. Can be difficult & expensive to find experts. Best experts have knowledge of application domain & users. Biggest problems: –Important problems may get missed; –Many trivial problems are often identified; –Experts have biases.

©2011 Heuristics for websites focus on key criteria (Budd, 2007) Clarity Minimize unnecessary complexity & cognitive load Provide users with context Promote positive & pleasurable user experience

©2011 Cognitive walkthroughs Focus on ease of learning. Designer presents an aspect of the design & usage scenarios. Expert is told the assumptions about user population, context of use, task details. One or more experts walk through the design prototype with the scenario. Experts are guided by 3 questions.

©2011 The 3 questions Will the correct action be sufficiently evident to the user? Will the user notice that the correct action is available? Will the user associate and interpret the response from the action correctly? As the experts work through the scenario they note problems.

©2011 Pluralistic walkthrough Variation on the cognitive walkthrough theme. Performed by a carefully managed team. The panel of experts begins by working separately. Then there is managed discussion that leads to agreed decisions. The approach lends itself well to participatory design.

©2011 A project for you … q provides heuristics and a template so that you can evaluate different kinds of systems. q More information about this is provided in the interactivities section of the id-book.com website.

©2011 Analytics A method for evaluating user traffic through a system or part of a system Many examples including Google Analytics, Visistat (shown below) Times of day & visitor IP addresses

©2011 Social action analysis (Perer & Shneiderman, 2008)

©2011 Predictive models Provide a way of evaluating products or designs without directly involving users. Less expensive than user testing. Usefulness limited to systems with predictable tasks - e.g., telephone answering systems, mobiles, cell phones, etc. Based on expert error-free behavior.

©2011 GOMS G oals –what the user wants to achieve eg. find a website. O perators –the cognitive processes & physical actions needed to attain goals, eg. decide which search engine to use. M ethods –the procedures to accomplish the goals, eg. drag mouse over field, type in keywords, press the go button. S election rules –decide which method to select when there is more than one.

©2011 Keystroke Level Model (KLM) q a quantitative model based on GOMS q allows predictions to be made about how long it takes an expert user to perform a task –only models time for key strokes –does not consider time to think about the task

©2011 Response times for keystroke level operators (Card et al., 1983)

©2011 Summing together

©2011 Gaze Change Time Using keystroke level models (KLM)to calculate time to change gaze –(Holleis et al., 2007)

©2011 Fitts’ Law (Fitts, 1954) Fitts’ Law predicts that the time to point at an object using a device is a function of the distance from the target object & the object’s size. The further away & the smaller the object, the longer the time to locate it & point to it. Fitts’ Law is useful for evaluating systems for which the time to locate an object is important, e.g., a cell phone, a handheld devices.

©2011 A project for you … q Use the web & other resources to research claims that heuristic evaluation often identifies problems that are not serious & may not even be problems. q Decide whether you agree or disagree. q Write a brief statement arguing your position. q Provide practical evidence & evidence from the literature to support your position.

©2011 A Project for you …Fitts’ Law Visit Tog’s website and do Tog’s quiz, designed to give you fitts! 2DesignedToGiveFitts.html

©2011 Key points Inspections can be used to evaluate requirements, mockups, functional prototypes, or systems. User testing & heuristic evaluation may reveal different usability problems. Walkthroughs are focused so are suitable for evaluating small parts of a product. Analytics involves collecting data about users activity on a website or product The GOMS and KLM models and Fitts’ Law can be used to predict expert, error-free performance for certain kinds of tasks.