Chapter 15: Analytical evaluation Q1, 2. Inspections Heuristic evaluation Walkthroughs Start Q3 Reviewers tend to use guidelines, heuristics and checklists.

Slides:



Advertisements
Similar presentations
Chapter 15: Analytical evaluation
Advertisements

Acknowledgements: Most of this course is based on the excellent course offered by Prof. Kellogg Booth at the British Columbia University, Vancouver, Canada.
Asking Users and Experts
CS305: HCI in SW Development Continuing Evaluation: Asking Experts Inspections and walkthroughs.
Evaluation (cont.): Heuristic Evaluation Cognitive Walkthrough CS352.
Asking users & experts.
Part 4: Evaluation Days 25, 27, 29, 31 Chapter 20: Why evaluate? Chapter 21: Deciding on what to evaluate: the strategy Chapter 22: Planning who, what,
Chapter 15: Analytical evaluation. 2 FJK User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept. Cal Poly San.
Asking users & experts. The aims Discuss the role of interviews & questionnaires in evaluation. Teach basic questionnaire design. Describe how do interviews,
1 User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept. Cal Poly San Luis Obispo FJK 2009.
Heuristics  Basis  Evaluators –Qualifications –Training  Preparation –Scenario  Results –List of problems –Severity –Group synthesis.
Heuristic Evaluation IS 485, Professor Matt Thatcher.
Heuristic Evaluation Evaluating with experts. Discount Evaluation Techniques  Basis: Observing users can be time- consuming and expensive Try to predict.
Evaluation Through Expert Analysis U U U
Evaluating with experts
Heuristic Evaluation.
Evaluation: Inspections, Analytics & Models
Design in the World of Business
Heuristic evaluation IS 403: User Interface Design Shaun Kane.
©2011 1www.id-book.com Analytical evaluation Chapter 15.
Usability testing and field studies
1 Asking users & experts and Testing & modeling users Ref: Ch
1 Usability evaluation and testing User interfaces Jaana Holvikivi Metropolia.
Predictive Evaluation
Discount Evaluation Evaluating with experts. Discount Evaluation Techniques Basis: – Observing users can be time-consuming and expensive – Try to predict.
INFO3315 Week 4 Personas, Tasks Guidelines, Heuristic Evaluation.
Part 1-Intro; Part 2- Req; Part 3- Design  Chapter 20 Why evaluate the usability of user interface designs?  Chapter 21 Deciding on what you need to.
Formative Evaluation cs3724: HCI. Problem scenarios summative evaluation Information scenarios claims about current practice analysis of stakeholders,
Chapter 26 Inspections of the UI. Heuristic inspection Recommended before but in lieu of user observations Sort of like an expert evaluation Heuristics.
Computer Science Department California Polytechnic State University San Luis Obispo, CA, U.S.A. Franz J. Kurfess CPE/CSC 484: User-Centered Design and.
©2011 1www.id-book.com Introducing Evaluation Chapter 12 adapted by Wan C. Yoon
Nielsen’s Ten Usability Heuristics
Usability Evaluation/LP Usability: how to judge it.
Usability Expert Review Anna Diubina. What is usability? The effectiveness, efficiency and satisfaction with which specified users achieve specified goals.
Usability Evaluation June 8, Why do we need to do usability evaluation?
Heuristic Evaluation and Discount Usability Engineering Taken from the writings of Jakob Nielsen – inventor of both.
©2011 Elsevier, Inc. Heuristic Evaluation of MealpayPlus website Ruidi Tan Rachel Vilceus Anant Patil Junior Anthony Xi Li Kinberley Seals Niko Maresco.
Testing & modeling users. The aims Describe how to do user testing. Discuss the differences between user testing, usability testing and research experiments.
Evaluation of User Interface Design 4. Predictive Evaluation continued Different kinds of predictive evaluation: 1.Inspection methods 2.Usage simulations.
Y ASER G HANAM Heuristic Evaluation. Roadmap Introduction How it works Advantages Shortcomings Conclusion Exercise.
CS3205: HCI in SW Development Evaluation (Return to…) We’ve had an introduction to evaluation. Now for more details on…
Chapter 15: Analytical evaluation. Inspections Heuristic evaluation Walkthroughs.
Analytical evaluation Prepared by Dr. Nor Azman Ismail Department of Computer Graphics and Multimedia Faculty of Computer Science & Information System.
Evaluating a UI Design Expert inspection methods Cognitive Walkthrough
Human-computer interaction: users, tasks & designs User modelling in user-centred system design (UCSD) Use with Human Computer Interaction by Serengul.
Usability 1 Usability evaluation Without users - analytical techniques With users - survey and observational techniques.
Asking users & experts. The aims Discuss the role of interviews & questionnaires in evaluation. Teach basic questionnaire design. Describe how do interviews,
Chapter 15: Analytical evaluation. Aims: Describe inspection methods. Show how heuristic evaluation can be adapted to evaluate different products. Explain.
Administrivia  Feedback from the mid-term evaluation  Insights from project proposal.
ParaQ Usability ParaQ Summit II March 14, 2006 Matthew Wong, SNL/CA.
Introduction to Evaluation “Informal” approaches.
Fall 2002CS/PSY Predictive Evaluation (Evaluation Without Users) Gathering data about usability of a design by a specified group of users for a particular.
Ten Usability Heuristics with Example.. Page 2 Heuristic Evaluation Heuristic evaluation is the most popular of the usability inspection methods. Heuristic.
COGNITIVE WALKTHROUGH Vigneshwar Poojar. The cognitive walkthrough is a formalized way of imagining people’s thoughts and actions when they use an interface.
APPLE MAPS FOR APP Heuristic Evaluation By Rayed Alakhtar.
Discount Evaluation User Interface Design. Startup Weekend Wellington CALLING ALL DESIGNERS, DEVELOPERS AND IDEAS FOLK: Startup Weekend returns to Wellington.
Asking Users and Experts Li Zhang (Jacey) Yuewei Zhou (Joanna)
SIE 515 Design Evaluation Lecture 7.
Human Computer Interaction Lecture 15 Usability Evaluation
Imran Hussain University of Management and Technology (UMT)
CS3205: HCI in SW Development Evaluation (Return to…)
Evaluation Techniques 1
PostPC Computing Heuristic Evaluation Prof. Scott Kirkpatrick, HUJI
Chapter 26 Inspections of the user interface
Evaluation.
Formative Evaluation cs3724: HCI.
Miguel Tavares Coimbra
Miguel Tavares Coimbra
Evaluation: Inspections, Analytics & Models
Evaluation: Inspections, Analytics, and Models
Presentation transcript:

Chapter 15: Analytical evaluation Q1, 2

Inspections Heuristic evaluation Walkthroughs Start Q3 Reviewers tend to use guidelines, heuristics and checklists as they do inspections. Here are some you can count on your hands!

Heuristic evaluation Developed by Jacob Nielsen in the early 1990s. Based on heuristics distilled from an empirical analysis of 249 usability problems.

Nielsen’s heuristics 1.Visibility of system status. 2.Match between system and real world. 3.User control and freedom. 4.Consistency and standards. 5.Error prevention. 6.Recognition rather than recall. 7.Flexibility and efficiency of use. 8.Aesthetic and minimalist design. 9.Help users recognize, diagnose, recover from errors. 10.Help and documentation. Start Q4

Other Heuristics As you can see, Heuristics are not always right! They can also be seen as “biases,” in considering the ways they can be wrong.

Heuristics for Facebook Sociability –Why should I join? Why should I come back? –What are the rules of behavior? –Safety –Freedom of expression Usability –How do I join? –How do I use it? Is it easy to use it? –Can I leave the community? –Is it easy to find information? End Q4

Discount evaluation Heuristic evaluation is referred to as discount evaluation when 5 evaluators are used. Empirical evidence suggests that on average 5 evaluators identify 75-80% of usability problems.

3 stages for doing heuristic evaluation Briefing session to tell experts what to do. Evaluation period of 1-2 hours in which: –Each expert works separately; –Take one pass to get a feel for the product; –Take a second pass to focus on specific features. Debriefing session in which experts work together to prioritize problems.

Advantages and problems Few ethical & practical issues Can be difficult & expensive to find experts. Best experts have knowledge of application domain & users. Biggest problems: –Important problems may get missed; –Many trivial problems are often identified; –Experts have biases.

Cognitive walkthroughs Focus on ease of learning. Designer presents an aspect of the design & usage scenarios. Expert is told the assumptions about user population, context of use, task details. One of more experts walk through the design prototype with the scenario.

The 3 questions Will the correct action be sufficiently evident to the user? Will the user notice that the correct action is available? Will the user associate and interpret the response from the action correctly? Finish Q3

Pluralistic walkthrough Variation on the cognitive walkthrough theme. Performed by a carefully managed team.

Predictive models Provide a way of evaluating products or designs without directly involving users. Less expensive than user testing. Usefulness limited to systems with predictable tasks - e.g., telephone answering systems, mobiles, cell phones, etc. Based on expert error-free behavior. Q5

GOMS- Card, Moran and Newell Goals Operations Methods Selection Rules

Keystroke level model The keystroke model allows predictions to be made about how long it takes an expert user to perform a task. Q6,7

Fitts’ Law (Fitts, 1954) Fitts’ Law predicts that the time to point at an object using a device is a function of the distance from the target object & the object’s size.