CS305: HCI in SW Development Continuing Evaluation: Asking Experts Inspections and walkthroughs.

Slides:



Advertisements
Similar presentations
Chapter 15: Analytical evaluation
Advertisements

Acknowledgements: Most of this course is based on the excellent course offered by Prof. Kellogg Booth at the British Columbia University, Vancouver, Canada.
Asking Users and Experts
Evaluation (cont.): Heuristic Evaluation Cognitive Walkthrough CS352.
Asking users & experts.
Usability presented by the OSU Libraries’ u-team.
Part 4: Evaluation Days 25, 27, 29, 31 Chapter 20: Why evaluate? Chapter 21: Deciding on what to evaluate: the strategy Chapter 22: Planning who, what,
Chapter 15: Analytical evaluation. 2 FJK User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept. Cal Poly San.
Asking users & experts. The aims Discuss the role of interviews & questionnaires in evaluation. Teach basic questionnaire design. Describe how do interviews,
1 User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept. Cal Poly San Luis Obispo FJK 2009.
Heuristics  Basis  Evaluators –Qualifications –Training  Preparation –Scenario  Results –List of problems –Severity –Group synthesis.
HCI 201 Week 4 Design Usability Heuristics Tables Links.
Heuristic Evaluation IS 485, Professor Matt Thatcher.
Heuristic Evaluation Evaluating with experts. Discount Evaluation Techniques  Basis: Observing users can be time- consuming and expensive Try to predict.
Evaluation Through Expert Analysis U U U
Evaluating with experts
Heuristic Evaluation.
Usability 2004 J T Burns1 Usability & Usability Engineering.
Evaluation: Inspections, Analytics & Models
10th Workshop "Software Engineering Education and Reverse Engineering" Ivanjica, Serbia, 5-12 September 2010 First experience in teaching HCI course Dusanka.
Mid-Term Exam Review IS 485, Professor Matt Thatcher.
Review an existing website Usability in Design. to begin with.. Meeting Organization’s objectives and your Usability goals Meeting User’s Needs Complying.
Design in the World of Business
©2011 1www.id-book.com Analytical evaluation Chapter 15.
Ch 13. Asking Users & Experts Team 3:Jessica Herron Lauren Sullivan Chris Moore Steven Pautz.
Usability testing and field studies
1 Asking users & experts and Testing & modeling users Ref: Ch
Discount Evaluation Evaluating with experts. Discount Evaluation Techniques Basis: – Observing users can be time-consuming and expensive – Try to predict.
INFO3315 Week 4 Personas, Tasks Guidelines, Heuristic Evaluation.
Part 1-Intro; Part 2- Req; Part 3- Design  Chapter 20 Why evaluate the usability of user interface designs?  Chapter 21 Deciding on what you need to.
Formative Evaluation cs3724: HCI. Problem scenarios summative evaluation Information scenarios claims about current practice analysis of stakeholders,
Chapter 26 Inspections of the UI. Heuristic inspection Recommended before but in lieu of user observations Sort of like an expert evaluation Heuristics.
Computer Science Department California Polytechnic State University San Luis Obispo, CA, U.S.A. Franz J. Kurfess CPE/CSC 484: User-Centered Design and.
©2011 1www.id-book.com Introducing Evaluation Chapter 12 adapted by Wan C. Yoon
Usability Expert Review Anna Diubina. What is usability? The effectiveness, efficiency and satisfaction with which specified users achieve specified goals.
Multimedia Specification Design and Production 2012 / Semester 1 / week 5 Lecturer: Dr. Nikos Gazepidis
Usability Evaluation June 8, Why do we need to do usability evaluation?
Heuristic Evaluation and Discount Usability Engineering Taken from the writings of Jakob Nielsen – inventor of both.
©2011 Elsevier, Inc. Heuristic Evaluation of MealpayPlus website Ruidi Tan Rachel Vilceus Anant Patil Junior Anthony Xi Li Kinberley Seals Niko Maresco.
SEG3120 User Interfaces Design and Implementation
Y ASER G HANAM Heuristic Evaluation. Roadmap Introduction How it works Advantages Shortcomings Conclusion Exercise.
CS3205: HCI in SW Development Evaluation (Return to…) We’ve had an introduction to evaluation. Now for more details on…
Chapter 15: Analytical evaluation. Inspections Heuristic evaluation Walkthroughs.
A centre of expertise in digital information management 1 UKOLN is supported by: SWMLAC Workshop: Accessibility and Usability Marieke Guy.
Chapter 15: Analytical evaluation Q1, 2. Inspections Heuristic evaluation Walkthroughs Start Q3 Reviewers tend to use guidelines, heuristics and checklists.
Analytical evaluation Prepared by Dr. Nor Azman Ismail Department of Computer Graphics and Multimedia Faculty of Computer Science & Information System.
Evaluating a UI Design Expert inspection methods Cognitive Walkthrough
Usability 1 Usability evaluation Without users - analytical techniques With users - survey and observational techniques.
CS5714 Usability Engineering Usability Inspection Copyright © 2003 H. Rex Hartson and Deborah Hix.
Asking users & experts. The aims Discuss the role of interviews & questionnaires in evaluation. Teach basic questionnaire design. Describe how do interviews,
Chapter 15: Analytical evaluation. Aims: Describe inspection methods. Show how heuristic evaluation can be adapted to evaluate different products. Explain.
Administrivia  Feedback from the mid-term evaluation  Insights from project proposal.
Oct 211 The next two weeks Oct 21 & 23: Lectures on user interface evaluation Oct 28: Lecture by Dr. Maurice Masliah No office hours (out of town) Oct.
Introduction to Evaluation “Informal” approaches.
Fall 2002CS/PSY Predictive Evaluation (Evaluation Without Users) Gathering data about usability of a design by a specified group of users for a particular.
Ten Usability Heuristics with Example.. Page 2 Heuristic Evaluation Heuristic evaluation is the most popular of the usability inspection methods. Heuristic.
COGNITIVE WALKTHROUGH Vigneshwar Poojar. The cognitive walkthrough is a formalized way of imagining people’s thoughts and actions when they use an interface.
Master Medical Informatics Biomedical Research and evaluation Methodology Maarten Buiter Khalid Bohoudi Mark de Groot Evelyn Lai Usability evaluation of.
APPLE MAPS FOR APP Heuristic Evaluation By Rayed Alakhtar.
Asking Users and Experts Li Zhang (Jacey) Yuewei Zhou (Joanna)
SIE 515 Design Evaluation Lecture 7.
Human Computer Interaction Lecture 15 Usability Evaluation
Imran Hussain University of Management and Technology (UMT)
CS3205: HCI in SW Development Evaluation (Return to…)
Evaluation Techniques 1
Chapter 26 Inspections of the user interface
Evaluation.
Formative Evaluation cs3724: HCI.
Evaluation: Inspections, Analytics & Models
Evaluation: Inspections, Analytics, and Models
Presentation transcript:

CS305: HCI in SW Development Continuing Evaluation: Asking Experts Inspections and walkthroughs

Overview

The aims  Describe how do heuristic evaluation & walkthroughs.  Discuss strengths & limitations of these techniques Old topics in this unit (not covered this year)  Discuss the role of interviews & questionnaires in evaluation.  Teach basic questionnaire design.  Describe how to collect, analyze & present data.

Overview: Evaluation By Experts Several approaches –Heuristic evaluation –Walkthroughs (several flavors) In general: –Inexpensive and quick compared to asking users “Discount evaluation” –Experts may suggest solutions (users probably don’t)

Asking experts Experts use their knowledge of users & technology to review software usability Expert critiques (crits) can be formal or informal reports Heuristic evaluation is a review guided by a set of heuristics Walkthroughs involve stepping through a pre-planned scenario noting potential problems

Heuristic evaluation Developed Jacob Nielsen in the early 1990s Based on heuristics distilled from an empirical analysis of 249 usability problems These heuristics have been revised for current technology, e.g., HOMERUN for web Heuristics still needed for mobile devices, wearables, virtual worlds, etc. Design guidelines form a basis for developing heuristics

Nielsen’s “general” heuristics (Remember these?) Visibility of system status Match between system and real world User control and freedom Consistency and standards Help users recognize, diagnose, recover from errors Error prevention Recognition rather than recall Flexibility and efficiency of use Aesthetic and minimalist design Help and documentation

Nielsen HOMERUN Derived from general heuristics –More specific for commercial Web sites: High-quality content Often updated Minimal download time Ease of use Relevant to users’ needs Unique to the online medium Netcentric corporate culture

Discount evaluation Heuristic evaluation is referred to as discount evaluation when 5 evaluators are used. Empirical evidence suggests that on average 5 evaluators identify 75-80% of usability problems. Note how similar to “quick and dirty” user studies results

3 stages for doing heuristic evaluation Briefing session to tell experts what to do Evaluation period of 1-2 hours in which: - Each expert works separately - Take one pass to get a feel for the product - Take a second pass to focus on specific features Debriefing session in which experts work together to prioritize problems

Advantages and problems Few ethical & practical issues to consider Can be difficult & expensive to find experts Best experts have knowledge of application domain & users Biggest problems - important problems may get missed - many trivial problems are often identified One study has shown: –For each true problem, 1.2 false alarms and 0.6 missed problems

Evaluating Websites Heuristics like Nielsen’s “general list” less applicable for websites –Nielsen’s HOMERUN Important: Heuristics can become more useful when they become guidelines for development Analysis of outcomes might lead to particular suggestions. E.g. for one set of published results: –(1) layout of pages, (2) arrangement of topics, (3) depth of navigation

Preece’s Web Heuristics/Guidelines Navigation –(1) long pages with wasted space, (2) navigation support like a site map, (3) good menu structures, (4) standard link colors, (5) consistent look and feel Access –(1) avoid complex URLs, (2) long download times Information Design (both content and comprehension) –(1) outdated or incomplete info, (2) good graphical design, (3) good use of color, (4) avoid gratuitous graphics and animation, (5) consistency

Topic: Heuristics for… What are a good set of heuristics for a cellphone’s UI? Status –status of call should be visible (call, connection, roaming, battery) –mode (vibrate etc.) –unread text messages, voice mails Navigation –one-button for phonebook numbers –consistent navigation button for back, etc. Error prevention –prevent accidental button presses in pocket, backpack, purse Efficiency

Topic: Heuristics for… What are a good set of heuristics for a cellphone’s UI? Status Navigation Error prevention Efficiency

Overview: Walkthroughs Like heuristic evaluation because –Experts are involved –Criteria are used to evaluate things Different because: –Defining characteristic: they “walk through” one or more tasks –In addition to experts, may involve designers and/or users

Cognitive walkthroughs Focus on ease of learning Designer presents an aspect of the design & usage scenarios One of more experts walk through the design prototype with the scenario Expert is told the assumptions about user population, context of use, task details Experts are guided by 3 questions (on next slide) Disadvantages? time-consuming, laborious, narrow focus (maybe that’s OK)

The 3 questions Will the correct action be sufficiently evident to the user? Will the user notice that the correct action is available? Will the user associate and interpret the response from the action correctly? As the experts work through the scenario they note problems

Pluralistic walkthrough Variation on the cognitive walkthrough theme Performed by a carefully managed team –Experts, users, and developers The panel begins by working separately –Goes through a task scenario, using screens from a prototype (perhaps) Then there is managed discussion that leads to agreed decisions The approach lends itself well to participatory design Disadvantages: larger group to schedule, only look at a few scenarios

Key points Expert evaluation: heuristic & walkthroughs –Relatively inexpensive because no users –Heuristic evaluation relatively easy to learn –May miss key problems & identify false ones –Walkthroughs: more task focused, more time and cost

Heuristic Categories from ID-Book Visibility of System Status Match between system and real world User control and freedom Consistency and Standards Error Prevention Recognition rather than recall Flexibility and efficiency of use Aesthetic and minimalist design Help users recover from errors Help and documentation Navigation Structure of Information Physical constraints Extraordinary users

Class Exercise: Choose One (A) Heuristic evaluation for cell phone –Define a few criteria for cell phones Use categories but make them phone-specific –E.g. visibility of status; or, error prevention; or recognition vs. recall Produce your own list of important criteria –Be experts and review UI for issues List issues for one (or more) phones (B) Cognitive walkthrough for one or more department web pages –Identify tasks Find requirements for major electives for that major Find rules for major(s) in that department Find faculty who are working in a certain research area etc. –Walk-through these tasks identifying issues based on particular criteria (ease of learning) (B1) Same but do a shopping site etc.

End, Spring 2008