CS 575 Spring 2012 CSULA Bapa Rao Lecture 6. Agenda for today Review of previous meeting Student Comments Heuristic Evaluation Presentation Team reports.

Slides:



Advertisements
Similar presentations
DEVELOPING A METHODOLOGY FOR MS3305 CW2 Some guidance.
Advertisements

Heuristic Evaluation.
Ch 11 Cognitive Walkthroughs and Heuristic Evaluation Yonglei Tao School of Computing and Info Systems GVSU.
11 HCI - Lesson 5.1 Heuristic Inspection (Nielsen’s Heuristics) Prof. Garzotto.
Electronic Communications Usability Primer.
Multimedia and the World Wide Web
MScIT HCI Web GUI design. IBM’s CUA guidelines - taster Design Principles Each principle has supporting implementation techniques. The two design.
Part 4: Evaluation Days 25, 27, 29, 31 Chapter 20: Why evaluate? Chapter 21: Deciding on what to evaluate: the strategy Chapter 22: Planning who, what,
AJ Brush Richard Anderson
Heuristic Evaluation. Sources for today’s lecture: Professor James Landay: stic-evaluation/heuristic-evaluation.ppt.
Interactive Systems Technical Design
Testing your design Without users: With users: Cognitive walkthrough
Today’s class Group Presentation More about principles, guidelines, style guides and standards In-class exercises More about usability Norman’s model of.
Heuristic Evaluation IS 485, Professor Matt Thatcher.
Heuristic Evaluation Evaluating with experts. Discount Evaluation Techniques  Basis: Observing users can be time- consuming and expensive Try to predict.
Evaluating with experts
SIMS 213: User Interface Design & Development Marti Hearst Tues Feb 13, 2001.
Hueristic Evaluation. Heuristic Evaluation Developed by Jakob Nielsen Helps find usability problems in a UI design Small set (3-5) of evaluators examine.
Evaluation techniques Part 1
Heuristic Evaluation of Usability Teppo Räisänen
Human-Computer Interaction
1 SKODA-AUTO.CZ prototype evaluation Poznań, 23th of March 2015.
Heuristic evaluation IS 403: User Interface Design Shaun Kane.
Heuristic Evaluation “Discount” Usability Testing Adapted from material by Marti Hearst, Loren Terveen.
Heuristic Evaluation: Hotels.com
User Centred Design Overview. Human centred design processes for interactive systems, ISO (1999), states: "Human-centred design is an approach to.
Mario Čagalj University of Split 2013/2014. Human-Computer Interaction (HCI)
SAMPLE HEURISTIC EVALUATION FOR 680NEWS.COM Glenn Teneycke.
INFO3315 Week 4 Personas, Tasks Guidelines, Heuristic Evaluation.
CS 4720 Usability and Accessibility CS 4720 – Web & Mobile Systems.
Basic Principles of HCI Lecture Requirements Analysis Establish the goals for the Website from the standpoint of the user and the business. Agree.
Nielsen’s Ten Usability Heuristics
Usability Evaluation/LP Usability: how to judge it.
10 Usability Heuristics for User Interface Design.
Multimedia Specification Design and Production 2012 / Semester 1 / week 5 Lecturer: Dr. Nikos Gazepidis
Usability Evaluation June 8, Why do we need to do usability evaluation?
SEG3120 User Interfaces Design and Implementation
Design 2 (Chapter 5) Conceptual Design Physical Design Evaluation
LZW Compression Grant Friedline Robert Frankeny Thomas Sutcavage.
Y ASER G HANAM Heuristic Evaluation. Roadmap Introduction How it works Advantages Shortcomings Conclusion Exercise.
Mahindra Infotainment System Heuristic Evaluation v1.0 Maya Studios July 6, 2010.
Evaluating a UI Design Expert inspection methods Cognitive Walkthrough
Information Systems and Organisations
 What to “know”? ◦ Goals of information visualization. ◦ About human perceptual capabilities. ◦ About the issues involved in designing visualization for.
Usability 1 Usability evaluation Without users - analytical techniques With users - survey and observational techniques.
Developed by Tim Bell Department of Computer Science and Software Engineering University of Canterbury Human Computer Interaction.
June 5, 2007Mohamad Eid Heuristic Evaluation Chapter 9.
CENG 394 Introduction to HCI Usability Heuristics.
Heuristic Evaluation Short tutorial to heuristic evaluation
Alan Woolrych My Background Currently – Research & Liaison Officer (DMN) From 1 st January 2003 Usability Researcher with.
CMSC 345, Version 1/11 S. Mitchell 1 Usability and User Interface Design.
Usability Heuristics Avoid common design pitfalls by following principles of good design Nielsen proposes 10 heuristics, others propose more or less. Inspect.
Basic Elements.  Design is the process of collecting ideas, and aesthetically arranging and implementing them, guided by certain principles for a specific.
Efficient Techniques for Evaluating UI Designs CSE 403.
User Interface Evaluation Heuristic Evaluation Lecture #17.
© 2016 Cognizant. © 2016 Cognizant Introduction PREREQUISITES SCOPE Heuristic evaluation is a discount usability engineering method for quick, cheap,
CS 575 Spring 2012 Lecture Bapa Rao.
Ten Usability Heuristics These are ten general principles for user interface design. They are called "heuristics" because they are more in the nature of.
Discount Evaluation User Interface Design. Startup Weekend Wellington CALLING ALL DESIGNERS, DEVELOPERS AND IDEAS FOLK: Startup Weekend returns to Wellington.
User Interface Design SCMP Special Topic: Software Development
Heuristic Evaluation 3 CPSC 481: HCI I Fall 2014
A NEW FACE OF THE TECHNICAL COMMUNICATOR – UX IS OUR STRENGTH – WE DESIGN TO WRITE BY CHRIS GANTA © 2016, STC INDIA CHAPTER.
Unit 14 Website Design HND in Computing and Systems Development
Heuristic Evaluation Jon Kolko Professor, Austin Center for Design.
PostPC Computing Heuristic Evaluation Prof. Scott Kirkpatrick, HUJI
One-timer?. A new face of the technical communicator – UX is our strength – we design to write.
10 Design Principles.
Nilesen 10 hueristics.
CSM18 Usability Engineering
Some Usability Engineering Methods
Presentation transcript:

CS 575 Spring 2012 CSULA Bapa Rao Lecture 6

Agenda for today Review of previous meeting Student Comments Heuristic Evaluation Presentation Team reports Heuristic Evaluation Task Assignment In-class heuristic evaluations [time permitting] Alan Kaye video

Review of previous meeting App Ideas / Use Cases Team Finalization Each team should own one app idea they will pursue

Team Reports Red: Armando, Albert, Phu Orange: Phanti, Rain, Long Green: Ali, Behin Blue: Yin yin, Sina, Amir Pink: Raudel, Dinesh, Tony Purple: Gavik, Hardik, Mohammed Brown: Shweta, Sowmya Grey: Harshil, Haresh, Vishrut

Work assignment from the previous week (Revise & re-)write the high-level use case spec for your assigned app. – For some, this will be catch-up Goal is to have a paper prototype Point of view: – What experience do you want to provide for the user? – Relate to needfinding exercise results for validation Storyboard two alternative design ideas – Use web, library or bookstore resources along with Aziz’s guide/sketch – Generate paper storyboards & upload pics to your wiki page Discuss & Choose one of the ideas Make an Oz-style paper prototype of the chosen idea – (see HanMail) Make a video of use of paper prototype, upload & link Bring paper prototype to next class Refer: Assignment 3 in Stanford course link

Heuristic Evaluation Motivation Nielsen’s Usability heuristics Nielsen’s Severity Ratings How many evaluators? How to conduct a heuristic evaluation

Motivation Usability engineering is costly Heuristic Evaluation uses guidelines rather than hard criteria – “Discount User Testing” Most of the value for low investment ml

The Scenario Concept (e.g., Wizard of Oz)

Cost-Benefit Analysis

Nielsen’s Usability Heuristics Visibility of System Status Match Between System and the real world User control and freedom Consistency and standards Error prevention Recognition rather than recall Flexibility and efficiency of use Aesthetic and minimalist design Help users recognize, diagnose, and recover from errors Help and documentation

Usability Heuristics Visibility of system status The system should always keep users informed about what is going on, through appropriate feedback within reasonable time. Match between system and the real world The system should speak the users' language, with words, phrases and concepts familiar to the user, rather than system-oriented terms. Follow real-world conventions, making information appear in a natural and logical order. User control and freedom Users often choose system functions by mistake and will need a clearly marked "emergency exit" to leave the unwanted state without having to go through an extended dialogue. Support undo and redo. Consistency and standards Users should not have to wonder whether different words, situations, or actions mean the same thing. Follow platform conventions. Error prevention Even better than good error messages is a careful design which prevents a problem from occurring in the first place. Either eliminate error-prone conditions or check for them and present users with a confirmation option before they commit to the action. Recognition rather than recall Minimize the user's memory load by making objects, actions, and options visible. The user should not have to remember information from one part of the dialogue to another. Instructions for use of the system should be visible or easily retrievable whenever appropriate. Flexibility and efficiency of use Accelerators -- unseen by the novice user -- may often speed up the interaction for the expert user such that the system can cater to both inexperienced and experienced users. Allow users to tailor frequent actions. Aesthetic and minimalist design Dialogues should not contain information which is irrelevant or rarely needed. Every extra unit of information in a dialogue competes with the relevant units of information and diminishes their relative visibility. Help users recognize, diagnose, and recover from errors Error messages should be expressed in plain language (no codes), precisely indicate the problem, and constructively suggest a solution. Help and documentation Even though it is better if the system can be used without documentation, it may be necessary to provide help and documentation. Any such information should be easy to search, focused on the user's task, list concrete steps to be carried out, and not be too large.

Nielsen’s Severity Ratings Factors: – Frequency – Impact – Persistence 0: Not a problem 1: Cosmetic 2: Minor usability problem 3: Major usability problem 4: Usability catastrophe

Number of evaluators vs. problems found

How Many Evaluators? Cost-benefit Number of evaluators: 3-5

Not Every Evaluator Catches Every Problem

How Many Evaluators, contd. Taking the mean of 3 evaluations usually works

How to do Heuristic Testing Do “walkthroughs” first Each individual evaluator inspects the interface alone After, evaluators can communicate and findings can be aggregated Record: – Written reports from each evaluator or – Observer notes Observer can assist 1-2 hrs – Probably closer to 30 min for our app evaluator goes through the interface several times and inspects the various dialogue elements and compares them with a list of recognized usability principles (the heuristics). also allowed to consider any additional usability principles or results that come to mind that may be relevant for any specific dialogue element or category go through the interface at least twice – get a feel for the flow of the interaction and the general scope of the system – focus on specific interface elements while knowing how they fit into the larger whole.

Heuristic eval followup: Design Advice Debriefing Session – Evaluators – Observer – Design team Brainstorming mode – Address design problems – Identify positive aspects

Output from heuristic testing process a list of usability problems in the interface with references to those usability principles that were violated by the design in each case in the opinion of the evaluator. Not enough to say “I don’t like this” – Explain why with respect to heuristic criteria

In-class heuristic evaluation Pick a team to be the evaluators for a paper design

Assignment for the coming week Self-evaluation of your work from the previous week (storyboard / paper prototype) Start a journal on your personal page documenting the work you have personally done, and your personal thoughts & ideas – Back-fill with work you have already done – Include self-evaluation – For every activity done by your team, you should post documentation of your personal contribution and self-evaluation Not everyone has to do every task, but everyone has to do significant tasks in the team activity – This will, in part, determine your personal final grade I can’t grade you if I don’t know what you have done I will provide evaluation templates as appropriate

Assignment for the coming week (contd) Read the papers on Heuristic Evaluation posted on the wiki. Complete heuristic evaluation of your paper prototype and post your evaluation report on the team wiki page. – Be sure to include your personal (course-related) activity and contribution on your personal wiki page Do a debriefing and make needed design changes Post a self-evaluation of your heuristic evaluation. Put an implementation proposal and plan together – See the sample on wiki – You have 2 weeks to build the prototype so limit your scope accordingly starting May 18th + 1 week for testing – Identify platform – Decide what backend functions you will “fake” and how – Should be demo-able in class on June 9 th 2012 No coding till you finish heuristic evaluation step