CS5714 Usability Engineering Formative Evaluation of User Interaction: After Evaluation Session Copyright © 2003 H. Rex Hartson and Deborah Hix.

Slides:



Advertisements
Similar presentations
Design Time Here We Go!.
Advertisements

Copyright 2010, The World Bank Group. All Rights Reserved. Statistical Project Monitoring Section B 1.
Chapter McGraw-Hill/Irwin Copyright © 2006 by The McGraw-Hill Companies, Inc. All rights reserved. 9 Net Present Value and Other Investment Criteria.
McGraw-Hill/Irwin Copyright © 2008 by The McGraw-Hill Companies, Inc. All rights reserved. 9 Net Present Value and Other Investment Criteria.
Chapter McGraw-Hill Ryerson © 2013 McGraw-Hill Ryerson Limited 9 Prepared by Anne Inglis Net Present Value and Other Investment Criteria.
0 Net Present Value and Other Investment Criteria.
Chapter 9 INVESTMENT CRITERIA Pr. Zoubida SAMLAL GF 200.
Virtual University - Human Computer Interaction 1 © Imran Hussain | UMT Imran Hussain University of Management and Technology (UMT) Lecture 16 HCI PROCESS.
CS5714 Usability Engineering An Iterative, Evaluation- Centered Life Cycle For Interaction Development Copyright © 2003 H. Rex Hartson and Deborah Hix.
CS5714 Usability Engineering Wrap Up and Making It All Work Copyright © 2003 H. Rex Hartson and Deborah Hix.
© 2003 The McGraw-Hill Companies, Inc. All rights reserved. Net Present Value and Other Investment Criteria Chapter Nine.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 17 Slide 1 Rapid software development.
Each individual person is working on a GUI subset. The goal is for you to create screens for three specific tasks your user will do from your GUI Project.
© 2003 The McGraw-Hill Companies, Inc. All rights reserved. Net Present Value and Other Investment Criteria Chapter 9.
Usability Evaluation Evaluation should occur continually through the design and implementation process. Evaluation methods are applied as the interface.
The Process of Interaction Design. Overview What is Interaction Design? —Four basic activities —Three key characteristics Some practical issues —Who are.
The Process of Interaction Design
What is Interaction Design?
The Process of Interaction Design. What is Interaction Design? It is a process: — a goal-directed problem solving activity informed by intended use, target.
Heuristic Evaluation IS 485, Professor Matt Thatcher.
© 2003 The McGraw-Hill Companies, Inc. All rights reserved. Net Present Value and Other Investment Criteria Chapter Nine.
SE 555 Software Requirements & Specification 1 SE 555 Software Requirements & Specification Prototyping.
Stoimen Stoimenov QA Engineer QA Engineer SitefinityLeads,SitefinityTeam6 Telerik QA Academy Telerik QA Academy.
Good Decision Criteria
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 17 Slide 1 Extreme Programming.
1. Learning Outcomes At the end of this lecture, you should be able to: –Define the term “Usability Engineering” –Describe the various steps involved.
Presented to: [Date] By (Insert Name) Failure Mode and Effect Analysis (FMEA)
INFO 637Lecture #31 Software Engineering Process II Launching & Strategy INFO 637 Glenn Booker.
What If I Must Go Beyond a Preliminary Assessment? (the example of a USAID EA under Reg. 216) [DATE][SPEAKERS NAMES]
System Planning- Preliminary investigation
1 ISA&D7‏/8‏/ ISA&D7‏/8‏/2013 Systems Development Life Cycle Phases and Activities in the SDLC Variations of the SDLC models.
Multimedia Specification Design and Production 2013 / Semester 1 / week 9 Lecturer: Dr. Nikos Gazepidis
Extreme/Agile Programming Prabhaker Mateti. ACK These slides are collected from many authors along with a few of mine. Many thanks to all these authors.
1.NET Web Forms Business Forms © 2002 by Jerry Post.
Rapid software development 1. Topics covered Agile methods Extreme programming Rapid application development Software prototyping 2.
Applied Software Project Management
CS5263 Bioinformatics Lecture 20 Practical issues in motif finding Final project.
Usability Testing Chris North cs3724: HCI. Presentations karen molye, steve kovalak Vote: UI Hall of Fame/Shame?
Gary MarsdenSlide 1University of Cape Town Human-Computer Interaction - 4 User Centred Design Gary Marsden ( ) July 2002.
Chapter 8 Usability Specification Techniques Hix & Hartson.
CS5714 Usability Engineering Formative Evaluation of User Interaction: During Evaluation Session Copyright © 2003 H. Rex Hartson and Deborah Hix.
CS5714 Usability Engineering Usability Specifications Copyright © 2003 H. Rex Hartson and Deborah Hix.
Design Process … and some design inspiration. Course ReCap To make you notice interfaces, good and bad – You’ll never look at doors the same way again.
1 Notes from
© Simeon Keates 2008 Usability with Project Lecture 4 –18/09/09 Susanne Frennert.
S519: Evaluation of Information Systems Analyzing data: Merit Ch8.
Copyright 2010, The World Bank Group. All Rights Reserved. Testing and Documentation Part II.
Introduction to Usability By : Sumathie Sundaresan.
CS5714 Usability Engineering Usability Inspection Copyright © 2003 H. Rex Hartson and Deborah Hix.
Engineers create what has never existed!
Project Management. Introduction  Project management process goes alongside the system development process Process management process made up of three.
Requirements Engineering Requirements Engineering in Agile Methods Lecture-28.
Information Technology Project Management Managing IT Project Risk.
CSC 480 Software Engineering Test Planning. Test Cases and Test Plans A test case is an explicit set of instructions designed to detect a particular class.
1 CS 501 Spring 2002 CS 501: Software Engineering Lecture 27 Software Engineering as Engineering.
Chapter 10 Information Systems Development. Learning Objectives Upon successful completion of this chapter, you will be able to: Explain the overall process.
Software Development Process CS 360 Lecture 3. Software Process The software process is a structured set of activities required to develop a software.
The Engineering Design Process
Evaluation / Usability. ImplementDesignAnalysisEvaluateDevelop ADDIE.
Project Management Methodology Project Closing. Project closing stage Must be performed for all projects, successfully completed or shut off by management.
The Information School of the University of Washington Information System Design Info-440 Autumn 2002 Session #20.
Chapter 25 – Configuration Management 1Chapter 25 Configuration management.
User Interface Evaluation

Methodologies By Akinola Soyinka.
CS5714 Usability Engineering
What do you need to know about XP?
Project Management How to access the power of projects!
Extreme Programming.
Requirements Analysis and Negotiation
Presentation transcript:

CS5714 Usability Engineering Formative Evaluation of User Interaction: After Evaluation Session Copyright © 2003 H. Rex Hartson and Deborah Hix.

Eval after 2 Topics Analyzing the data Cost-importance analysis Drawing conclusions Connecting back to UE management Team exercise on formative evaluation

Eval after 3 Analyzing the Data Major design: Accept as is or redesign Compare observed results to usability specifications – Remember, it’s formative, not summative (therefore, statistical significance is not at issue) If usability specifications are not met, identify interaction design problems and solve in order of cost and effect on usability

Eval after 4 Analyzing the Data In most situations, finding usability problems is bad, but in formative evaluation, finding usability problems is good Structured identification of interaction design problems and potential solutions (More details on individual usability problem analysis and reporting coming later)

Eval after 5 Cost-Importance Analysis Cost-Importance table – Best to do in a spreadsheet – Problem Something that has an impact on usability; observed as user interacts Deduced from critical incident data

Eval after 6 Cost-Importance Analysis For Calendar example ProblemImpSolution(s)CostPrio ratio Prio rank Cum cost Reso lution User didn’t know to select appt before trying to delete

Eval after 7 Cost-Importance Analysis – Importance to fix--effect on usability (independent of cost), based on best engineering guess (include risk of not fixing) Importance=M: must fix, regardless Importance=5: If interaction feature is mission critical or usability problem has major impact on task performance or user satisfaction (e.g., user cannot complete key task), expect to occur frequently, causes costly errors, or major source of dissatisfaction

Eval after 8 Cost-Importance Analysis Importance=3: If user can complete task, but with difficulty (e.g., caused confusion and required extra effort), or problem was a source of dissatisfaction Importance=1: If problem did not impact task performance or dissatisfaction much (e.g., irritant or cosmetic), but is still worth listing

Eval after 9 Cost-Importance Analysis Adjustment factors for “Importance” – Probability of occurrence Over all affected user classes, how often will user encounter this problem? Example: if task cannot be completed (e.g., Importance = 5) but usability problem represents situation that will arise rarely and not critical task, downgrade importance to 4 or 3 Example: if impact is moderately significant (3), but occurs frequently, upgrade to 4 – Learnability If users can learn to solve it immediately, it won’t affect subsequent usage (reduce Importance by 1) Boneheaded

Eval after 10 Cost-Importance Analysis For Calendar example, assume an Importance = 3 ProblemImpSolution(s)CostPrio ratio Prio rank Cum cost Reso lution User didn’t know to select appt before trying to delete 3

Eval after 11 Cost-Importance Analysis – Solution(s)--proposed changes to solve problem Design principles and guidelines Brainstorming Study other similar designs Solutions suggested by users and experts A given problem can have more than one possible solution; list them each on a separate line Typically not a good solution: More training or documentation (adjust the design, not the user!)

Eval after 12 Cost-Importance Analysis For Calendar example: ProblemImpSolution(s)CostPrio ratio Prio rank Cum cost Reso lution User didn’t know to select appt before trying to delete 3Allow user to click in time slot then press delete; add pop-up hint ‘Click in appointment to select for deletion’

Eval after 13 Cost-Importance Analysis – Cost--resources (time, money) needed for each proposed solution Changes to paper prototype are minimal cost Cost more significant in computer-based prototypes, versions of real product Usually units of cost are in person-hours Usually round up fractional values to next integer (this is an engineering estimate, anyway) For problems with multiple possible solutions, give cost for each on separate line (presumably importance is same)

Eval after 14 Cost-Importance Analysis For Calendar, assume a cost of 4 person-hours for recoding this feature ProblemImpSolution(s)CostPrio ratio Prio rank Cum cost Reso lution User didn’t know to select appt before trying to delete 3Allow user to click in time slot then press delete; add pop-up hint ‘Click in appointment to select for deletion’ 4

Eval after 15 Cost-Importance Analysis – Priority ratio – metric to establish priorities = (importance/cost) * 1000 Intuitively, high priority means high importance, low cost

Eval after 16 Cost-Importance Analysis For Calendar example, priority = (3/4) * 1000 = 750 ProblemImpSolution(s)CostPrio ratio Prio rank Cum cost Reso lution User didn’t know to select appt before trying to delete 3Allow user to click in time slot then press delete; add pop-up hint ‘Click in appointment to select for deletion’ 4750

Eval after 17 Cost-Importance Analysis Do priority ratios for all individual usability problems Group together related problems (that can share common solution) – Represent as single problem and recalculate importance, cost, priority ratio – Usually lower cost than total for individual problems – Often, for problem with more than one solution, the broadest solution (though more costly) groups better with others

Eval after 18 Cost-Importance Analysis – Move all “must fix regardless of cost” problems to top of table – Sort rest of table in descending order by priority ratio This gets “must fix” problems first The high importance, low cost problems are next These are the problems that, when fixed, give biggest improvement in usability for least cost Importance Cost Fix first Probably not fix

Eval after 19 Cost-Importance Analysis ProblemImpSolution(s)CostPrio ratio Prio rank Cum cost Reso lution User didn’t know to select appt before trying to delete 3Allow user to click in time slot then press delete; add pop-up hint ‘Click in appointment to select for deletion’ 4750 abc5Yada yada51000 pqrMBlah blah20M xyz1Sure, right3333 Filling in more problems for the Calendar example

Eval after 20 Cost-Importance Analysis Putting M problems at top, sort rest by prio ratio, fill in ‘Cum cost’ ProblemImpSolution(s)CostPrio ratio Prio rank Cum cost Reso lution pqrMBlah blah20M1 abc5Yada yada User didn’t know to select appt before trying to delete 3Allow user to click in time slot then press delete; add pop-up hint ‘Click in appointment to select for deletion’ xyz1Sure, right

Eval after 21 Cost-Importance Analysis Assume we have 26 person-hours available for changes; draw line just before ‘Cum cost’ exceeds ProblemImpSolution(s)CostPrio ratio Prio rank Cum cost Reso lution pqrMBlah blah20M1 abc5Yada yada User didn’t know to select appt before trying to delete 3Allow user to click in time slot then press delete; add pop-up hint ‘Click in appointment to select for deletion’ xyz1Sure, right

Eval after 22 Drawing Conclusions Deal with “Must fix” problems – If enough resources, fix – Otherwise, someone (e.g., project manager) must decide Sometimes fixing “must fix” problems means no resources left for any other problems Extreme cost of a “must fix” problem could make it infeasible to fix in current version Exceptions (with cost overruns) can be dictated by corporate policy, management decision, marketing, etc. – In our example, we have enough resources

Eval after 23 Drawing Conclusions Our resolution ProblemImpSolution(s)CostPrio ratio Prio rank Cum cost Resolut ion pqrMBlah blah20M1 Fix Abc5Yada yada Fix, if time User didn’t know to select appt before trying to delete 3Allow user to click in time slot then press delete; add pop-up hint ‘Click in appointment to select for deletion’ Table until next version xyz1Sure, right Probably never

Eval after 24 Drawing Conclusions Other considerations – Sometimes get ties on Priority Ratios Must break ties by Importance, personal preference, etc. – Similarly, if Importance (which reflects severity, impact on user, etc.) is more significant factor in your environment, weight it a little more in the formula (for Priority Ratio) How to report your usability results to management

Eval after 25 Connection Back to Usability Engineering Lifecycle Implement chosen design solutions Realize benefits of improved usability Cycle back through life cycle process and evaluate again Usability testing controls iterative process – Stop when achieve usability specifications – Goal is not perfection What we have been doing

Eval after 26 Connection Back to Usability Engineering Lifecycle Cost-effectiveness – If schedule for first release is too tight for thorough testing in lab, use: Usability inspection Rapid usability analysis Isolate most severe problems, “show stoppers” Leave rest until next release – But, be careful not to let tight production schedule force you to release something that could embarrass your company – Quality is remembered long after schedules are forgotten! Videoclip: Analysis of Envision evaluation sessions

Eval after 27 Team Exercise – Usability Data Analysis Goal: To perform the analysis part of a very simple formative usability evaluation Activities: – Assemble in same teams (including your new participants) – The team compiles results to determine whether usability specifications were met.

Eval after 28 Team Exercise – Usability Data Analysis – Fill in the "Observed results" column on the usability specification table. – Using cost-importance table, list 2 or 3 usability problems from critical incidents. – Assign an "Importance" rating, 1 through 5, to some of the observed problems, based on severity. – Propose "Solutions" (without doing all the work of re-design).

Eval after 29 Team Exercise – Usability Data Analysis – Assign "Cost" values (in person-hours) to each solution. Values based on computer (not paper) prototype changes. – Compute "Priority ratios". – Group together related problems and list as single problem – Move “Must fix” problems to the top

Eval after 30 Team Exercise – Usability Data Analysis – Sort the rest by decreasing ratios to determine "Priority rank" of usability problems. – Fill in cumulative cost column. – Assume hypothetical value for available time resources (something to make this exercise work). – Draw the cutoff line. – Finalize your "management" decisions about which changes to make in the next version.

Eval after 31 Team Exercise – Usability Data Analysis Deliverables – Summary of quantitative results, written in "Observed results" column, on usability specification table on transparency (for comparison) – List of raw critical incidents – Cost-importance table, with "Resolutions" column completed in the form on transparency. – Choose someone to give brief report on evaluation results.