Code Inspections and Heuristic Evaluation

Slides:



Advertisements
Similar presentations
Acknowledgements: Most of this course is based on the excellent course offered by Prof. Kellogg Booth at the British Columbia University, Vancouver, Canada.
Advertisements

Heuristic Evaluation.
Damian Gordon.  Static Testing is the testing of a component or system at a specification or implementation level without execution of the software.
Design Reviews. Genres of assessment  Automated: Usability measures computed by software  Empirical: Usability assesses by testing with real users 
Copyright © 1994 Carnegie Mellon University Disciplined Software Engineering - Lecture 1 1 Disciplined Software Engineering Lecture #7 Software Engineering.
Testing Without Executing the Code Pavlina Koleva Junior QA Engineer WinCore Telerik QA Academy Telerik QA Academy.
SIMS 213: User Interface Design & Development Marti Hearst Tues, Feb 25, 2003.
Part 4: Evaluation Days 25, 27, 29, 31 Chapter 20: Why evaluate? Chapter 21: Deciding on what to evaluate: the strategy Chapter 22: Planning who, what,
Heuristic Evaluation. Sources for today’s lecture: Professor James Landay: stic-evaluation/heuristic-evaluation.ppt.
1 Heuristic Evaluation. 2 Interface Hall of Shame or Fame? Standard MS calculator on all Win95/98/NT/2000/XP.
Heuristic Evaluation IS 485, Professor Matt Thatcher.
Heuristic Evaluation Evaluating with experts. Discount Evaluation Techniques  Basis: Observing users can be time- consuming and expensive Try to predict.
Evaluating with experts
SIMS 213: User Interface Design & Development Marti Hearst Tues Feb 13, 2001.
SE 555 Software Requirements & Specification Requirements Validation.
Hueristic Evaluation. Heuristic Evaluation Developed by Jakob Nielsen Helps find usability problems in a UI design Small set (3-5) of evaluators examine.
Discount Usability Engineering Marti Hearst (UCB SIMS) SIMS 213, UI Design & Development March 2, 1999.
Usability Testing.
Review an existing website Usability in Design. to begin with.. Meeting Organization’s objectives and your Usability goals Meeting User’s Needs Complying.
Design Reviews Peer Reviews. Agenda Peer Reviews Participants of Peer Review Preparation for a Peer Review Session The Peer Review Session Post-peer Review.
1CMSC 345, Version 4/04 Verification and Validation Reference: Software Engineering, Ian Sommerville, 6th edition, Chapter 19.
Evaluation in HCI Angela Kessell Oct. 13, Evaluation Heuristic Evaluation Measuring API Usability Methodology Matters: Doing Research in the Behavioral.
Heuristic evaluation IS 403: User Interface Design Shaun Kane.
Heuristic Evaluation “Discount” Usability Testing Adapted from material by Marti Hearst, Loren Terveen.
1 Usability evaluation and testing User interfaces Jaana Holvikivi Metropolia.
Software Quality Chapter Software Quality  How can you tell if software has high quality?  How can we measure the quality of software?  How.
Software Testing CS 470. Testing Goal is to find faults What kind of faults? –Algorithmic –Computation and Precision –Documentation –Overload –Capacity.
Software Inspections Peer Reviews. Objectives Today –Overview of inspections –Why inspections can help improve your software quality –Mini inspection.
Discount Evaluation Evaluating with experts. Discount Evaluation Techniques Basis: – Observing users can be time-consuming and expensive – Try to predict.
Part 1-Intro; Part 2- Req; Part 3- Design  Chapter 20 Why evaluate the usability of user interface designs?  Chapter 21 Deciding on what you need to.
Object-Oriented Software Engineering Practical Software Development using UML and Java Chapter 7: Focusing on Users and Their Tasks.
Usability Evaluation June 8, Why do we need to do usability evaluation?
Heuristic Evaluation and Discount Usability Engineering Taken from the writings of Jakob Nielsen – inventor of both.
Disciplined Software Engineering Lecture #7 Software Engineering Institute Carnegie Mellon University Pittsburgh, PA Sponsored by the U.S. Department.
SEG3120 User Interfaces Design and Implementation
Prof. James A. Landay University of Washington Autumn 2008 Heuristic Evaluation October 28, 2008.
Y ASER G HANAM Heuristic Evaluation. Roadmap Introduction How it works Advantages Shortcomings Conclusion Exercise.
Copyright © 1994 Carnegie Mellon University Disciplined Software Engineering - Lecture 7 1 Design and Code Reviews - Overview What are design and code.
Software Testing and Maintenance 1 Code Review  Introduction  How to Conduct Code Review  Practical Tips  Tool Support  Summary.
Evaluating a UI Design Expert inspection methods Cognitive Walkthrough
Usability 1 Usability evaluation Without users - analytical techniques With users - survey and observational techniques.
June 5, 2007Mohamad Eid Heuristic Evaluation Chapter 9.
Inspection and Review The main objective of an Inspection or a Review is to Detect Defects. (Today -there may be some other goals or broader definition.
© Michael Crosby and Charles Sacker, 2001 Systematic Software Reviews Software reviews are a “quality improvement process for written material”.
CPSC 481 – Week 10 Heuristic Evaluation Sowmya Somanath (based on previous tutorials by Alice Thudt, Jonathan Haber and Brennan Jones)
Usability Heuristics Avoid common design pitfalls by following principles of good design Nielsen proposes 10 heuristics, others propose more or less. Inspect.
1 Usability evaluation and testing User interfaces Jaana Holvikivi Metropolia.
Oct 211 The next two weeks Oct 21 & 23: Lectures on user interface evaluation Oct 28: Lecture by Dr. Maurice Masliah No office hours (out of town) Oct.
Introduction to Evaluation “Informal” approaches.
Fall 2002CS/PSY Predictive Evaluation (Evaluation Without Users) Gathering data about usability of a design by a specified group of users for a particular.
Prof. James A. Landay University of Washington Autumn 2007 Heuristic Evaluation October 30, 2007.
Reviews Chapter 5 Applied Software Project Management, Stellman & Greene See also:
Ten Usability Heuristics with Example.. Page 2 Heuristic Evaluation Heuristic evaluation is the most popular of the usability inspection methods. Heuristic.
Efficient Techniques for Evaluating UI Designs CSE 403.
© 2016 Cognizant. © 2016 Cognizant Introduction PREREQUISITES SCOPE Heuristic evaluation is a discount usability engineering method for quick, cheap,
Verification vs. Validation Verification: "Are we building the product right?" The software should conform to its specification.The software should conform.
Heuristic Evaluation May 4, 2016
Heuristic Evaluation October 26, 2006.
SIE 515 Design Evaluation Lecture 7.
Heuristic Evaluation August 5, 2016
Heuristic Evaluation 3 CPSC 481: HCI I Fall 2014
Software Engineering D7025E
PostPC Computing Heuristic Evaluation Prof. Scott Kirkpatrick, HUJI
Inspection and Review The main objective of an Inspection or a Review is to detect defects. (Not for Giving Alternative Solutions) This activity and procedure.
Heuristic Evaluation.
Evaluation.
QA Reviews Lecture # 6.
Miguel Tavares Coimbra
Miguel Tavares Coimbra
Miguel Tavares Coimbra
Presentation transcript:

Code Inspections and Heuristic Evaluation

Objectives Today Next time The inspection process Practice inspection Heuristic evaluation process Practice evaluation Next time Rationale behind why inspections and heuristic evaluation is so great Normally would do this in the opposite order, but this way you should be able to better prepare any materials over the weekend

Presentation Order Email me your materials (code to review or if doing the heuristic evaluation, software to run if possible) by the Friday prior to your presentation if going on Monday, or on the Monday prior to your presentation if going on Wednesday Tuesday 3/25: Curtis Conner, Devin Lyons, Matt Grimm, Scott Mahar Thursday 3/27: Devin Homan, Dean Sawyer, Hector Sanchez, Joshua Tester Tuesday 4/1: Richard To, Britny Herzog, Bobby Porter, Robert Bailey

Code Inspection / Fagan Inspection Definition A formal review of a work product by peers. A standard process is followed with the purpose of detecting defects early in the development lifecycle. Can inspect many different kinds of documents We will focus on just the code

Defects Inspections are used to find defects A defect is a deviation from specific or expected behavior Something wrong Missing information Common error Standards violation Ambiguity Inconsistency Perception error Design error

A defect is a defect A defect is based on the opinion of the person doing the review This means that any defect that is found IS a defect Not open to debate Not all defects are necessarily bugs Many defects may not be “fixed” in the end No voting or consensus process on what is a defect How to fix a defect should be debated later, not when the defects are logged

What should be inspected? For existing code or documentation, select The most critical piece to the program’s operation Most used section Most costly if defects were to exist Most error-prone Least well-known Most frequently changed For new code or documentation 20% <= inspect <= 100%

Our Inspection Exercise Individual Work Group in Class Owner Planning 30-60 mins Introduction 2-5 mins Inspect and Log Defects 10-15 mins Inspectors review code 20-60 mins Owner rework ? mins

Owner Planning Owner decides what code/documents to review Copy of code listing for everyone Send me code by the prior class before the inspection date and I’ll post it on the calendar page for everyone to get Code should be numbered by line Not all code, just the selected code (see previous slide on “What should be inspected?”) Up to owner’s discretion as to what/how much, but we will stop after 20 minutes Probably about 2-3 pages

Preparation Each inspector should have the materials to inspect in advance Identify defects on their own to ensure independent thought Note defects and questions Complete a defect log High/Medium/Low Without this preparation, group review might find only 10% of defects that could otherwise be found (Fagan) Rules of thumb 2 hours for 10 full pages of text

Common Defects Mistakes you’ve made in the past Anything we discussed in class Code techniques E.g. variable names, location, initialization, refactoring, defensive programming, error checking, magic numbers, loop length, etc. Security Usability Etc. Similar issues apply to other languages

Inspection Day Prior to inspection Inspection process Code has already been posted Inspectors have prepared by inspecting the code and noting their defects Inspection process Owner provides brief introduction for code Round-robin where each inspector describes a defect found or passes if no defects noted Might find new defects during the inspection exercise Total of 10-20 minutes in our exercise Scribe writes down defects in the defect log

Defect Logging High, Medium, Low, or Question Brief description should be ~7 words or less, or until the owner understands If possible, resolve questions: defect or not Also log defects found in Parent document, e.g. requirements Common errors list Work product guidelines Will be up to the work owner whether or not to fix a defect

Inspection Example Requirement: Support authentication based upon user@host using regular expressions Open file Containing operators 1 /********************************************************* 2 * Returns a 1 if the user is on the ops list, and 3 * returns a 0 if the user is not on the ops list. 4 *********************************************************/ 5 int Authorized(char *user) 6 { 7 FILE *f; 8 9 f=fopen(OPSPATH,"r"); /* open authorized file */ 10 while (fgets(tempstr,80,f)!=NULL) 11 { 12 tempstr[strlen(tempstr)-1]='\0'; /* annoying \r at end */ 13 if (!fnmatch(tempstr,user,FNM_CASEFOLD)) { fclose(f); return(1); } 14 } 15 fclose(f); 16 return(0); 17 } Returns true if wildcards match

Defect Log Severity: H M L Q Location Description 1 2 3 4 5 6

In-Class Exercise Take 5-10 minutes to find defects in the code posted online http://www.math.uaa.alaska.edu/~afkjm/cs470/handouts/CodeReview.pdf This is C# code that highlights the location of my pen on the tablet screen We will then do a short round-robin to note defects

Defect Log Severity: H M L Q Location Description

“Discount” Usability Testing Heuristic Evaluation “Discount” Usability Testing

Heuristic Evaluation Developed by Jakob Nielsen Helps find usability problems in a UI design Small set (3-5) of evaluators examine UI independently check for compliance with usability principles (“heuristics”) Can also refer to any of the GUI Bloopers we covered different evaluators will find different problems evaluators only communicate afterwards during meeting findings are then aggregated Can perform on working UI or on sketches

Jakob Nielsen’s Heuristics Aesthetic and minimalist design Match between system and real world Recognition rather than recall Consistency and standards Visibility of system status User control and freedom Flexibility and efficiency of use Help users recognize, diagnose, and recover from errors Error prevention Help and documentation

Evaluation Day Similar to code inspection Prior to evaluation Ideally, the program has already been posted (by class prior to the inspection) and inspectors have prepared by running the program and noting issues This may not be possible depending upon the nature of your project. If so, you may give an in-class “demo” and do evaluation on the fly Evaluation process Owner provides brief introduction for the program Round-robin where each evaluator describes an issue found or passes if no defects noted Might find new issues during the exercise Total of 15-20 minutes in our exercise Scribe writes down issues in the issue log

Example Problem Descriptions Have to remember command codes Violates “Minimize the users’ memory load” (H3) Fix: add drop down box with selectable codes Typography uses mix of upper/lower case formats and fonts Violates “Consistency and standards” (H4) Slows users down Probably wouldn’t be found by user testing Fix: pick a single format for entire interface Adapted from slide by James Landay

Severity ratings Used to allocate resources to fix problems Should be calculated after all evaluations are done Should be done independently by all evaluators Based on Frequency the problem will occur Impact of problem (hard or easy to overcome) Persistence (will users learn a work around or will they be bothered every time?) 1 – cosmetic problem 2 – minor usability problem 3 – major usability problem; important to fix 4 – usability catastrophe – must fix

Heuristic Evaluation Issue Log Heuristic Issue Severity Description Consistency 3 The interface used the string "Save" on the first screen for saving the user's file, but used the string "Write file" on the second screen. Users may be confused by this different terminology for the same function Error Message 4 Entering invalid input into dialog box on first form results in “Error 103” …

Exercise Evaluate an application using heuristic evaluation Use your computer or share with a neighbor Java Required: http://www.math.uaa.alaska.edu/~afkjm/GeoPixelCounter/ Program to compute % of a mineral in a thin section of rock Refer back to slide with the 10 heuristics Fill out issues found on paper We will discuss and debrief after you are done

Heuristic Evaluation Issue Log Heuristic Issue Severity Description

Theory – Code Inspections

Why inspections? Inspections can be applied to many different things by many different groups Inspections are a “Best Known Method” (BKM) for increasing quality Developed by Michael Fagan at IBM, paper published 1976 Estimates: Inspections of design and code usually remove 50-90% of defects before testing Very economical compared to testing Formal inspections are more productive than informal reviews

Formal Inspections By formalizing the process, inspections become systematic and repeatable Each person in the inspection process must understand their role Use of checklists focus concentration on detection of defects that have been problematic Metrics Feedback and data collection metrics are quantifiable Feed into future inspections to improve them Designers and developers learn to improve their work through inspection participation

More reasons to use inspections Inspections are measurable Ability to track progress Reduces rework and debug time Cannot guarantee that a deadline will be met but can give early warning of impending problems Information sharing with other developers, testers

Definition What is an inspection? Examples of work products A formal review of a work product by peers. A standard process is followed with the purpose of detecting defects early in the development lifecycle. Examples of work products Code, Specs, Web Pages Presentations, Guides, Requirements, Specifications, Documentation

When are inspections used? Possible anytime code or documents are complete Requirements: Inspect specs, plans, schedules Design: Inspect architecture, design doc Implementation: Inspect technical code Test: Inspect test procedure, test report

Defects Inspections are used to find defects A defect is a deviation from specific or expected behavior Something wrong Missing information Common error Standards violation Ambiguity Inconsistency Perception error Design error

Other Review Methods Presentation Walkthrough Inspection What Audience Present idea or proposal Technical presentation of work Formal review by peers Audience Mgmt/Tech Tech Objective Provide Info, Evaluate specs or plan – Give Status Explain work, may find design or logic defect - Give context Find defects early - Find defects

Other Defect Detection Methods Buddy Testing Inspection What Developers work in pairs Formal testing Formal review by peers Audience Tech Objective Develop, explain work, find defects Find defects by symptom, usability, performance Find defects where they occur

Why a formal review? Provides a well-defined process Repeatability, measurement Avoids some scenarios with less formal processes “My work is perfect” Point is not to criticize the author “I don’t have time” Formal process proceeds only when all are prepared, have inspected code in advance

Walkthrough vs. Inspection Focus Improve product Find defects Activities Find defects Examine alternatives Forum for learning Discussion Find defects Only defect explanation allowed Learning through defects and inspection Process Informal Formal Quality Variable; personalities can modify outcome Repeatable with fixed process Time Preparation ad-hoc, less formal Preparation required, efficient use of time

Typical Inspection Process We are using a shortened process in class, but it is essentially the same as the “normal” process Planning 45 mins Prep 15-120 mins Log Defects 60-120 mins Causal Analysis and Rework Follow-Up

Roles Moderator Inspectors Work Owner Scribe

Causal Analysis Meeting Purpose – Brainstorming session on the root cause of specific defects This takes place sometime after the inspection has been completed This meeting supports the continuous improvement Initiate thinking and action about most common or severe defects Can help prevent future defects from occurring Specific action items may be achieve this goal

Rework Purpose: Address defects found during the logging process Rules Performed by product owner All defects must be addressed Does not mean they are fixed, but that sufficient analysis/action has taken place All defects found in any other documents should be recorded Owner should keep work log

Follow-Up Purpose: Verify resolution of defects Work product redistributed for review Inspection team can re-inspect or assign a few inspectors to review Unfixed defects are reported to the team and discussed to resolution We’re skipping these last few phases for the class I would like to see how you addressed defects in your final writeup

Theory - Heuristic Evaluation Adapted from material by Marti Hearst, Loren Terveen

Evaluating UI Designs Usability testing is a major technique Formal techniques require users, rigid control experiments, statistical analysis “Discount” methods don’t require users Heuristic Evaluation Cognitive Walkthrough

Heuristic Evaluation Developed by Jakob Nielsen Helps find usability problems in a UI design Small set (3-5) of evaluators examine UI independently check for compliance with usability principles (“heuristics”) different evaluators will find different problems evaluators only communicate afterwards findings are then aggregated Can perform on working UI or on sketches

Phases of Heuristic Evaluation 1) Pre-evaluation training give evaluators needed domain knowledge and information on the scenarios 2) Evaluation individuals evaluate and then aggregate results 3) Severity rating determine how severe each problem is (priority) 4) Debriefing discuss the outcome with design team Adapted from slide by James Landay

Jakob Nielsen’s heuristics 1.0 2.0 H1. Simple and natural dialog Aesthetic and minimalist design H2. Speak the user’s language Match between system and real world H3. Minimize user memory load Recognition rather than recall H4. Be consistent Consistency and standards H5. Provide feedback Visibility of system status H6. Provide clearly marked exits User control and freedom H7. Provide shortcuts Flexibility and efficiency of use H8. Provide good error messages Help users recognize, diagnose, and recover from errors H9. Prevent errors Error prevention H10. Help and documentation Help and documentation

Pros / Cons + Cheap (no special lab or equipment) + Easy + Fast (about 1 day) + Cost-effective + Detects many problems without users + Complementary to task-centered approaches + Coverage + Catches cross-task interactions - Requires subjective interpretation - Does not specify how to fix problems - Performance improves as evaluator knowledge increases

Procedure A set of evaluators (3-5 is about optimal) evaluate a UI (some training may be needed) Each one independently checks for compliance with the heuristics Different evaluators find different problems Individually rate severity of the problems Evaluators then get together and merge their findings Debriefing/brainstorming  how to fix the problems (and point out what’s really good)

Adapted from slide by James Landay How to Perform H.E. At least two passes for each evaluator first to get feel for flow and scope of system second to focus on specific elements Assistance from implementors/domain experts If system is walk-up-and-use or evaluators are domain experts, then no assistance needed Otherwise might supply evaluators with scenarios and have implementors standing by Adapted from slide by James Landay

How to Perform Evaluation Where problems may be found single location in UI two or more locations that need to be compared problem with overall structure of UI something that is missing Adapted from slide by James Landay

Severity ratings Used to allocate resources to fix problems Should be calculated after all evaluations are done Should be done independently by each evaluators Based on Frequency the problem will occur Impact of problem (hard or easy to overcome) Persistence (will users learn a work around or will they be bothered every time?) 1 – cosmetic problem 2 – minor usability problem 3 – major usability problem; important to fix 4 – usability catastrophe – must fix

Adapted from slide by James Landay Debriefing Conduct with evaluators, observers, and development team members Discuss general characteristics of UI Suggest potential improvements to address major usability problems Developer team rates how hard things are to fix Make it a brainstorming session Adapted from slide by James Landay

Results of Using HE (cont.) Single evaluator achieves poor results only finds 35% of usability problems 5 evaluators find ~ 75% of usability problems why not more evaluators? 10? 20? adding evaluators costs more adding more evaluators doesn’t increase the number of unique problems found Adapted from slide by James Landay

Adapted from slide by James Landay Decreasing Returns problems found benefits / cost (from Nielsen) Caveat: these graphs are for a specific example This is a controversial point. Adapted from slide by James Landay

Why Multiple Evaluators? Every evaluator doesn’t find every problem Good evaluators find both easy & hard ones

Summary Inspections and Heuristic Evaluation are considered Best Known Methods to improve software quality Relatively cheap to perform Finds errors directly Developers share their knowledge with one another, leading to quality improvements on future projects