Presentation is loading. Please wait.

Presentation is loading. Please wait.

A software engineering perspective

Similar presentations


Presentation on theme: "A software engineering perspective"— Presentation transcript:

1 A software engineering perspective
User interface design A software engineering perspective Soren Lauesen Slides for Chapter 1 November 2004 © 2005, Pearson Education retains the copyright to the slides, but allows restricted copying for teaching purposes only. It is a condition that the source and copyright notice is preserved on all the material.

2 User interfaces Technical interfaces
Slide 2 Fig 1.1A System interfaces Courses? User interfaces Accounting system Technical interfaces Factory Hotline? System Manual?

3 Easy to make a user interface: Just give access to the database
Fig 1.1B Quality factors Slide 3 Easy to make a user interface: Just give access to the database Hard to make a good user interface see, edit create, delete Database Quality factors: Correctness Availability Performance Security Ease of use Maintainability . . . Functionality: Necessary features All factors important. Hard to measure, but possible.

4 Responsibility? Programmers?
Slide 4 Fig 1.2 What is usability? Max three menu levels On-line help Windows standard ?? Usability factors: a. Fit for use (adequate functionality) Ease of use: b. Ease of learning c. Task efficiency d. Ease of remembering e. Subjective satisfaction f. Understandability Responsibility? Programmers? Other developers? User department? Measurable Priorities vary Game programs: a. ??

5 Fig 1.3 Usability problems
Slide 5 Examples: The system works as intended by the programmer, but the user: P1. Cannot figure out how to start the search. Finally finds out to use F10. P2. Believes he has completed the task, but forgot to push Update. P3. Sees the discount code field, but cannot figure out which code to use. P4. Says it is crazy to use six screens to fill in ten fields. P5. Wants to print a list of discount codes, but the system cannot do it. Severity classes: 1 Missing functionality 2 Task failure 3 Annoying 4 Medium problem (succeeds after long time) 5 Minor problem (succeeds after short time) Critical problem = Missing functionality, task failure, or annoying

6 Fig 1.4 Usability test - think aloud
Slide 6 Purpose: Find usability problems I try this because ... User doesn’t notice ... Facilitator Listens Asks as needed Logkeeper Listens Records problems User Performs tasks Thinks aloud

7 (Fig 1.4 cont.) Plan Carry out Reporting Test-users: Test-tasks:
Slide 7 Plan Test-users: Test-tasks: Study system yourself Carry out Explain purpose: - Find problems when using the system - System’s fault - not yours Give task - think aloud, please Observe, listen, note down Ask cautiously: - what are you looking for? - why ? Help users out when they are surely lost Reporting List the usability problems - within 12 hours

8 Fig 1.5 Heuristic evaluation
Slide 8 Fig 1.5 Heuristic evaluation Purpose: Find usability problems Usability specialist looks at system using common sense and/or guidelines The specialist lists problems (Consults with other experts) Expert - reviewer First law of usability: Heuristic evaluation has only 50% hitrate Actual problems Predicted False problems Missed problems

9 Fig 1.6A Measuring usability - task time (performance)
Slide 9 ATM Users: 20 bank customers, random selection. Task 1: Withdraw $100 from ATM. No instructions. Measure: How many succeed in 2 min? Task 2: Withdraw as much as possible ($174) Measure: How many succeed in 5 min? Reqs: Task 1: 18 succeed. Task 2: 12 succeed. How to measure What to measure Requirement - target Internal ordering system Users: 5 secretaries in the company. Have tried the internal ordering system. Have not used it for a month. Task 1: Order two boxes of letter paper Measure: Average time per user. Reqs: Average time below 5 min. What to measure Risky! Pros: Classic approach. Good when buying. Cons: Not good for development. Not possible early. Little feedback.

10 Fig 1.6B Choosing the numbers
Slide 10 Why 20? Cost versus reliability. During development: One, later two, later ... Users: 20 bank customers ... Measure: In 2 min? Reqs: Task 1: 18 succeed. Task 2: 12 succeed. Why 2 mins? Best practice, ideal way ... Why 18? 90% of customers should succeed. Task 2 harder. Open target Reqs: 18 out of 20 must succeed within ____ min. We expect around 2 min. Specify how, what, and expectations. Wait and see what is possible.

11 Fig 1.6C Measuring usability - Problem counts
Slide 11 Users: 3 potential users. Think-aloud test. Record usability problems. Task 1: Order two boxes of letter paper Task 2: . . . Measure: Number of critical problems per user. Number of medium problems on list. Reqs: Max one user encounters critical problems. Max 5 medium problems on the list. How to measure What to measure Requirement Pros: Possible early - mockup sufficient. Good feedback to developers. Cons: Best for ease of learning. Only indications for other factors.

12 Fig 1.6D Measuring usability - Keystroke counts
Slide 12 Task 1: Withdraw a standard amount from ATM. Task 2: . . . Measure: Number of keystrokes and mouse clicks. Reqs: Max keystrokes 6 - incl. PIN code. Total system response time max 8 s. How to measure What to measure Requirement Total task time 6 0.6 s 3.6 s total system response time s Total task time 11.6 s Plus other user actions? Pros: No users needed. Possible early - mockup sufficient. Cons: Not sure users find the fast way. Only task efficiency.

13 Fig 1.6E Measuring usability - Opinion poll
Slide 13 Ask 20 novice users to complete the questionnaire. Measure: Count number of entries per box. Reqs: 80% find system easy to learn. 50% will recommend it to others. How to measure What to measure Requirement Questionnaire agree neutral disagree The system was easy to learn The system is easy to use The system helps me . . . It is fun to use I will recommend it to others Pros: Widely used. You may ask for any usability factor. Cons: Doesn’t match objective evidence. Only indications during development. Little feedback to developers.

14 Fig 1.6F Measuring usability - Score for understanding
Slide 14 Ask 5 potential ATM users what these error messages mean: Amount too large PIN code invalid . . . Ask them also: What would the system do if . . . Measure: Assess answers on scale A-D. Reqs: 80% of answers marked A or B. How to measure What to measure Requirement Pros: Easy way to test understandability. Best way to cover error messages. Useful both early and late in development. Cons: Only measures understandability..

15 Fig 1.6G Measuring usability - Guideline adherence
Slide 15 Ask an expert to review the user interface and identify deviations from guideline X. (Or ask two experts to come up with a joint list.) Measure: Number of deviations per screen. Reqs: At most one deviation per screen. How to measure What to measure Requirement Pros: Adherence helps users switch between systems. Company-specific guidelines for internal systems can help even more. Cons: Cannot guarantee high usability. Developers find guidelines hard to follow - examples help best.

16 Fig 1.6H Which usability measure?
Slide 16 Ease of remember Subjective satisf. Understandability Ease of learning Task efficiency Fit for use Development, early Development, late Buying a system Task time Problem counts Keystroke counts Opinion poll Score for underst. Guidelines Highly useful Some use Indications only ? ?

17 A software engineering perspective
User interface design A software engineering perspective Soren Lauesen Slides for Chapter 2 November 2004 © 2005, Pearson Education retains the copyright to the slides, but allows restricted copying for teaching purposes only. It is a condition that the source and copyright notice is preserved on all the material.

18 Traditional systems development HCI classic: iterative design
Slide 18 Fig 2.1 The development process Analysis Traditional systems development Design Experts? Guidelines? Program Usability test? Scaring results ! Too late to correct Test Operation Design prototype Program Usability test Study users and tasks Analysis HCI classic: iterative design

19 Fig 2.2 Hotel system Task list Breakfasts 23/9 Book guest
Checkin Checkout Change room Record services Breakfast list Breakfasts 23/9 Room Buffet In room 11 1 12 2 13 1 1 15 . . .

20 Fig 2.3A Hotel system prototype

21 (Fig 2.3A Cont.)

22 Fig 2.3B Defect list for hotel system mockup

23 Fig 2.3C Hit-rate of Hotel System evaluation
Heuristic evaluation: 7 false problems 21 predicted problems 8 likely, but not observed 6 hits 20 observed problems 14 missed problems

24 Fig 2.4 Various prototypes
Tool-drawn mockup: Hand-drawn mockup: 15-30 min 30-60 min Which prototype is the best? Screen prototype: Functional prototype: 1-4 hours 2-8 hours

25 (Fig 2.4 Cont.) Full contents of a mockup Handling a system
 Empty screens for copying  Screens with realistic data  Screens to be filled in by user  Menus, lists, dialog boxes  Error messages  Help texts  Notes about what functions do Handling a system with 100 screens? Accelerator effect: If the central screens are good, the rest are okay almost automatically


Download ppt "A software engineering perspective"

Similar presentations


Ads by Google