Presentation is loading. Please wait.

Presentation is loading. Please wait.

CSM18 Usability Engineering

Similar presentations


Presentation on theme: "CSM18 Usability Engineering"— Presentation transcript:

1 CSM18 Usability Engineering
Evaluation: test the usability and functionality of an interactive system Goals of an evaluation • assess the extent of the system’s functionality • assess its usability - see 10 heuristics assess the effect & affect of the interface on the user • identify any specific problems with the system or with its use UniS Department of Computing Dr Terry Hinton 17 April, 2019

2 Evaluation Methods for Interactive Systems
Analytical Methods Experimental Methods Observational Methods Query Methods UniS Department of Computing Dr Terry Hinton 17 April, 2019

3 Evaluation Methods for Interactive Systems
Analytical Methods Predict performance based on a model e.g. analysis of cash dispenser based on number of key strokes required, time needed to press a key, time needed to think, time needed to react. Experimental Methods design experiments in the laboratory e.g. speed of recognition of key words dependent on font & colour UniS Department of Computing Dr Terry Hinton 17 April, 2019

4 Evaluation Methods for Interactive Systems
Observational Methods - in the field Users: expert users typical users novice users Query Methods - survey opinions, attitudes, easy, enjoyable also consider contextual issues Interviews Questionnaires UniS Department of Computing Dr Terry Hinton 17 April, 2019

5 UniS Department of Computing
Usability Usability defined: usability = efficiency + effectiveness + enjoyment Can’t compute a usability parameter - J Neilson proposed 10 Usability heuristics see later heuristics - set of rules for solving problems other than by an algorithm (Collins English Dictionary 2nd Ed.) UniS Department of Computing Dr Terry Hinton 17 April, 2019

6 Experimental methods - in the laboratory
design an experiment for laboratory conditions make an hypothesis - testable select your subjects - sample size select the variables - change one at a time statistical measures - time, speed, number of events, number of errors (recoverable, fatal) UniS Department of Computing Dr Terry Hinton 17 April, 2019

7 Observational techniques - in the field
Observe behaviour - arbitrary activity - set the tasks Task analysis Specify a set of tasks gives insight into usability Specify a goal gives insight into cognitive strategy used Record - actions, time, errors etc. UniS Department of Computing Dr Terry Hinton 17 April, 2019

8 Observational techniques - in the field
Verbal Protocol - Think aloud Protocol analysis paper and pencil audio recording video recording computer logging user notebooks Automatic protocol analysis tools Post-event protocol - teach-back or Post-task “walkthroughs” UniS Department of Computing Dr Terry Hinton 17 April, 2019

9 Query techniques - Attitudinal Data
Interviews (see page 35 et seq.) design an interview schedule Questionnaires general open-ended scalar multi-choice UniS Department of Computing Dr Terry Hinton 17 April, 2019

10 Planning an evaluation
Factors to be considered in planning an evaluation purpose - who are the stake holders laboratory vs field studies qualitative vs quantitative measures information provided immediacy of response intrusiveness resources UniS Department of Computing Dr Terry Hinton 17 April, 2019

11 Ten usability heuristics by J Neilson
Visibility of system status system should keep users informed about what is going on Match between system and the real world system should speak users’ language - words, phrases and concepts familiar to the user (rather than system-oriented terms) User control and freedom users often choose system functions by mistake - support undo/redo Consistency and standards follow platform conventions (users shouldn’t have to wonder whether diferent words, situations, or actions mean the same thing. UniS Department of Computing Dr Terry Hinton 17 April, 2019

12 Ten usability heuristics
Error prevention better than error messages Recognition rather than recall make objects, actions, and options visible (users shouldn’t have to remember information) Flexibility and efficiency of use accelerators (unseen by novice users) my speed up interaction for expert users - system allows users to tailor frequent actions Aesthetic and minimalist design simplicity is beauty UniS Department of Computing Dr Terry Hinton 17 April, 2019

13 Ten usability heuristics
Help users recognise, diagnose, and recover from errors Express error messages in plain language (no codes), indicate the problem, and suggest solution Help and documentation Ideally, its better if system can be used without documentation, Most often it is necessary to provide help and documentation. Such information should be easy to search, focused on the user’s task, list concrete steps to be carried out and not be too long. Examples are always helpful. UniS Department of Computing Dr Terry Hinton 17 April, 2019

14 Questionnaire Design (see page 35 et seq.)
A simple checklist yes no Don’t’ know copy paste Example of 6-point rating scale (avoid a middle value) Very useful Of no use UniS Department of Computing Dr Terry Hinton 17 April, 2019

15 UniS Department of Computing
Questionnaire Design An example of a Likert Scale strongly agree agree slightly agree neutral slightly disagree disagree strongly disagree An example of a semantic differential scale extremely slightly neutral slightly extremely easy difficult UniS Department of Computing Dr Terry Hinton 17 April, 2019

16 UniS Department of Computing
Questionnaire Design An example of a ranked order question: Place the following commands in order of usefulness using the numbers 1 to 4, 1 being the most useful. copy paste group clear UniS Department of Computing Dr Terry Hinton 17 April, 2019

17 Evaluation in the Design Phase
Participatory Design - user is involved in the whole design life cycle Number of methods to help convey information between user and designer brainstorming storyboarding workshops pencil & paper exercise role playing UniS Department of Computing Dr Terry Hinton 17 April, 2019

18 UniS Department of Computing
Evaluating the design Cognitive walk through Heuristic evaluation Review based evaluation Model based evaluation UniS Department of Computing Dr Terry Hinton 17 April, 2019

19 UniS Department of Computing
Choosing an evaluation method Ref: Dix, A., Finlay, J.,Abowd, G., Beale, R. (1994) UniS Department of Computing Dr Terry Hinton 17 April, 2019

20 Choosing an evaluation method
UniS Department of Computing Dr Terry Hinton 17 April, 2019

21 Choosing an evaluation method
UniS Department of Computing Dr Terry Hinton 17 April, 2019


Download ppt "CSM18 Usability Engineering"

Similar presentations


Ads by Google