Presentation is loading. Please wait.

Presentation is loading. Please wait.

Human Computer Interaction

Similar presentations


Presentation on theme: "Human Computer Interaction"— Presentation transcript:

1 Human Computer Interaction
Chapter-9 Evaluation techniques

2 Evaluation Techniques
Evaluation test the usability, functionality and acceptability of interactive system To assess design and test systems to ensure that they actually behave as we expect and meet user requirements should be considered at all stages in the design life cycle It is not possible to perform extensive experimental testing continuously throughout the design, but analytic & informal techniques can be used

3 Goals of Evaluation Three main goals:
assess extent & accessibility of system functionality assess users experience of interaction identify specific problems

4 Evaluation through Expert Analysis
Evaluation should be performed before any implementation work has started Expensive mistakes can be avoided Later in design process that error is discovered, the more costly it is to put right A number of methods proposed to evaluate interactive systems through expert analysis consider four approaches to expert analysis: cognitive walkthrough, heuristic evaluation, use of models and use of previous work

5 Cognitive Walkthrough
Proposed by Polson evaluates design on how well it supports user in learning task usually performed by expert in cognitive psychology expert ‘walks through’ design to identify potential problems using psychological principles main focus “how easy a system is to learn” Cognitive > concerned with acquisition of knowledge: relating to the process of acquiring knowledge by the use of reasoning, intuition, or perception 2. relating to thought: relating to thought processes Encarta« World English Dictionary ⌐ & (P) 1999 Microsoft Corporation. All rights reserved. Developed for Microsoft by Bloomsbury Publishing Plc.

6 Cognitive Walkthrough
For each task walkthrough considers what impact will interaction have on user? what cognitive processes are required? what learning problems may occur? Analysis focuses on goals and knowledge: does the design lead the user to generate the correct goals?

7 Heuristic Evaluation Proposed by Nielsen and Molich.
A heuristic is a guideline or general principle or rule of thumb that can guide a design decision Heuristic evaluation perform on design specification so it is useful for evaluating early design It can also used on prototypes, storyboards and fully functioning systems, therefore it is flexible and cheap approach design examined by experts to see if these are violated heuristic > procedure for getting solution: a helpful procedure for arriving at a solution but not necessarily a proof Encarta« World English Dictionary ⌐ & (P) 1999 Microsoft Corporation. All rights reserved. Developed for Microsoft by Bloomsbury Publishing Plc. Violated > disregard something: to act contrary to something such as a law, contract, or agreement, especially in a way that produces significant effects 2. rape somebody: to rape or sexually assault somebody 3. disturb something: to disturb or interrupt something in a rude or violent way

8 Heuristic Evaluation Example heuristics
system behaviour is predictable system behaviour is consistent feedback is provided Heuristic evaluation `debugs' design.

9 Heuristic Evaluation Nielsen’s ten heuristics are:
Visibility of system status Match b/w system and the real world User control and freedom Consistency and standards Error prevention Recognition rather than recall Flexibility and efficiency Aesthetic and minimalist design Help users recognize and recover from errors Help and documentation prevention > ction that stops something from happening: an action or actions taken to stop somebody from doing something or to prevent something from taking place • the prevention of crime 2. something that acts to prevent something: an action or measure that makes it impossible or very difficult for somebody to do a certain thing, or for something to happen Encarta« World English Dictionary ⌐ & (P) 1999 Microsoft Corporation. All rights reserved. Developed for Microsoft by Bloomsbury Publishing Plc. Aesthetic > action that stops something from happening: an action or actions taken to stop somebody from doing something or to prevent something from taking place • the prevention of crime Aesthetic > beautiful: pleasing in appearance

10 Model-based evaluation
Another expert-based approach is the use of models Certain cognitive and design models provide a means of combining design specification and evaluation into same framework e.g. GOMS (goals, operators, methods & selection) model predicts user performance with a particular interface and can used to filter design options. Design rationale can also provide useful evaluation information

11 Review -based evaluation
Results from the literature used to support or refute parts of design. Care needed to ensure results are transferable to new design. refute > disprove

12 Evaluating through user Participation
The techniques considered so far concentrate on evaluating a design or system through analysis by designer or expert evaluator, rather than testing with actual users There are number of different approaches to evaluation through user participation

13 Styles of evaluation We distinguish between two distinct evaluation styles: Those performed under laboratory conditions Those conducted in the work environment or ‘in the field’

14 Laboratory studies Advantages: Disadvantages:
specialist equipment available uninterrupted environment Disadvantages: lack of context (filling cabinets, calendars, books) difficult to observe several users cooperating context > environment

15 Field Studies Advantages: Disadvantages: natural environment
longitudinal studies possible Disadvantages: distraction noise Distraction > interruption

16 Experimental evaluation
One most powerful methods of evaluating a design is to use a controlled experiment It can be used to study a wide range of different issues at different levels of detail There are number of factors that are important to overall reliability of experiment, which must be considered carefully in experimental design This include the participants chosen, variables tested and manipulated and hypothesis tested hypothesis > assumption, suggestion

17 Experimental factor Subjects Variables Hypothesis Experimental design
who – representative Variables things to modify and measure Hypothesis what you’d like to show Experimental design how you are going to do it

18 Variables independent variable (IV) dependent variable (DV)
IV are those elements of the experiment that are manipulated to produce different conditions for comparison. e.g. interface style, number of menu items and icon design dependent variable (DV) DV are the variables that can be measured in experiment, their value is ‘dependent’ on the changes made to independent variable e.g. time taken to complete task, number of errors made

19 Hypothesis A hypothesis is a prediction of outcome of an experiment.
The aim of experiment is to show this prediction is correct

20 Experimental design In order to produce reliable and generalizable results, an experiment must be carefully designed The first phase in experimental design is to choose the hypothesis: to decide exactly what it is you are trying to demonstrate In doing this we are likely to clarify the independent and dependent variables a number of experimental conditions are considered which differ only in the value of some variable

21 Experimental studies on groups
More difficult than single-user experiments Problems with: complexities of human-human communication & group working choice of task data gathering analysis

22 Participant groups larger number of subjects  more expensive
longer time to `settle down’ … even more variation! difficult to timetable

23 The task Choosing a suitable task is also difficult, we may want to test a variety of different task types: options: creative task e.g. ‘write a short report on …’ decision games e.g. desert survival task control task e.g. ARKola bottling plant

24 Data gathering problems: one solution:
Even in a single-user experiment use several video cameras In group setting this is replicated for each participant problems: synchronisation volume! one solution: record from each perspective

25 Observational techniques
A particular way to gather information about actual user of system is to observe users interacting with it Usually they are asked to complete a set of predetermined tasks The evaluator watches and records the users action Users are asked to elaborate their actions by ‘thinking aloud’

26 Think Aloud user observed performing task
user asked to describe what he is doing and why, what he thinks is happening etc. Advantages simplicity - requires little expertise can provide useful insight can show how system is actually use

27 Cooperative evaluation
A variation on think aloud is known as cooperative evaluation in which the user is encouraged to see himself as collaborator both user and evaluator can ask each other questions throughout Additional advantages less constrained and easier to use user is encouraged to criticize system the evaluator can clarify points of confusion

28 Protocol analysis paper and pencil – cheap, limited to writing speed
audio – good for think aloud, difficult to match with other protocols video – accurate and realistic, needs special equipment, obtrusive computer logging – relatively easy to get system automatically to record user actions, it tells us what user is doing on system user notebooks – coarse and subjective, useful insights, good for longitudinal studies

29 automated analysis – EVA
Analyzing protocols, whether video, audio or system logs is time consuming It is harder if there is more than one stream of data One solution to this problem is to provide automatic analysis tools to support the task EVA (Experimental Video Annotator) is a system that runs on a multimedia workstation with a direct link to video recorder Post task walkthrough user reacts on action after the event used to fill in intention

30 Interviews Questionnaires
Query Techniques Interviews Questionnaires

31 Interviews analyst questions user on one-to-one basis usually based on prepared questions informal, subjective and relatively cheap Advantages issues can be explored more fully can elicit user views and identify unanticipated problems Disadvantages very subjective time consuming

32 Questionnaires Set of fixed questions given to users Advantages
quick and reaches large user group can be analyzed more rigorously Disadvantages less flexible less probing

33 Questionnaires Need careful design Styles of question
what information is required? how are answers to be analyzed? Styles of question general open-ended scalar multi-choice

34 Choosing an evaluation method
A range of techniques available for evaluating system at all stages in design process So how do we decide which methods are most appropriate……no hard and fast rules Each method has its particular strengths and weakness and each is useful There are number of factors that should be taken into account when selecting evaluation techniques

35 Choosing an Evaluation Method
when in process: design vs. implementation style of evaluation: laboratory vs. Field type of measures: qualitative vs. quantitative level of information: high level vs. low level resources available: time, subjects, equipment, expertise


Download ppt "Human Computer Interaction"

Similar presentations


Ads by Google