2 Evaluation Techniques tests usability and functionality of systemoccurs in laboratory, field and/or in collaboration with usersevaluates both design and implementationshould be considered at all stages in the design life cycle
3 Goals of Evaluation assess extent of system functionality assess effect of interface on useridentify specific problems
5 Cognitive Walkthrough Proposed by Polson et al.evaluates design on how well it supports user in learning taskusually performed by expert in cognitive psychologyexpert ‘walks though’ design to identify potential problems using psychological principlesforms used to guide analysis
6 Cognitive Walkthrough (ctd) For each task walkthrough considerswhat impact will interaction have on user?what cognitive processes are required?what learning problems may occur?Analysis focuses on goals and knowledge: does the design lead the user to generate the correct goals?
7 Heuristic Evaluation Proposed by Nielsen and Molich. usability criteria (heuristics) are identifieddesign examined by experts to see if these are violatedExample heuristicssystem behaviour is predictablesystem behaviour is consistentfeedback is providedHeuristic evaluation `debugs' design.
8 Review-based evaluation Results from the literature used to support or refute parts of design.Care needed to ensure results are transferable to new design.Model-based evaluationCognitive models used to filter design optionse.g. GOMS prediction of user performance.Design rationale can also provide useful evaluation information
10 Laboratory studies Advantages: Disadvantages: Appropriate specialist equipment availableuninterrupted environmentDisadvantages:lack of contextdifficult to observe several users cooperatingAppropriateif system location is dangerous or impractical for constrained single user systems to allow controlled manipulation of use
11 Field Studies Advantages: Disadvantages: Appropriate natural environmentcontext retained (though observation may alter it)longitudinal studies possibleDisadvantages:distractionsnoiseAppropriatewhere context is crucial for longitudinal studies
12 Evaluating Implementations Requires an artefact:simulation, prototype,full implementation
13 Experimental evaluation controlled evaluation of specific aspects of interactive behaviourevaluator chooses hypothesis to be testeda number of experimental conditions are considered which differ only in the value of some controlled variable.changes in behavioural measure are attributed to different conditions
14 Experimental factors Subjects Variables Hypothesis Experimental design who – representative, sufficient sampleVariablesthings to modify and measureHypothesiswhat you’d like to showExperimental designhow you are going to do it
15 Variables independent variable (IV) dependent variable (DV) characteristic changed to produce different conditionse.g. interface style, number of menu itemsdependent variable (DV)characteristics measured in the experimente.g. time taken, number of errors.
16 Hypothesis prediction of outcome null hypothesis: framed in terms of IV and DVe.g. “error rate will increase as font size decreases”null hypothesis:states no difference between conditionsaim is to disprove thise.g. null hyp. = “no change with font size”
17 Experimental design within groups design between groups design each subject performs experiment under each condition.transfer of learning possibleless costly and less likely to suffer from user variation.between groups designeach subject performs under only one conditionno transfer of learningmore users requiredvariation can bias results.
18 Analysis of data Before you start to do any statistics: look at datasave original dataChoice of statistical technique depends ontype of datainformation requiredType of datadiscrete - finite number of valuescontinuous - any value
19 Analysis - types of test parametricassume normal distributionrobustpowerfulnon-parametricdo not assume normal distributionless powerfulmore reliablecontingency tableclassify data by discrete attributescount number of data items in each group
20 Analysis of data (cont.) What information is required?is there a difference?how big is the difference?how accurate is the estimate?Parametric and non-parametric tests mainly address first of these
21 Experimental studies on groups More difficult than single-user experimentsProblems with:subject groupschoice of taskdata gatheringanalysis
22 Subject groups larger number of subjects more expensive longer time to `settle down’ … even more variation!difficult to timetableso … often only three or four groups
23 The task must encourage cooperation perhaps involve multiple channels options:creative task e.g. ‘write a short report on …’decision games e.g. desert survival taskcontrol task e.g. ARKola bottling plant
24 Data gathering several video cameras + direct logging of application problems:synchronisationsheer volume!one solution:record from each perspective
25 Analysis N.B. vast variation between groups solutions: within groups experimentsmicro-analysis (e.g., gaps in speech)anecdotal and qualitative analysislook at interactions between group and mediacontrolled experiments may `waste' resources!
26 Field studies Experiments dominated by group formation Field studies more realistic:distributed cognition work studied in contextreal action is situated actionphysical and social environment both crucialContrast:psychology – controlled experimentsociology and anthropology – open study and rich data
28 Think Aloud user observed performing task user asked to describe what he is doing and why, what he thinks is happening etc.Advantagessimplicity - requires little expertisecan provide useful insightcan show how system is actually useDisadvantagessubjectiveselectiveact of describing may alter task performance
29 Cooperative evaluation variation on think alouduser collaborates in evaluationboth user and evaluator can ask each other questions throughoutAdditional advantagesless constrained and easier to useuser is encouraged to criticize systemclarification possible
30 Protocol analysis paper and pencil – cheap, limited to writing speed audio – good for think aloud, difficult to match with other protocolsvideo – accurate and realistic, needs special equipment, obtrusivecomputer logging – automatic and unobtrusive, large amounts of data difficult to analyzeuser notebooks – coarse and subjective, useful insights, good for longitudinal studiesMixed use in practice.audio/video transcription difficult and requires skill.Some automatic support tools available
31 automated analysis – EVA Workplace projectPost task walkthroughuser reacts on action after the eventused to fill in intentionAdvantagesanalyst has time to focus on relevant incidentsavoid excessive interruption of taskDisadvantageslack of freshnessmay be post-hoc interpretation of events
32 post-task walkthroughs transcript played back to participant for commentimmediately fresh in minddelayed evaluator has time to identify questionsuseful to identify reasons for actions and alternatives considerednecessary in cases where think aloud is not possible
34 Interviewsanalyst questions user on one-to -one basis usually based on prepared questionsinformal, subjective and relatively cheapAdvantagescan be varied to suit contextissues can be explored more fullycan elicit user views and identify unanticipated problemsDisadvantagesvery subjectivetime consuming
35 Questionnaires Set of fixed questions given to users Advantages quick and reaches large user groupcan be analyzed more rigorouslyDisadvantagesless flexibleless probing
36 Questionnaires (ctd) Need careful design Styles of question what information is required?how are answers to be analyzed?Styles of questiongeneralopen-endedscalarmulti-choiceranked
38 eye trackinghead or desk mounted equipment tracks the position of the eyeeye movement reflects the amount of cognitive processing a display requiresmeasurements includefixations: eye maintains stable position. Number and duration indicate level of difficulty with displaysaccades: rapid eye movement from one point of interest to anotherscan paths: moving straight to a target with a short fixation at the target is optimal
39 physiological measurements emotional response linked to physical changesthese may help determine a user’s reaction to an interfacemeasurements include:heart activity, including blood pressure, volume and pulse.activity of sweat glands: Galvanic Skin Response (GSR)electrical activity in muscle: electromyogram (EMG)electrical activity in brain: electroencephalogram (EEG)some difficulty in interpreting these physiological responses - more research needed
40 Choosing an Evaluation Method when in process: design vs. implementationstyle of evaluation: laboratory vs. fieldhow objective: subjective vs. objectivetype of measures: qualitative vs. quantitativelevel of information: high level vs. low levellevel of interference: obtrusive vs. unobtrusiveresources available: time, subjects, equipment, expertise