1http://img.cs.man.ac.uk/stevens Evaluation CS2391 Lecture n+1: Robert Stevens.

Slides:



Advertisements
Similar presentations
Human Computer Interaction
Advertisements

Human Computer Interaction
CS305: HCI in SW Development Evaluation (Return to…)
1http://img.cs.man.ac.uk/stevens Interaction Models of Humans and Computers CS2352: Lecture 7 Robert Stevens
©N. Hari Narayanan Computer Science & Software Engineering Auburn University 1 COMP 7620 Evaluation Chapter 9.
Part 4: Evaluation Days 25, 27, 29, 31 Chapter 20: Why evaluate? Chapter 21: Deciding on what to evaluate: the strategy Chapter 22: Planning who, what,
Empirical Methods in Human- Computer Interaction.
Useability.
Evaluation Methodologies
Heuristic Evaluation Evaluating with experts. Discount Evaluation Techniques  Basis: Observing users can be time- consuming and expensive Try to predict.
Evaluating with experts
1 CS 430 / INFO 430 Information Retrieval Lecture 24 Usability 2.
An evaluation framework
Usability 2004 J T Burns1 Usability & Usability Engineering.
Evaluation techniques Part 1
Analytical Evaluations 2. Field Studies
Usability 2009 J T Burns1 Usability & Usability Engineering.
Predictive Evaluation
Evaluation Framework Prevention vs. Intervention CHONG POH WAN 21 JUNE 2011.
Chapter 11: An Evaluation Framework Group 4: Tony Masi, Sam Esswein, Brian Rood, & Chris Troisi.
Discount Evaluation Evaluating with experts. Discount Evaluation Techniques Basis: – Observing users can be time-consuming and expensive – Try to predict.
Ch 14. Testing & modeling users
Evaluation Techniques Material from Authors of Human Computer Interaction Alan Dix, et al.
Nielsen’s Ten Usability Heuristics
Human Computer Interaction
Usability Evaluation/LP Usability: how to judge it.
Multimedia Specification Design and Production 2012 / Semester 1 / week 5 Lecturer: Dr. Nikos Gazepidis
Usability Evaluation June 8, Why do we need to do usability evaluation?
Click to edit Master subtitle style USABILITY and USER INTERFACE DESIGN Application.
Evaluation Techniques zEvaluation ytests usability and functionality of system yoccurs in laboratory, field and/or in collaboration with users yevaluates.
CS2003 Usability Engineering Usability Evaluation Dr Steve Love.
Evaluation of User Interface Design 4. Predictive Evaluation continued Different kinds of predictive evaluation: 1.Inspection methods 2.Usage simulations.
COMP5047 Pervasive Computing: 2012 Think-aloud usability experiments or concurrent verbal accounts Judy Kay CHAI: Computer human adapted interaction research.
Chapter 9 evaluation techniques. Evaluation Techniques Evaluation –tests usability and functionality of system –occurs in laboratory, field and/or in.
Evaluation Techniques Evaluation –tests usability and functionality of system –occurs in laboratory, field and/or in collaboration with users –evaluates.
1 Lecture 18 chapter 9 evaluation techniques. 2 Evaluation Techniques Evaluation –tests usability and functionality of system –occurs in laboratory, field.
CS 580 chapter 9 evaluation techniques. Evaluation Tests usability and functionality of system Occurs in laboratory, field and/or in collaboration with.
CENG 394 Introduction to Human-Computer Interaction
Chapter 15: Analytical evaluation Q1, 2. Inspections Heuristic evaluation Walkthroughs Start Q3 Reviewers tend to use guidelines, heuristics and checklists.
Evaluating a UI Design Expert inspection methods Cognitive Walkthrough
Human-computer interaction: users, tasks & designs User modelling in user-centred system design (UCSD) Use with Human Computer Interaction by Serengul.
Usability 1 Usability evaluation Without users - analytical techniques With users - survey and observational techniques.
AMSc Research Methods Research approach IV: Experimental [1] Jane Reid
Usability Engineering Dr. Dania Bilal IS 582 Spring 2006.
EVALUATION PROfessional network of Master’s degrees in Informatics as a Second Competence – PROMIS ( TEMPUS FR-TEMPUS-JPCR)
Overview and Revision for INFO3315. The exam
Usability Engineering Dr. Dania Bilal IS 592 Spring 2005.
Chapter 15: Analytical evaluation. Aims: Describe inspection methods. Show how heuristic evaluation can be adapted to evaluate different products. Explain.
Administrivia  Feedback from the mid-term evaluation  Insights from project proposal.
1 Usability evaluation and testing User interfaces Jaana Holvikivi Metropolia.
Oct 211 The next two weeks Oct 21 & 23: Lectures on user interface evaluation Oct 28: Lecture by Dr. Maurice Masliah No office hours (out of town) Oct.
Introduction to Evaluation “Informal” approaches.
Fall 2002CS/PSY Predictive Evaluation (Evaluation Without Users) Gathering data about usability of a design by a specified group of users for a particular.
COGNITIVE WALKTHROUGH Vigneshwar Poojar. The cognitive walkthrough is a formalized way of imagining people’s thoughts and actions when they use an interface.
Usability Engineering Dr. Dania Bilal IS 582 Spring 2007.
Usability Engineering Dr. Dania Bilal IS 587 Fall 2007.
Design Evaluation Overview Introduction Model for Interface Design Evaluation Types of Evaluation –Conceptual Design –Usability –Learning Outcome.
Chapter 9 evaluation techniques. Evaluation Techniques Evaluation – tests usability and functionality of system – occurs in laboratory, field and/or in.
School of Engineering and Information and Communication Technology KIT305/607 Mobile Application Development Week 7: Usability (think-alouds) Dr. Rainer.
Ten Usability Heuristics These are ten general principles for user interface design. They are called "heuristics" because they are more in the nature of.
Human Computer Interaction
SIE 515 Design Evaluation Lecture 7.
Evaluation through user participation
Human Computer Interaction Lecture 15 Usability Evaluation
Evaluation techniques
Evaluation.
HCI Evaluation Techniques
CSM18 Usability Engineering
Evaluation Techniques
Experimental Evaluation
Presentation transcript:

1http://img.cs.man.ac.uk/stevens Evaluation CS2391 Lecture n+1: Robert Stevens

2http://img.cs.man.ac.uk/stevens Introduction You’ve gathered requirements, designed your system, built the artefact, …But does it fulfil the user‘s requirements? Basic usability Basic evaluation Evaluation styles Design evaluation Implementation evaluation

3http://img.cs.man.ac.uk/stevens Usability Basics Allowing users to achieve a goal with efficiency, effectiveness and satisfaction Utility is the functionality of a system Utility without usability, but not vice versa Worthy, but unhelpful Have paradigms of good usability, e.g. GUI Also need theory to know why something is usable Really want principles to guide developers – engineering not craft

4http://img.cs.man.ac.uk/stevens Execution and Evaluation SystemUser Input Output presentation performance observation articulation

5http://img.cs.man.ac.uk/stevens Execution & Evaluation (2) Presentation: How the system renders state and allows the user to evaluate state and alteration to the state Observation: What the user notices of the presentation; Can he/she see what they need to? Articulation: Expression of a user’s execution plan Performance: the system’s execution of a plan, the results of which are presented to the user

6http://img.cs.man.ac.uk/stevens Usability Principles a.Visibility of system status System should always keep users informed b.Match between system and the real world System should speak the user's language c.System functions chosen by mistake need a clear 'emergency exit' d.Consistency and standards Avoid ambiguity e.Error prevention f.Recognition rather than recall g.Flexibility and efficiency of use h.Aesthetic and minimalist design i.recognize, diagnose and recover from errors j.Help and documentation

7http://img.cs.man.ac.uk/stevens What is Evaluation? Do the design and implementation behave as we expect and fulfil the user’s requirements? Not just an add on at the end! Assess the design at various times during the life cycle Assess implementation prototypes, alpha and beta versions Evaluation saves time and money Many types of evaluation and the trick is to choose the appropriate one Purpose is to uncover usability problems

8http://img.cs.man.ac.uk/stevens Usability Thoughts Recall and recognition Making a system easier to use makes it more powerful Humans can switch topics fast – think of more than one thing at once Computer system should be able to do the same Complex syntax often hides the task – need directness of interaction

9http://img.cs.man.ac.uk/stevens Styles of Evaluation Evaluation Design Evaluation Cognitive walkthrough Heuristic evaluation Review-based evaluation The use of models Implementation Evaluation Empirical Observational Query

10http://img.cs.man.ac.uk/stevens Evaluation Styles (2) Cheaper to evaluate design, before the expense of implementation Tends not to involve the end-users, except as consultants Evaluation of an implementation does involve end-users Design evaluation techniques can be used to evaluate implementation The former are often paper based and involve experts The latter are time consuming, difficult and expensive and can involve numbers of end-users

11http://img.cs.man.ac.uk/stevens Types of User Not all users are Computer Scientists Different users have different needs Remember: Managers, system administrators and trainers Use end-users where possible and appropriate Important to have evaluatees that are representative of end- users Balance between under use and over use: Users need a reward for their time

12http://img.cs.man.ac.uk/stevens Hawthorn Effect Users like to please the evaluator People respond well to having someone interested in them Simply by evaluating an artefact, experience of that artefact improves Investigation of light levels in factories showed the investigation itself was the most important factor Not much to be done about it – be aware

13http://img.cs.man.ac.uk/stevens Goals of Evaluation Does the system have the correct functionality? Does it match the users task? A clerk used to searching by post-code, should be able to search by post-code Can the functionality be used: What is the effect on the user? What are the problems with the system? The last is part of the other two, but negative aspects drawn out

14http://img.cs.man.ac.uk/stevens Laboratory Techniques A usability lab: One way mirror; Video and audio recorders Logging of system Lacks context; unnatural for end-users and natural collaborative work difficult Does allow close study, particularly of specialist task or particular UI notion Good for single user tasks

15http://img.cs.man.ac.uk/stevens Field Techniques See the user in context Allows a user to interact with all people, objects and actions involved in a task Collaborative work can take place Noisy, difficult to record, etc Can lack detail possible in laboratory

16http://img.cs.man.ac.uk/stevens Cognitive Walk Through Bring psycology theory into informal and subjective walk through 1.Need a design: not necessarily complete, but location and wording helpful 2.A description of the task: Should be representative 3.A list of actions the user makes to perform the task 4.A description of the users and the experience expected of them given to experts, who step through actions and make an assessment of usability 1.Are the users performing the task described by the action? 2.Can the users see the object of interaction (button etc)? 3.Can the user tell that it is the right action? 4.Once performed, does the user get appropriate feedback? End of execution & evaluation cycle

17http://img.cs.man.ac.uk/stevens Heuristic Evaluation A set of heuristics (rules of thumb) developed by Jakob Nielsen and Rolf Molich Each heuristic used to critique an interface A set of independent experts use the heuristics Problems found following a Poisson distribution – 5 experts find about 75% of problems Usability questions used to guide and stimulate Essentially a check list

18http://img.cs.man.ac.uk/stevens Review Based Evaluation Principles from experimental psychology and HCI literature used to provide evaluation criteria E.g., menu design, naming items, icon design and language design and memory attributes Cheaper than performing the experiment, but beware of context in which a study was performed Like all expert based methods, it is all about stimulating basic questions to be asked Try and ensure independence of experts Performance, using scales and comment fields should be used

19http://img.cs.man.ac.uk/stevens Empirical Evaluation Evaluating the implementation (can also use Design Evaluation methods here) Empirical studies concentrate on end-users, rather than experts The controlled experiment technique Measure some attribute, while controlling other attributes of system Various experimental conditions, which differ only in the value of some variable Independent (manipulated) and dependent (measured) variable Difference in behaviour attributed to different values of independent variable that provide the different conditions (interface style, pointing device, wording, etc.) Dependent variable must be measurable in some way – speed, mouse clicks, satisfaction etc. Use both subjective and objective Measures

20http://img.cs.man.ac.uk/stevens Empirical Techniques (2) A hypothesis is framed in terms of the variables A change in the independent variable causes a change in the dependent The experiment attempts to prove this relationship Achieved by disproving null hypothesis; that is, no relationship of variables Use statistics to show that any differences seen could not have happened by chance Experimental design: Between groups and within groups Between Groups: Subjects assigned to experimental and control groups; latter ensures it is the independent variable that counts Each subject only does one condition, so avoiding learning effects; but prone to variation Within Groups: Subject performs in all conditions; Vary condition order to avoid learning

21http://img.cs.man.ac.uk/stevens Empirical Evaluation (3) Good for evaluation of individual design decisions: Colour, dialogue, wording, etc. Less good for overall usability – systems and humans too complex for controlled experiment Difficult to design Expensive in time, money and users

22http://img.cs.man.ac.uk/stevens Observational Techniques Think aloud & Co-operative evaluation Observing the user’s actions in work context – the whole task Usually pre-determined, representative tasks and users explain what they are doing (think aloud) Experimenter interacts with participant (subject) to elicit more information Everything recorded (notes, system log, audio, video) Protocols analysed Post-experiment walk through

23http://img.cs.man.ac.uk/stevens Query Based Techniques Ask the user can be very informative Simple, but highly subjective Interviews and questionnaires (see earlier lectures) Good for large numbers and high-level Good for exploring alternative strategies, particularly in context Less systematic, more subjective

24http://img.cs.man.ac.uk/stevens Summary Need to test appropriateness of functionality Also that functionality can be used Efficiency, effectiveness and satisfaction Evaluation of design and its implementation Choose your users with care HCI: Dix, Findlay, Abowd & Beale; Chapter 11