Presentation is loading. Please wait.

Presentation is loading. Please wait.

Usability Evaluation or, “I can’t figure this out...do I still get the donuts?”

Similar presentations


Presentation on theme: "Usability Evaluation or, “I can’t figure this out...do I still get the donuts?”"— Presentation transcript:

1 Usability Evaluation or, “I can’t figure this out...do I still get the donuts?”

2 Purposes of Evaluation Which design is better? Are there any problems with the design as it now stands? Does the design meet usability targets?

3 When do you evaluate? Formative evaluation (e.g. scenarios) –Helps with form of the solution –Deciding between competing designs –At early and intermediate design stages Summative evaluation –Provides summary of usability –Often compares new design with existing or alternative solutions –When design is complete

4 What do you evaluate? Prototypes –During design iterations –Can be very low-fidelity –“Wizard of Oz” technique Working systems –System can be prototype –Can also evaluate at end of design cycle

5 Types of Evaluation Depends on range of formality and completeness of system: –Informal user studies –Usability studies –Formal experiments

6 User Studies Early stages of design Usually only a few users Non-structured tasks Collect comments, observations, suggestions, preferences,...

7 Usability Studies Usually more complete prototype Structured tasks Collect timings, errors, verbal protocol,... Issues: –Finding users –Testing environment –Compensation

8 Formal Experiments Working system or piece of system Number of users determined by desired difference Tightly controlled tasks Precise measurements Statistical analysis of hypotheses

9 Ethics Testing can be a distressing experience –pressure to perform, errors inevitable –feelings of inadequacy –competition with other subjects Golden rule –subjects should always be treated with respect

10 Managing Subjects Ethically Before the test –Don’t waste the user’s time Use pilot tests to debug experiments, questionnaires etc Have everything ready before the user shows up –Make users feel comfortable Emphasize that it is the system that is being tested, not the user Acknowledge that the software may have problems Let users know they can stop at any time –Maintain privacy Tell user that individual test results will be kept completely confidential –Inform the user Explain any monitoring that is being used Answer all user’s questions (but avoid bias) –Only use volunteers user must sign an informed consent form

11 Managing Subjects Ethically During the test –Don’t waste the user’s time Never have the user perform unnecessary tasks –Make users comfortable Try to give user an early success experience Keep a relaxed atmosphere in the room Coffee, breaks, etc Hand out test tasks one at a time Never indicate displeasure with the user’s performance Avoid disruptions Stop the test if it becomes too unpleasant –Maintain privacy Do not allow the user’s management to observe the test

12 Managing Subjects Ethically After the test –Make the users feel comfortable State that the user has helped you find areas of improvement –Inform the user Answer particular questions about the experiment that could have biased the results before –Maintain privacy Never report results in a way that individual users can be identified Only show videotapes outside the research group with the user’s permission

13 Components of a Study Informed consent User familiarization User questionnaire Background testing System exploration Specific tasks Post-testing Debriefing

14 Informed Consent Form advising users of their rights Tell them you are studying the system, not them! Tell them if there are any known risks Tell them they can stop at any time Get signature, give them a copy

15 User Familiarization Make the user comfortable Tell them what you’re doing without giving everything away Show them the facilities and equipment Give them written instructions Tell them approximately how much time the evaluation will take Ask if they have any questions

16 User Questionnaire Obtain demographic information –Age –Gender –Occupation or major –Computer experience –Domain experience –Eyesight –Handedness

17 Background Testing You may want to correlate your findings with some standardized test score Examples –Spatial ability tests –Memorization tests –Domain knowledge tests Example result: “Users with high spatial ability preferred interface X, while others preferred interface Y”

18 Pre- and Post-Testing Some systems have a specific learning goal (i.e. education or training) Giving users the same test or type of test both before and after usage of the system is a way to measure learning Only used with very well-developed prototypes

19 System Exploration User gets “free play” time with the system Observe what they seem to understand easily and what is troublesome See if they “find” all the features or parts of the system Good with verbal protocol

20 Specific Tasks Give the user a specific goal Ex: Buy a burger with extra pickles and no onions using this interface Observe problems Record performance, errors

21 Post-Questionnaire Get user’s reaction to the system Subjective levels of satisfaction, perceived ease of use, usefulness, etc. Objective comments, thoughts, and questions

22 Debriefing Talk to the users about the session Assure them they did well Give more details about what you’re doing if they are interested Thank them Give them donuts

23 Other Evaluation Issues Recruiting users Testing facilities Measurements and observations Compensation

24 Recruiting Users Attempt to match the proposed user population Perhaps divide into several user groups Techniques –Posted advertisements –Internet/email/newsgroup –Colleagues/friends/classmates –People already using existing system Make sure your users are not overly knowledgeable!

25 Testing Facilities For informal studies, a simple setup In all cases, privacy is important Replicate the usage environment? Evaluator present or not? More formal studies may use a special usability lab (e.g. McBryde 102 A or C)

26 Measurements & Observations User comments General observations Specific “critical incidents” Task timing User errors

27 Session management May need multiple evaluators Use checksheets or pre-printed tables for filling in results Video/audio as backup if something is missed Don’t ask the user to stop while you catch up!

28 Compensation Most studies give the user something for their time and effort Doesn’t have to be monetary – can also be: –Food –Extra credit in a class –Special discounts on the company’s products –Tour of your facility –…


Download ppt "Usability Evaluation or, “I can’t figure this out...do I still get the donuts?”"

Similar presentations


Ads by Google