Presentation is loading. Please wait.

Presentation is loading. Please wait.

Usable Privacy and Security Carnegie Mellon University Spring 2007 Cranor/Hong 1 User Studies Methods Feb 01,

Similar presentations


Presentation on theme: "Usable Privacy and Security Carnegie Mellon University Spring 2007 Cranor/Hong 1 User Studies Methods Feb 01,"— Presentation transcript:

1 Usable Privacy and Security Carnegie Mellon University Spring 2007 Cranor/Hong http://cups.cs.cmu.edu/courses/ups-sp07/ 1 User Studies Methods Feb 01, 2007

2 Usable Privacy and Security Carnegie Mellon University Spring 2007 Cranor/Hong http://cups.cs.cmu.edu/courses/ups-sp07/ 2 Case Studies Chameleon

3 Usable Privacy and Security Carnegie Mellon University Spring 2007 Cranor/Hong http://cups.cs.cmu.edu/courses/ups-sp07/ 3 Case Study: Chameleon Design proposal introducing new user interface metaphor

4 Usable Privacy and Security Carnegie Mellon University Spring 2007 Cranor/Hong http://cups.cs.cmu.edu/courses/ups-sp07/ 4 Case Study: Chameleon Iterative Design Paper prototype -> Visual Basic -> Implement Increasingly refined prototypes Evaluation of each prototype

5 Usable Privacy and Security Carnegie Mellon University Spring 2007 Cranor/Hong http://cups.cs.cmu.edu/courses/ups-sp07/ 5 Chameleon Study #1 Understand feasibility of basic idea How people used security features Explicit vs implicit role switching Used paper prototype Recruited 10 people from campus Unclear, but presumably typical users w/o extensive computer experience

6 Usable Privacy and Security Carnegie Mellon University Spring 2007 Cranor/Hong http://cups.cs.cmu.edu/courses/ups-sp07/ 6 Chameleon Study #1 “We recruited 10 people from around our campus to use the paper prototype while we observed them ad listened to their comments about what they found confusing, easy, difficult, helpful, etc.” “Participants also filled out a web-based questionnaire about their experiences using the prototype”

7 Usable Privacy and Security Carnegie Mellon University Spring 2007 Cranor/Hong http://cups.cs.cmu.edu/courses/ups-sp07/ 7 Chameleon Study #1 Fairly typical of an early formative study Formative means early stages of design Summative means later stages (timing data) Lots of qualitative feedback Useful for early stages Should be able to notice major issues w/o having to do extensive analysis Little unclear what the tasks were Specific tasks to understand usability Freeform tasks to understand utility

8 Usable Privacy and Security Carnegie Mellon University Spring 2007 Cranor/Hong http://cups.cs.cmu.edu/courses/ups-sp07/ 8 Chameleon Study #1 Web survey useful too Lots of positive and negative comments Always a good idea to do a survey Helped flesh out major issues Switching roles needed to be improved User motivation issues Names of roles

9 Usable Privacy and Security Carnegie Mellon University Spring 2007 Cranor/Hong http://cups.cs.cmu.edu/courses/ups-sp07/ 9 Chameleon Study #1 Comments: Good to show alternative designs after such a study People not as good evaluating a single design, better to show alternatives and have them compare differences

10 Usable Privacy and Security Carnegie Mellon University Spring 2007 Cranor/Hong http://cups.cs.cmu.edu/courses/ups-sp07/ 10 Chameleon Study #2 Drilling down on the UI How people should perform key operations Ex. Moving a file from one role to another Roughly three designs per operation Within-subjects design (each person tries all) How to address learning effects?

11 Usable Privacy and Security Carnegie Mellon University Spring 2007 Cranor/Hong http://cups.cs.cmu.edu/courses/ups-sp07/ 11 Chameleon Study #3 Visual Basic prototype More refined prototype let them study issues more in-depth than possible with paper Injected an “attack”, window that appeared to be in certain role but was in another One issue with security studies is timing, may want people to become comfortable and then see if they notice and how they react Few participants noticed 

12 Usable Privacy and Security Carnegie Mellon University Spring 2007 Cranor/Hong http://cups.cs.cmu.edu/courses/ups-sp07/ 12 Chameleon: General Comments Start simple and with big issues first Progressively refine the prototypes Don’t drill down to small issues until needed UI design studies should inject an attack See whether people notice Can try various UIs to compare effectiveness

13 Usable Privacy and Security Carnegie Mellon University Spring 2007 Cranor/Hong http://cups.cs.cmu.edu/courses/ups-sp07/ 13 Kazaa File Sharing Study Good and Krekelberg, CHI 2003 Could people understand what files were downloadable by others? Found lots of people sharing inbox.dbx Found that some people were downloading a fake inbox.dbx file

14 Usable Privacy and Security Carnegie Mellon University Spring 2007 Cranor/Hong http://cups.cs.cmu.edu/courses/ups-sp07/ 14 Kazaa Cognitive Walkthrough Cognitive Walkthrough Put yourself in shoes of users and try to use the interface from their perspective Somewhat effective approach, depends on ability of person to see other perspectives Problem #1: Multiple names for similar things My Shared Folder- a folder + all shared files My Media- all shared files by media type My Kazaa- all shared files by media type Folder for downloaded files - root folder of all shared files

15 Usable Privacy and Security Carnegie Mellon University Spring 2007 Cranor/Hong http://cups.cs.cmu.edu/courses/ups-sp07/ 15 Kazaa Cognitive Walkthrough Problem 2: Downloaded files are also shared files Problem 3: Kazaa recursively shares folders

16 Usable Privacy and Security Carnegie Mellon University Spring 2007 Cranor/Hong http://cups.cs.cmu.edu/courses/ups-sp07/ 16 Kazaa Cognitive Walkthrough Problem 4: Can select a folder, but what files are inside? Error-prone approach. Also risk with recursive folders.

17 Usable Privacy and Security Carnegie Mellon University Spring 2007 Cranor/Hong http://cups.cs.cmu.edu/courses/ups-sp07/ 17 Kazaa Cognitive Walkthrough Note: Gives one-time warning if you select an entire hard drive

18 Usable Privacy and Security Carnegie Mellon University Spring 2007 Cranor/Hong http://cups.cs.cmu.edu/courses/ups-sp07/ 18 Kazaa Cognitive Walkthrough Problem 5: Inconsistent views Two UIs for doing similar tasks, but show different information about state of system

19 Usable Privacy and Security Carnegie Mellon University Spring 2007 Cranor/Hong http://cups.cs.cmu.edu/courses/ups-sp07/ 19 Cognitive Walkthru Discussion Fairly effective technique May be useful to apply multiple times from multiple perspectives Parent who has things to protect Teen who wants to download music May have false positives Probably best to do cog walkthru with multiple people, combine issues, and triage Importance (not a problem -> catastrophe) Cost (trivial -> major rework)

20 Usable Privacy and Security Carnegie Mellon University Spring 2007 Cranor/Hong http://cups.cs.cmu.edu/courses/ups-sp07/ 20 Kazaa File Sharing Study 12 users, 10 had used file sharing before Figure out what files being shared by Kazaa Download files set to C:\ (ie all files) Results 5 people thought it was “My Shared Folder”  which one UI did suggest

21 Usable Privacy and Security Carnegie Mellon University Spring 2007 Cranor/Hong http://cups.cs.cmu.edu/courses/ups-sp07/ 21 Kazaa File Sharing Study 12 users, 10 had used file sharing before Figure out what files being shared by Kazaa Download files set to C:\ (ie all files) Results 5 people thought it was “My Shared Folder”  which one UI did suggest 2 people used Find Files to find all shared files  This UI had no files checked, thus no files shared?

22 Usable Privacy and Security Carnegie Mellon University Spring 2007 Cranor/Hong http://cups.cs.cmu.edu/courses/ups-sp07/ 22 Kazaa File Sharing Study Results 5 people thought it was “My Shared Folder”  which one UI did suggest 2 people used Find Files to find all shared files  This UI had no files checked, thus no files shared? 2 people used help, said “My Shared Folder” 1 person couldn’t figure it out at all Only 2 people got it right

23 Usable Privacy and Security Carnegie Mellon University Spring 2007 Cranor/Hong http://cups.cs.cmu.edu/courses/ups-sp07/ 23 Kazaa File Sharing Study 12 participants a little low, though results strong enough to indicate big problems Could have tried to verify cognitive walkthrough issues Could have tried to test people’s ability to configure system (defaults important!) Interesting point: Had to set up system to prevent any actual sharing of files We’ve had similar issues wrt phishing

24 Usable Privacy and Security Carnegie Mellon University Spring 2007 Cranor/Hong http://cups.cs.cmu.edu/courses/ups-sp07/ 24 Are people still accidentally sharing files? A rough & ready experiment by your friendly instructor (2006) eMule (open source) Combines eDonkey and Kad file sharing Different from FastTrack (Kazaa file sharing) eMule stats Downloaded by over 85 million people 5.3 mil people / 633 mil files on eDonkey 1.7 mil people / 300 mil files on Kad

25 Usable Privacy and Security Carnegie Mellon University Spring 2007 Cranor/Hong http://cups.cs.cmu.edu/courses/ups-sp07/ 25

26 Usable Privacy and Security Carnegie Mellon University Spring 2007 Cranor/Hong http://cups.cs.cmu.edu/courses/ups-sp07/ 26

27 Usable Privacy and Security Carnegie Mellon University Spring 2007 Cranor/Hong http://cups.cs.cmu.edu/courses/ups-sp07/ 27

28 Usable Privacy and Security Carnegie Mellon University Spring 2007 Cranor/Hong http://cups.cs.cmu.edu/courses/ups-sp07/ 28

29 Usable Privacy and Security Carnegie Mellon University Spring 2007 Cranor/Hong http://cups.cs.cmu.edu/courses/ups-sp07/ 29

30 Usable Privacy and Security Carnegie Mellon University Spring 2007 Cranor/Hong http://cups.cs.cmu.edu/courses/ups-sp07/ 30

31 Usable Privacy and Security Carnegie Mellon University Spring 2007 Cranor/Hong http://cups.cs.cmu.edu/courses/ups-sp07/ 31

32 Usable Privacy and Security Carnegie Mellon University Spring 2007 Cranor/Hong http://cups.cs.cmu.edu/courses/ups-sp07/ 32

33 Usable Privacy and Security Carnegie Mellon University Spring 2007 Cranor/Hong http://cups.cs.cmu.edu/courses/ups-sp07/ 33

34 Usable Privacy and Security Carnegie Mellon University Spring 2007 Cranor/Hong http://cups.cs.cmu.edu/courses/ups-sp07/ 34

35 Usable Privacy and Security Carnegie Mellon University Spring 2007 Cranor/Hong http://cups.cs.cmu.edu/courses/ups-sp07/ 35 eMule File Sharing UI

36 Usable Privacy and Security Carnegie Mellon University Spring 2007 Cranor/Hong http://cups.cs.cmu.edu/courses/ups-sp07/ 36 Putting Them Together Lessons from Chameleon + Kazaa Examples of how to run user studies  Not the most rigorous studies, but good enough to demonstrate main point Examples of mental models Design ModelUser Model System Image

37 Usable Privacy and Security Carnegie Mellon University Spring 2007 Cranor/Hong http://cups.cs.cmu.edu/courses/ups-sp07/ 37 Other General Comments Inform people that it’s a security study? Can’t get useful results if informed Ethics of not informing people Involves some element of deception Phishing studies framed as email studies Golden rule useful here: treat people as you would like to be treated

38 Usable Privacy and Security Carnegie Mellon University Spring 2007 Cranor/Hong http://cups.cs.cmu.edu/courses/ups-sp07/ 38 Heuristic Evaluation Mentioned in “Why Johnny Can’t Encrypt” Similar to cognitive walkthrough Helps find usability problems in a UI design Can perform on working UI or on sketches Small set (3-5) of evaluators examine UI independently check for compliance with usability principles (“heuristics”) different evaluators will find different problems evaluators combine findings afterwards

39 Usable Privacy and Security Carnegie Mellon University Spring 2007 Cranor/Hong http://cups.cs.cmu.edu/courses/ups-sp07/ 39 Why Multiple Evaluators? Every evaluator doesn’t find every problem Good evaluators find both easy & hard ones

40 Usable Privacy and Security Carnegie Mellon University Spring 2007 Cranor/Hong http://cups.cs.cmu.edu/courses/ups-sp07/ 40 Heuristic Evaluation Process Evaluators go through UI several times inspect various dialogs and screens compare with heuristics and other usability principles “Standard” set of heuristics Can also create domain-specific heuristics  competitive analysis & user testing of existing products Use violations to redesign/fix problems

41 Usable Privacy and Security Carnegie Mellon University Spring 2007 Cranor/Hong http://cups.cs.cmu.edu/courses/ups-sp07/ 41 Heuristic H2-1 H2-1: Visibility of system status keep users informed about what is going on example: pay attention to response time  0.1 sec: no special indicators needed, why?  1.0 sec: user tends to lose track of data  10 sec: max. duration if user to stay focused on action  for longer delays, use percent-done progress bars searching database for matches

42 Usable Privacy and Security Carnegie Mellon University Spring 2007 Cranor/Hong http://cups.cs.cmu.edu/courses/ups-sp07/ 42 Heuristic H2-2 H2-2: Match between system & real world speak the users’ language follow real world conventions Example: Mac desktop Dragging disk to trash  should delete it, not eject it  finally fixed in Mac OS X

43 Usable Privacy and Security Carnegie Mellon University Spring 2007 Cranor/Hong http://cups.cs.cmu.edu/courses/ups-sp07/ 43 Heuristic H2-3 H2-3: User control & freedom “exits” for mistaken choices, undo, redo don’t force down fixed paths  like that BART machine…

44 Usable Privacy and Security Carnegie Mellon University Spring 2007 Cranor/Hong http://cups.cs.cmu.edu/courses/ups-sp07/ 44 Heuristic H2-4 H2-4: Consistency & standards

45 Usable Privacy and Security Carnegie Mellon University Spring 2007 Cranor/Hong http://cups.cs.cmu.edu/courses/ups-sp07/ 45 Heuristic H2-5 H2-5: Error prevention

46 Usable Privacy and Security Carnegie Mellon University Spring 2007 Cranor/Hong http://cups.cs.cmu.edu/courses/ups-sp07/ 46 Heuristic H2-6 H2-6: Recognition rather than recall make objects, actions, options, & directions visible or easily retrievable

47 Usable Privacy and Security Carnegie Mellon University Spring 2007 Cranor/Hong http://cups.cs.cmu.edu/courses/ups-sp07/ 47 Heuristic H2-7 H2-7: Flexibility and efficiency of use accelerators for experts (e.g., gestures, kb shortcuts) allow users to tailor frequent actions (e.g., macros)

48 Usable Privacy and Security Carnegie Mellon University Spring 2007 Cranor/Hong http://cups.cs.cmu.edu/courses/ups-sp07/ 48 Heuristic H2-8 H2-8: Aesthetic and minimalist design no irrelevant information in dialogues

49 Usable Privacy and Security Carnegie Mellon University Spring 2007 Cranor/Hong http://cups.cs.cmu.edu/courses/ups-sp07/ 49 Heuristic H2-9 H2-9: Help users recognize, diagnose, and recover from errors error messages in plain language precisely indicate the problem constructively suggest a solution

50 Usable Privacy and Security Carnegie Mellon University Spring 2007 Cranor/Hong http://cups.cs.cmu.edu/courses/ups-sp07/ 50

51 Usable Privacy and Security Carnegie Mellon University Spring 2007 Cranor/Hong http://cups.cs.cmu.edu/courses/ups-sp07/ 51 Heuristic H2-10 H2-10: Help and documentation easy to search focused on the user’s task list concrete steps to carry out not too large

52 Usable Privacy and Security Carnegie Mellon University Spring 2007 Cranor/Hong http://cups.cs.cmu.edu/courses/ups-sp07/ 52 Phases of Heuristic Evaluation 1) Pre-evaluation training give evaluators needed domain knowledge and information on the scenario 2) Evaluation individuals evaluate problems then combine problems as a group 3) Severity each person rates severity, then combine 4) Debriefing discuss the outcome with design team

53 Usable Privacy and Security Carnegie Mellon University Spring 2007 Cranor/Hong http://cups.cs.cmu.edu/courses/ups-sp07/ 53 How to Perform Heuristic Evaluation At least two passes for each evaluator first to get feel for flow and scope of system second to focus on specific elements If system is walk-up-and-use or evaluators are domain experts, no assistance needed otherwise supply evaluators with scenarios Each evaluator produces list of problems explain why with reference to heuristic or other information be specific and list each problem separately

54 Usable Privacy and Security Carnegie Mellon University Spring 2007 Cranor/Hong http://cups.cs.cmu.edu/courses/ups-sp07/ 54 Examples Typography uses mix of upper/lower case formats and fonts violates “Consistency and standards” (H2-4) slows users down probably wouldn’t be found by user testing fix: pick a single format for entire interface Note: agreeing on heuristic not as important as the problem itself

55 Usable Privacy and Security Carnegie Mellon University Spring 2007 Cranor/Hong http://cups.cs.cmu.edu/courses/ups-sp07/ 55 Severity Rating Used to allocate resources to fix problems estimates of need for more usability efforts Combination of frequency (one time or repeating, few people or lots of people) impact (minimal to lots) Should be calculated after all evals. are in Should be done independently by all judges

56 Usable Privacy and Security Carnegie Mellon University Spring 2007 Cranor/Hong http://cups.cs.cmu.edu/courses/ups-sp07/ 56 Severity Ratings (cont.) 0 - don’t agree that this is a usability problem 1 - cosmetic problem 2 - minor usability problem 3 - major usability problem; important to fix 4 - usability catastrophe; imperative to fix

57 Usable Privacy and Security Carnegie Mellon University Spring 2007 Cranor/Hong http://cups.cs.cmu.edu/courses/ups-sp07/ 57 Debriefing Conduct with evaluators, observers, and development team members Discuss general characteristics of UI Suggest potential improvements to address major usability problems Dev team rates how hard things are to fix Make it a brainstorming session little criticism until end of session

58 Usable Privacy and Security Carnegie Mellon University Spring 2007 Cranor/Hong http://cups.cs.cmu.edu/courses/ups-sp07/ 58 Severity Ratings Example 1. [H1-4 Consistency] [Severity 3][Fix 0] The interface used the string "Save" on the first screen for saving the user's file, but used the string "Write file" on the second screen. Users may be confused by this different terminology for the same function.

59 Usable Privacy and Security Carnegie Mellon University Spring 2007 Cranor/Hong http://cups.cs.cmu.edu/courses/ups-sp07/ 59 HE vs. User Testing HE is much faster 1-2 hours each evaluator vs. days-weeks HE doesn’t require interpreting user’s actions User testing far more accurate (by def.) takes into account actual users and tasks HE may miss problems & find “false positives” Good to alternate between HE & user testing find different problems don’t waste participants

60 Usable Privacy and Security Carnegie Mellon University Spring 2007 Cranor/Hong http://cups.cs.cmu.edu/courses/ups-sp07/ 60


Download ppt "Usable Privacy and Security Carnegie Mellon University Spring 2007 Cranor/Hong 1 User Studies Methods Feb 01,"

Similar presentations


Ads by Google