Presentation is loading. Please wait.

Presentation is loading. Please wait.

Evaluation in HCI Angela Kessell Oct. 13, 2005. Evaluation Heuristic Evaluation Measuring API Usability Methodology Matters: Doing Research in the Behavioral.

Similar presentations


Presentation on theme: "Evaluation in HCI Angela Kessell Oct. 13, 2005. Evaluation Heuristic Evaluation Measuring API Usability Methodology Matters: Doing Research in the Behavioral."— Presentation transcript:

1 Evaluation in HCI Angela Kessell Oct. 13, 2005

2 Evaluation Heuristic Evaluation Measuring API Usability Methodology Matters: Doing Research in the Behavioral and Social Sciences

3 Evaluation Heuristic Evaluation –“Discount usability engineering method” Measuring API Usability Methodology Matters: Doing Research in the Behavioral and Social Sciences

4 Evaluation Heuristic Evaluation –“Discount usability engineering method” Measuring API Usability –Usability applied to APIs Methodology Matters: Doing Research in the Behavioral and Social Sciences

5 Evaluation Heuristic Evaluation –“Discount usability engineering method” Measuring API Usability –Usability applied to APIs Methodology Matters: Doing Research in the Behavioral and Social Sciences –Designing, carrying out, and evaluating human subjects studies

6 Heuristic Evaluation Jakob Nielsen

7 Most usability engineering methods will contribute substantially to the usability of an interface …

8 Heuristic Evaluation Jakob Nielsen Most usability engineering methods will contribute substantially to the usability of an interface … …if they are actually used.

9 Heuristic Evaluation What is it?

10 Heuristic Evaluation What is it? A discount usability engineering method

11 Heuristic Evaluation What is it? A discount usability engineering method - Easy (can be taught in ½ day seminar) - Fast (about a day for most evaluations) - Cheap (e.g. $(4,000 + 600 i ))

12 Heuristic Evaluation How does it work?

13 Heuristic Evaluation How does it work? –Evaluators use a checklist of basic usability heuristics –Evaluators go through an interface twice 1 st pass get a feel for the flow and general scope 2 nd pass refer to checklist of usability heuristics and focus on individual elements –The findings of evaluators are combined and assessed

14 Heuristic Evaluation Usability Heuristics (original, unrevised list) Simple and natural dialogue Speak the users’ language Minimize the users’ memory load Consistency Feedback Clearly marked exits Shortcuts Precise and constructive error messages Prevent errors Help and documentation

15 Heuristic Evaluation Usability Heuristics (original, unrevised list) Simple and natural dialogue Speak the users’ language Minimize the users’ memory load Consistency Feedback Clearly marked exits Shortcuts Precise and constructive error messages Prevent errors Help and documentation COMMENTS?

16 Heuristic Evaluation One expert won’t due Need 3 - 5 evaluators Exact number needed depends on cost- benefit analysis

17 Heuristic Evaluation Who are these evaluators? –Typically not domain experts / real users –No official “usability specialist” certification exists Optimal performance requires double experts

18 Heuristic Evaluation Debriefing session –Conducted in brain-storming mode –Evaluators rate the severity of all problems identified –Use a 0 – 4, absolute scale 0 I don’t agree that this is a prob at all 1 Cosmetic prob only 2 Minor prob – low priority 3 Major prob – high priority 4 Usability catastrophe – imperative to fix

19 Heuristic Evaluation Debriefing session –Conducted in brain-storming mode –Evaluators rate the severity of all problems identified –Use a 0 – 4, absolute scale 0 I don’t agree that this is a prob at all 1 Cosmetic prob only 2 Minor prob – low priority 3 Major prob – high priority 4 Usability catastrophe – imperative to fix COMMENTS?

20 Heuristic Evaluation How does H.E. differ from User Testing?

21 Heuristic Evaluation How does H.E. differ from User Testing? –Evaluators have checklists –Evaluators are not the target users –Evaluators decide on their own how they want to proceed –Observer can answer evaluators’ questions about the domain or give hints for using the interface –Evaluators say what they didn’t like and why; observer doesn’t interpret evaluators’ actions

22 Heuristic Evaluation What are the shortcomings of H.E.?

23 Heuristic Evaluation What are the shortcomings of H.E.? –Identifies usability problems without indicating how they are to be fixed. “Ideas for appropriate redesigns have to appear magically in the heads of designers on the basis of their sheer creative powers.” –Cannot expect it to address all usability issues when evaluators are not domain experts / actual users

24 Measuring API Usability Steven Clarke

25 User-centered design approach –Understanding both your users and the way they work Scenario-based design approach –Ensures API reflects the tasks that users want to perform Use Cognitive Dimensions Framework

26 Measuring API Usability Cognitive dimensions framework describes: –What users expect –What the API actually provides Cognitive dimensions framework provides: –A common vocabulary for developers –Draws attention to important aspects The Dimensions: –Abstraction level –Learning style –Working framework –Work-step unit –Progressive evaluation –Premature commitment –Penetrability –API elaboration –API viscosity –Consistency –Role expressiveness –Domain correspondence

27 Measuring API Usability Cognitive dimensions framework describes: –What users expect –What the API actually provides Cognitive dimensions framework provides: –A common vocabulary for developers –Draws attention to important aspects The Dimensions: –Abstraction level –Learning style –Working framework –Work-step unit –Progressive evaluation –Premature commitment –Penetrability –API elaboration –API viscosity –Consistency –Role expressiveness –Domain correspondence COMMENTS?

28 Measuring API Usability Use Personas: –Profiles describing the stereotypical behavior of three main developer groups (Opportunistic, Pragmatic, Systematic) Compare API evaluation with the profile requirements

29 Measuring API Usability Use Personas: –Profiles describing the stereotypical behavior of three main developer groups (Opportunistic, Pragmatic, Systematic) Compare API evaluation with the profile requirements COMMENTS?

30 Methodology Matters: Doing Research in the Behavioral and Social Sciences Joseph McGrath

31 Methodology Matters: Doing Research in the Behavioral and Social Sciences Key points: All methods are valuable, but all have limitations/weaknesses Offset the weaknesses by using multiple methods

32 Methodology Matters: Doing Research in the Behavioral and Social Sciences In conducting research, try to maximize: Generalizability Precision Realism

33 Methodology Matters: Doing Research in the Behavioral and Social Sciences In conducting research, try to maximize: Generalizability Precision Realism -You cannot maximize all three simultaneously.

34 Methodology Matters: Doing Research in the Behavioral and Social Sciences From http://pages.cpsc.ucalgary.ca/%7Esaul/hci_educ_papers/bgbg95/mcgrath-summary.pdf

35 So… 1 st 2 papers focus on computer programs / GUIs 3 rd paper presents the whole gamut of methodologies available to study any human behavior

36 But… what’s missing?

37 But… Where are the statistics? Are there objective “right” answers in HCI? How do we evaluate other kinds of interfaces? Other thoughts on what’s missing?

38 How do we evaluate… “Embodied virtuality” / ubiquitous computing “interfaces” (Aura video… http://www.cs.cmu.edu/~aura/) Try to pick out one capability presented, and think about how you might evaluate it

39 Evaluating Aura Do we evaluate the whole system at once? Or bit by bit? Where / What is the interface? Is anyone not a target user?

40 From http://www.usability.uk.com/images/cartoons/cart5.htm


Download ppt "Evaluation in HCI Angela Kessell Oct. 13, 2005. Evaluation Heuristic Evaluation Measuring API Usability Methodology Matters: Doing Research in the Behavioral."

Similar presentations


Ads by Google