Presentation is loading. Please wait.

Presentation is loading. Please wait.

Empirical Methods in Human- Computer Interaction.

Similar presentations


Presentation on theme: "Empirical Methods in Human- Computer Interaction."— Presentation transcript:

1 Empirical Methods in Human- Computer Interaction

2 Empirical methods in HCI Where do good designs come from?  Observation  Experience  Experiments

3 UCSD: Iterative Design DESIGN TEST

4 Evolutionary design vs Radical new designs

5 Empirical methods in HCI Task analysis*: Ethnographic & other observations Requirements analysis *done first!

6 Empirical methods in HCI Task analysis*: Ethnographic & other observations Requirements analysis Rapid prototyping, scenarios, story boards Simulation/Wizard of Oz studies Heuristic evaluation; cognitive walkthroughs (by experts) *done first!

7 Empirical methods in HCI Task analysis*: Ethnographic & other observations Requirements analysis Rapid prototyping, scenarios, story boards Simulation/Wizard of Oz studies Heuristic evaluation; cognitive walkthroughs (by experts) Usability testing & user studies (qualitative & quantitative Controlled experiments *done first!

8 These methods share some of the same measures. Often, the best projects use several methods in combination! Good design requires iterating between design and observation (or testing).

9 User-Centered System Design Task analysis tells us how people currently accomplish a task. Requirements analysis tells us what a system should do. Usability testing tells us whether a system performs acceptably when a user tries to carry out certain tasks. User-Centered System Design brings these things together.

10 Methods for task analysis (cont.)  Questionnaires  Interviews  Ethnographic observation  Verbal protocols  Formal models and notations (GOMS) (Hierarchical task analysis)

11 Verbal protocols  pioneered by psychologists studying problem-solving  have people “think out loud” as they do some activity, step by step  Advantage: can get at some of the invisible steps that people go through on the way to a solution, steps that aren’t evident in their behavior.

12 Task & requirements analysis, usability testing These are pragmatic activities that require observations and systematic analyses. BUT: they’re not the same thing as the scientific method! How do they differ?

13 A note on scientific method: Two important steps: 1. Observing and describing 2. Testing theories and hypotheses HCI specialists get many useful principles and solutions from what they see users do (#1), not only from theories (#2). But they sometimes test theories.

14 Ethnographic observation: very different from controlled observations in the laboratory! The observer looks at what people do in real life, recording data in great detail, and then tells a story rather than quantifying the data.

15 Ethnographic observation vs. experiments Ethnographic studies:Experiments:

16 Ethnographic observation vs. experiments Ethnographic studies: study behavior taking place naturally Experiments: study behavior during a controlled task

17 Ethnographic observation vs. experiments Ethnographic studies: study behavior taking place naturally fewer observations Experiments: study behavior during a controlled task many observations

18 Ethnographic observation vs. experiments Ethnographic studies: study behavior taking place naturally fewer observations very rich observations Experiments: study behavior during a controlled task many observations limited observations

19 Ethnographic observation vs. experiments Ethnographic studies: study behavior taking place naturally fewer observations very rich observations no hypotheses Experiments: study behavior during a controlled task many observations limited observations hypothesis-testing

20 Ethnographic observation vs. experiments Ethnographic studies: study behavior taking place naturally fewer observations very rich observations no hypotheses results may differ; speculative contain confounds Experiments: study behavior during a controlled task many observations limited observations hypothesis-testing reliable results; scientific, replicable eliminates confounds

21 Scenarios or story boards

22  Write a story or dialog of a sample interaction*  Draw key frames (as in animation)  Act out functionality; role play *Not unlike what a telephone speech dialog designer would do…

23 Scenarios or story boards PROS and CONS:  don’t require programming  require readers or “users” to use their imaginations  may fail to convey the interactive aspects of a design  may fail to find problems with a design  are often good for getting started

24 Rapid prototyping  risky if based on designer’s intutions  works well combined w/ user studies  observe naive users using the prototype  Examples of prototyping languages: HyperCard, Director, Smalltalk, Logo, LISP, HTML, VoiceXML or JAVA  Pro: system need not be finished  Con: can’t thoroughly test system

25 Wizard of Oz studies - laboratory studies of simulated systems The experimenter intercepts the subject’s input to the computer and may provide responses as if they were coming from the computer.

26 Wizard of Oz - Pros & Cons  test something without having to build it  can be difficult to run  you need a consistent set of response rules to avoid bias  “system’s” reaction times may be slow  sometimes subjects catch on  can control what the user experiences

27 Usability testing

28 evaluation of an existing system or prototype less formal than a laboratory study, more formal than just asking users what they think. for instance, watch people use a prototype to do an assigned task (user studies)

29 Usability testing assesses:  Performance  Learnability  User satisfaction  Extensibility

30 Usability testing assesses: Performance (speed, errors, tasks) Learnability (How long does it take to get started? to become an expert? What is forgotten between sessions?) User satisfaction (both by self report and by behavior) Extensibility (can system be tailored?)

31 User studies Systematic user testing, often having each user do the same task(s).

32 User studies (Gomoll, 1990)  Set up observation (tasks, users, situation)  Describe the evaluation’s purpose  Tell user she can quit at any time  Introduce equipment  Explain how to “think aloud”  Explain that you will not provide help  Describe tasks and system  Ask for questions  Conduct the observations (debrief the subject)  Summarize results

33 Writing for the Internet (Nielsen) How users read on the Web (Read about the different variables that influence readablity; follow the links to the full report of the study.)


Download ppt "Empirical Methods in Human- Computer Interaction."

Similar presentations


Ads by Google