Saul Greenberg Evaluating Interfaces With Users Why evaluation is crucial to interface design General approaches and tradeoffs in evaluation The role of.

Slides:



Advertisements
Similar presentations
Chapter 14: Usability testing and field studies
Advertisements

Evaluating interfaces with users Why evaluation is crucial Quickly debug prototypes by observing people use them Methods reveal what a person is thinking.
Copyright © 2005, Pearson Education, Inc. An Instructor’s Outline of Designing the User Interface 4th Edition by Ben Shneiderman & Catherine Plaisant Slides.
6.811 / PPAT: Principles and Practice of Assistive Technology Wednesday, 16 October 2013 Prof. Rob Miller Today: User Testing.
CS305: HCI in SW Development Evaluation (Return to…)
Conducting a User Study Human-Computer Interaction.
Methodology Overview Why do we evaluate in HCI? Why should we use different methods? How can we compare methods? What methods are there? see
Chapter 14: Usability testing and field studies. 2 FJK User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept.
Methodology Overview Dr. Saul Greenberg John Kelleher.
User Interface Testing. Hall of Fame or Hall of Shame?  java.sun.com.
6.S196 / PPAT: Principles and Practice of Assistive Technology Monday, 28 Nov Prof. Rob Miller Today: User Testing & Ethics.
PSYCO 105: Individual and Social Behaviour Lecture 1: The Ways and Means of Psychology.
An evaluation framework
James Tam Qualitative Evaluation Techniques Why evaluation is crucial to interface design General approaches and tradeoffs with the different approaches.
ICS 463, Intro to Human Computer Interaction Design: 9. Experiments Dan Suthers.
ISE554 The WWW 3.4 Evaluation Methods. Evaluating Interfaces with Users Why evaluation is crucial to interface design General approaches and tradeoffs.
James Tam Evaluating Interfaces With Users Why evaluation is crucial to interface design General approaches and tradeoffs in evaluation The role of ethics.
Damian Gordon.  Summary and Relevance of topic paper  Definition of Usability Testing ◦ Formal vs. Informal methods of testing  Testing Basics ◦ Five.
Methods of Psychology Hypothesis: A tentative statement about how or why something happens. e.g. non experienced teachers use corporal punishment more.
Spring break survey how much will your plans suck? how long are your plans? how many people are involved? how much did you overpay? what’s your name? how.
Empirical Evaluation Assessing usability (with users)
류 현 정류 현 정 Human Computer Interaction Introducing evaluation.
Thinking Actively in a Social Context T A S C.
Conducting a User Study Human-Computer Interaction.
Describing Research Activities
Chapter 11: An Evaluation Framework Group 4: Tony Masi, Sam Esswein, Brian Rood, & Chris Troisi.
Presentation: Techniques for user involvement ITAPC1.
Research methods in psychology Simple revision points.
Human Computer Interaction
Conducting a User Study Human-Computer Interaction.
Usability Testing Chapter 6. Reliability Can you repeat the test?
Midterm Stats Min: 16/38 (42%) Max: 36.5/38 (96%) Average: 29.5/36 (78%)
Human-Computer Interaction. Overview What is a study? Empirically testing a hypothesis Evaluate interfaces Why run a study? Determine ‘truth’ Evaluate.
Step 5 Training Session: Interview Techniques. Questions Generate useful information Generate useful information Focus on reasons or motives Focus on.
Running a User Study Alfred Kobsa University of California, Irvine.
Methods of Research. 1. Laboratory Experiments Research in lab setting Research in lab setting Researcher can be objective and usually provides accurate.
Working with People & Project Overview “Doing right by your participants”
Ethics in Evaluation Why ethics? What you have to do Slide deck by Saul Greenberg. Permission is granted to use this for non-commercial purposes as long.
Manju Nair Session 2: Methods of Studying Children.
DECIDE: An evaluation framework. DECIDE: a framework to guide evaluation D D etermine the goals. E E xplore the questions. C C hoose the evaluation approach.
The task The task: You need to create a set of slides to use as your evaluation tools Once created please print them out and bring to your lesson. Use.
G063 – Prototyping. Learning Objective: At the end of this topic you should be able to: describe prototyping as a software development methodology.
Methodology: How Social Psychologists Do Research
Oct 211 The next two weeks Oct 21 & 23: Lectures on user interface evaluation Oct 28: Lecture by Dr. Maurice Masliah No office hours (out of town) Oct.
Spring /6.831 User Interface Design and Implementation1 Lecture 13: User Testing next 3 lectures required for 6.831G only nanoquizzes on those.
Usability Evaluation or, “I can’t figure this out...do I still get the donuts?”
How do we know if our UI is good or bad?.
Sociology 12 Acad. New Unit: Sociological Research Methods.
School of Engineering and Information and Communication Technology KIT305/607 Mobile Application Development Week 7: Usability (think-alouds) Dr. Rainer.
Engineering Design Process. First, let’s review the “Scientific Method” 1.Ask a question 2.Research 3.Procedure/Method 4.Data Observation 5.Conclusion.
Social Learning Theory
What Do We Mean by Usability?
Evaluating interfaces with users
Methodology Overview 2 basics in user studies Lecture /slide deck produced by Saul Greenberg, University of Calgary, Canada Notice: some material in this.
Usability Evaluation, part 2
Usability Evaluation.
Research Methods in Psychology
Evaluating interfaces with users
Psychological Research method
Psychological Research method
Experimental Design.
Experimental Design.
ABAB Design Ethical considerations
IS THE RESEARCH MEASURING WHAT IT AIMED TO MEASURE?
Unit 1 Research Methods (can be examined in Unit 1&2)
Psychological Research method
How psychologist know what they know
AS Psychology Research Methods
Presentation transcript:

Saul Greenberg Evaluating Interfaces With Users Why evaluation is crucial to interface design General approaches and tradeoffs in evaluation The role of ethics

Saul Greenberg Why Bother? Tied to all parts of the usability engineering lifecycle Pre-design -investing in new expensive system requires proof of viability Initial design stages -develop and evaluate initial design ideas with the user design implementationevaluation

Saul Greenberg Why Bother? Tied to all parts of the usability engineering lifecycle Iterative design -does system behaviour match the user’s task requirements? -are there specific problems with the design? -can users provide feedback to modify design? Acceptance testing -verify that human/computer system meets expected performance criteria ease of learning, usability, user’s attitude, performance criteria e.g., a first time user will take 1-3 minutes to learn how to withdraw $50. from the automatic teller design implementationevaluation

Saul Greenberg Approaches: Naturalistic Observation occurs in realistic setting real life -ecologically valid describes an ongoing process as it evolves over time

Saul Greenberg Approaches: Experimental Experimental Classical lab study study relations by manipulating one or more independent variables -experimenter controls all environmental factors (nothing else changes) observe effect on one or more dependent variables

Saul Greenberg Tradeoff: Natural Vs. Experimental Internal validity do you measure what you set out to measure External validity degree to which results can be generalized to other situations NaturalisticExperimental Internal validity LowHigh External validity HighLow

Saul Greenberg Reliability Concerns Would the same results be achieved if the test were repeated? Problem: individual differences: best user 10x faster than slowest best 25% of users ~2x faster than slowest 25% Partial Solution reasonable number and range of users tested

Saul Greenberg Validity Concerns Does the test measure something of relevance to usability of real products in real use outside of lab? Some typical reliability problems of testing vs real use -non-typical users tested -tasks are not typical tasks -physical environment different quiet lab vs. very noisy open offices vs interruptions -social influences different motivation towards experimenter vs motivation towards boss Partial Solution use real users tasks from task-centered system design environment similar to real situation

Saul Greenberg Ethics... and to think that you want me to test it!!!

Saul Greenberg Ethics Testing can be a distressing experience pressure to perform, errors inevitable feelings of inadequacy competition with other subjects Golden rule test participants should always be treated with respect

Saul Greenberg Managing Participants In An Ethical Manner Before the test don’t waste the person’s time -use pilot tests to debug experiments, questionnaires etc -have everything ready before the user shows up make participants feel comfortable -emphasize that it is the system that is being tested, not the user -acknowledge that the software may have problems -let users know they can stop at any time maintain privacy -tell user that individual test results will be kept completely confidential inform the user -explain any monitoring that is being used -answer all user’s questions (but avoid bias) only use volunteers -user must sign an informed consent form

Saul Greenberg Managing Participants In An Ethical Manner During the test don’t waste the person’s time -never have the user perform unnecessary tasks make test participants comfortable -try to give the person an early success experience -keep a relaxed atmosphere in the room -coffee, breaks, etc -hand out test tasks one at a time -never indicate displeasure with the person’s performance -avoid disruptions -stop the test if it becomes too unpleasant maintain privacy -do not allow the participant’s management to observe the test

Saul Greenberg Managing Participants In An Ethical Manner After the test make the person feel comfortable -state that the participant has helped you find areas of improvement inform the participant -answer particular questions about the experiment that could have biased the results before maintain privacy -never report results in a way that individuals can be identified -only show videotapes outside the research group with the participant’s permission

Saul Greenberg You Know Now Evaluation is crucial for designing, debugging, and verifying interfaces There is a tradeoff in naturalistic vs experimental approaches internal and external validity reliability Test participants must be treated with respect ethical rules of behaviour