USABILITY AND EVALUATION Motivations and Methods.

Slides:



Advertisements
Similar presentations
Chapter 14: Usability testing and field studies
Advertisements

Design of Experiments Lecture I
Extended Project Research Skills 1 st Feb Aims of this session  Developing a clear focus of what you are trying to achieve in your Extended Project.
Data analysis and interpretation. Agenda Part 2 comments – Average score: 87 Part 3: due in 2 weeks Data analysis.
1 User-Centered Design CSSE 376, Software Quality Assurance Rose-Hulman Institute of Technology April 17, 2007.
Good Research Questions. A paradigm consists of – a set of fundamental theoretical assumptions that the members of the scientific community accept as.
Chapter 14: Usability testing and field studies. Usability Testing Emphasizes the property of being usable Key Components –User Pre-Test –User Test –User.
HCI Methods for Pathway Visualization Tools Purvi Saraiya, Chris North, Karen Duca* Virginia Tech Dept. of Computer Science, Center for Human-Computer.
Evaluation Adam Bodnar CPSC 533C Monday, April 5, 2004.
1 User Centered Design and Evaluation. 2 Overview Why involve users at all? What is a user-centered approach? Evaluation strategies Examples from “Snap-Together.
Reporting Stages of project at which reports are likely User and task analysis –Who (potential) users are, how they do their work, what they do, what.
Gender Issues in Systems Design and User Satisfaction for e- testing software Prepared by Sahel AL-Habashneh. Department of Business information systems.
SIMS 213: User Interface Design & Development Marti Hearst Thurs, Jan 22, 2004.
1 User Centered Design and Evaluation. 2 Overview My evaluation experience Why involve users at all? What is a user-centered approach? Evaluation strategies.
Science vs Pseudoscience
Chapter 4 Principles of Quantitative Research. Answering Questions  Quantitative Research attempts to answer questions by ascribing importance (significance)
1. Human – the end-user of a program – the others in the organization Computer – the machine the program runs on – often split between clients & servers.
Empirical Evaluation Assessing usability (with users)
Chapter 14: Usability testing and field studies
Predictive Evaluation
Marketing Research: Overview
Research in Computing สมชาย ประสิทธิ์จูตระกูล. Success Factors in Computing Research Research Computing Knowledge Scientific MethodAnalytical Skill Funding.
Evaluation of software engineering. Software engineering research : Research in SE aims to achieve two main goals: 1) To increase the knowledge about.
Planning an Applied Research Project Chapter 7 – Forms of Quantitative Research © 2014 John Wiley & Sons, Inc. All rights reserved.
OBSERVATIONAL METHODS © 2012 The McGraw-Hill Companies, Inc.
Behavioral Research Chapter 6-Observing Behavior.
Big Idea 1: The Practice of Science Description A: Scientific inquiry is a multifaceted activity; the processes of science include the formulation of scientifically.
Exploratory Research Design Week 02
Chapter 1 The Study of Our World. The Way Science Works  Science Observing, studying, experimenting to find the way that things work and why  Technology.
Usability testing. Goals & questions focus on how well users perform tasks with the product. – typical users – doing typical tasks. Comparison of products.
Usability Evaluation June 8, Why do we need to do usability evaluation?
Agenda Usability Usability Testing Method Demo IX Lab Tour 02/15/2006School of Information, The University of Texas at Austin1/12 Usability and Testing:
Testing & modeling users. The aims Describe how to do user testing. Discuss the differences between user testing, usability testing and research experiments.
Experimentation in Computer Science (Part 1). Outline  Empirical Strategies  Measurement  Experiment Process.
Introduction to science, biology, and experimental design
AVI/Psych 358/IE 340: Human Factors Data Analysis October 22-24, 2008.
Oh, no! validation bingo!. algorithm complexity analysis.
Nursing research Is a systematic inquiry into a subject that uses various approach quantitative and qualitative methods) to answer questions and solve.
The product of evaluation is knowledge. This could be knowledge about a design, knowledge about the user or knowledge about the task.
Inquiry: The Heart and Soul of Science Education Michael Padilla Clemson University
The Language of Science.  Hypothesis: a prediction that can be tested; an educated guess base on observations and prior knowledge  Theory: a well tested.
EVALUATION PROfessional network of Master’s degrees in Informatics as a Second Competence – PROMIS ( TEMPUS FR-TEMPUS-JPCR)
OBSERVATIONAL METHODS © 2009 The McGraw-Hill Companies, Inc.
Evaluation Methods - Summary. How to chose a method? Stage of study – formative, iterative, summative Pros & cons Metrics – depends on what you want to.
Chapter 2 Doing Sociological Research. Chapter Outline The Research Process The Tools of Sociological Research Prediction, Sampling and Statistical Analysis.
What is Science? SECTION 1.1. What Is Science and Is Not  Scientific ideas are open to testing, discussion, and revision  Science is an organize way.
ITM 734 Introduction to Human Factors in Information Systems Cindy Corritore This material has been developed by Georgia Tech HCI faculty,
Empirical Evaluation Chris North cs5984: Information Visualization.
The Scientific Method ♫ A Way to Solve a Problem ♫ Created by Ms. Williams July, 2009.
Midterm Review Renan Levine POL 242 June 14, 2006.
Searching and Using Electronic Literature III. Research Design.
User Interface Evaluation
Distinguish between an experiment and other types of scientific investigations where variables are not controlled,
SIE 515 Design Evaluation Lecture 7.
Evaluation through user participation
Psychology Notes 1.1.
Principles of Quantitative Research
Planning an Applied Research Project
Data analysis and interpretation
Data Collection and Analysis
Methods Choices Overall Approach/Design
Model based design.
Software Engineering Experimentation
Scientific Processes Scientific Method.
สมชาย ประสิทธิ์จูตระกูล
Module 8- Stages in the Evaluation Process
The Scientific Method.
The Scientific Method.
The Scientific Method.
Presentation transcript:

USABILITY AND EVALUATION Motivations and Methods

Motivations  Define a metric for performance of users when using new tools, interfaces, visualizations etc.  Verify scientific, innovative contributions.  Reduce cost of redesigning a new product.

Ideal  Come up with theories like Fit’s Law so we won’t need to run user studies at all

Performance  New tools, user interfaces (graphical or not), visualizations require users to:  perceive,  interpret and  execute tasks.  Performance is measured in:  Time  Accuracy  Recall  Satisfaction

Overlaps  Cognitive Psychology: the study of how people think, perceive, remember, speak and solve problems. Adopts a very empirical, scientific study method.  Cultural and Social Anthropology: investigates effects of social and cultural norms on individual behavior. Field studies is a common research method.  Schools of Information (iSchools), Graphic Design, Communications, Marketing

Usability in HCI  Very empirical: carefully designed controlled experiments. Has to be designed to verify a hypothesis. Hypothesis: “Users will outperform in executing task T when they use technique A instead of technique B”

Task  Thy your user !  Thy your task !  Most complicated tasks are a culmination of simple building block tasks.  Sorting documents: Access individual documents (point, select, click) -> read titles -> categorize (re-label, change location etc)

Scenario Based Usability Tests  Let users achieve identified tasks in a convincing scenario!  Hard to achieve:  Nature of the controlled experiment requires as minimum uncontrolled variables as possible whereas a convincing scenario requires complexity.

Designing and Running an Experiment  Identify hypothesis  Identify tasks  Design your tool, interface, visualization after these stages or at least re-visit your initial design  Identify dependent and independent variables  Within vs between subjects designs  Randomization  Demographics

Lab Study

Evaluate the results of your evaluation  Statistical analysis  ANOVA  Chi-square tests  Regression  …

End of Controlled Studies  Limitations: how to measure enjoyment, creativity  “our tool let people discover new things … encourage them to try things that are not recommended by their friends…”  Alternatives:  Qualitative methods Think-aloud protocols Count a-ha! moments Longitudinal studies Interviews Surveys Focus Groups

Analyzing Qualitative Data  Easier to collect, harder to interpret  Quantitative analysis applied to qualitative data 

Reporting: Writing the Paper  Whatever you do, what is really important is how you present it.  A quantitative experiment is easier to report.  You have to make sure you don’t arrive at a “big” conclusion based on little evidence, little results.  On the other hand you have to emphasize importance of your findings.