Computer Science Department California Polytechnic State University San Luis Obispo, CA, U.S.A. Franz J. Kurfess CPE/CSC 484: User-Centered Design and.

Slides:



Advertisements
Similar presentations
©2011 1www.id-book.com Evaluation studies: From controlled to natural settings Chapter 14.
Advertisements

Chapter 14: Usability testing and field studies
Chapter 2 The Process of Experimentation
Chapter 5 Development and Evolution of User Interface
Methodology and Explanation XX50125 Lecture 1: Part I. Introduction to Evaluation Methods Part 2. Experiments Dr. Danaë Stanton Fraser.
CS305: HCI in SW Development Evaluation (Return to…)
Copyright © 2005 Pearson Education Canada Inc. 1 Psychology as a Science Theory development involves collecting interrelated ideas and observations Taken.
WHAT IS INTERACTION DESIGN?
Chapter 14: Usability testing and field studies. 2 FJK User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept.
1 User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept. Cal Poly San Luis Obispo FJK 2005.
User Testing & Experiments. Objectives Explain the process of running a user testing or experiment session. Describe evaluation scripts and pilot tests.
Chapter 14: Usability testing and field studies. Usability Testing Emphasizes the property of being usable Key Components –User Pre-Test –User Test –User.
Chapter 15: Analytical evaluation. 2 FJK User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept. Cal Poly San.
1 User Centered Design and Evaluation. 2 Overview Why involve users at all? What is a user-centered approach? Evaluation strategies Examples from “Snap-Together.
1 User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept. Cal Poly San Luis Obispo FJK 2009.
Copyright © 2009 Pearson Education Canada2-1 Chapter 2: Child Development 2.1 Doing Child-Development Research 2.2 Child-Development Research and Family.
1 User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept. Cal Poly San Luis Obispo FJK 2005.
User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept. Cal Poly San Luis Obispo FJK 2005.
An evaluation framework
An evaluation framework
Chapter 3 Preparing and Evaluating a Research Plan Gay and Airasian
Usability Evaluation 1
From Controlled to Natural Settings
Chapter 13 Survey Designs
Design in the World of Business
Power Point Slides by Ronald J. Shope in collaboration with John W. Creswell Chapter 13 Survey Designs.
Power Point Slides by Ronald J. Shope in collaboration with John W. Creswell Chapter 13 Survey Designs.
Chapter 14: Usability testing and field studies
Usability testing and field studies
+ Controlled User studies HCI /6610 Winter 2013.
Evaluation Framework Prevention vs. Intervention CHONG POH WAN 21 JUNE 2011.
Chapter 11: An Evaluation Framework Group 4: Tony Masi, Sam Esswein, Brian Rood, & Chris Troisi.
Section A Experiments.
Ch 14. Testing & modeling users
Computer Science Department California Polytechnic State University San Luis Obispo, CA, U.S.A. Franz J. Kurfess CPE/CSC 484: User-Centered Design and.
Computer Science Department California Polytechnic State University San Luis Obispo, CA, U.S.A. Franz J. Kurfess CPE/CSC 484: User-Centered Design and.
©2011 1www.id-book.com Introducing Evaluation Chapter 12 adapted by Wan C. Yoon
Human Computer Interaction
Usability testing. Goals & questions focus on how well users perform tasks with the product. – typical users – doing typical tasks. Comparison of products.
Usability Evaluation June 8, Why do we need to do usability evaluation?
Computer Science Department California Polytechnic State University San Luis Obispo, CA, U.S.A. Franz J. Kurfess CPE/CSC 484: User-Centered Design and.
Computer Science Department California Polytechnic State University San Luis Obispo, CA, U.S.A. Franz J. Kurfess CPE/CSC 484: User-Centered Design and.
Usability Testing CS774 Human Computer Interaction Spring 2004.
Computer Science Department California Polytechnic State University San Luis Obispo, CA, U.S.A. Franz J. Kurfess CPE/CSC 484: User-Centered Design and.
Testing & modeling users. The aims Describe how to do user testing. Discuss the differences between user testing, usability testing and research experiments.
Level 2 Prepared by: RHR First Prepared on: Nov 23, 2006 Last Modified on: Quality checked by: MOH Copyright 2004 Asia Pacific Institute of Information.
INTERACTION DESIGN PROCESS Textbook: S. Heim, The Resonant Interface: HCI Foundations for Interaction Design [Chapter 3] Addison-Wesley, 2007 February.
© An Evaluation Framework Chapter 13.
Lesson Overview Lesson Overview What Is Science? Lesson Overview 1.1 What Is Science?
Science Fair By Kimberly Albertson.
Computer Science Department California Polytechnic State University San Luis Obispo, CA, U.S.A. Franz J. Kurfess CPE/CSC 484: User-Centered Design and.
User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept. Cal Poly San Luis Obispo FJK 2005.
©2010 John Wiley and Sons Chapter 2 Research Methods in Human-Computer Interaction Chapter 2- Experimental Research.
©2011 1www.id-book.com Data Gathering Chapter 7. ©2011 Data Gathering What is data gathering? –The act of gathering data through a study The data can.
Usability Evaluation, part 2. REVIEW: A Test Plan Checklist, 1 Goal of the test? Specific questions you want to answer? Who will be the experimenter?
AVI/Psych 358/IE 340: Human Factors Evaluation October 31, 2008.
Um ambiente para avaliação. Objetivos Explicar conceitos e termos da avaliação Descrever paradigmas de avaliação e técnicas utilizadas no design de.
Oct 211 The next two weeks Oct 21 & 23: Lectures on user interface evaluation Oct 28: Lecture by Dr. Maurice Masliah No office hours (out of town) Oct.
Consumer Behavior, Ninth Edition Schiffman & Kanuk Copyright 2007 by Prentice Hall Chapter 2 Consumer Research.
What is Research Design? RD is the general plan of how you will answer your research question(s) The plan should state clearly the following issues: The.
COMP 135: Human-Computer Interface Design
Evaluation through user participation
From Controlled to Natural Settings
From Controlled to Natural Settings
Evaluation.
Testing & modeling users
Human-Computer Interaction: Overview of User Studies
COMP444 Human Computer Interaction Evaluation
From Controlled to Natural Settings
Presentation transcript:

Computer Science Department California Polytechnic State University San Luis Obispo, CA, U.S.A. Franz J. Kurfess CPE/CSC 484: User-Centered Design and Development

2 © Franz J. Kurfess Logistics ❖ Assignments  A4 - Data Collection trial runs complete  A4 presentation schedule: Team 3 today, rest Thu during the lab time  A5 - Usability Evaluation  selection of systems to evaluate ❖ Guest presentation on Universal Design and ADA compliance on Thu, May 17  Trey Duffy, John Lee, Disability Resource Center ❖ HCI Lab Opening Ceremony on Thu, May 31, 9: :00 am  poster boards, demos from 484 teams? ❖ CEng Project Fair Thu, May 31, 4:00 - 7:00 pm  final project displays ❖ CSC IAB presentations Fri, June 1, 10: :00  20 min project presentations

3 © Franz J. Kurfess Logistics ❖ Term Project  opening ceremony (“ribbon cutting”) on Thu, May 31, 9: :00  guests  student presentations ❖ Research Activity  status update  final version due on Thu, May 24  presentations in class/lab  include your experiences with blog, video, etc. as medium ❖ Talk Doug Harr, SPLUNK, on “Big Data”  Thu, May 17, 11:10 am, Philips Hall, PAC

© Franz J. Kurfess Copyright Notice These slides are a revised version of the originals provided with the book “Interaction Design” by Jennifer Preece, Yvonne Rogers, and Helen Sharp, Wiley, I added some material, made some minor modifications, and created a custom show to select a subset.  Slides added or modified by me use a different style, with my name at the bottom

©2011 1www.id-book.com Evaluation studies: From controlled to natural settings Chapter 14

6 © Franz J. Kurfess Motivation ❖ practical examples are often a good source of information ❖ communication is an interesting domain since it can use different methods and technologies

7 © Franz J. Kurfess Objectives ❖ learn from practical projects how design and evaluation are brought together in the development of interactive products ❖ compare different combinations of design and evaluation methods, and how they are used in practice ❖ identify examples of design trade-offs and decisions for real-world products

©2011 2www.id-book.com The aims: Explain how to do usability testing Outline the basics of experimental design Describe how to do field studies

©2011 3www.id-book.com Usability testing Involves recording performance of typical users doing typical tasks. Controlled settings. Users are observed and timed. Data is recorded on video & key presses are logged. The data is used to calculate performance times, and to identify & explain errors. User satisfaction is evaluated using questionnaires & interviews. Field observations may be used to provide contextual understanding.

©2011 4www.id-book.com Experiments & usability testing Experiments test hypotheses to discover new knowledge by investigating the relationship between two or more things – i.e., variables. Usability testing is applied experimentation. Developers check that the system is usable by the intended user population for their tasks. Experiments may also be done in usability testing.

©2011 5www.id-book.com Usability testing & research Usability testing Improve products Few participants Results inform design Usually not completely replicable Conditions controlled as much as possible Procedure planned Results reported to developers Experiments for research Discover knowledge Many participants Results validated statistically Must be replicable Strongly controlled conditions Experimental design Scientific report to scientific community

©2011 6www.id-book.com Usability testing Goals and questions focus on how well users perform tasks with the product Comparison of products or prototypes common Focus is on –time to complete task –number of errors –type of errors Data collection –video and interaction logging Testing is central User satisfaction –questionnaires, interviews provide data about users’ opinions

©2011 7www.id-book.com Usability lab with observers watching a user & assistant

©2011 8www.id-book.com Portable equipment for use in the field

©2011 9www.id-book.com

© www.id-book.com Mobile head-mounted eye tracker Picture courtesy of SensoMotoric Instruments (SMI), copyright 2010

© www.id-book.com Testing conditions Usability lab or other controlled space. Emphasis on: –selecting representative users; –developing representative tasks users typically selected. Tasks usually last no more than 30 minutes. The test conditions should be the same for every participant. Informed consent form explains procedures and deals with ethical issues.

© www.id-book.com Some type of data Time to complete a task. Time to complete a task after a specified. time away from the product. Number and type of errors per task. Number of errors per unit of time. Number of navigations to online help or manuals. Number of users making a particular error. Number of users completing task successfully.

© www.id-book.com Usability engineering orientation Aim is improvement with each version. Current level of performance. Minimum acceptable level of performance. Target level of performance.

© www.id-book.com How many participants is enough for user testing? The number is a practical issue. Depends on: –schedule for testing; –availability of participants; –cost of running tests. Typically 5-10 participants. Some experts argue that testing should continue until no new insights are gained.

© www.id-book.com Name 3 features for each that can be tested by usability testing iPad

© www.id-book.com Experiments Predict the relationship between two or more variables. Independent variable is manipulated by the researcher. Dependent variable depends on the independent variable. Typical experimental designs have one or two independent variable. Validated statistically & replicable.

© www.id-book.com Experimental designs Different participants - single group of participants is allocated randomly to the experimental conditions. Same participants - all participants appear in both conditions. Matched participants - participants are matched in pairs, e.g., based on expertise, gender, etc.

© www.id-book.com Different, same, matched participant design

© www.id-book.com Field studies Field studies are done in natural settings. “in the wild” is a term for prototypes being used freely in natural settings. Aim to understand what users do naturally and how technology impacts them. Field studies are used in product design to: - identify opportunities for new technology; - determine design requirements; - decide how best to introduce new technology; - evaluate technology in use.

© www.id-book.com Data collection & analysis Observation & interviews –Notes, pictures, recordings –Video –Logging Analyzes –Categorized –Categories can be provided by theory Grounded theory Activity theory

© www.id-book.com Data presentation The aim is to show how the products are being appropriated and integrated into their surroundings. Typical presentation forms include: vignettes, excerpts, critical incidents, patterns, and narratives.

© www.id-book.com UbiFit Garden: An in the wild study

© www.id-book.com Key points Usability testing is done in controlled conditions. Usability testing is an adapted form of experimentation. Experiments aim to test hypotheses by manipulating certain variables while keeping others constant. The experimenter controls the independent variable(s) but not the dependent variable(s). There are three types of experimental design: different- participants, same- participants, & matched participants. Field studies are done in natural environments. “In the wild” is a recent term for studies in which a prototype is freely used in a natural setting. Typically observation and interviews are used to collect field studies data. Data is usually presented as anecdotes, excerpts, critical incidents, patterns and narratives.