ICS 463, Intro to Human Computer Interaction Design: 8. Evaluation and Data Dan Suthers.

Slides:



Advertisements
Similar presentations
Chapter 14: Usability testing and field studies
Advertisements

Evaluation of User Interface Design
Exploratory Research and Qualitative Analysis
User Interface Evaluation Formative Evaluation. Summative Evaluation Evaluation of the user interface after it has been developed. Typically performed.
CS305: HCI in SW Development Evaluation (Return to…)
Usability Evaluation Evaluation should occur continually through the design and implementation process. Evaluation methods are applied as the interface.
Evaluation Howell Istance. Why Evaluate? n testing whether criteria defining success have been met n discovering user problems n testing whether a usability-related.
©N. Hari Narayanan Computer Science & Software Engineering Auburn University 1 COMP 7620 Evaluation Chapter 9.
© De Montfort University, 2001 Questionnaires contain closed questions (attitude scales) and open questions pre- and post questionnaires obtain ratings.
Marketing Research Exploratory Research and Qualitative Analysis Dr. Zafer Erdogan.
4/16/2017 Usability Evaluation Howell Istance 1.
Focus Groups. Contents What is a focus group and why use it Methods When to use Focus Groups Advantages and Disadvantages Example.
Evaluation. formative 4 There are many times throughout the lifecycle of a software development that a designer needs answers to questions that check.
Think-aloud usability experiments or concurrent verbal accounts Judy Kay CHAI: Computer human adapted interaction research group School of Information.
An evaluation framework
ICS 463, Intro to Human Computer Interaction Design: 9. Experiments Dan Suthers.
Part 2: Requirements Days 7, 9, 11, 13 Chapter 2: How to Gather Requirements: Some Techniques to Use Chapter 3: Finding Out about the Users and the Domain.
Evaluation How do we test the interaction design? Several Dimensions
HCI revision lecture. Main points Understanding Applying knowledge Knowing key points Knowing relationship between things If you’ve done the group project.
RESEARCH METHODS IN EDUCATIONAL PSYCHOLOGY
Gathering Usability Data
류 현 정류 현 정 Human Computer Interaction Introducing evaluation.
Introduction to SDLC: System Development Life Cycle Dr. Dania Bilal IS 582 Spring 2009.
Chapter 11: An Evaluation Framework Group 4: Tony Masi, Sam Esswein, Brian Rood, & Chris Troisi.
Exploring Marketing Research William G. Zikmund
CSCI 4163/6904, summer Quiz  Multiple choice  Answer individually - pass in  Then class discussion.
Experimental Research Methods in Language Learning Chapter 8 A Hybrid Approach for Experimental Research.
Gathering User Data IS 588 Dr. Dania Bilal Spring 2008.
Multimedia Specification Design and Production 2013 / Semester 1 / week 9 Lecturer: Dr. Nikos Gazepidis
Evaluating a Research Report
Exploratory Research Design Week 02
Chapter 12 Observing Users Li, Jia Li, Wei. Outline What and when to observe Approaches to observation How to observe How to collect data Indirect observation.
Human Computer Interaction
Usability testing. Goals & questions focus on how well users perform tasks with the product. – typical users – doing typical tasks. Comparison of products.
Level 2 Prepared by: RHR First Prepared on: Nov 23, 2006 Last Modified on: Quality checked by: MOH Copyright 2004 Asia Pacific Institute of Information.
Usability Testing Chapter 6. Reliability Can you repeat the test?
COMP5047 Pervasive Computing: 2012 Think-aloud usability experiments or concurrent verbal accounts Judy Kay CHAI: Computer human adapted interaction research.
 Descriptive Methods ◦ Observation ◦ Survey Research  Experimental Methods ◦ Independent Groups Designs ◦ Repeated Measures Designs ◦ Complex Designs.
AVI/Psych 358/IE 340: Human Factors Data Analysis October 22-24, 2008.
Effective Methods for Educational Research The value of prototyping, observing, engaging users Diana Laurillard.
Chapter 8 Usability Specification Techniques Hix & Hartson.
CS5714 Usability Engineering Formative Evaluation of User Interaction: During Evaluation Session Copyright © 2003 H. Rex Hartson and Deborah Hix.
Task Analysis Methods IST 331. March 16 th
The product of evaluation is knowledge. This could be knowledge about a design, knowledge about the user or knowledge about the task.
AMSc Research Methods Research approach IV: Experimental [1] Jane Reid
Copyright © 2012 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 13 Data Collection in Quantitative Research.
©2011 1www.id-book.com Data Gathering Chapter 7. ©2011 Data Gathering What is data gathering? –The act of gathering data through a study The data can.
Usability Evaluation, part 2. REVIEW: A Test Plan Checklist, 1 Goal of the test? Specific questions you want to answer? Who will be the experimenter?
EVALUATION PROfessional network of Master’s degrees in Informatics as a Second Competence – PROMIS ( TEMPUS FR-TEMPUS-JPCR)
© 2005 by Prentice Hall Chapter 6 Determining System Requirements Modern Systems Analysis and Design Fourth Edition Jeffrey A. Hoffer Joey F. George Joseph.
4. Marketing research After carefully studying this chapter, you should be able to: Define marketing research; Identify and explain the major forms of.
Data gathering (Chapter 7 Interaction Design Text)
User Interface Evaluation Introduction Lecture #15.
Searching and Using Electronic Literature III. Research Design.
Evaluation and Assessment of Instructional Design Module #4 Designing Effective Instructional Design Research Tools Part 2: Data-Collection Techniques.
Consumer Behavior, Ninth Edition Schiffman & Kanuk Copyright 2007 by Prentice Hall Chapter 2 Consumer Research.
Evaluation / Usability. ImplementDesignAnalysisEvaluateDevelop ADDIE.
School of Engineering and Information and Communication Technology KIT305/607 Mobile Application Development Week 7: Usability (think-alouds) Dr. Rainer.
Day 8 Usability testing.
Essentials of Marketing Research William G. Zikmund
User Interface Evaluation
Research strategies & Methods of data collection
Collaboration with Google Drive
Consumer Research.
Week 4 REVIEW.
Research strategies & Methods of data collection
CSM18 Usability Engineering
Empirical Evaluation Data Collection: Techniques, methods, tricks Objective data IRB Clarification All research done outside the class (i.e., with non-class.
Human-Computer Interaction: Overview of User Studies
Presentation transcript:

ICS 463, Intro to Human Computer Interaction Design: 8. Evaluation and Data Dan Suthers

Evaluation Formative: early and often; informs design Summative: near end; have we done well? We focus on formative

When and why? Early in the life cycle –understanding target application –check understanding of requirements –quick filtering of ideas Middle –predicting usability –comparing alternate designs –engineering towards a usability target Later –fine tuning of usability –verifying conformance to a standard

Preview of Methods of Evaluation Collecting users' opinions  attitudes Observing and monitoring use  how users interact Experiments  hypothesis testing Interpretive evaluation  how used in natural settings (ecological validity) Predictive evaluation  anticipated usability issues

Typical Procedure (formal evals) Identify questions Plan the evaluation Pilot the evaluation and revise if needed Run the sessions and collect the data Analyze the data Draw conclusions Redesign and revise system Details next week (Hix & Hartson)

Dimensions to consider Evaluation planning should consider Characteristics of users Types of activities Environment of use Nature of artifact So should design … from the outset!

Data Types of Data –Objective versus Subjective –Quantitative versus Qualitative What are the independent variables? Dependent variables? Controlled variables? Now on to details of evaluation methods …

Collecting Users’ Opinions Tells us about attitudes Caveat: "First rule of usability: don't listen to users, watch what they do!" Two major methods Interviews - qualitative analysis Surveys - quantitative analysis

Interviews Structured: –fixed questions, fixed or conditional sequence –easier to conduct and analyze –may miss opportunistic information Semi-structured –set of questions “to get to” Flexible –no set questions or sequence

Questionnaires and Surveys Large numbers, analyzed quantitatively Design with your analysis in mind Piloting important Closed questions versus open questions Types of closed questions –Checklists: background information –Likert scales: range of agreement or disagreement with a statement –Semantic differentials: place on scale of adjectives –Ranked order: e.g., rank in order of usefulness

Observing Users Observing and monitoring use of artifact –in laboratory –in natural setting  how users interact with system  usability issues

Direct Observation Researcher watches use, takes notes Hawthorne Effect (users act differently under observation) may contaminate results Record may be incomplete Only one chance Helpful to have shorthand and/or forms which which you are fluent

Indirect Observation Video logging –User(s) body language, gestures –Screen activity Two uses: –Exploratory evaluation: review tapes carefully and repeatedly to discover issues –Formal studies: know what you are looking for! Interaction logging (software) –Often use two or more together –Must synchronize all data streams High volume of data can be overwhelming

Verbal Protocols Audio record of spoken language –Spontaneous utterances –Conversation between multiple users –Think-aloud protocol –Post-event protocols Dangers of introspection, rationalization Analyze along with video

Video/Verbal Analysis Diversity of approaches Task-based –how do users approach the problem –difficulties in using the software –need not be exhaustive: identify interesting cases Performance-based –frequency and timing of categories of actions, errors, task completion

More on Analysis of Video/Verbal Requires classification scheme, invented or borrowed May involve inter-rater reliability Often exhaustive and time intensive! Tools important –we transcribe conversation to text merged with transaction log –better approach would be direct analysis of digital video

Software Instrumentation/Logging Time-stamped logs –key-presses or higher level actions –record what happened but not replayable Interaction logging –replayable Synchronize with video data for rich but overwhelming data Analysis issues are similar

Methods of Evaluation Collecting users' opinions –  attitudes Observing and monitoring use of artifact –in laboratory –in natural setting –  how users interact Experiments –testable hypothesis –comparative with controlled variables –quantitative analysis Interpretive evaluation –how used in natural settings –qualitative data and analysis Predictive evaluation –predict usability issues based on model –applied to specifications or prototypes

Additional Methods Experiments –testable hypothesis –comparative with controlled variables –quantitative analysis –we’ll get into this next week Interpretive evaluation –how used in natural settings –qualitative data and analysis Predictive evaluation –predict usability issues based on model –applied to specifications or prototypes

Assignment 7 (project groups) The good news: your project has been accepted by The Boss! The bad news: you have 6 weeks to finish it!

Assignment 7 continued Write a 2 page plan expressed in terms of the Star model –What is the expected product? –At which phase will you start? –What is your estimated timeline? –How will you incorporate evaluation and other user-centered techniques, and use the outcomes to adjust the design? –What resources do you need? –This can be in outline form and will be assessed for its value as an efficient briefing for The Boss