LInfoVis Winter 2011 Chris Culy Evaluation of visualizations.

Slides:



Advertisements
Similar presentations
Chapter 5 Development and Evolution of User Interface
Advertisements

Requirements gathering
The Robert Gordon University School of Engineering Dr. Mohamed Amish
Market Research Ms. Roberts 10/12. Definition: The process of obtaining the information needed to make sound marketing decisions.
SEM A – Marketing Information Management
The design process IACT 403 IACT 931 CSCI 324 Human Computer Interface Lecturer:Gene Awyzio Room:3.117 Phone:
Agile Usability Testing Methods
CS305: HCI in SW Development Evaluation (Return to…)
1 In-Process Metrics for Software Testing Kan Ch 10 Steve Chenoweth, RHIT Left – In materials testing, the goal always is to break it! That’s how you know.
Department of Industrial Management Engineering 1.Introduction ○Usability evaluation primarily summative ○Informal intuitive evaluations by designers even.
Semester in review. The Final May 7, 6:30pm – 9:45 pm Closed book, ONE PAGE OF NOTES Cumulative Similar format to midterm (probably about 25% longer)
SE 450 Software Processes & Product Metrics Survey Use & Design.
Chapter 3 Doing Sociological Research 1. Sociology & the Scientific Method The research process: 1.Developing a research question 2.Creating a research.
CAP 252 Lecture Topic: Requirement Analysis Class Exercise: Use Cases.
Group Project. Don’t make me think Steve Krug (2006)
Methodology Overview Dr. Saul Greenberg John Kelleher.
CS CS 5150 Software Engineering Lecture 12 Usability 2.
Empirical Methods in Human- Computer Interaction.
Evaluation Methodologies
Design and Evaluation of Iterative Systems n For most interactive systems, the ‘design it right first’ approach is not useful. n The 3 basic steps in the.
1 CS 430 / INFO 430 Information Retrieval Lecture 24 Usability 2.
Data-collection techniques. Contents Types of data Observations Event logs Questionnaires Interview.
Principles of Marketing
Prototyping Teppo Räisänen
From Controlled to Natural Settings
Usability Evaluation Methods Computer-Mediated Communication Lab week 10; Tuesday 7/11/06.
Marketing Research Unit 7.
CONTINUOUS DELIVERY / CONTINUOUS INTEGRATION. IDEAS -> SOLUTIONS Time.
1. Learning Outcomes At the end of this lecture, you should be able to: –Define the term “Usability Engineering” –Describe the various steps involved.
The design process z Software engineering and the design process for interactive systems z Standards and guidelines as design rules z Usability engineering.
Spring break survey how much will your plans suck? how long are your plans? how many people are involved? how much did you overpay? what’s your name? how.
Chapter 14: Usability testing and field studies
류 현 정류 현 정 Human Computer Interaction Introducing evaluation.
Conducting Usability Tests ITSW 1410 Presentation Media Software Instructor: Glenda H. Easter.
Chapter 8: Systems analysis and design
Requirements Engineering Requirements Elicitation Process Lecture-8.
Chapter 20 Why evaluate the usability of UI designs?
Human Computer Interaction
Usability testing. Goals & questions focus on how well users perform tasks with the product. – typical users – doing typical tasks. Comparison of products.
Click to edit Master subtitle style USABILITY and USER INTERFACE DESIGN Application.
HCI in Software Process Material from Authors of Human Computer Interaction Alan Dix, et al.
Evaluation and Case Study Review Dr. Lam TECM 5180.
CS2003 Usability Engineering Usability Evaluation Dr Steve Love.
Systems Development Lifecycle Analysis. Learning Objectives List the nine stages of the system life cycle Explain the system life cycle as an iterative.
Analytical Thinking What is analysis and how does it work?
Usability Assessment Methods beyond Testing Chapter 7 Evaluating without users.
Post Implementation Review The Post Implementation Review is carried out once the system is fully operational. The Post Implementation Review is carried.
Chapter 8 Usability Specification Techniques Hix & Hartson.
The product of evaluation is knowledge. This could be knowledge about a design, knowledge about the user or knowledge about the task.
Usability Engineering Dr. Dania Bilal IS 582 Spring 2006.
SBD: Analyzing Requirements Chris North CS 3724: HCI.
Educational Research: Competencies for Analysis and Application, 9 th edition. Gay, Mills, & Airasian © 2009 Pearson Education, Inc. All rights reserved.
Marketing Research Approaches. Research Approaches Observational Research Ethnographic Research Survey Research Experimental Research.
Usability Evaluation, part 2. REVIEW: A Test Plan Checklist, 1 Goal of the test? Specific questions you want to answer? Who will be the experimenter?
Chapter 27 Variations and more complex evaluations.
Usability Evaluation or, “I can’t figure this out...do I still get the donuts?”
William H. Bowers – Participatory Methods Torres 6.
Usability Engineering Dr. Dania Bilal IS 582 Spring 2007.
Usability Engineering Dr. Dania Bilal IS 587 Fall 2007.
Evaluation / Usability. ImplementDesignAnalysisEvaluateDevelop ADDIE.
1 1 Principles of Marketing Spring Term MKTG 220 Spring Term MKTG 220 Dr. Abdullah Sultan Dr. Abdullah Sultan.
Unit 6 Application Design Sample Assignment.
Investigation Techniques
Usability Evaluation, part 2
The design process Software engineering and the design process for interactive systems Standards and guidelines as design rules Usability engineering.
The design process Software engineering and the design process for interactive systems Standards and guidelines as design rules Usability engineering.
Chapter 23 Deciding how to collect data
Evaluating Data.
1. INTRODUCTION.
Presentation transcript:

LInfoVis Winter 2011 Chris Culy Evaluation of visualizations

LInfoVis Winter 2011 Chris Culy Levels: perception and interpetation Visual/perceptual how well can people perceive the distinctions the visualization intends? Interpretation how well can people interpret the visualization?

LInfoVis Winter 2011 Chris Culy Levels: Use Use in isolation how accurately can people use the visualization for a particular task in isolation? how quickly can people use the visualization for a particular task in isolation? Use as part of a broader goal how accurately can people use the visualization for a particular task as part of a broader goal? how quickly can people use the visualization for a particular task as part of a broader goal?

LInfoVis Winter 2011 Chris Culy Levels: Satisfaction how satisfied are people with the visualization? e.g. easy/hard, "cool", etc. ≠ how well they can use it how useful is the visualization for what they want to do? how well do they "get it"? people may use/prefer a more difficult tool if it's "cool" if they already know it – there's a learning curve for a new tool if it's cheaper

LInfoVis Winter 2011 Chris Culy Goals Formative evaluation Goal is to get specific information to improve the software Done during the development cycle Often informal Summative evaluation Goal is to get general information about how the software performs Done at the end of (a) development cycle Often (more) formal Can also be used to improve the next iteration

LInfoVis Winter 2011 Chris Culy Some principles for experiments There should be a specific purpose for the experiment What are you evaluating? Experiments should be task based The user should be introduced to the software, especially something new The user should not be “interfered with” during the experiment

LInfoVis Winter 2011 Chris Culy Some techniques for experiments "instrumenting" the program tracking clicks, etc, and then trying to analyse the patterns also for tracking speed, accuracy

LInfoVis Winter 2011 Chris Culy Some techniques for experiments "instrumenting" the user various means for perception eye tracking shows where people are paying attention

LInfoVis Winter 2011 Chris Culy Some techniques for experiments Observation with note taking, timing of the user's actions User feedback pre/post session questionnaire "think aloud protocol": user explains what they're doing and why as they're doing it explicit questions during demo (informal) post session interview (“debriefing”)

LInfoVis Winter 2011 Chris Culy Formal vs. informal Formal Several subjects (6+) Experimental design (comparison) Often statistical analysis of results Experiment usually recorded Informal, "guerilla" testing Few subjects (3-4) Quick, informal test to get user feedback on s.t.

LInfoVis Winter 2011 Chris Culy Techniques for deployed software "Bug" reports When is a bug not a bug? Feature requests Interviews (or narratives) with "customers" Case studies (positive = “success stories”) Observation of "customers" using the software

LInfoVis Winter 2011 Chris Culy Interpreting users' reactions Users can only react in terms of what they know/assume. They don't analyse what they do, so their suggestions tend to be too concrete. They often make specific suggestions that aren't necessarily the best solutions, since they don't know what the possibilities are They are honestly trying to be helpful