Evaluation Adam Bodnar CPSC 533C Monday, April 5, 2004.

Slides:



Advertisements
Similar presentations
©2011 1www.id-book.com Evaluation studies: From controlled to natural settings Chapter 14.
Advertisements

Chapter 14: Usability testing and field studies
Critical Reading Strategies: Overview of Research Process
Critical Reading Strategies: Overview of Research Process
The Robert Gordon University School of Engineering Dr. Mohamed Amish
Alina Pommeranz, MSc in Interactive System Engineering supervised by Dr. ir. Pascal Wiggers and Prof. Dr. Catholijn M. Jonker.
Market Research Ms. Roberts 10/12. Definition: The process of obtaining the information needed to make sound marketing decisions.
CONSUMER RESEARCH Chapter 2.
A2 Unit 4A Geography fieldwork investigation Candidates taking Unit 4A have, in section A, the opportunity to extend an area of the subject content into.
Cognitive Walkthrough More evaluation without users.
Cartographic visualization Dmitry Nekrasovski March 24, 2004.
MusicLand: Exploratory Browsing in Music Space Heidi Lam December 15, 2004 CPSC 533CInformation Visualization Project Presentation.
Small Displays Nicole Arksey Information Visualization December 5, 2005 My new kitty, Erwin.
ClearEye A Visualization System for Document Revision CPSC 533C Project Qiang Kong Qixing Zheng.
User Testing & Experiments. Objectives Explain the process of running a user testing or experiment session. Describe evaluation scripts and pilot tests.
USABILITY AND EVALUATION Motivations and Methods.
1 Designing Group Annotations and Process Visualizations for Role-Based Collaboration Gregorio Convertino, Anna Wu, Xiaolong (Luke) Zhang, Craig H. Ganoe,
Chapter 14: Usability testing and field studies. Usability Testing Emphasizes the property of being usable Key Components –User Pre-Test –User Test –User.
HCI Methods for Pathway Visualization Tools Purvi Saraiya, Chris North, Karen Duca* Virginia Tech Dept. of Computer Science, Center for Human-Computer.
Methodology Overview Dr. Saul Greenberg John Kelleher.
Paper Title Your Name CMSC 838 Presentation. CMSC 838T – Presentation Motivation u Problem paper is trying to solve  Characteristics of problem  … u.
1 User Centered Design and Evaluation. 2 Overview Why involve users at all? What is a user-centered approach? Evaluation strategies Examples from “Snap-Together.
Exploring Gradient-based Face Navigation Interfaces Tzu-Pei Grace Chen Sidney Fels Human Communication Technologies Laboratory Department of Electrical.
Evaluation Methodologies
Jesper Kjeldskov & Jan Stage Department of Computer Science Aalborg University Denmark New Techniques for Usability Evaluation of Mobile Systems.
An evaluation framework
2D or 3D ? Presented by Xu Liu, Ming Luo. Is 3D always better than 2D? NO!
Law Enforcement Resource Allocation (LERA) Visualization System Michael Welsman-Dinelle April Webster.
1 User Centered Design and Evaluation. 2 Overview My evaluation experience Why involve users at all? What is a user-centered approach? Evaluation strategies.
From Controlled to Natural Settings
Ch 3 Usability page 1CS 368 Usability Models the authors compare three usability models and introduce their own “the extent to which a product can be used.
Applying Eye Tracking for Usability Evaluations of E-Commerce sites Ekaterini Tzanidou, Shailey Minocha, Marian Petre Department of Computing, The Open.
Online Chat Visualization Mahesh B Mungara. Chatting With A “Purpose”  Lack of support for chatting with a purpose  Three major requirements -Need of.
Evaluation of Viewport Size and Curvature of Large, High-Resolution Displays Lauren Shupp, Robert Ball, John Booker, Beth Yost, Chris North Virginia Polytechnic.
Chapter 14: Usability testing and field studies
Predictive Evaluation
Evaluation Framework Prevention vs. Intervention CHONG POH WAN 21 JUNE 2011.
Research Methods in Computer Science Lecture: Quantitative and Qualitative Data Analysis | Department of Science | Interactive Graphics System.
Human Computer Interaction
Usability testing. Goals & questions focus on how well users perform tasks with the product. – typical users – doing typical tasks. Comparison of products.
Usability Evaluation June 8, Why do we need to do usability evaluation?
Today: Our process Assignment 3 Q&A Concept of Control Reading: Framework for Hybrid Experiments Sampling If time, get a start on True Experiments: Single-Factor.
Visualization and analysis of microarray and gene ontology data with treemaps Eric H Baehrecke, Niem Dang, Ketan Babaria and Ben Shneiderman Presenter:
Design Rules-Part B Standards and Guidelines
Today: Assignment 2 misconceptions Our process True Experiments: Single-Factor Design Assignment 3 Q&A Mid-term: format, coverage.
Testing & modeling users. The aims Describe how to do user testing. Discuss the differences between user testing, usability testing and research experiments.
How to read a scientific paper
1 CS430: Information Discovery Lecture 18 Usability 3.
Towards Common Standards for Studies of Software Engineering Tools and Tool Features Timothy C. Lethbridge University of Ottawa.
AMSc Research Methods Research approach IV: Experimental [1] Jane Reid
1 MP2 Experimental Design Review HCI W2014 Acknowledgement: Much of the material in this lecture is based on material prepared for similar courses by Saul.
Unit 5—HS 305 Research Methods in Health Science
Marketing Research Approaches. Research Approaches Observational Research Ethnographic Research Survey Research Experimental Research.
Copyright © Allyn & Bacon 2008 Intelligent Consumer Chapter 14 This multimedia product and its contents are protected under copyright law. The following.
Information Visualization and Data Art Presentation by Borra, Erik Dresscher, Paulien Kampman, Minke.
1 Human-Computer Interaction Usability Evaluation: 2 Expert and Empirical Methods.
EXTENDED PRACTICE Students from art and design make assumptions regarding relevance of research methods and qualitative and quantitative practice.
User Interface Design Patterns: Part 1 Kirsten McCane.
Environmental Systems and Society Internal Assessment.
1 Evaluating the User Experience in CAA Environments: What affects User Satisfaction? Gavin Sim Janet C Read Phil Holifield.
Information Visualization “Ant-vision is humanity’s usual fate; but seeing the whole is every thinking person’s aspiration” - David Gelernter “Visualization.
LineUp: Visual Analysis of Multi-Attribute Rankings IEEE INFOVIS 2013 Samuel Gratzl, Johannes Kepler University Alexander Lex, Nils Gehlenborg, Hanspeter.
Lecture №4 METHODS OF RESEARCH. Method (Greek. methodos) - way of knowledge, the study of natural phenomena and social life. It is also a set of methods.
An Evaluation of Pan & Zoom and Rubber Sheet Navigation with and without an Overview Dmitry Nekrasovski, Adam Bodnar, Joanna McGrenere, François Guimbretière,
Table Lens Paper – The Table Lens: Merging Graphical and Symbolic Representations in an Interactive Focus + Context Visualization for Tabular Information.
Monday, September 22 Revising Content Writing Process Map.
Quantitative vs. Qualitative Research Method Issues
Human-Computer Interaction: Overview of User Studies
A Semester project by MAULIK SHUKLA DENZIL JOHN AJAY JAMPANI
Presentation transcript:

Evaluation Adam Bodnar CPSC 533C Monday, April 5, 2004

Motivation So many techniques, so little evaluation Are they really effective? How effective? When are they effective? Why are they effective?

Papers User Studies: Why, How and When? (Kosara et al., 2003) Navigation Patterns and Usability of Zoomable User Interfaces with and without an Overview (Hornbaek et al., 2002) An Evaluation of Information Visualization in Attention-Limited Environments (Somervell et al., 2002)

From Theory to Practice Can we design an effective colour sequence to illustrate features? Chromatic sequence reveals categories (a) Luminant sequence reveals form (b)

Comparison of Techniques Can we design an effective texture that conveys 3D shape information better than the current method? Phong shading is default (a) One principal direction texture mapping (b)

Study Within Context Can we effectively integrate semantic depth of field into an application? Multi-layer map viewer Layers can be opaque, semi-transparent, or SDOF No significant results

Other Techniques User studies aren’t always the best choice Time consuming, difficult to run, answer only small questions Field study Observe the user in their native setting Visual designers Replace part of user test with an expert

What to take away… Good experiments are difficult to design but are worth the effort User studies aren’t always the most appropriate method of evaluation We need to establish evaluation as a standard InfoVis practice

Critique Strengths Promotes evaluation through example Accessible to those without a background in HCI Weaknesses Only good points of studies presented No critique of alternative evaluation techniques

Papers User Studies: Why, How and When? (Kosara et al., 2003) Navigation Patterns and Usability of Zoomable User Interfaces with and without an Overview (Hornbaek et al., 2002) An Evaluation of Information Visualization in Attention-Limited Environments (Somervell et al., 2002)

Experimental Background Interfaces with an overview Details of information space together with an overview of the entire information space Established usability in literature Zoomable user interfaces Organize information in space and scale, and use panning and zooming to navigate Mixed results for usability in literature The usability of overviews for zoomable user interfaces has not been studied

What to Investigate? Question How does the presence or absence of an overview in a zoomable interface affect usability? Hypotheses Subjects will prefer the overview interface The overview interface will be faster for comparison and browsing based tasks

Dataset and Tasks Dataset Two maps based on census data Differ in levels (single vs. multi-level) Tasks Navigation and browsing

Study Design Experimental Design Within 2 x 2 x 2 (interface, task, map) Counterbalanced conditions 32 subjects Measures Quantitative Accuracy, recall, speed, navigation actions Qualitative Preference, satisfaction

Results Significant Effects Subjects preferred interface with an overview (H1) Subjects faster with interface without an overview for multi-layer map (H2) Other No difference between interfaces in subjects’ ability to correctly solve tasks

Study Implications Consider the trade off between satisfaction and task completion time Unify overview with detail window Consider how map design influences usability

Critique Strengths Detailed methodology Real dataset and real test subjects Strong statistical analysis and discussion Weaknesses Investigators created the maps No explanation for small display used in experiment

Papers User Studies: Why, How and When? (Kosara et al., 2003) Navigation Patterns and Usability of Zoomable User Interfaces with and without an Overview (Hornbaek et al., 2002) An Evaluation of Information Visualization in Attention-Limited Environments (Somervell et al., 2002)

What to Investigate? Motivation InfoVis as a secondary display is a practical application but has not been evaluated Questions How quickly and effectively can people interpret information visualization while busily performing other tasks? What are the issues we must consider?

Experimental Setup Primary task Video game Secondary task Multiple choice questions about visualization target Target could be single item or cluster

Study Design Experimental Design Between/Within 2 x 2 x 2 (time, info density, task) Counterbalanced conditions 28 subjects Measures Quantitative Performance, correctness Qualitative None

Results Significant Effects Subjects performed as good or better on low density visualizations vs. high density visualizations Subjects achieved greater correctness (answering questions) when time = 8sec Other No difference in primary task performance before or after the visualization appeared

Study Implications Peripheral visualizations can be introduced without hindering primary task performance Effective interpretation in a duel-task scenario requires more than one second Low information density displays result in performance that is as good as high density displays in a duel-task scenario

Critique Strengths Ground experiment in previous work Strong statistical analysis and discussion Weaknesses Lack of real underlying data Only focused on one type of primary task

Conclusion Empirical evaluation can lead to improvements in the design of information visualization Questions?