1 Visualization Process and Collaboration Tamara Munzner Department of Computer Science University of British Columbia

Slides:



Advertisements
Similar presentations
©2011 1www.id-book.com Evaluation studies: From controlled to natural settings Chapter 14.
Advertisements

Chapter 14: Usability testing and field studies
H3: Laying Out Large Directed Graphs in 3D Hyperbolic Space Tamara Munzner, Stanford University.
SECOND MIDTERM REVIEW CS 580 Human Computer Interaction.
1 CS 501 Spring 2002 CS 501: Software Engineering Lecture 11 Designing for Usability I.
A Nested Model for Visualization Design and Validation Tamara Munzner University of British Columbia Department of Computer Science.
Tamara Munzner University of British Columbia Department of Computer Science Outward and Inward Grand Challenges VisWeek08 Panel: Grand Challenges for.
1 Estimating Software Development Using Project Metrics.
Chapter 14: Usability testing and field studies. 2 FJK User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept.
University of British Columbia Department of Computer Science Tamara Munzner Visualization: From Pixels to Insight March 3, 2007 UBC CS TechTrek.
USABILITY AND EVALUATION Motivations and Methods.
Semester in review. The Final May 7, 6:30pm – 9:45 pm Closed book, ONE PAGE OF NOTES Cumulative Similar format to midterm (probably about 25% longer)
Research Cycles Tamara Munzner University of British Columbia Computer Science Dept, 2366 Main Mall Vancouver BC V6T 1Z4 Canada
Live Re-orderable Accordion Drawing (LiveRAC) Peter McLachlan, Tamara Munzner Eleftherios Koutsofios, Stephen North AT&T Research Symposium August, 2007.
Information Retrieval: Human-Computer Interfaces and Information Access Process.
Design and Evaluation of Iterative Systems n For most interactive systems, the ‘design it right first’ approach is not useful. n The 3 basic steps in the.
1 Biology Design Studies comparative functional genomics –Pathline comparative genomics (synteny) –MizBee gene expression + interaction network –Cerebral.
Grand Challenges Robert Moorhead Mississippi State University Mississippi State, MS 39762
Ballspenden: Filling Your Domain User Dance Card Tamara Munzner Dagstuhl InfoVis Seminar 2010.
1 User Centered Design and Evaluation. 2 Overview My evaluation experience Why involve users at all? What is a user-centered approach? Evaluation strategies.
From Controlled to Natural Settings
Web Design cs414 spring Announcements Project status due Friday (submit pdf)
Computational Thinking Related Efforts. CS Principles – Big Ideas  Computing is a creative human activity that engenders innovation and promotes exploration.
Annual SERC Research Review - Student Presentation, October 5-6, Extending Model Based System Engineering to Utilize 3D Virtual Environments Peter.
Human Interface Engineering1 Main Title, 60 pt., U/L case LS=.8 lines Introduction to Human Interface Engineering NTU Seminar Amy Ma HIE Global Director.
Platforms for Learning in Computer Science July 28, 2005.
Chapter 7 Requirement Modeling : Flow, Behaviour, Patterns And WebApps.
FRED: Free Energy Database A platform for making better decisions about energy June 2012.
1 SWE 513: Software Engineering Usability II. 2 Usability and Cost Good usability may be expensive in hardware or special software development User interface.
Business Analysis and Essential Competencies
UNDERSTANDING USERS: MODELING TASKS AND LOW- LEVEL INTERACTION Human-Computer Interaction
Evaluation of software engineering. Software engineering research : Research in SE aims to achieve two main goals: 1) To increase the knowledge about.
User-Centered Development Methodology A user interface comprises “ those aspects of the system that the user comes in contact with.” ● Moran [1981]
Requirements Engineering Requirements Elicitation Process Lecture-8.
Human Computer Interaction
Usability testing. Goals & questions focus on how well users perform tasks with the product. – typical users – doing typical tasks. Comparison of products.
Chapter 11 Analysis Concepts and Principles
Usability Testing CS774 Human Computer Interaction Spring 2004.
Validation of Visualizations CS 4390/5390 Data Visualization Shirley Moore, Instructor September 24,
SEG3120 User Interfaces Design and Implementation
Laboratory for Computational Intelligence, University of British Columbia Belief & Decision Networks Stochastic Local Search Neural NetworksGraph Searching.
Lecture 7: Requirements Engineering
Introduction to Science Informatics Lecture 1. What Is Science? a dependence on external verification; an expectation of reproducible results; a focus.
1 CS430: Information Discovery Lecture 18 Usability 3.
INTERACTION DESIGN PROCESS Textbook: S. Heim, The Resonant Interface: HCI Foundations for Interaction Design [Chapter 3] Addison-Wesley, 2007 February.
Oh, no! validation bingo!. algorithm complexity analysis.
Design Process … and some design inspiration. Course ReCap To make you notice interfaces, good and bad – You’ll never look at doors the same way again.
Software Prototyping Rapid software development to validate requirements.
EVALUATION PROfessional network of Master’s degrees in Informatics as a Second Competence – PROMIS ( TEMPUS FR-TEMPUS-JPCR)
Smart Home Technologies
INFO 414 Human Information Behavior Presentation tips.
Importance of user interface design – Useful, useable, used Three golden rules – Place the user in control – Reduce the user’s memory load – Make the.
User Modeling Lecture # 7 Gabriel Spitz 1. User Interface Design Process Gabriel Spitz 2 Needs Assessment Competitive Analysis Persona Develop Task Analysis/
JavaScript Introduction and Background. 2 Web languages Three formal languages HTML JavaScript CSS Three different tasks Document description Client-side.
Katy Börner Teaching & Research Teaching & Research Katy Börner
Design Evaluation Overview Introduction Model for Interface Design Evaluation Types of Evaluation –Conceptual Design –Usability –Learning Outcome.
Zaap Visualization of web traffic from http server logs.
Chapter 12: Simulation and Modeling
CSC 222: Object-Oriented Programming
Software Prototyping.
CSC 222: Computer Programming II
Complexity Time: 2 Hours.
Introduction to Requirements
From Controlled to Natural Settings
CS 522: Human-Computer Interaction Lab: Formative Evaluation
Usability Techniques Lecture 13.
Fundamentals of Human Computer Interaction (HCI)
From Controlled to Natural Settings
CHAPTER 7: Information Visualization
Presentation transcript:

1 Visualization Process and Collaboration Tamara Munzner Department of Computer Science University of British Columbia Dagstuhl Scientific Visualization Workshop June 2009

Technique-driven work 3D hyperbolic graphs –H3 dimensionality reduction –steerable MDSteer –GPU accelerated Glimmer general multilevel graphs –layout TopoLayout –interaction Grouse, GrouseFlocks, TugGraph

Problem-driven work evolutionary tree comparison –TreeJuxtaposer protein-gene interaction networks –Cerebral linguistic graphs –Constellation

Problem-driven work web logs –SessionViewer large-scale system monitoring –LiveRAC

Collaboration sometimes you approach users sometimes they approach you –not guarantee of success! challenges –learning each others’ language –finding right people/problems where needs of both are met collaboration as dance/negotation –initial contact is only the beginning –continuous decision process: when to end the dance? after initial talk? after further discussion? after get feet wet with start on real work? after one project? after many projects?

Research Cycles, Collaboration, and Visualization 4-slide version of hour-long collaboration talk –research cycles and collaborator roles –value of collaboration: success stories –difficulty of collaboration: when to walk away

Research cycles difficult for one person to cover all roles collaboration is obvious way to fill in gaps Johnson, Moorhead, Munzner, Pfister, Rheingans, and Yoo. NIH/NSF Visualization Research Challenges Report. IEEE CS Press, 2006.

Four process questions ask them early in dance/negotiation! what is the role of my collaborators? is there a real need for my new approach/tool? am I addressing a real task? does real data exist and can I get it?

Collaborator roles left: providers of principles/methodologies –HCI, cognitive psychology –computer graphics –math, statistics right: providers of driving problems –domain experts, target app users middle: fellow vis practitioners middle: fellow tool builders, outside of vis –often want vis interface for their tools/algs –do not take their word for it on needs of real users

Characteristics I look for in collaborators people with driving problems –big data –clear questions –need for human in the loop –enthusiasm/respect for vis possibilities all collaborators –has enough time for the project –research meetings are fun no laughter is a very bad sign –(project has funding - ideally...)

Tricky collaboration: sustainability vis environmental sustainability simulation –citizens in communities making policy choices –facilitator leads workshops initial focus: high-dimensional dataset –11 input variables, 3 choices each –100K output scenarios, with 300 indicators –existing tool only shows a few outputs at once hard to understand entire scenario impossible to compare scenarios –goal: show linkages between inputs and outputs

First prototype linked views –needed refining dimensionality reduction –too confusing for general public use –bad match to true dimensionality of dataset

Second prototype better linked views –solved interesting aggregation problem but not deployed –real goal was policy choices and behavior change –not to absorb details of how simulation works! got the task wrong!

Process model: what can go wrong? wrong problem: they don’t do that wrong abstraction: you’re showing them the wrong thing wrong encoding/interaction: the way you show it doesn’t work wrong algorithm: your code is too slow domain problem characterization data/operation abstraction design encoding/interaction technique design algorithm design

threat: wrong problem validate: observe and interview target users threat: bad data/operation abstraction threat: ineffective encoding/interaction technique validate: justify encoding/interaction design threat: slow algorithm validate: analyze computational complexity implement system validate: measure system time/memory validate: qualitative/quantitative result image analysis [test on any users, informal usability study] validate: lab study, measure human time/errors for operation validate: test on target users, collect anecdotal evidence of utility validate: field study, document human usage of deployed system validate: observe adoption rates Different threats to validity at each level

Studies: different flavors head to head system comparison (HCI) –H3 vs. 2D web browser psychophysical characterization (cog psych) –impact of distortion on visual search –on visual memory

Studies: different flavors characterize technique applicability, derive design guidelines –stretch and squish vs. pan/zoom navigation –separate vs. integrated views –2D points vs. 3D landscapes

Studies: different flavors requirements analysis (before starting) –semi-structured interviews –watch what they do before new tool introduced: current workflow analysis field study of deployed system (after prototype refined) –watch them use tool: characterize what they can do now