Methodology Overview basics in user studies Lecture /slide deck produced by Saul Greenberg, University of Calgary, Canada Notice: some material in this.

Slides:



Advertisements
Similar presentations
10+10 Descending the Design Funnel Chapter 1.4 in Sketching User Experiences: The Workbook.
Advertisements

The State Transition Diagram
Win8 on Intel Programming Course Win8 and Intel Paul Guermonprez Intel Software
Why Should I Sketch? Chapter 1.2 in Sketching User Experiences: The Workbook.
CS305: HCI in SW Development Evaluation (Return to…)
The Keyboard Study Lecture /slide deck produced by Saul Greenberg, University of Calgary, Canada Notice: some material in this deck is used from other.
The Branching Storyboard Chapter 4.3 in Sketching the User Interface: The Workbook Image from:
Methodology Overview Why do we evaluate in HCI? Why should we use different methods? How can we compare methods? What methods are there? see
Saul Greenberg User Centered Design Why User Centered Design is important Approaches to User Centered Design.
Department of Computer Science
Chapter 14: Usability testing and field studies. 2 FJK User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept.
Saul Greenberg CPSC 481 Foundations and Principles of Human Computer Interaction James Tam.
Methodology Overview Dr. Saul Greenberg John Kelleher.
James Tam User Centered Design Why User Centered Design is important Approaches to User Centered Design.
1 User Centered Design and Evaluation. 2 Overview Why involve users at all? What is a user-centered approach? Evaluation strategies Examples from “Snap-Together.
Saul Greenberg CPSC 481 Foundations and Principles of Human Computer Interaction James Tam.
ESE Einführung in Software Engineering X. CHAPTER Prof. O. Nierstrasz Wintersemester 2005 / 2006.
Graphical User Interfaces Design and usability Saul Greenberg Professor University of Calgary Slide deck by Saul Greenberg. Permission is granted to use.
Foundations and Principles of Human Computer Interaction Slide deck by Saul Greenberg. Permission is granted to use this for non-commercial purposes as.
Midterm Exam Review IS 485, Professor Matt Thatcher.
Inspection Methods. Inspection methods Heuristic evaluation Guidelines review Consistency inspections Standards inspections Features inspection Cognitive.
1 User Centered Design and Evaluation. 2 Overview My evaluation experience Why involve users at all? What is a user-centered approach? Evaluation strategies.
Mid-Term Exam Review IS 485, Professor Matt Thatcher.
Science and Engineering Practices
How to Write an Abstract Grad Tips by Saul Greenberg University of Calgary Image from:
Sequential Storyboards Chapter 4.1 in Sketching the User Interface: The Workbook Image from:
CPSC 581 Human Computer Interaction II Interaction Design Lecture /slide deck produced by Saul Greenberg, University of Calgary, Canada Notice: some material.
Collecting Images & Clippings Chapter 2.3 in Sketching User Experiences: The Workbook.
Graphical Screen Design Part 1: Contrast, Repetition, Alignment, Proximity Lecture /slide deck produced by Saul Greenberg, University of Calgary, Canada.
Predictive Evaluation
Win8 on Intel Programming Course The challenge Paul Guermonprez Intel Software
What is a sketch? Chapter 1.2 addendum Sketching User Experiences: The Workbook.
Human Computer Interaction
The Animated Sequence Chapter 5.1 in Sketching User Experiences: The Workbook.
The Sketchbook Chapter 1.4 in Sketching User Experiences: The Workbook.
Mario Čagalj University of Split 2014/15. Human-Computer Interaction (HCI)
Sketching Vocabulary Chapter 3.4 in Sketching User Experiences: The Workbook Drawing objects, people, and their activities.
Win8 on Intel Programming Course Paul Guermonprez Intel Software
Overview and Revision for INFO3315. The exam
Design of Everyday Things Part 2: Useful Designs? Lecture /slide deck produced by Saul Greenberg, University of Calgary, Canada Images from:
Chapter 15: Analytical evaluation. Aims: Describe inspection methods. Show how heuristic evaluation can be adapted to evaluate different products. Explain.
The Narrative Storyboard Chapter 4.4 in Sketching User Experiences: The Workbook.
Research Word has a broad spectrum of meanings –“Research this topic on ….” –“Years of research has produced a new ….”
What is a sketch? 1 Concepts (and selected visuals) from this slide deck are based on: -Buxton, B. (2007) Sketching User Experiences: Getting the Design.
Design Evaluation Overview Introduction Model for Interface Design Evaluation Types of Evaluation –Conceptual Design –Usability –Learning Outcome.
Controlled Experiments
Sketching Vocabulary Chapter 3.4 in Sketching User Experiences: The Workbook Drawing objects, people, and their activities.
Task-Centered Walkthrough
User-centred system design process
Imran Hussain University of Management and Technology (UMT)
Agenda Video pre-presentations Digital sketches & photo traces
Controlled Experiments
Section 2: Science as a Process
Methodology Overview 2 basics in user studies Lecture /slide deck produced by Saul Greenberg, University of Calgary, Canada Notice: some material in this.
HCI in the software process
Pre and post workshop assessments
Cognitive Walkthrough
Prototyping.
Writing the Methods Section
Evaluation techniques
Usability Techniques Lecture 13.
HCI in the software process
Gathering Requirements
Evaluation.
HCI Evaluation Techniques
Testing & modeling users
HCI What ? HCI Why ? What happens when a human and a computer system interact to perform a task? task -write document, calculate budget, solve equation,
Human Computer Interaction Lecture 14 HCI in Software Process
Human-Computer Interaction: Overview of User Studies
Interface Design and Usability
Presentation transcript:

Methodology Overview basics in user studies Lecture /slide deck produced by Saul Greenberg, University of Calgary, Canada Notice: some material in this deck is used from other sources without permission. Credit to the original source is given if it is known,

Overview Why we evaluate in HCI What methods are there Why we use different methods How can we compare methods?

Overview Why we evaluate in HCI What methods are there Why we use different methods How can we compare methods?

Why Do We Evaluate In HCI? Designer: user-centered iterative design Researcher developing a knowledge base Customer selecting among systems Manager assisting effectiveness Marketer building a case for the product (From Finholt & Olsons CSCW 96 Tutorial)

Why Do We Evaluate In HCI? 1. Evaluation as part of the Iterative Design Process design implementation evaluation

Why Do We Evaluate In HCI? A. Pre-design stage: what do people do? what is their real world context and constraints? how do they think about their task? how can we understand what we need in system functionality? can we validate our requirements analysis? evaluation produces key tasks and required functionality key contextual factors descriptions of work practices organizational practices useful key requirements user type…

Why Do We Evaluate In HCI? B. Initial design stage: evaluate choices of initial design ideas and representations usually sketches, brainstorming exercises, paper prototypes is the representation appropriate? does it reflect how people think of their task evaluation produces: user reaction to design validation / invalidation of ideas list of conceptual problem areas (conceptual bugs) new design ideas

Why Do We Evaluate In HCI? C. Iterative design stage iteratively refine / fine tune the chosen design / representation evolve low / medium / high fidelity prototypes and products look for usability bugs can people use this system? evaluation produces: user reaction to design validation and list of problem areas (bugs) variations in design ideas

Why Do We Evaluate In HCI? D. Post-design stage acceptance test: did we deliver what we said we would? verify human/computer system meets expected performance criteria ease of learning, usability, user’s attitude, time, errors… e.g., 9/10 first-time users will successfully download pictures from their camera within 3 minutes, and delete unwanted ones in an additional 3 minutes revisions: what do we need to change? effects: what did we change in the way people do their tasks? in the field: do actual users perform as we expected them to?

Why Do We Evaluate In HCI? D. Post-design stage acceptance test revisions effects in the field evaluation produces testable usability metrics end user reactions validation and list of problem areas (bugs) changes in original work practices/requirements

Interface Design and Usability Engineering Articulate: who users are their key tasks Brainstorm designs Refined designs Completed designs Goals: Task centered system design Participatory design User-centered design Psychology of everyday things User involvement Representation & metaphors Graphical screen design Interface guidelines Style guides Participatory interaction Task scenario walk- through Evaluate Usability testing Heuristic evaluation Field testing Methods: low fidelity prototyping methods high fidelity prototyping methods Products: User and task descriptions Throw-away paper prototypes Testable prototypes Alpha/beta systems or complete specification

Why Do We Evaluate In HCI? 2. Evaluation to produce generalized knowledge are there general design principles? are there theories of human behaviour? explanatory predictive can we validate ideas / visions / hypotheses? evaluation produces: validated theories, principles and guidelines evidence supporting/rejecting hypotheses / ideas / visions…

Why Do We Evaluate In HCI? Design and evaluation Best if they are done together evaluation suggests design design suggests evaluation use evaluation to create as well as critique Design and evaluation methods must fit development constraints budget, resources, time, product cost… do triage: what is most important given the constraints? Design usually needs quick approximate answers precise results rarely needed close enough, good enough, informed guesses,… See optional reading by Don Norman Applying the Behavioural, Cognitive and Social Sciences to Products.

How can we think about methods? Method definition (Baecker, McGrath) Formalized procedures / tools that guide and structure the process of gathering and analyzing information Different methods can do different things. Each method offers potential opportunities not available by other means, Each method has inherent limitations…

What methods are there? Laboratory tests requires human subjects that act as end users Experimental methodologies highly controlled observations and measurements to answer very specific questions i.e., hypothesis testing Usability testing mostly qualitative, less controlled observations of users performing tasks

What methods are there? Interface inspection done by interface professionals, no end users necessary Usability heuristics several experts analyze an interface against a handful of principles Walkthroughs experts and others analyze an interface by considering what a user would have to do a step at a time while performing their task

What methods are there? Field studies requires established end users in their work context Ethnography field worker immerses themselves in a culture to understand what that culture is doing Contextual inquiry interview methodology that gains knowledge of what people do in their real-world context Deployment Studies providing potential end users with novel technologies to understand how they would use them in their context

What methods are there? Self reporting requires established or potential end users interviews questionnaires surveys

What methods are there? Cognitive modeling requires detailed interface specifications Fitt’s Law mathematical expression that can predict a user’s time to select a target Keystroke-level model low-level description of what users would have to do to perform a task that can be used to predict how long it would take them to do it Goms structured, multi-level description of what users would have to do to perform a task that can also be used to predict time

It is not what method is best It is not what method is best. It is what method is best to answer the question you are asking

Why Use Different Methods? All methods: enable but also limit what can be gathered and analyzed are valuable in certain situations, but weak in others have inherent weaknesses and limitations can be used to complement each other’s strengths and weaknesses. -McGrath (Methodology Matters)

Why Use Different Methods? Information requirements differ pre-design, iterative design, post-design, generalizable knowledge… Information produced differs outputs should match the particular problem/needs Relevance does the method provide information to our question / problem?

Why Use Different Methods? Cost/benefit of using method cost of method should match the benefit gained from the result Constraints and pragmatics may force you to chose quick and dirty discount usability methods

Learning Goals Learn a toolbox of evaluation methodologies for both research and practice in Human Computer Interaction To achieve this, you will: investigate, compare and contrast many existing methodologies understand how each methodology fits particular interface design and evaluation situation practice several of these methodologies on simple problems gain first-hand experience with a particular methodology by designing, running, and interpreting a study.

You know now Why we evaluate in HCI What methods there are Why we use different methods

Primary Sources This slide deck is partly based on concepts as taught by: Finholt & Olson, CSCW 1996 Tutorial Lecture Notes

Permissions You are free: to Share — to copy, distribute and transmit the work to Remix — to adapt the work Under the following conditions: Attribution — You must attribute the work in the manner specified by the author (but not in any way that suggests that they endorse you or your use of the work) by citing: “Lecture materials by Saul Greenberg, University of Calgary, AB, Canada. http://saul.cpsc.ucalgary.ca/saul/pmwiki.php/HCIResources/HCILectures” Noncommercial — You may not use this work for commercial purposes, except to assist one’s own teaching and training within commercial organizations. Share Alike — If you alter, transform, or build upon this work, you may distribute the resulting work only under the same or similar license to this one. With the understanding that: Not all material have transferable rights — materials from other sources which are included here are cited Waiver — Any of the above conditions can be waived if you get permission from the copyright holder. Public Domain — Where the work or any of its elements is in the public domain under applicable law, that status is in no way affected by the license. Other Rights — In no way are any of the following rights affected by the license: Your fair dealing or fair use rights, or other applicable copyright exceptions and limitations; The author's moral rights; Rights other persons may have either in the work itself or in how the work is used, such as publicity or privacy rights. Notice — For any reuse or distribution, you must make clear to others the license terms of this work. The best way to do this is with a link to this web page.