Methodology and Explanation XX50125 Lecture 1: Part I. Introduction to Evaluation Methods Part 2. Experiments Dr. Danaë Stanton Fraser.

Slides:



Advertisements
Similar presentations
©2011 1www.id-book.com Evaluation studies: From controlled to natural settings Chapter 14.
Advertisements

Chapter 14: Usability testing and field studies
Evaluation of User Interface Design
User Interface Design CIS 322 Office hours: Tuesday 1-2pm Thursday 12-1pm Room
Methodology and Explanation XX50125 Lecture 2: Experiments Dr. Danaë Stanton Fraser.
Communication Theory Lecture 1: Introduction to Communication Theory and Novel Technology Dr. Danaë Stanton Fraser.
Interacting with Technology Lecture 5. Laboratory versus Field: the Evaluation Debate Dr. Danaë Stanton Fraser.
Methodology and Explanation XX50125 Lecture 3: Interviews and questionnaires Dr. Danaë Stanton Fraser.
Controversies in Cognition Laboratory versus Field: the Evaluation Debate Dr. Danaë Stanton Fraser.
Human Computer Interaction
Human Computer Interaction
Introduction to Research
Whiteboard Content Sharing Audio Video Text Chat Polls & Recording Meet Now Skype Integration MS Lync 2013 Features, Tools & Tips for facilitators… Limitations.
Data analysis and interpretation. Agenda Part 2 comments – Average score: 87 Part 3: due in 2 weeks Data analysis.
Chapter 14: Usability testing and field studies. 2 FJK User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept.
Chapter 14: Usability testing and field studies. Usability Testing Emphasizes the property of being usable Key Components –User Pre-Test –User Test –User.
Evaluation. formative 4 There are many times throughout the lifecycle of a software development that a designer needs answers to questions that check.
The Methods of Social Psychology
Evaluation Methodologies
Observing users.
Testing and Modeling Users Kristina Winbladh & Ramzi Nasr.
From Controlled to Natural Settings
SIMAD University Research Process Ali Yassin Sheikh.
Chapter 14: Usability testing and field studies
Computer Science Department California Polytechnic State University San Luis Obispo, CA, U.S.A. Franz J. Kurfess CPE/CSC 484: User-Centered Design and.
Prism Lab Stats Dr. Roger Newport & Laura Condon Room B47. Drop-In Sessions: Tuesdays 12-2pm.
Introduction to SDLC: System Development Life Cycle Dr. Dania Bilal IS 582 Spring 2009.
School of something FACULTY OF OTHER School of Computing FACULTY OF ENGINEERING Data Collection and Evaluation Undergraduate Final Year Projects Lydia.
Tutor: Prof. A. Taleb-Bendiab Contact: Telephone: +44 (0) CMPDLLM002 Research Methods Lecture 8: Quantitative.
Ch 14. Testing & modeling users
Evaluation Techniques Material from Authors of Human Computer Interaction Alan Dix, et al.
Gathering User Data IS 588 Dr. Dania Bilal Spring 2008.
Multimedia Specification Design and Production 2013 / Semester 1 / week 9 Lecturer: Dr. Nikos Gazepidis
Evaluating a Research Report
Human Computer Interaction
Evaluation Paradigms & Techniques IS 588 Spring 2008 Dr. D. Bilal.
Evaluation Techniques zEvaluation ytests usability and functionality of system yoccurs in laboratory, field and/or in collaboration with users yevaluates.
Testing & modeling users. The aims Describe how to do user testing. Discuss the differences between user testing, usability testing and research experiments.
Karrie Karahalios, Eric Gilbert 6 April 2007 some slides courtesy of Brian Bailey and John Hart cs414 empirical user studies.
CS2003 Usability Engineering Usability Evaluation Dr Steve Love.
CS2003 Usability Engineering Human-Centred Design Dr Steve Love.
Evaluation Techniques Evaluation –tests usability and functionality of system –occurs in laboratory, field and/or in collaboration with users –evaluates.
steps in psychological research
CSCI 4163 / CSCI 6904 – Winter Housekeeping  Clarification about due date for reading comments/questions  Skills sheet  Active listening handout.
QUANTITATIVE RESEARCH Presented by SANIA IQBAL M.Ed Course Instructor SIR RASOOL BUKSH RAISANI.
Qualitative Research Design
AMSc Research Methods Research approach IV: Experimental [1] Jane Reid
1 Human-Computer Interaction Usability Evaluation: 2 Expert and Empirical Methods.
AVI/Psych 358/IE 340: Human Factors Evaluation October 31, 2008.
Lecture №4 METHODS OF RESEARCH. Method (Greek. methodos) - way of knowledge, the study of natural phenomena and social life. It is also a set of methods.
Sports Market Research. Know Your Customer How do businesses know their customers needs and wants?  Ask them/talking to customers  Surveys  Questionnaires.
Chapter 9 evaluation techniques. Evaluation Techniques Evaluation – tests usability and functionality of system – occurs in laboratory, field and/or in.
User Interface Design G52UID Milena Radenkovic Introduction.
COMP 135: Human-Computer Interface Design
Part Two.
Qualitative vs. Quantitative
Between-Subjects, within-subjects, and factorial Experimental Designs
Evaluation techniques
From Controlled to Natural Settings
From Controlled to Natural Settings
Evaluation.
HCI Evaluation Techniques
CSM18 Usability Engineering
Testing & modeling users
Evaluation Techniques
Experimental Evaluation
Human-Computer Interaction: Overview of User Studies
Causal Comparative Research Design
From Controlled to Natural Settings
Presentation transcript:

Methodology and Explanation XX50125 Lecture 1: Part I. Introduction to Evaluation Methods Part 2. Experiments Dr. Danaë Stanton Fraser

Methodology and Explanation 2007 Goals of this unit To introduce research methods in human computer interaction To explore qualitative and quantative methods To gain experience in putting these methods into practice To gain a critical understanding of analytic techniques

Methodology and Explanation 2007 Unit structure Lectures each focusing on methods of analysis Practical sessions putting these methods into practice

Methodology and Explanation 2007 Course text and slides Rogers, Y., Sharp, H., and Preece, J Interaction Design: Beyond Human Computer Interaction. Wiley. Slides available online at:

Methodology and Explanation 2007 Assessment 2 assessed courseworks: 1 st coursework (report) 2 nd coursework (report and presentation)

Methodology and Explanation 2007 Evaluation Methods Controlled experiments Ethnography Expert review Usability testing Surveys/questionnaires Data logging Interviews and focus groups

Methodology and Explanation 2007 Not just desktop computing. What other kinds of technologies are we evaluating?

Methodology and Explanation 2007 Direct manipulation & graphical user interfaces

Methodology and Explanation 2007 Collaborative Desktop

Methodology and Explanation 2007 Immersive technologies

Methodology and Explanation 2007 Tangible interfaces

Methodology and Explanation 2007 Wearable, handheld and mobile devices

Methodology and Explanation 2007 Embedded interfaces

Methodology and Explanation 2007 Evaluate: which method to choose Design or implementation? Laboratory or field studies? Subjective or objective? Qualitative or quantitative? Information provided? Immediacy of response? Intrusiveness? Resources?

Methodology and Explanation 2007 Part II Experiments: A brief guide

Methodology and Explanation 2007 Aim To answer a question or test an hypothesis that predicts a relationship between two or more events, known as variables. E.g. Will spatial knowledge be superior following exploration of a VR simulation of a building or from exploring a model of a building?

Methodology and Explanation 2007 Variables Such hypotheses are tested by manipulating one or more of the variables. The variable that is manipulated is called the independent variable (the conditions to test this variable are setup independently before the experiment starts). In the study mentioned media type (VR vs model) is the independent measure The dependent variable would be accuracy of spatial information (ie time to reach point a from point b) because the hypothesis is that the time to carry out the task depends on the media explored.

Methodology and Explanation 2007 Variables and Conditions In order to test an hypothesis the experimenter sets up the experimental conditions e.g. Condition 1 Pretest Explore model Post test Condition 2 Pretest Explore VR Post test Control condition – against which to compare the results?

Methodology and Explanation 2007 Participants Between participants Two drawbacks – no. of participants needed, individual differences Advantage – no order effects Within participants Counterbalancing required Matched pairs on characteristics such as gender and expertise Difficult to match across all variables

Methodology and Explanation 2007 Practicalities Where will the experiment be carried out? How will the equipment be set-up? How does one introduce participants to study? What scripts are need to standardise the procedure? Always include pilot study

Methodology and Explanation 2007 Data Collection and Analysis Data – performance measures are taken e.g. response times, no. of errors Use graphs The data should be averaged across conditions to examine any differences Statistical tests such as t-tests and ANOVAs can reveal whether the differences are significant Software packages such as SPSS are often used If there is no significant difference then the hypothesis is refuted.

Methodology and Explanation 2007 Data captured Record quantitative results However also gather process data e.g. dialogue turns, gesture and non verbal behaviour

Methodology and Explanation 2007 Preparation for Practical 1 Topic: Examining the use of gestures in collaboration. In preparation for the practical please read: 1.Kraut, R. E., Fussell, S. R., and Siegel, J. (2003). Visual Information as a Conversational Resource in Collaborative Physical Tasks. Human-Computer Interaction,Vol. 18, pp. 13– 49 Lawrence Erlbaum Associates, Inc. df 2. Kirk, D. and Stanton Fraser, D. (2005). The Effects of Remote Gesturing on Distance Instruction. Computer Supported Collaborative Learning (CSCL 2005). Taiwan. May.