The product of evaluation is knowledge. This could be knowledge about a design, knowledge about the user or knowledge about the task.

Slides:



Advertisements
Similar presentations
Evaluation of User Interface Design
Advertisements

Fact Finding Techniques
ACTIVELY ENGAGING THE STAKEHOLDER IN DEFINING REQUIREMENTS FOR THE BUSINESS, THE STAKEHOLDER, SOLUTION OR TRANSITION Requirements Elicitation.
USABILITY AND EVALUATION Motivations and Methods.
©N. Hari Narayanan Computer Science & Software Engineering Auburn University 1 COMP 7620 Evaluation Chapter 9.
Chapter 41 Training for Organizations Research Skills.
Think-aloud usability experiments or concurrent verbal accounts Judy Kay CHAI: Computer human adapted interaction research group School of Information.
Evaluation Methodologies
ICS 463, Intro to Human Computer Interaction Design: 8. Evaluation and Data Dan Suthers.
“Retrospective vs. concurrent think-aloud protocols: usability testing of an online library catalogue.” Presented by: Aram Saponjyan & Elie Boutros.
RESEARCH METHODS IN EDUCATIONAL PSYCHOLOGY
Compilation and interpretation of primary and secondary sources of information. The integration of different sources will consolidate the write up of the.
Chapter 14: Usability testing and field studies
 For the IB Diploma Programme psychology course, the experimental method is defined as requiring: 1. The manipulation of one independent variable while.
DATA COLLECTION DATA COLLECTION Compilation and interpretation of primary and secondary sources of information. The integration of different sources will.
Scientific Process ► 1) Developing a research idea and hypothesis ► 2) Choosing a research design (correlational vs. experimental) ► 3) Choosing subjects.
Quantitative Research Qualitative Research? A type of educational research in which the researcher decides what to study. A type of educational research.
Evaluating a Research Report
Human Computer Interaction
Assumes that events are governed by some lawful order
Usability Testing Chapter 6. Reliability Can you repeat the test?
COMP5047 Pervasive Computing: 2012 Think-aloud usability experiments or concurrent verbal accounts Judy Kay CHAI: Computer human adapted interaction research.
University of Sunderland Professionalism and Personal Skills Unit 9 Professionalism and Personal Skills Lecture Data Collection.
©2010 John Wiley and Sons Chapter 3 Research Methods in Human-Computer Interaction Chapter 3- Experimental Design.
Designing an Experiment Lesson 1.3 Chapter 1: Using Scientific Inquiry Interactive Science Grade 8, Pearson Education Inc., Upper Saddle River, New Jersey.
DATA COLLECTION DATA COLLECTION Compilation and interpretation of primary and secondary sources of information. The integration of different sources will.
Issues in Validity and Reliability Conducting Educational Research Chapter 4 Presented by: Vanessa Colón.
steps in psychological research
Nursing research Is a systematic inquiry into a subject that uses various approach quantitative and qualitative methods) to answer questions and solve.
AMSc Research Methods Research approach IV: Experimental [1] Jane Reid
The Scientific Method: A flipbook of the inquiry process! the steps you follow to do an experiment.
Scientific Method A blueprint for experiment success.
9.351 Systems Analysis & DesignRequirements Determination1 Requirement Determination Collecting information that specify what system needs to do/support.
Presenting and Analysing your Data CSCI 6620 Spring 2014 Thesis Projects: Chapter 10 CSCI 6620 Spring 2014 Thesis Projects: Chapter 10.
The Scientific Method. So what exactly is Science? Science (from Latin scientia, meaning "knowledge" ) is a systematic enterprise that builds and organizes.
The Language of Science.  Hypothesis: a prediction that can be tested; an educated guess base on observations and prior knowledge  Theory: a well tested.
Usability Evaluation, part 2. REVIEW: A Test Plan Checklist, 1 Goal of the test? Specific questions you want to answer? Who will be the experimenter?
Fall 2002CS/PSY Empirical Evaluation Data collection: Subjective data Questionnaires, interviews Gathering data, cont’d Subjective Data Quantitative.
1 Human-Computer Interaction Usability Evaluation: 2 Expert and Empirical Methods.
CHAPTER 3: METHODOLOGY.
Scientific Method A blueprint for experiment success.
SOCIOLOGICAL RESEARCH Importance of social research Help solve social problems by understanding how they come about, and why they persist. Makes clear.
The Information School of the University of Washington LIS 570 Session 5.1 Research Design and Data Collection.
Evaluation INST 734 Module 5 Doug Oard. Agenda Evaluation fundamentals Test collections: evaluating sets Test collections: evaluating rankings Interleaving.
Research Methods in Psychology Introduction to Psychology.
Searching and Using Electronic Literature III. Research Design.
Research Design Quantitative. Quantitative Research Design Quantitative Research is the cornerstone of evidence-based practice It provides the knowledge.
Interviews, Questionnaires, and control flowcharts Chapter 19.
Chapter 2 Research Methods Please fill in your slides as we proceed.
고려대학교 산업경영공학과 IMEN 315 인간공학 1. INTRODUCTION. 고려대학교 산업경영공학과 IMEN 315 인간공학.
School of Engineering and Information and Communication Technology KIT305/607 Mobile Application Development Week 7: Usability (think-alouds) Dr. Rainer.
Part Two.
Evaluation through user participation
AN INTRODUCTION TO EDUCATIONAL RESEARCH.
Usability Evaluation, part 2
Methods of observation - Collecting data
Chapter 2 Sociological Research Methods
Quantitative and Qualitative Data
SOCIOLOGY RESEARCH METHODS.
A blueprint for experiment success.
What is Science?.
A blueprint for experiment success.
A blueprint for experiment success.
Chapter 1 The Science of Biology.
Research Design Quantitative.
HCI Evaluation Techniques
Gathering data, cont’d Subjective Data Quantitative
1. INTRODUCTION.
Steps of the Scientific Method.
A blueprint for experiment success.
Presentation transcript:

The product of evaluation is knowledge. This could be knowledge about a design, knowledge about the user or knowledge about the task.

 Two types of evaluation: 1.Summative Evaluation 2.Formative Evaluation  Is evaluation necessary?  When do we need it?  Quantitative vs. Qualitative Data

 To increase the quality of research, we must avoid the following effects during the observational study: 1.Hawthorne effect 2.Observer effect 3.Halo effect

 Verbal Protocols  Designing Observations: ◦ Writing a verbal protocol ◦ How to conduct a session ◦ Analyzing a protocol/transcript

 We ask users to speak out load and mention why they had done an action.  2 components: ◦ Talk aloud: verbalize silent decision ◦ Think aloud: verbalize whatever thoughts occur during this task.

 Select: ◦ Tasks ◦ Users ◦ Environment ◦ One or two significant functional requirements  Observe (at least 3 users)  Give users their set of tasks to complete  Conduct a think-aloud study  Keep protocols (transcripts for each user)  Record users comments, etc. (p. 141)

 Description of the environment  List of tasks completed by the user  Users’ background & demographic details  Record and write up users’ comments, body language, facial expressions  The aspects of the interface that these responses relate to should also be detailed.

 Choose the tasks  Select users – wrong users lead to misleading information  Explain the purpose to the users  Conduct the evaluation – for example: ◦ What are you looking at now? ◦ What just happened? ◦ What are you going to do next? Why?

 The data that you will have at the end of a session is known as the transcript, which details the physical actions and verbal commentary that the user has made.  When analyzing a transcript of an evaluation session, the aim is to categorize the comments according to: ◦ Frequency ◦ Fundamentality

 Experiments define a hypothesis  2 stages of experiments: ◦ Implementing the experiment ◦ Analyzing the results  Advantages: systematic with a repeatable approach to testing based on scientific rigor.  Disadvantages: includes a reduced consideration of specific variables, questions which are hard t relate to real-world holistic problems and an artificial setting which may lack real-world validity.  In testing we strive for realism

 CW is an approach to formative evaluation without users. ◦ Preparing a CW ◦ Conducting a CW ◦ Experiments in Support of Design ◦ Dependent and Independent Variables ◦ Assigning Subjects ◦ Statistics ◦ Summary of user experimentation

 Interviews ◦ Structured interview ◦ Unstructured interview  Questionnaires: ◦ Open Q’s ◦ Closed Q’s:  Simple checklist: Y/N or N/A  Ranked order: (select your preference)  Multi-point rating scale: strongly agree/disagree