COMP5047 Pervasive Computing: 2012 Think-aloud usability experiments or concurrent verbal accounts Judy Kay CHAI: Computer human adapted interaction research.

Slides:



Advertisements
Similar presentations
Chapter 2 The Process of Experimentation
Advertisements

Evaluation of User Interface Design
Extended Cognitive Walkthrough Judy Kay CHAI: Computer human adapted interaction research group School of Information Technologies.
COMP5047 – Design Focus groups – brainstorming for design Judy Kay CHAI: Computer human adapted interaction research group School of Information Technologies.
CS305: HCI in SW Development Evaluation (Return to…)
Each individual person is working on a GUI subset. The goal is for you to create screens for three specific tasks your user will do from your GUI Project.
Deciding How to Measure Usability How to conduct successful user requirements activity?
Extended Cognitive Walkthrough Judy Kay CHAI: Computer human adapted interaction research group School of Information Technologies.
Chapter 14: Usability testing and field studies. 2 FJK User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept.
Data gathering.
Evaluation Howell Istance. Why Evaluate? n testing whether criteria defining success have been met n discovering user problems n testing whether a usability-related.
©N. Hari Narayanan Computer Science & Software Engineering Auburn University 1 COMP 7620 Evaluation Chapter 9.
© De Montfort University, 2001 Questionnaires contain closed questions (attitude scales) and open questions pre- and post questionnaires obtain ratings.
Evaluation. formative 4 There are many times throughout the lifecycle of a software development that a designer needs answers to questions that check.
Think-aloud usability experiments or concurrent verbal accounts Judy Kay CHAI: Computer human adapted interaction research group School of Information.
Observing users.
Usable Privacy and Security Carnegie Mellon University Spring 2008 Lorrie Cranor 1 Designing user studies February.
An evaluation framework
Feedback from Usability Evaluation to User Interface Design: Are Usability Reports Any Good? Christian M. Nielsen 1 Michael Overgaard 2 Michael B. Pedersen.
Business research methods: data sources
GOMS and keystroke predictive methods Judy Kay CHAI: Computer human adapted interaction research group School of Information Technologies.
ICS 463, Intro to Human Computer Interaction Design: 8. Evaluation and Data Dan Suthers.
“Retrospective vs. concurrent think-aloud protocols: usability testing of an online library catalogue.” Presented by: Aram Saponjyan & Elie Boutros.
Usability Methods: Cognitive Walkthrough & Heuristic Evaluation Dr. Dania Bilal IS 588 Spring 2008 Dr. D. Bilal.
Week 3 Judy Kay CHAI: Computer human adapted interaction research group School of Information Technologies.
Interviews By Darelle van Greunen.
Evaluation Framework Prevention vs. Intervention CHONG POH WAN 21 JUNE 2011.
Week 5 – 2015 HCI – COMP3315 Judy Kay CHAI: Computer Human Adapted Interaction research group Human Centred Technology Cluster for Teaching and Research.
Chapter 11: An Evaluation Framework Group 4: Tony Masi, Sam Esswein, Brian Rood, & Chris Troisi.
Presentation: Techniques for user involvement ITAPC1.
ITEC224 Database Programming
COMP3615,5615 Capstone Projects Week 4. Overview Where should you be now? What are the pragmatics of getting established? The grading over the next 2.
Ch 14. Testing & modeling users
Multimedia Specification Design and Production 2013 / Semester 1 / week 9 Lecturer: Dr. Nikos Gazepidis
Overview of the rest of the semester Building on Assignment 1 Using iterative prototyping.
Gruppearbejde Mikael B. Skov Jan Stage. 2 Gruppearbejde Hvad er jeres produkter, og hvad gør I vedr. evaluering? Hvordan kan specifikt evaluere disse?
Human Computer Interaction
Overview of the rest of the semester Iteratively design interface to help people log their food intake over the long term.
COMP5047 Pervasive Computing: 2012 GOMS and keystroke predictive methods Judy Kay CHAI: Computer human adapted interaction research group School of Information.
Cognitive Walkthrough Judy Kay CHAI: Computer human adapted interaction research group School of Information Technologies.
Extended Cognitive Walkthrough Judy Kay CHAI: Computer human adapted interaction research group School of Information Technologies.
Usability Testing Chapter 6. Reliability Can you repeat the test?
1 ISE 412 Usability Testing Purpose of usability testing:  evaluate users’ experience with the interface  identify specific problems in the interface.
CS2003 Usability Engineering Usability Evaluation Dr Steve Love.
Virtual University - Human Computer Interaction 1 © Imran Hussain | UMT Imran Hussain University of Management and Technology (UMT) Lecture 40 Observing.
Dr. Engr. Sami ur Rahman Assistant Professor Department of Computer Science University of Malakand Research Methods in Computer Science Lecture: Data Generation.
User Interface Design & Usability for the Web Card Sorting You should now have a basic idea as to content requirements, functional requirements and user.
The product of evaluation is knowledge. This could be knowledge about a design, knowledge about the user or knowledge about the task.
Usability Engineering Dr. Dania Bilal IS 582 Spring 2006.
Usability Evaluation, part 2. REVIEW: A Test Plan Checklist, 1 Goal of the test? Specific questions you want to answer? Who will be the experimenter?
Overview and Revision for INFO3315. The exam
Usability Engineering Dr. Dania Bilal IS 592 Spring 2005.
Alternative Assessment Chapter 8 David Goh. Factors Increasing Awareness and Development of Alternative Assessment Educational reform movement Goals 2000,
COMP5047 Pervasive Computing: 2012 Extended Cognitive Walkthrough Judy Kay CHAI: Computer human adapted interaction research group School of Information.
Chapter 23 Deciding how to collect data. UIDE Chapter 23 Introduction: Preparing to Collect Evaluation Data Timing and Logging Actions –Automatic Logging.
User Interface Evaluation Introduction Lecture #15.
Usability Evaluation or, “I can’t figure this out...do I still get the donuts?”
Steps in Planning a Usability Test Determine Who We Want To Test Determine What We Want to Test Determine Our Test Metrics Write or Choose our Scenario.
Usability Engineering Dr. Dania Bilal IS 582 Spring 2007.
Usability Engineering Dr. Dania Bilal IS 587 Fall 2007.
Evaluation / Usability. ImplementDesignAnalysisEvaluateDevelop ADDIE.
School of Engineering and Information and Communication Technology KIT305/607 Mobile Application Development Week 7: Usability (think-alouds) Dr. Rainer.
Day 8 Usability testing.
Usability Evaluation, part 2
Usability Evaluation.
Chapter 23 Deciding how to collect data
Evaluation.
Empirical Evaluation Data Collection: Techniques, methods, tricks Objective data IRB Clarification All research done outside the class (i.e., with non-class.
Human-Computer Interaction: Overview of User Studies
Presentation transcript:

COMP5047 Pervasive Computing: 2012 Think-aloud usability experiments or concurrent verbal accounts Judy Kay CHAI: Computer human adapted interaction research group School of Information Technologies

Overview Empirical methods Think-aloud Benefits Disadvantages Naturalised think-aloud

Postconditions for this week Describe the uses of Think Alouds Describe the processes for conducting one Describe advantages and limitations Ability to use Think-aloud as relevant and for your project Justify the use of Think Aloud in the overall testing of a pervasive computing application Be ready to do Think Aloud evaluations for your part of your project

Think aloud protocols Ask user to “think aloud” as they use the interface Often used with video, audio taping Otherwise MUST make notes Helps observer interpret what is going on Gives qualitative data mainly 3-5 users may be enough (Nielsen)‏ for each stage of refining the prototype

Case study Test usability of teacher’s data projection facilities in this classroom What is intended functionality? Formulate some concrete tasks – write these as instructions –eg Suppose you currently have the projector screen down but you want to use the whiteboard. So you want make the screen go up.

Case study Work in your groups to identify 3 tasks relevant to testing the teacher’s use of projection facilities in this room

Case study Call for volunteers to be users for this trial Have not used these facilities before Have used them in other places in the uni Call for volunteers to conduct the trial All will make helpful notes

Classification of think-aloud formative versus summative predictive v empirical laboratory v naturalistic qualitative v quantitative For the project, you goal will be?

Design cycle User Centred Design Define criteria for success Define concrete tasks users should be able to do - use these in evaluation Prototype construction Usability study –Decide just what data to collect –Test design of experiment for timing (trial it)‏ –Recruit users –Run study Goto top

Recruiting users How representative are they? –similarity to intended user population –Age –Gender –experience in area –interest/motivation –computer literacy What effect does user population have for conclusions?

Stages of running an evaluation 1.Preparation 2.Introduction 3.The test 4.Questionnaire/interview 5.Debriefing 6.Analysis, reflection, summarising, reporting, conclusions for action Steps 1- 5 done for each user test, as run Step 6 is applied mainly after several users

Preparation Set up machine, room, environment Check all of them Check user instructions Do a mental run-through Be sure not to waste user's time because of your lack of preparation!

Introduction Welcome user, explain purpose of test –make clear system tested not user –confidentiality –anonymity of reporting –opt out at any time –what is recorded Invite any other questions to here –explain procedure –if appropriate, do demo –invite questions

The test User works through experiment.... –recording –ensure user feels supported –show pleasure at problems identified –critical to help user if stuck Questionnaire/interview –open and closed

What data should you collect? Observe –direct/indirect –take notes –video/audio/software monitor –software logs for timing Questionnaire: –open –closed

Debriefing Thank user Remind them of usefulness of results Pause to make sure all data collected All notes written May ask user to confirm details collected

Pitfalls Defining the right concrete tasks –Test all key aspects –Multiple tasks for same aspects Instructions to the users –Do NOT lead the user –Take particular care not to use words that are identical to terms on the interface

Benefits of think aloud “show what users are doing and why they are doing it while they are doing it in order to avoid later rationalisations” (Nielsen, Usability Engineering, Academic press 1993, p195)‏ Cheap Slows users down –studies show users may work faster with fewer errors due to care on critical elements

Problems of think aloud Not directly quantitative Add cognitive load to users User's “theories” must be interpreted with care Slows users down Users are aware they are being observed so behave accordingly

Facilitating think aloud What are you thinking now? What do you think that message means? (only after the user has noticed the message and is clearly spending time on it)‏ don't help user except with How do you think you can do it? if user appears surprised, Is that what you expected to happen?

Naturalised think-aloud Multi-user interaction –Two (or more) users work on task –Conversation is natural –Observer collects dialogue

Problems with observing users Hawthorn effect People rationalise (Telling more than we can know)‏ Qualitative data

Class activity - foreshadowed Formulate three concrete tasks for testing the console of the lecture theatre Write these out ready for use in an experiment After next part of the seminar: each group will conduct one practice Think-Aloud with the rest of the class writing feedback on how this was done.

Summary Relatively inexpensive Can identify major flaws And may indicate causes of user problems May give access to user's mental model Alters activity => meaningfulness