Introducing Evaluation Chapter 12. What is Evaluation?  Assessing and judging  Reflecting on what it is to be achieved  Assessing the success  Identifying.

Slides:



Advertisements
Similar presentations
Introducing evaluation. The aims Discuss how developers cope with real-world constraints. Explain the concepts and terms used to discuss evaluation. Examine.
Advertisements

Chapter 13: An evaluation framework
Chapter 14: Usability testing and field studies
Chapter 12: Introducing Evaluation. The aims To illustrate how observation, interviews and questionnaires that you encountered in Chapters 7 and 8 are.
Performance Assessment
Evaluation of User Interface Design
©2011 1www.id-book.com Introducing Evaluation Chapter 12.
Chapter 1 Introducing User Interface Design. UIDE Chapter 1 Why the User Interface Matters Why the User Interface Matters Computers Are Ubiquitous Computers.
Agile Usability Testing Methods
CS305: HCI in SW Development Evaluation (Return to…)
IS214 Recap. IS214 Understanding Users and Their Work –User and task analysis –Ethnographic methods –Site visits: observation, interviews –Contextual.
WHAT IS INTERACTION DESIGN?
ACTIVELY ENGAGING THE STAKEHOLDER IN DEFINING REQUIREMENTS FOR THE BUSINESS, THE STAKEHOLDER, SOLUTION OR TRANSITION Requirements Elicitation.
Chapter 14: Usability testing and field studies. 2 FJK User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept.
Part 1: Introducing User Interface Design Chapter 1: Introduction –Why the User Interface Matters –Computers are Ubiquitous –The Importance of Good User.
Chapter 14: Usability testing and field studies. Usability Testing Emphasizes the property of being usable Key Components –User Pre-Test –User Test –User.
©N. Hari Narayanan Computer Science & Software Engineering Auburn University 1 COMP 7620 Evaluation Chapter 9.
Methodology Overview Dr. Saul Greenberg John Kelleher.
User-centered approaches to interaction design. Overview Why involve users at all? What is a user-centered approach? Understanding users’ work —Coherence.
1 User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept. Cal Poly San Luis Obispo FJK 2005.
Evaluation Methodologies
User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept. Cal Poly San Luis Obispo FJK 2005.
Part 4: Evaluation Chapter 20: Why evaluate? Chapter 21: Deciding on what to evaluate: the strategy Chapter 22: Planning who, what, where, and when Chapter.
Design and Evaluation of Iterative Systems n For most interactive systems, the ‘design it right first’ approach is not useful. n The 3 basic steps in the.
An evaluation framework
Usability 2004 J T Burns1 Usability & Usability Engineering.
An evaluation framework
Usability Evaluation 1
From Controlled to Natural Settings
Usability Evaluation Methods Computer-Mediated Communication Lab week 10; Tuesday 7/11/06.
Design in the World of Business
Usability 2009 J T Burns1 Usability & Usability Engineering.
Chapter 14: Usability testing and field studies
류 현 정류 현 정 Human Computer Interaction Introducing evaluation.
Chapter 11: An Evaluation Framework Group 4: Tony Masi, Sam Esswein, Brian Rood, & Chris Troisi.
Part 1-Intro; Part 2- Req; Part 3- Design  Chapter 20 Why evaluate the usability of user interface designs?  Chapter 21 Deciding on what you need to.
Multimedia Specification Design and Production 2013 / Semester 1 / week 9 Lecturer: Dr. Nikos Gazepidis
©2011 1www.id-book.com Introducing Evaluation Chapter 12 adapted by Wan C. Yoon
Human Computer Interaction
Computer Science Department California Polytechnic State University San Luis Obispo, CA, U.S.A. Franz J. Kurfess CPE/CSC 484: User-Centered Design and.
Introducing Evaluation: why, what, when, where Text p Text p 317 – 323;
Chapter 12/13: Evaluation/Decide Framework Question 1.
CSCD 487/587 Human Computer Interface Winter 2013 Lecture 19 Evaluation.
Level 2 Prepared by: RHR First Prepared on: Nov 23, 2006 Last Modified on: Quality checked by: MOH Copyright 2004 Asia Pacific Institute of Information.
Usability Testing: Role Playing as an Illustration Defining experimental tasks Recording stuff Identifying problems  changes Leading to A6 Presented by.
Chapter 12: Introducing Evaluation. The aims To illustrate how observation, interviews and questionnaires that you encountered in Chapters 7 and 8 are.
CS2003 Usability Engineering Human-Centred Design Dr Steve Love.
Chapter 12: Introducing Evaluation. The aims To illustrate how observation, interviews and questionnaires that you encountered in Chapters 7 and 8 are.
Chapter 12/13: Evaluation/Decide Framework. Why Evaluate? Why: to check that users can use the product and that they like it. Designers need to check.
EVALUATION PROfessional network of Master’s degrees in Informatics as a Second Competence – PROMIS ( TEMPUS FR-TEMPUS-JPCR)
AVI/Psych 358/IE 340: Human Factors Evaluation October 31, 2008.
Oct 211 The next two weeks Oct 21 & 23: Lectures on user interface evaluation Oct 28: Lecture by Dr. Maurice Masliah No office hours (out of town) Oct.
User Interface Evaluation Introduction Lecture #15.
1 Design and evaluation methods: Objectives n Design life cycle: HF input and neglect n Levels of system design: Going beyond the interface n Sources of.
Chapter 13: An evaluation framework. The aims are: To discuss the conceptual, practical and ethical issues involved in evaluation. To introduce and explain.
The aims Show how design & evaluation are brought together in the development of interactive products. Show how different combinations of design & evaluation.
User Interface Evaluation
SIE 515 Design Evaluation Lecture 7.
Introducing Evaluation
From Controlled to Natural Settings
Evaluation technique Heuristic Evaluation.
WHAT IS INTERACTION DESIGN?
Imran Hussain University of Management and Technology (UMT)
From Controlled to Natural Settings
HCI Evaluation Techniques
Introducing Evaluation
COMP444 Human Computer Interaction Evaluation
Chapter 14 INTRODUCING EVALUATION
Presentation transcript:

Introducing Evaluation Chapter 12

What is Evaluation?  Assessing and judging  Reflecting on what it is to be achieved  Assessing the success  Identifying required changes.

W hy? W here? W hat? W hen?

Why: check users’ requirements What: a conceptual model, early prototypes of a new system and later, more complete prototypes. Where: in natural and laboratory settings. When: throughout design; finished products can be evaluated to collect information to inform new products. Iterative design & evaluation is a continuous process that examines:

Bruce Tognazzini  Born 1945  American  Usability consultant  Specialist in Human Computer Interaction  Worked with Apple for 14 years  Sun Microsystems for 4 years  WebMD for 4 years  Published 2 books  Published Webzine Askfog

What does Bruce Tognazzini think? Why do we need to evaluate? “Iterative design, with its repeating cycle of design and testing, is the only validated methodology in existence that will consistently produce successful results. If you don’t have user-testing as an integral part of your design process you are going to throw buckets of money down the drain.” See AskTog.com for topical discussions about design and evaluation.

Types of evaluation Controlled settings involving users, eg usability testing & experiments in laboratories and living labs. Natural settings involving users, eg field studies to see how the product is used in the real world. Any settings not involving users, eg consultants critique; to predict, analyze & model aspects of the interface analytics.

Living labs  People’s use of technology in their everyday lives can be evaluated in living labs.  Such evaluations are too difficult to do in a usability lab.  Eg the Aware Home was embedded with a complex network of sensors and audio/video recording devices (Abowd et al., 2000).

Usability testing & field studies can compliment

→Experiment to investigate a computer game →In the wild field study of skiers →Crowd sourcing Evaluation case studies

Challenge & engagement in a collaborative immersive game  Physiological measures were used.  Players were more engaged when playing against another person than when playing against a computer.

What does this data tell you?

Why study skiers in the wild ?

e-skiing system components

Crowd sourcing-when might you use it?

Evaluating an ambient system The Hello Wall is a new kind of system that is designed to explore how people react to its presence. What are the challenges of evaluating systems like this?

Evaluation methods MethodControlled settings Natural settings Without users Observing x x Asking users x x Asking experts x x Testing x Modeling x

The language of evaluation Analytics Analytical evaluation Controlled experiment Expert review or critique Field study Formative evaluation Heuristic evaluation In the wild evaluation Living laboratory Predictive evaluation Summative evaluation Usability laboratory User studies Usability testing Users or participants

Key points Evaluation & design are closely integrated in user-centered design. Some of the same techniques are used in evaluation as for establishing requirements but they are used differently (e.g. observation interviews & questionnaires). Three types of evaluation: laboratory based with users, in the field with users, studies that do not involve users The main methods are: observing, asking users, asking experts, user testing, inspection, and modeling users’ task performance, analytics. Dealing with constraints is an important skill for evaluators to develop.

“The Butterfly Ballot: Anatomy of disaster” was written by Bruce Tognazzini, and you can find it by going to AskTog.com and looking through the 2001 column. Alternatively go directly to: A project for you …

Read Tog’s account and look at the picture of the ballot card. Make a similar ballot card for a class election and ask 10 of your friends to vote using the card. After each person has voted ask who they intended to vote for and whether the card was confusing. Note down their comments. Redesign the card and perform the same test with 10 different people. Report your findings.