Chapter 12: Introducing Evaluation. The aims To illustrate how observation, interviews and questionnaires that you encountered in Chapters 7 and 8 are.

Slides:



Advertisements
Similar presentations
©2011 1www.id-book.com An evaluation framework Chapter 13.
Advertisements

Introducing evaluation. The aims Discuss how developers cope with real-world constraints. Explain the concepts and terms used to discuss evaluation. Examine.
Chapter 13: An evaluation framework
Chapter 12: Introducing Evaluation. The aims To illustrate how observation, interviews and questionnaires that you encountered in Chapters 7 and 8 are.
©2011 1www.id-book.com Introducing Evaluation Chapter 12.
CHAPTER 10 Introducing Evaluation Doosup Baek Sungmin Cho Syed Omer Jan.
Agile Usability Testing Methods
CS305: HCI in SW Development Evaluation (Return to…)
WHAT IS INTERACTION DESIGN?
User-centered approaches to interaction design
COMP 6620 Chapter Presentation Karthik Vemula. Agenda:-  User Centered Approach  Basic Activities of Interaction Design.  In Class Assignment.
Methodology Overview Dr. Saul Greenberg John Kelleher.
User-centered approaches to interaction design. Overview Why involve users at all? What is a user-centered approach? Understanding users’ work —Coherence.
1 User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept. Cal Poly San Luis Obispo FJK 2005.
Midterm Exam Review IS 485, Professor Matt Thatcher.
User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept. Cal Poly San Luis Obispo FJK 2005.
Part 4: Evaluation Chapter 20: Why evaluate? Chapter 21: Deciding on what to evaluate: the strategy Chapter 22: Planning who, what, where, and when Chapter.
Observing users.
Observing users. The aims Discuss the benefits & challenges of different types of observation. Describe how to observe as an on-looker, a participant,
Design and Evaluation of Iterative Systems n For most interactive systems, the ‘design it right first’ approach is not useful. n The 3 basic steps in the.
An evaluation framework
Course Wrap-Up IS 485, Professor Matt Thatcher. 2 C.J. Minard ( )
Mid-Term Exam Review IS 485, Professor Matt Thatcher.
Design in the World of Business
Lecture 7: User-centered approaches and Introducing evaluation.
Web Content Development Dr. Komlodi Class 25: Evaluative testing.
Chapter 10: Introducing Evaluation Group 4: Tony Masi, Sam Esswein, Brian Rood, Chris Troisi.
류 현 정류 현 정 Human Computer Interaction Introducing evaluation.
Human Computer Interaction & Usability Prototyping Design & Prototyping HCI Prototyping.
Ibrahim A. Atoum Portable-02, Room-03 University of Hail, KSA
Evaluation Framework Prevention vs. Intervention CHONG POH WAN 21 JUNE 2011.
Chapter 11: An Evaluation Framework Group 4: Tony Masi, Sam Esswein, Brian Rood, & Chris Troisi.
HCI Prototyping Chapter 6 Prototyping. Learning Outcomes At the end of this lecture, you should be able to: –Define the term “prototyping” –Explain the.
S556 SYSTEMS ANALYSIS & DESIGN Week 11. Creating a Vision (Solution) SLIS S556 2  Visioning:  Encourages you to think more systemically about your redesign.
Part 1-Intro; Part 2- Req; Part 3- Design  Chapter 20 Why evaluate the usability of user interface designs?  Chapter 21 Deciding on what you need to.
Chapter 11 Design, prototyping, and construction.
Multimedia Specification Design and Production 2013 / Semester 1 / week 9 Lecturer: Dr. Nikos Gazepidis
Human Computer Interaction
Computer Science Department California Polytechnic State University San Luis Obispo, CA, U.S.A. Franz J. Kurfess CPE/CSC 484: User-Centered Design and.
Introducing Evaluation: why, what, when, where Text p Text p 317 – 323;
Evaluation approaches Text p Text p
Chapter 12/13: Evaluation/Decide Framework Question 1.
Level 2 Prepared by: RHR First Prepared on: Nov 23, 2006 Last Modified on: Quality checked by: MOH Copyright 2004 Asia Pacific Institute of Information.
CS2003 Usability Engineering Usability Evaluation Dr Steve Love.
Chapter 12: Introducing Evaluation. The aims To illustrate how observation, interviews and questionnaires that you encountered in Chapters 7 and 8 are.
CS2003 Usability Engineering Human-Centred Design Dr Steve Love.
Virtual University - Human Computer Interaction 1 © Imran Hussain | UMT Imran Hussain University of Management and Technology (UMT) Lecture 40 Observing.
Web Content Development Dr. Komlodi Class 25: Evaluative testing.
© An Evaluation Framework Chapter 13.
Chapter 3 Managing Design Processes. 3.1 Introduction Design should be based on: –User observation Analysis of task frequency and sequences –Prototypes,
Chapter 12/13: Evaluation/Decide Framework. Why Evaluate? Why: to check that users can use the product and that they like it. Designers need to check.
Computer Science Department California Polytechnic State University San Luis Obispo, CA, U.S.A. Franz J. Kurfess CPE/CSC 484: User-Centered Design and.
Observing users. The aims Discuss the benefits & challenges of different types of observation. Describe how to observe as an on-looker, a participant,
Z556 Systems Analysis & Design Session 10 ILS Z556 1.
©2011 1www.id-book.com The process of interaction design Chapter 9.
AVI/Psych 358/IE 340: Human Factors Evaluation October 31, 2008.
Um ambiente para avaliação. Objetivos Explicar conceitos e termos da avaliação Descrever paradigmas de avaliação e técnicas utilizadas no design de.
Oct 211 The next two weeks Oct 21 & 23: Lectures on user interface evaluation Oct 28: Lecture by Dr. Maurice Masliah No office hours (out of town) Oct.
User Interface Evaluation Introduction Lecture #15.
Introducing Evaluation Chapter 12. What is Evaluation?  Assessing and judging  Reflecting on what it is to be achieved  Assessing the success  Identifying.
Introduction to System Evaluation IS 588 Dania Bilal & Lorraine Normore Spring 2007.
Chapter 13: An evaluation framework. The aims are: To discuss the conceptual, practical and ethical issues involved in evaluation. To introduce and explain.
SIE 515 Design Evaluation Lecture 7.
Introducing Evaluation
WHAT IS INTERACTION DESIGN?
Imran Hussain University of Management and Technology (UMT)
Introducing Evaluation
COMP444 Human Computer Interaction Evaluation
Chapter 14 INTRODUCING EVALUATION
Presentation transcript:

Chapter 12: Introducing Evaluation

The aims To illustrate how observation, interviews and questionnaires that you encountered in Chapters 7 and 8 are used in evaluation. To explain the key concepts and terms used in evaluation. To introduce three main evaluation evaluation approaches and key evaluation methods within the context of real evaluation studies.

Six evaluation case studies Evaluating early design ideas for a mobile device for rural nurses in India. Evaluating cell phones for different markets. Evaluating affective issues: challenge and engagement in a collaborative immersive game. Improving a design: the HutchWorld patient support system. Multiple methods help ensure good usability: the olympic messaging system (OMS). Evaluating a new kind of interaction: an ambient system.

Why, what, where and when to evaluate Iterative design & evaluation is a continuous process that examines: Why: to check that users can use the product and that they like it. What: a conceptual model, early prototypes of a new system and later, more complete prototypes. Where: in natural and laboratory settings. When: throughout design; finished products can be evaluated to collect information to inform new products. Designers need to check that they understand users’ requirements.

Bruce Tognazzini tells you why you need to evaluate “Iterative design, with its repeating cycle of design and testing, is the only validated methodology in existence that will consistently produce successful results. If you don’t have user-testing as an integral part of your design process you are going to throw buckets of money down the drain.” See AskTog.com for topical discussions about design and evaluation.AskTog.com

The language of evaluation Analytical evaluation Controlled experiment Field study Formative evaluation Heuristic evaluation Predictive evaluation Summative evaluation Usability laboratory User studies Usability studies Usability testing User testing

Evaluation approaches Usability testing Field studies Analytical evaluation Combining approaches Opportunistic evaluations

Characteristics of approaches Usability testing Field studies Analytical Usersdo tasknaturalnot involved Locationcontrollednaturalanywhere Whenprototypeearlyprototype Dataquantitativequalitativeproblems Feed backmeasures & errors descriptionsproblems Typeappliednaturalisticexpert

Evaluation approaches and methods MethodUsability testing Field studies Analytical Observing x x Asking users x x Asking experts x x Testing x Modeling x

Evaluation to design a mobile record system for Indian AMWs A field study using observations and interviews to refine the requirements. It would replace a paper system. It had to be easy to use in rural environments. Basic information would be recorded: identify each house-hold, head of house, no. members, age and medical history of members, etc.

Could these icons be used with other cultures? For more interesting examples of mobile designs for the developing world see Gary Marsden’s home page:

Evaluating cell phones for different world markets An already existing product was used as a prototype for a new market. Observation and interviews. Many practical problems needed to be overcome: Can you name some? Go to and select a phone or imagine evaluating this one in a country that Nokia serves.

Challenge & engagement in a collaborative immersive game Physiological measures were used. Players were more engaged when playing against another person than when playing against a computer. What were the precautionary measures that the evaluators had to take?

What does this data tell you?

The HutchWorld patient support system This virtual world supports communication among cancer patients. Privacy, logistics, patients’ feelings, etc. had to be taken into account. Designers and patients speak different languages. Participants in this world can design their own avatar. Look at the “My appearance” slide that follows. How would you evaluate it?

My Appearance

Multiple methods to evaluate the 1984 OMS Early tests of printed scenarios & user guides. Early simulations of telephone keypad. An Olympian joined team to provide feedback. Interviews & demos with Olympians outside US. Overseas interface tests with friends and family. Free coffee and donut tests. Usability tests with 100 participants. A ‘try to destroy it’ test. Pre-Olympic field-test at an international event. Reliability of the system with heavy traffic.

Something to think about Why was the design of the OMS a landmark in interaction design? Today cell phones replace the need for the OMS. What are some of the benefits and losses of cell phones in this context? How might you compensate for the losses that you thought of?

Evaluating an ambient system The Hello Wall is a new kind of system that is designed to explore how people react to its presence. What are the challenges of evaluating systems like this?

Key points Evaluation & design are closely integrated in user-centered design. Some of the same techniques are used in evaluation as for establishing requirements but they are used differently (e.g. observation interviews & questionnaires). Three main evaluation approaches are: usability testing, field studies, and analytical evaluation. The main methods are:observing, asking users, asking experts, user testing, inspection, and modeling users’ task performance. Different evaluation approaches and methods are often combined in one study. Triangulation involves using a combination of techniques to gain different perspectives, or analyzing data using different techniques. Dealing with constraints is an important skill for evaluators to develop.