Chapter 11: An Evaluation Framework Group 4: Tony Masi, Sam Esswein, Brian Rood, & Chris Troisi.

Slides:



Advertisements
Similar presentations
©2011 1www.id-book.com Evaluation studies: From controlled to natural settings Chapter 14.
Advertisements

©2011 1www.id-book.com An evaluation framework Chapter 13.
Chapter 13: An evaluation framework
Chapter 14: Usability testing and field studies
Chapter 5 Development and Evolution of User Interface
Imran Hussain University of Management and Technology (UMT)
Data gathering. Overview Four key issues of data gathering Data recording Interviews Questionnaires Observation Choosing and combining techniques.
CS305: HCI in SW Development Evaluation (Return to…)
WHAT IS INTERACTION DESIGN?
Chapter 14: Usability testing and field studies. 2 FJK User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept.
Observing Users Paul Bogen, LaShon Johnson, Jehoon Park.
Data gathering.
Chapter 14: Usability testing and field studies. Usability Testing Emphasizes the property of being usable Key Components –User Pre-Test –User Test –User.
An evaluation framework
User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept. Cal Poly San Luis Obispo FJK 2005.
Observing users.
Observing users. The aims Discuss the benefits & challenges of different types of observation. Describe how to observe as an on-looker, a participant,
Data-collection techniques. Contents Types of data Observations Event logs Questionnaires Interview.
An evaluation framework
Usability 2004 J T Burns1 Usability & Usability Engineering.
An evaluation framework
Chapter 3 Preparing and Evaluating a Research Plan Gay and Airasian
Evaluation How do we test the interaction design? Several Dimensions
From Controlled to Natural Settings
Design in the World of Business
Chapter 14: Usability testing and field studies
류 현 정류 현 정 Human Computer Interaction Introducing evaluation.
Evaluation Framework Prevention vs. Intervention CHONG POH WAN 21 JUNE 2011.
Designing 1-1 Interviews and Focus Groups Desmond Thomas, University of Essex.
Data gathering. Overview Four key issues of data gathering Data recording Interviews Questionnaires Observation Choosing and combining techniques.
Ch 14. Testing & modeling users
Part 1-Intro; Part 2- Req; Part 3- Design  Chapter 20 Why evaluate the usability of user interface designs?  Chapter 21 Deciding on what you need to.
Human Computer Interaction
Usability testing. Goals & questions focus on how well users perform tasks with the product. – typical users – doing typical tasks. Comparison of products.
Usability Evaluation June 8, Why do we need to do usability evaluation?
Evaluation Paradigms & Techniques IS 588 Spring 2008 Dr. D. Bilal.
Chapter 12/13: Evaluation/Decide Framework Question 1.
Testing & modeling users. The aims Describe how to do user testing. Discuss the differences between user testing, usability testing and research experiments.
Level 2 Prepared by: RHR First Prepared on: Nov 23, 2006 Last Modified on: Quality checked by: MOH Copyright 2004 Asia Pacific Institute of Information.
Chapter 13. Reviewing, Evaluating, and Testing © 2010 by Bedford/St. Martin's1 Usability relates to five factors of use: ease of learning efficiency of.
Chapter 12: Introducing Evaluation. The aims To illustrate how observation, interviews and questionnaires that you encountered in Chapters 7 and 8 are.
CS3205: HCI in SW Development Evaluation (Return to…) We’ve had an introduction to evaluation. Now for more details on…
© An Evaluation Framework Chapter 13.
Chapter 12: Introducing Evaluation. The aims To illustrate how observation, interviews and questionnaires that you encountered in Chapters 7 and 8 are.
Chapter 12/13: Evaluation/Decide Framework. Why Evaluate? Why: to check that users can use the product and that they like it. Designers need to check.
Computer Science Department California Polytechnic State University San Luis Obispo, CA, U.S.A. Franz J. Kurfess CPE/CSC 484: User-Centered Design and.
Observing users. The aims Discuss the benefits & challenges of different types of observation. Describe how to observe as an on-looker, a participant,
AVI/Psych 358/IE 340: Human Factors Data Gathering October 3, 2008.
AVI/Psych 358/IE 340: Human Factors Evaluation October 31, 2008.
DECIDE: An evaluation framework. DECIDE: a framework to guide evaluation D D etermine the goals. E E xplore the questions. C C hoose the evaluation approach.
Um ambiente para avaliação. Objetivos Explicar conceitos e termos da avaliação Descrever paradigmas de avaliação e técnicas utilizadas no design de.
Oct 211 The next two weeks Oct 21 & 23: Lectures on user interface evaluation Oct 28: Lecture by Dr. Maurice Masliah No office hours (out of town) Oct.
Data gathering (Chapter 7 Interaction Design Text)
Introduction Ms. Binns.  Distinguish between qualitative and quantitative data  Explain strengths and limitations of a qualitative approach to research.
Chapter 13: An evaluation framework. The aims are: To discuss the conceptual, practical and ethical issues involved in evaluation. To introduce and explain.
Planning my research journey
SIE 515 Design Evaluation Lecture 7.
Introducing Evaluation
Chapter 20 Why evaluate the usability of user interface designs?
From Controlled to Natural Settings
WHAT IS INTERACTION DESIGN?
Evaluation Paradigms & Techniques
Observing users.
From Controlled to Natural Settings
Testing & modeling users
Human-Computer Interaction: Overview of User Studies
COMP444 Human Computer Interaction Evaluation
From Controlled to Natural Settings
Presentation transcript:

Chapter 11: An Evaluation Framework Group 4: Tony Masi, Sam Esswein, Brian Rood, & Chris Troisi

Introduction Evaluation helps ensure that product meets the users’ needs Recall HutchWorld & Olympic Messaging System (OMS) – chapter 10 What to evaluate? usability user experience

Chapter Goals Key concepts & terms to discuss evaluation Description of evaluation paradigms & techniques Conceptual, practical, and ethical issues DECIDE framework for evaluation

Evaluation Paradigms Key terms- evaluation paradigms, user studies 4 core evaluation paradigms “Quick and dirt” evaluation Usability testing Field Studies Predictive Evaluation

Key Terms User studies involve looking at how people behave in their natural environments, or in the laboratory, both with old technologies and with new ones. Evaluation Paradigm is the set of beliefs which guide any type of evaluation

“Quick and Dirty” Quick & Dirty evaluation describes the common practice in which designers informally get feedback from users or consultants to confirm that their ideas are in- line with users’ needs and are liked. Quick & Dirty evaluations are done any time. The emphasis is on fast input to the design process rather than carefully documented findings.

Usability testing Usability testing involves recording typical users’ performance on typical tasks in controlled settings. As the users perform these tasks they are watched & recorded on video & their key presses are logged. This data is used to calculate performance times, identify errors & help explain why the users did what they did. User satisfaction questionnaires & interviews are used to elicit users’ opinions. Recall HutchWorld and OMS

Field Studies Field studies are done in natural settings The aim is to understand what users do naturally and how technology impacts them. In product design field studies can be used to: - identify opportunities for new technology - determine design requirements - decide how best to introduce new technology - evaluate technology in use.

Field Studies Two Approaches Outsider – observing and recording what happens as an outsider looking in Insider – participant in study that explores the details of what happens in a particular setting

Predictive Evaluation Experts apply their knowledge of typical users, often guided by heuristics, to predict usability problems. Note: heuristics- design principles used in practice Another approach involves theoretically based models. A key feature of predictive evaluation is that users need not be present Relatively quick & inexpensive

Key Aspects of each Evaluation Paradigm Table 11.1 – page 344 role of users who controls the process & relationship during evaluation location when is it most useful to evaluate type of data collected & how it is analyzed how findings are fed back to the design process philosophy that underlies these paradigms

Evaluation Techniques Observing users Asking users their opinions Asking experts their opinions Testing users’ performance Modeling users’ task performance to predict the efficacy of a user interface

Observing Users Techniques notes audio video interaction log

Asking users their opinions Questions like: what do you think about the product? does it do what you want? do you like it? does the aesthetic design appeal to you? did you encounter problems? would you use it again?

Asking experts their opinions Use heuristics to step through tasks Typically use role-playing to identify problems It is inexpensive and quick to ask experts rather than perform laboratory and field evaluations

User Testing Recall HutchWorld example Usually conducted in a controlled environment Users perform well-defined tasks Data can be collected and statistically analyzed

Modeling users’ task performance Model human-computer interaction to predict the efficiency and problems in the design This is successful for systems with limited functionality Table page 347

DECIDE: framework Determine the goals the evaluation addresses. Explore the specific questions to be answered. Choose the evaluation paradigm and techniques to answer the questions. Identify the practical issues. Decide how to deal with the ethical issues. Evaluate, interpret and present the data.

Determining the Goals What are the goals of the evaluation? Who wants it and why? Goals influence the paradigm for the study. Some examples of goals:  Check that evaluators have understood user needs  Check to ensure that the final interface is consistent.  Investigate how technology affects working practices.  Improve the usability of an existing product.

Explore the Questions All evaluations need goals & questions to guide them so time is not wasted on ill- defined studies. For example, the goal of finding out why many customers prefer to purchase paper airline tickets rather than e-tickets can be broken down into sub-questions: - What are customers’ attitudes to these new tickets? - Are they concerned about security? - Is the interface for obtaining them poor?

Choose Evaluation Paradigm and Techniques Evaluation Paradigms determine which type of techniques will be used. Trade-Offs Combinations of Techniques -HutchWorld

Identifying Practical Issues For example, how to: select users stay on budget staying on schedule evaluators select equipment

Decide on Ethical Issues Consideration for peoples rights. Develop an informed consent form Participants have a right to: - know the goals of the study - what will happen to the findings - privacy of personal information - not to be quoted without their agreement - leave when they wish “do unto others only what you would not mind being done to you”

Evaluate, Interpret, and Present Data Reliability Validity Biases Scope Ecological Validity

Pilot Studies Pilot Study is a small trial run of the main study. Pilot studies are always useful for testing plans for an evaluation, before launching the main study Often evaluators run several pilot studies.