An evaluation framework

Slides:



Advertisements
Similar presentations
©2011 1www.id-book.com Evaluation studies: From controlled to natural settings Chapter 14.
Advertisements

©2011 1www.id-book.com An evaluation framework Chapter 13.
Chapter 13: An evaluation framework
Chapter 14: Usability testing and field studies
Imran Hussain University of Management and Technology (UMT)
Chapter 7 Data Gathering 1.
Data gathering. Overview Four key issues of data gathering Data recording Interviews Questionnaires Observation Choosing and combining techniques.
CS305: HCI in SW Development Evaluation (Return to…)
Observing users Chapter 12. Observation ● Why? Get information on.. – Context, technology, interaction ● Where? – Controlled environments – In the field.
WHAT IS INTERACTION DESIGN?
Chapter 14: Usability testing and field studies. 2 FJK User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept.
1 User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept. Cal Poly San Luis Obispo FJK 2005.
Observing Users Paul Bogen, LaShon Johnson, Jehoon Park.
Data gathering.
Chapter 14: Usability testing and field studies. Usability Testing Emphasizes the property of being usable Key Components –User Pre-Test –User Test –User.
An evaluation framework
User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept. Cal Poly San Luis Obispo FJK 2005.
Observing users.
Observing users. The aims Discuss the benefits & challenges of different types of observation. Describe how to observe as an on-looker, a participant,
An evaluation framework
Evaluation How do we test the interaction design? Several Dimensions
Chapter 7 GATHERING DATA.
From Controlled to Natural Settings
IN4MATX 231 Human-Computer Interaction Presenters: Ritesh Subramanian Tanmay Goel Observing Users.
Design in the World of Business
CSCI 4163 / CSCI 6904 – Winter Housekeeping  Write a question/comment about today’s reading on the whiteboard (chocolate!)  Make sure to sign.
Chapter 14: Usability testing and field studies
Evaluation Framework Prevention vs. Intervention CHONG POH WAN 21 JUNE 2011.
Chapter 11: An Evaluation Framework Group 4: Tony Masi, Sam Esswein, Brian Rood, & Chris Troisi.
Data gathering. Overview Four key issues of data gathering Data recording Interviews Questionnaires Observation Choosing and combining techniques.
Ch 14. Testing & modeling users
Chapter 12 Observing Users Li, Jia Li, Wei. Outline What and when to observe Approaches to observation How to observe How to collect data Indirect observation.
Ch. 12 Observing Users Reece, Rogers, Sharp. Beyond human computer interaction. Team 1:Andy, Nikhil, Vladimir, Sajay.
Human Computer Interaction
Usability testing. Goals & questions focus on how well users perform tasks with the product. – typical users – doing typical tasks. Comparison of products.
Evaluation Paradigms & Techniques IS 588 Spring 2008 Dr. D. Bilal.
Chapter 12/13: Evaluation/Decide Framework Question 1.
Testing & modeling users. The aims Describe how to do user testing. Discuss the differences between user testing, usability testing and research experiments.
Level 2 Prepared by: RHR First Prepared on: Nov 23, 2006 Last Modified on: Quality checked by: MOH Copyright 2004 Asia Pacific Institute of Information.
Chapter 12: Introducing Evaluation. The aims To illustrate how observation, interviews and questionnaires that you encountered in Chapters 7 and 8 are.
CS3205: HCI in SW Development Evaluation (Return to…) We’ve had an introduction to evaluation. Now for more details on…
Virtual University - Human Computer Interaction 1 © Imran Hussain | UMT Imran Hussain University of Management and Technology (UMT) Lecture 40 Observing.
© An Evaluation Framework Chapter 13.
Chapter 12: Introducing Evaluation. The aims To illustrate how observation, interviews and questionnaires that you encountered in Chapters 7 and 8 are.
CSCI 4163 / CSCI 6904 – Winter Housekeeping  Clarification about due date for reading comments/questions  Skills sheet  Active listening handout.
Chapter 12/13: Evaluation/Decide Framework. Why Evaluate? Why: to check that users can use the product and that they like it. Designers need to check.
Computer Science Department California Polytechnic State University San Luis Obispo, CA, U.S.A. Franz J. Kurfess CPE/CSC 484: User-Centered Design and.
Observing users. The aims Discuss the benefits & challenges of different types of observation. Describe how to observe as an on-looker, a participant,
AVI/Psych 358/IE 340: Human Factors Data Gathering October 6, 2008.
AVI/Psych 358/IE 340: Human Factors Data Gathering October 3, 2008.
DECIDE: An evaluation framework. DECIDE: a framework to guide evaluation D D etermine the goals. E E xplore the questions. C C hoose the evaluation approach.
Observing users. What and when to observe Goals & questions determine the paradigms and techniques used. Observation is valuable any time during design.
Um ambiente para avaliação. Objetivos Explicar conceitos e termos da avaliação Descrever paradigmas de avaliação e técnicas utilizadas no design de.
Oct 211 The next two weeks Oct 21 & 23: Lectures on user interface evaluation Oct 28: Lecture by Dr. Maurice Masliah No office hours (out of town) Oct.
Data gathering (Chapter 7 Interaction Design Text)
Observation Direct observation in the field –Structuring frameworks –Degree of participation (insider or outsider) –Ethnography Direct observation in controlled.
Chapter 13: An evaluation framework. The aims are: To discuss the conceptual, practical and ethical issues involved in evaluation. To introduce and explain.
Lecture3 Data Gathering 1.
Chapter 7 Data Gathering 1.
Research & Writing in CJ
Introducing Evaluation
From Controlled to Natural Settings
Chapter 12 Observing Users
Evaluation Paradigms & Techniques
Observing users.
From Controlled to Natural Settings
Human-Computer Interaction: Overview of User Studies
COMP444 Human Computer Interaction Evaluation
Presentation transcript:

An evaluation framework

The aims Explain key evaluation concepts & terms. Describe the evaluation paradigms & techniques used in interaction design. Discuss the conceptual, practical and ethical issues that must be considered when planning evaluations. Introduce the DECIDE framework.

Some definitions/contrasts Evaluation study design, Methods/techniques for data gathering, Methods/techniques for data analysis; Research vs. Industry Quantitative vs. Qualitative Objective vs. Subjective Experimental vs. Field studies

Evaluation paradigm Any kind of evaluation is guided explicitly or implicitly by a set of beliefs, which are often under-pined by theory. These beliefs and the methods associated with them are known as an ‘evaluation paradigm’

User studies User studies involve looking at how people behave in their natural environments, or in the laboratory, both with old technologies and with new ones.

Four evaluation paradigms ‘quick and dirty’ usability testing field studies predictive evaluation Se tabellerna sid 344, 347

Quick and dirty ‘quick & dirty’ evaluation describes the common practice in which designers informally get feedback from users or consultants to confirm that their ideas are in-line with users’ needs and are liked. Quick & dirty evaluations are done any time. The emphasis is on fast input to the design process rather than carefully documented findings.

Usability testing Usability testing involves recording typical users’ performance on typical tasks in controlled settings. Field observations may also be used. As the users perform these tasks they are watched & recorded on video & their key presses are logged. This data is used to calculate performance times, identify errors & help explain why the users did what they did. User satisfaction questionnaires & interviews are used to elicit users’ opinions.

Field studies Field studies are done in natural settings The aim is to understand what users do naturally and how technology impacts them. In product design field studies can be used to: - identify opportunities for new technology - determine design requirements - decide how best to introduce new technology - evaluate technology in use.

Predictive evaluation Experts apply their knowledge of typical users, often guided by heuristics, to predict usability problems. Another approach involves theoretically based models. A key feature of predictive evaluation is that users need not be present Relatively quick & inexpensive

Overview of techniques observing users (chapter 12), asking users’ their opinions (chapter 13), asking experts’ their opinions (chapter 13), testing users’ performance (chapter 14), modeling users’ task performance (chapter 14)

DECIDE: A framework to guide evaluation Determine the goals the evaluation addresses. Explore the specific questions to be answered. Choose the evaluation paradigm and techniques to answer the questions. Identify the practical issues. Decide how to deal with the ethical issues. Evaluate, interpret and present the data.

Determine the goals What are the high-level goals of the evaluation? Who wants it and why? The goals influence the paradigm for the study Some examples of goals: Identify the best metaphor on which to base the design. Check to ensure that the final interface is consistent. Investigate how technology affects working practices. Improve the usability of an existing product .

Explore the questions All evaluations need goals & questions to guide them so time is not wasted on ill-defined studies. For example, the goal of finding out why many customers prefer to purchase paper airline tickets rather than e-tickets can be broken down into sub-questions: - What are customers’ attitudes to these new tickets? - Are they concerned about security? - Is the interface for obtaining them poor? What questions might you ask about the design of a cell phone?

Choose the evaluation paradigm & techniques The evaluation paradigm strongly influences the techniques used, how data is analyzed and presented. E.g. field studies do not involve testing or modeling

Identify practical issues For example, how to: select users stay on budget staying on schedule find evaluators select equipment

Decide on ethical issues Develop an informed consent form Participants have a right to: - know the goals of the study - what will happen to the findings - privacy of personal information - not to be quoted without their agreement - leave when they wish - be treated politely

Evaluate, interpret & present data How data is analyzed & presented depends on the paradigm and techniques used. The following also need to be considered: - Reliability: can the study be replicated? - Validity: is it measuring what you thought? - Biases: is the process creating biases? - Scope: can the findings be generalized? - Ecological validity: is the environment of the study influencing it - e.g. Hawthorn effect

Pilot studies A small trial run of the main study. The aim is to make sure your plan is viable. Pilot studies check: - that you can conduct the procedure - that interview scripts, questionnaires, experiments, etc. work appropriately It’s worth doing several to iron out problems before doing the main study. Ask colleagues if you can’t spare real users.

Key points An evaluation paradigm is an approach that is influenced by particular theories and philosophies. Five categories of techniques were identified: observing users, asking users, asking experts, user testing, modeling users. The DECIDE framework has six parts: - Determine the overall goals - Explore the questions that satisfy the goals - Choose the paradigm and techniques - Identify the practical issues - Decide on the ethical issues - Evaluate ways to analyze & present data Do a pilot study

Observing users

The aims Discuss the benefits & challenges of different types of observation. Describe how to observe as an on-looker, a participant, & an ethnographer. Discuss how to collect, analyze & present observational data. Examine think-aloud, diary studies & logging. Provide you with means in doing observation and critiquing observation studies.

What and when to observe Goals & questions determine the paradigms and techniques used. Observation is valuable any time during design. Quick & dirty observations early in design Observation can be done in the field (i.e., field studies) and in controlled environments (i.e., usability studies) Observers can be: - outsiders looking on - participants, i.e., participant observers - ethnographers

Frameworks to guide observation - The person. Who? - The place. Where? - The thing. What? The Goetz and LeCompte (1984) framework: - Who is present? - What is their role? - What is happening? - When does the activity occur? - Where is it happening? - Why is it happening? - How is the activity organized?

The Robinson (1993) framework Space. What is the physical space like? Actors. Who is involved? Activities. What are they doing? Objects. What objects are present? Acts. What are individuals doing? Events. What kind of event is it? Goals. What do they to accomplish? Feelings. What is the mood of the group and of individuals?

You need to consider Goals & questions Which framework & techniques How to collect data Which equipment to use How to gain acceptance How to handle sensitive issues Whether and how to involve informants How to analyze the data Whether to triangulate

Observing as an outsider As in usability testing More objective than participant observation In usability lab equipment is in place Recording is continuous Analysis & observation almost simultaneous Care needed to avoid drowning in data Analysis can be coarse or fine grained Video clips can be powerful for telling story

Participant observation & ethnography Debate about differences Participant observation is key component of ethnography Must get co-operation of people observed Informants are useful Data analysis is continuous Interpretivist technique Questions get refined as understanding grows Reports usually contain examples

Data collection techniques Notes & still camera Audio & still camera Video Tracking users: - diaries - interaction logging

Data analysis Qualitative data - interpreted & used to tell the ‘story’ about what was observed. Qualitative data - categorized using techniques such as content analysis. Quantitative data - collected from interaction & video logs. Presented as values, tables, charts, graphs and treated statistically.

Interpretive data analysis Look for key events that drive the group’s activity Look for patterns of behavior Test data sources against each other - triangulate Report findings in a convincing and honest way Produce ‘rich’ or ‘thick descriptions’ Include quotes, pictures, and anecdotes Software tools can be useful e.g., NUDIST, Ethnograph (see URL resource list for examples)

Looking for patterns Critical incident analysis Content analysis Discourse analysis Quantitative analysis - i.e., statistics

Key points Observe from outside or as a participant Analyzing video and data logs can be time-consuming. In participant observation collections of comments, incidents, and artifacts are made. Ethnography is a philosophy with a set of techniques that include participant observation and interviews. Ethnographers immerse themselves in the culture that they study.