CS305: HCI in SW Development Evaluation (Return to…)

Slides:



Advertisements
Similar presentations
©2011 1www.id-book.com Evaluation studies: From controlled to natural settings Chapter 14.
Advertisements

©2011 1www.id-book.com An evaluation framework Chapter 13.
Chapter 13: An evaluation framework
Chapter 14: Usability testing and field studies
Chapter 2 The Process of Experimentation
Chapter 5 Development and Evolution of User Interface
Imran Hussain University of Management and Technology (UMT)
Data gathering. Overview Four key issues of data gathering Data recording Interviews Questionnaires Observation Choosing and combining techniques.
Interviews and Focus Groups Christine Maidl Pribbenow SOTL Institute-RR 2014.
WHAT IS INTERACTION DESIGN?
Chapter 14: Usability testing and field studies. 2 FJK User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept.
CAP 252 Lecture Topic: Requirement Analysis Class Exercise: Use Cases.
Chapter 14: Usability testing and field studies. Usability Testing Emphasizes the property of being usable Key Components –User Pre-Test –User Test –User.
©N. Hari Narayanan Computer Science & Software Engineering Auburn University 1 COMP 7620 Evaluation Chapter 9.
An evaluation framework
User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept. Cal Poly San Luis Obispo FJK 2005.
An evaluation framework
Usability 2004 J T Burns1 Usability & Usability Engineering.
An evaluation framework
Evaluation How do we test the interaction design? Several Dimensions
From Controlled to Natural Settings
HCI revision lecture. Main points Understanding Applying knowledge Knowing key points Knowing relationship between things If you’ve done the group project.
Design in the World of Business
Usability 2009 J T Burns1 Usability & Usability Engineering.
Chapter 14: Usability testing and field studies
S/W Project Management
Evaluation Framework Prevention vs. Intervention CHONG POH WAN 21 JUNE 2011.
Chapter 11: An Evaluation Framework Group 4: Tony Masi, Sam Esswein, Brian Rood, & Chris Troisi.
Data gathering. Overview Four key issues of data gathering Data recording Interviews Questionnaires Observation Choosing and combining techniques.
Ch 14. Testing & modeling users
Part 1-Intro; Part 2- Req; Part 3- Design  Chapter 20 Why evaluate the usability of user interface designs?  Chapter 21 Deciding on what you need to.
Human Computer Interaction
Usability testing. Goals & questions focus on how well users perform tasks with the product. – typical users – doing typical tasks. Comparison of products.
Evaluation Paradigms & Techniques IS 588 Spring 2008 Dr. D. Bilal.
Evaluation approaches Text p Text p
Chapter 12/13: Evaluation/Decide Framework Question 1.
Usability Testing CS774 Human Computer Interaction Spring 2004.
Testing & modeling users. The aims Describe how to do user testing. Discuss the differences between user testing, usability testing and research experiments.
Level 2 Prepared by: RHR First Prepared on: Nov 23, 2006 Last Modified on: Quality checked by: MOH Copyright 2004 Asia Pacific Institute of Information.
Usability Testing Chapter 6. Reliability Can you repeat the test?
CS2003 Usability Engineering Usability Evaluation Dr Steve Love.
COMP5047 Pervasive Computing: 2012 Think-aloud usability experiments or concurrent verbal accounts Judy Kay CHAI: Computer human adapted interaction research.
Chapter 12: Introducing Evaluation. The aims To illustrate how observation, interviews and questionnaires that you encountered in Chapters 7 and 8 are.
CS3205: HCI in SW Development Evaluation (Return to…) We’ve had an introduction to evaluation. Now for more details on…
Interviews and Focus Groups Miriam Segura-Totten July 23, 2015 Adapted from a presentation by Dr. Christine Pribbenow.
© An Evaluation Framework Chapter 13.
Chapter 12/13: Evaluation/Decide Framework. Why Evaluate? Why: to check that users can use the product and that they like it. Designers need to check.
Computer Science Department California Polytechnic State University San Luis Obispo, CA, U.S.A. Franz J. Kurfess CPE/CSC 484: User-Centered Design and.
Usability Engineering Dr. Dania Bilal IS 592 Spring 2005.
AVI/Psych 358/IE 340: Human Factors Data Gathering October 3, 2008.
AVI/Psych 358/IE 340: Human Factors Evaluation October 31, 2008.
Chapter 15: Analytical evaluation. Aims: Describe inspection methods. Show how heuristic evaluation can be adapted to evaluate different products. Explain.
DECIDE: An evaluation framework. DECIDE: a framework to guide evaluation D D etermine the goals. E E xplore the questions. C C hoose the evaluation approach.
Um ambiente para avaliação. Objetivos Explicar conceitos e termos da avaliação Descrever paradigmas de avaliação e técnicas utilizadas no design de.
Oct 211 The next two weeks Oct 21 & 23: Lectures on user interface evaluation Oct 28: Lecture by Dr. Maurice Masliah No office hours (out of town) Oct.
School of Engineering and Information and Communication Technology KIT305/607 Mobile Application Development Week 7: Usability (think-alouds) Dr. Rainer.
Chapter 13: An evaluation framework. The aims are: To discuss the conceptual, practical and ethical issues involved in evaluation. To introduce and explain.
SIE 515 Design Evaluation Lecture 7.
CS3205: HCI in SW Development Evaluation (Return to…)
Research & Writing in CJ
Introducing Evaluation
From Controlled to Natural Settings
WHAT IS INTERACTION DESIGN?
Evaluation Paradigms & Techniques
From Controlled to Natural Settings
Evaluation.
Testing & modeling users
COMP444 Human Computer Interaction Evaluation
Presentation transcript:

CS305: HCI in SW Development Evaluation (Return to…) We’ve had an introduction to evaluation. Now for more details on…

Topics to Cover Types of evaluation How to design an evaluation paradigm, categories of techniques, specific techniques How to design an evaluation ID-book’s DECIDE framework Using experts for inspections and walkthroughs

Reminder: Formative vs. Summative Eval. After the something is complete Does it “meet spec”? Formative As something is being developed Begin as early as possible For the purpose of affecting the process, the item being developed

High-level Categories of Techniques observing users, asking users for their opinions, asking experts for their opinions, testing users’ performance modeling users’ task performance

Evaluation paradigm Any kind of evaluation is guided explicitly or implicitly by a set of beliefs, which are often under-pinned by theory. These beliefs and the methods associated with them are known as an ‘evaluation paradigm’

Four evaluation paradigms ‘quick and dirty’ usability testing field studies predictive evaluation

Quick and dirty ‘quick & dirty’ evaluation describes the common practice in which designers informally get feedback from users or consultants to confirm that their ideas are in-line with users’ needs and are liked. Quick & dirty evaluations are done any time. The emphasis is on fast input to the design process rather than carefully documented findings.

Quick and Dirty cont’d Applies to various other approaches. E.g., could either be: small number of experts (i.e. predictive evaluation) small number of subjects (i.e. usability testing) How many subjects? Not as many as you think! See Nielsen’s data on why 5 users is probably enough: http://www.useit.com/alertbox/20000319.html 15 users to find 100% of problems. 5 users find 85%. Better three studies with 5 users than one with 15?

Usability testing Usability testing involves recording typical users’ performance on typical tasks in controlled settings. Field observations may also be used. As the users perform these tasks they are watched & recorded on video & their key presses are logged. This data is used to calculate performance times, identify errors & help explain why the users did what they did. User satisfaction questionnaires & interviews are used to elicit users’ opinions.

Usability Engineering Term coined by staff at Digital Equipment Corp. around 1986 Concerned with: Techniques for planning, achieving and verifying objectives for system usability Measurable goals must be defined early Goals must be assessed repeatedly Note verification above Definition by Christine Faulkner (2000): “UE is an approach to the development of software and systems which involves user participation from the outset and guarantees the usefulness of the product through the use of a usability specification and metrics.”

Field studies Field studies are done in natural settings The aim is to understand what users do naturally and how technology impacts them. In product design field studies can be used to: - identify opportunities for new technology - determine design requirements - decide how best to introduce new technology - evaluate technology in use.

Predictive evaluation Experts apply their knowledge of typical users, often guided by heuristics, to predict usability problems. Heuristic evaluation Walkthroughs Another approach involves theoretically based models. Predicting time, errors: GOMS and Fitts’ Law formula A key feature of predictive evaluation is that users need not be present Relatively quick & inexpensive

How to Plan an Evaluation? ID-book’s DECIDE framework captures many important practical issues works with all categories of study

DECIDE: A framework to guide evaluation Determine the goals the evaluation addresses. Explore the specific questions to be answered. Choose the evaluation paradigm and techniques to answer the questions. Identify the practical issues. Decide how to deal with the ethical issues. Evaluate, interpret and present the data.

Determine the goals What are the high-level goals of the evaluation? Who wants it and why? The goals influence the paradigm for the study Some examples of goals: Identify the best metaphor on which to base the design. Check to ensure that the final interface is consistent. Investigate how technology affects working practices. Improve the usability of an existing product .

Explore the questions All evaluations need goals & questions to guide them so time is not wasted on ill-defined studies. For example, the goal of finding out why many customers prefer to purchase paper airline tickets rather than e-tickets can be broken down into sub-questions: - What are customers’ attitudes to these new tickets? - Are they concerned about security? - Is the interface for obtaining them poor? What questions might you ask about the design of a cell phone?

Choose the evaluation paradigm & techniques The evaluation paradigm strongly influences the techniques used, how data is analyzed and presented. E.g. field studies do not involve testing or modeling

Identify practical issues For example, how to: select users stay on budget staying on schedule find evaluators select equipment

Decide on ethical issues Develop an informed consent form See example(s) in text, Web site, etc. Participants have a right to: - know the goals of the study - what will happen to the findings - privacy of personal information - not to be quoted without their agreement - leave when they wish - be treated politely “Informed consent” agreement

Evaluate, interpret & present data How data is analyzed & presented depends on the paradigm and techniques used. The following also need to be considered: - Reliability: can the study be replicated? - Validity: is it measuring what you thought? - Biases: is the process creating biases? - Scope: can the findings be generalized? - Ecological validity: is the environment of the study influencing it - e.g. Hawthorn effect

Pilot studies A small trial run of the main study. The aim is to make sure your plan is viable. Pilot studies check: - that you can conduct the procedure - that interview scripts, questionnaires, experiments, etc. work appropriately It’s worth doing several to iron out problems before doing the main study. Ask colleagues if you can’t spare real users.

Key points An evaluation paradigm is an approach that is influenced by particular theories and philosophies. Five categories of techniques were identified: observing users, asking users, asking experts, user testing, modeling users. The DECIDE framework has six parts: - Determine the overall goals - Explore the questions that satisfy the goals - Choose the paradigm and techniques - Identify the practical issues - Decide on the ethical issues - Evaluate ways to analyze & present data Do a pilot study

Applying this… In class we looked at the method described in the article “Prototyping for Tiny Fingers” We examined it as follows: Use the DECIDE framework to analyze it. Which paradigms are involved? Does the study report address each aspect of DECIDE?