Presentation is loading. Please wait.

Presentation is loading. Please wait.

User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept. Cal Poly San Luis Obispo FJK 2005.

Similar presentations


Presentation on theme: "User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept. Cal Poly San Luis Obispo FJK 2005."— Presentation transcript:

1 User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept. Cal Poly San Luis Obispo FJK 2005

2 Copyright Notice These slides are a revised version of the originals provided with the book “Interaction Design” by Jennifer Preece, Yvonne Rogers, and Helen Sharp, Wiley, 2002. I added some material, made some minor modifications, and created a custom show to select a subset. –Slides added or modified by me are marked with my initials (FJK), unless I forgot it … FJK 2005

3 484-W07 Quarter The slides I use in class are in the Custom Show “484-W07”. It is a subset of the whole collection in this file. Week 6 contains slides from Chapters 10 and 11 of the textbook. The original slides for this chapter were in much better shape, and I mainly made changes to better match our emphasis FJK 2005

4 Chapter 11 Usability Evaluation Framework FJK 2005

5 Chapter Overview Evaluation paradigms and techniques DECIDE evaluation framework FJK 2005

6 Motivation since it can be difficult to put together a customized evaluation plan for each project, it is often useful to follow a template or framework evaluation paradigms have been identified over time as practical combinations of methods

7 FJK 2005 Objectives Explain key evaluation concepts & terms. Describe the evaluation paradigms & techniques used in interaction design. Discuss the conceptual, practical and ethical issues that must be considered when planning evaluations. Introduce the DECIDE framework.

8 An evaluation framework

9 The aims Explain key evaluation concepts & terms. Describe the evaluation paradigms & techniques used in interaction design. Discuss the conceptual, practical and ethical issues that must be considered when planning evaluations. Introduce the DECIDE framework.

10 FJK 2005 Evaluation paradigm Any kind of evaluation is guided explicitly or implicitly by a set of beliefs –these beliefs are often supported by theory The beliefs and the methods associated with them are known as an ‘evaluation paradigm’

11 User studies looking at how people behave –in their natural environments, –or in the laboratory, –with old technologies and with new ones

12 Four evaluation paradigms ‘quick and dirty’ usability testing field studies predictive evaluation

13 Quick and dirty ‘quick & dirty’ evaluation describes a common practice –designers informally get feedback from users or consultants to confirm that their ideas are in-line with users’ needs and are liked. Quick & dirty evaluations can be done any time The emphasis is on fast input to the design process rather than carefully documented findings.

14 Usability Testing recording the performance of typical users –on typical tasks in controlled settings –field observations may also be used users are watched –recorded on video –their activities are logged mouse movements, key presses evaluation –calculation of performance times –identification of errors –explanation why the users did what they did user satisfaction –questionnaires and interviews are used to elicit the opinions of users

15 Field Studies done in natural settings to understand what users do naturally and how technology impacts them in product design field studies can be used to - identify opportunities for new technology - determine design requirements - decide how best to introduce new technology - evaluate technology in use

16 Predictive Evaluation experts apply their knowledge of typical users to predict usability problems –often guided by heuristics another approach involves theoretical models users need not be present relatively quick & inexpensive

17 Overview of Techniques observing users asking users about their opinions asking experts about their opinions testing the performance of users modeling the task performance of users

18 DECIDE: A framework to guide evaluation Determine the goals the evaluation addresses. Explore the specific questions to be answered. Choose the evaluation paradigm and techniques to answer the questions. Identify the practical issues. Decide how to deal with the ethical issues. Evaluate, interpret and present the data.

19 Determine the Goals What are the high-level goals of the evaluation? Who wants it and why? The goals influence the paradigm for the study Some examples of goals: Identify the best metaphor on which to base the design. Check to ensure that the final interface is consistent. Investigate how technology affects working practices. Improve the usability of an existing product.

20 Explore the Questions All evaluations need goals & questions to guide them so time is not wasted on ill-defined studies. For example, the goal of finding out why many customers prefer to purchase paper airline tickets rather than e-tickets can be broken down into sub- questions: - What are customers’ attitudes to these new tickets? - Are they concerned about security? - Is the interface for obtaining them poor? What questions might you ask about the design of a cell phone?

21 Choose the Evaluation Paradigm & Techniques The evaluation paradigm strongly influences the techniques used, how data is analyzed and presented. –E.g. field studies do not involve testing or modeling

22 Identify Practical Issues For example, how to: –select users –stay on budget –staying on schedule –find evaluators –select equipment

23 Decide on Ethical Issues Develop an informed consent form Participants have a right to: - know the goals of the study - what will happen to the findings - privacy of personal information - not to be quoted without their agreement - leave when they wish - be treated politely

24 Evaluate, Interpret and Present Data How data is analyzed and presented depends on the paradigm and techniques used. The following also need to be considered: - Reliability: can the study be replicated? - Validity: is it measuring what you thought? - Biases: is the process creating biases? - Scope: can the findings be generalized? - Ecological validity: is the environment of the study influencing it e.g. Hawthorn effect

25 Pilot studies A small trial run of the main study. The aim is to make sure your plan is viable. Pilot studies check: –that you can conduct the procedure –that interview scripts, questionnaires, experiments, etc. work appropriately It’s worth doing several to iron out problems before doing the main study. Ask colleagues if you don’t have access to real users –danger: bias

26 Key points An evaluation paradigm is an approach that is influenced by particular theories and philosophies. Five categories of techniques were identified: observing users, asking users, asking experts, user testing, modeling users. The DECIDE framework has six parts: - Determine the overall goals - Explore the questions that satisfy the goals - Choose the paradigm and techniques - Identify the practical issues - Decide on the ethical issues - Evaluate ways to analyze & present data Do a pilot study

27 Activity: DECIDE Framework apply the DECIDE framework to one of the projects in this class –design and development tools tutorial –Learning Commons storyboard

28 A project for you … Find an evaluation study from the list of URLs on this site or one of your own choice. Use the DECIDE framework to analyze it. Which paradigms are involved? Does the study report address each aspect of DECIDE? Is triangulation used? If so which techniques? On a scale of 1-5, where 1 = poor and 5 = excellent, how would you rate this study?


Download ppt "User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept. Cal Poly San Luis Obispo FJK 2005."

Similar presentations


Ads by Google