Design and Evaluation Methods Chap 3. ► Technology-oriented vs. User- or Customer-oriented ► Understanding customer needs and desires.

Slides:



Advertisements
Similar presentations
Requirements gathering
Advertisements

Systems Investigation and Analysis
References Prof. Saul Greenberg, University of Calgary, notes and articles INUSE 6.2 and RESPECT 5.3 Handbook Prof. , University of , Notes and articles.
Human Aspects of System Design Introduction –Designing the human element into a system is paramount to its success –One error incorporated by the human.
SECOND MIDTERM REVIEW CS 580 Human Computer Interaction.
Software Modeling SWE5441 Lecture 3 Eng. Mohammed Timraz
ECE 796/896 Human Factor Engineering Chapter 22 Human Factors in System design.
THE PROCESS OF INTERACTION DESIGN
Usability presented by the OSU Libraries’ u-team.
Midterm Review Evaluation & Research Concepts Proposals & Research Design Measurement Sampling Survey methods.
Empirical Methods in Human- Computer Interaction.
Copyright 2002 Prentice-Hall, Inc. Chapter 1 The Systems Development Environment 1.1 Modern Systems Analysis and Design Third Edition Jeffrey A. Hoffer.
Usability Inspection n Usability inspection is a generic name for a set of methods based on having evaluators inspect or examine usability-related issues.
Inspection Methods. Inspection methods Heuristic evaluation Guidelines review Consistency inspections Standards inspections Features inspection Cognitive.
Design and Evaluation of Iterative Systems n For most interactive systems, the ‘design it right first’ approach is not useful. n The 3 basic steps in the.
Fundamentals of Information Systems, Second Edition
Computers: Tools for an Information Age
Principles and Methods
Task analysis 1 © Copyright De Montfort University 1998 All Rights Reserved Task Analysis Preece et al Chapter 7.
HCI revision lecture. Main points Understanding Applying knowledge Knowing key points Knowing relationship between things If you’ve done the group project.
Introduction to Systems Analysis and Design
Introduction to Computer Technology
Enterprise Architecture
Requirements Gathering and Task analysis. Requirements gathering and task analysis 4 Requirements gathering is a central part of systems development understanding.
Usability Methods: Cognitive Walkthrough & Heuristic Evaluation Dr. Dania Bilal IS 588 Spring 2008 Dr. D. Bilal.
1. Learning Outcomes At the end of this lecture, you should be able to: –Define the term “Usability Engineering” –Describe the various steps involved.
Copyright © 2003 by Prentice Hall Computers: Tools for an Information Age Chapter 14 Systems Analysis and Design: The Big Picture.
Chapter 3 Reference Slide 2 of Lecture 1.  Most products designed without adequate consideration for human factors  Focus is on technology and product.
Continuation From Chapter From Chapter 1
Mantova 18/10/2002 "A Roadmap to New Product Development" Supporting Innovation Through The NPD Process and the Creation of Spin-off Companies.
S/W Project Management
Quality Function Deployment
Copyright 2002 Prentice-Hall, Inc. Chapter 1 The Systems Development Environment 1.1 Modern Systems Analysis and Design.
Business Analysis and Essential Competencies
Multimedia Specification Design and Production 2013 / Semester 1 / week 9 Lecturer: Dr. Nikos Gazepidis
Object-Oriented Software Engineering Practical Software Development using UML and Java Chapter 7: Focusing on Users and Their Tasks.
Comp 15 - Usability & Human Factors Unit 8a - Approaches to Design This material was developed by Columbia University, funded by the Department of Health.
Copyright 2002 Prentice-Hall, Inc. 1.1 Modern Systems Analysis and Design Jeffrey A. Hoffer Joey F. George Joseph S. Valacich Chapter 1 The Systems Development.
UI Style and Usability, User Experience Niteen Borge.
Chapter 8 Usability Specification Techniques Hix & Hartson.
User Interfaces 4 BTECH: IT WIKI PAGE:
Usability 1 Usability evaluation Without users - analytical techniques With users - survey and observational techniques.
Usability Engineering Dr. Dania Bilal IS 582 Spring 2006.
Design Process … and some design inspiration. Course ReCap To make you notice interfaces, good and bad – You’ll never look at doors the same way again.
The techniques involved in systems analysis Explanation of a feasibility study:Explanation of a feasibility study: –economic, –legal, –technical, –time.
PRJ566 Project Planning & Management Software Architecture.
Human Computer Interaction
Module 4: Systems Development Chapter 13: Investigation and Analysis.
1 Chapter 18: Selection and training n Selection and Training: Last lines of defense in creating a safe and efficient system n Selection: Methods for selecting.
Task Analysis Lecture # 8 Gabriel Spitz 1. Key Points  Task Analysis is a critical element of UI Design  It describes what is a user doing or will.
Task Analysis Lecture # 8 Gabriel Spitz 1. Key Points  Task Analysis is a critical element of UI Design  It specifies what functions the user will need.
 CMMI  REQUIREMENT DEVELOPMENT  SPECIFIC AND GENERIC GOALS  SG1: Develop CUSTOMER Requirement  SG2: Develop Product Requirement  SG3: Analyze.
고려대학교 산업경영공학과 IMEN 315 인간공학 3. DESIGN AND EVALUATION METHODS.
Search Engine Optimization © HiTech Institute. All rights reserved. Slide 1 Click to edit Master title style What is Business Analysis Body of Knowledge?
1 Systems Analysis & Design 7 th Edition Chapter 2.
Usability Engineering Dr. Dania Bilal IS 582 Spring 2007.
Usability Engineering Dr. Dania Bilal IS 587 Fall 2007.
Evaluation / Usability. ImplementDesignAnalysisEvaluateDevelop ADDIE.
1 Usability Analysis n Why Analyze n Types of Usability Analysis n Human Subjects Research n Project 3: Heuristic Evaluation.
Design Evaluation Overview Introduction Model for Interface Design Evaluation Types of Evaluation –Conceptual Design –Usability –Learning Outcome.
고려대학교 산업경영공학과 IMEN 315 인간공학 1. INTRODUCTION. 고려대학교 산업경영공학과 IMEN 315 인간공학.
1 Design and evaluation methods: Objectives n Design life cycle: HF input and neglect n Levels of system design: Going beyond the interface n Sources of.
SIE 515 Design and Usability
SIE 515 Design Evaluation Lecture 7.
User-centred system design process
Usability Techniques Lecture 13.
COMP444 Human Computer Interaction Usability Engineering
1. INTRODUCTION.
THE PROCESS OF INTERACTION DESIGN
Presentation transcript:

Design and Evaluation Methods Chap 3

► Technology-oriented vs. User- or Customer-oriented ► Understanding customer needs and desires

Design and Evaluation Methods ► Overview of Design and Evaluation   ► Front-end Analysis   ► Iterative Design and Testing   ► Final Test and Evaluation   ► Conclusion  

Overview of Design and Evaluation ► Cost/Benefit Analysis of Human Factors Contributions   ► Human Factors in the Product Design Lifecycle   ► User-centered Design   ► Sources for Design Work  

Cost/Benefit Analysis of Human Factors Contributions (1/2) ► Costs: Personnel, Materials Tab 3.1 Tab 3.2 Tab 3.1 Tab 3.2 Tab 3.1 Tab 3.2 ► Benefits: Mayhew (1992)  Increased sales  Decreased cost of providing training  Decreased customer support costs  Decreased development costs  Decreased maintenance costs  Increased user productivity  Decreased user errors  Improved quality of service  Decreased training time  Decreased user turnover

Cost/Benefit Analysis of Human Factors Contributions (2/2) ► Benefits: health or safety related ► Total Benefits  Estimating relevant variables without human factors intervention (A)  Estimating the same variables with HF (B)  (B) - (A)

Human Factors in the Product Design Lifecycle ► Must be involved as early as possible  Multidisciplinary design team members ► Product life cycle (six major stages)  Front end analysis  Iterative design and test  System production  Implementation and evaluation  System operation and maintenance  System disposal

User-Centered Design (1/2) ► User-centered design: center the design process around the user ► How  Adequately determining user needs  Involving the user at all stages of the design process ► Subfield: Usability engineering  software design

User-Centered Design (2/2) ► Four general approaches  Early focus on the user and tasks  Empirical measurement: focus on quantitative performance data  Iterative design using prototypes (rapid changes)  Participatory design: users as part of the design team

Sources for Design Work ► Data Compendiums (摘要、概略)  Condensed and categorized databases ► Human Factors Design Standards  Precise recommendations relate to specific areas or topics ► Human Factors Principles and Guidelines  Cover a wide range of topics  Guides rather than hard-and-fast rules  Require careful consideration and application

Front-End Analysis ► User Analysis   ► Environment Analysis   ► Function and Task Analysis   ► How to Perform a Task Analysis   ► Collect Task Data   ► Summarize Task Data   ► Analyze Task Data   ► Identify User Preferences and Requirements  

User Analysis (1/2) ► User population  Most important: regular users or operators  Potential users (wider range of users) ► Complete description  Age, gender, education level  Reading ability, physical size, physical ability  Familiarity with the product, task-relevant skills  etc.

User Analysis (2/2) ► Personas (vs. list of characteristics)  Persona: hypothetical person  Not real people, but represent key characteristics of the user population  Specific (even have a name)  For most applications: 3 or 4 personas

Environment Analysis ► Depend on specific environment  User characteristics  Activity, basic tasks

Function and Task Analysis (1/3) ► Goal, Function, Task  Goal: end condition or reason for performing the tasks  Function: general transformations (of information and system state) to achieve the goal  Task: specific activities to carry out a function

Function and Task Analysis (2/3) ► Function analysis  Analysis of the basic functions performed by the “system”  System: human-machine, human-software, human-equipment- environment, etc. ► Task analysis  Systematically describing human interaction with a system to understand how to match the demands of the system to human capabilities

Function and Task Analysis (3/3) ► Task analysis (cont.)  Preliminary task analysis: Activity analysis  Beginning (of the design process): preliminary task analysis  Progress: more extensive task analysis

How to Perform a Task Analysis (1/2) ► Four steps of a task analysis  Define the analysis purpose and identify the type of data required  Collect task data  Summarize task data  Analyze task data

How to Perform a Task Analysis (2/2) ► Define Purpose and Required Data  Define purpose: focus the analysis on the end use of the data  Information gathered depends on: purpose, type of the task (physical task, cognitive task)  Type of information: 1. hierarchical relationships 2. information flow 3. task sequence 4. location and environmental conditions

Collect Task Data (1/7) ► Observation ► Think-Aloud Verbal Protocol ► Task Performance with Questioning ► Unstructured and Structured Interviews ► Surveys and Questionnaires

Collect Task Data (2/7) ► Observation  Perform under typical scenarios  Identify different methods for accomplishing a goal  Not sufficient: primarily cognitive tasks

Collect Task Data (3/7) ► Think-Aloud Verbal Protocol  Underlying goals, strategies, decisions, other cognitive components  Verbal protocol: verbalizations regarding task performance  Verbal protocol analysis  Type of verbal protocol − Concurrent (difficult, procedural information) − Retrospective (useful, explanations) − Prospective: imagine performing the task

Collect Task Data (4/7) ► Task Performance with Questioning  Advantage: may cue users to verbalize goal, …  Disadvantage: disruptive  Retrospective analysis of video-tapes: effective − think-aloud verbalization − Fail to provide information: requested − Can pause and ask questions

Collect Task Data (5/7) ► Unstructured and Structured Interviews  Begin with short unstructured interviews − How go about the activities − Preferences, strategies − Fail to achieve their goals, make errors, …  Question probes: When, How, Why is & not  Focus group − 6-10 users led by a facilitator − Facilitator: familiar with task & system, neutral

Collect Task Data (6/7) ► Surveys and Questionnaires  After obtaining preliminary descriptions of activities or basic tasks  Affirm the accuracy of the information  Determine frequency (perform the task)  Identify preferences or biasis

Collect Task Data (7/7) ► Beyond the Limitations  Should focus on the basic user goals and needs Not on how they are carried out using the existing products  Evaluate: the underlying characteristics of the environment the underlying characteristics of the environment the control requirements of the system the control requirements of the system

Summarize Task Data ► Lists, Outlines, and Matrices Tab 3.3 Tab 3.3 Tab 3.3 ► Hierarchies  Hierarchical task analysis (HTA) Fig 3.1 Fig 3.1 Fig 3.1  GOMS: goals, operators, methods, and selection rules ► Flow Charts, Timelines, and Maps  Operational sequence diagram (OSD) Fig 3.2 Fig 3.2 Fig 3.2

Analyze Task Data (1/2) ► Network Analysis Fig 3.3 Fig 3.3 Fig 3.3  Matrix representation of information flows between functions  Identify clusters of related functions ► Workload Analysis ► Simulation and Modeling ► Safety Analysis

Analyze Task Data (2/2) ► Scenario Specification  Scenario: a situation and a specific set of tasks that represent an important use of the system or product  Create a scenario: only those directly serve users’ goals are retained  Daily use scenarios: common sets of task that occur daily  Necessary use scenarios: infrequent but critical sets of tasks that must be performed

Identify User Preferences and Requirements ► Can be quite extensive

Iterative Design and Testing ► Providing Input for System Specifications   ► Organization Design ► Prototypes   ► Heuristic Evaluation   ► Usability Testing  

Providing Input for System Specifications (1/7) ► System Specifications  The overall objectives the system supports − What must be done to achieve the user’s goals, not how to do it − Reflect user’s goal, not technology  Performance requirements and features − Determine the means: help the users achieve their goals − What the system be able to do & under what conditions  Design constraints

Providing Input for System Specifications (2/7) ► System Specifications  The overall objectives the system supports  Performance requirements and features  Design constraints − Various solutions  Design constraints − Constraints: limit possible design alternatives ► Human factors  Take a systems design approach: analyzing the entire human-machine system

Providing Input for System Specifications (3/7) ► Quality Function Deployment: relative importance of potential system features  “House of quality” decision matrix Fig 3.4 Fig 3.4 Fig 3.4  Weighting: importance of the objectives − 9: very important; 3: somewhat important; 1: marginally important  Rating: how well each feature serves each objective

Providing Input for System Specifications (4/7) ► Cost/Benefit Analysis Fig 3.4 Fig 3.4 Fig 3.4  Rows: features  Columns: design alternatives  Weight: the result of the QFD  Rating: how well each design alternative address the feature  Benefit: weighted sum  Cost/Benefit ratio  Lowest Cost/Benefit ratio: valuable

Providing Input for System Specifications (5/7) ► Tradeoff Analysis  Small-scale study: which design alternative results in the best performance (trade studies)  Modeling or performance estimates (by designer)  Multiple factors: advantage , disadvantage   Decision matrix − Rows: features (factors) − Columns: different means of implementation − Fail to consider global issues: how the features interact as a group (A product is more than the sum of its features) − Complemented with: scenario specification (e.g.)

Providing Input for System Specifications (6/7) ► Human Factors Criteria Identification  Usability requirements  Specify characteristics: relevant to human performance and safety ► Functional Allocation  System (automatic), Person (manual), or combination  Assign a function to the more “capable” system component Fig 3.5 Fig 3.5 Fig 3.5  Left with a coherent set of tasks that can be understood

Providing Input for System Specifications (7/7) ► Support Materials Development  Should begin with the front-end analysis  Manuals, assembly instructions, owner’s manual, training programs, etc.  Materials: compatible with the characteristics and limitations of the human users − Critical information: maximize the likelihood that users read it, understand it, and comply it

Prototypes (1/2) ► Mock-up vs. prototype  Mock-up: very crude approximations of the final product (e.g. foam, cardboard)  Prototype: more of the look and feel of the final product, but not yet full functioning

Prototypes (2/2) ► Advantages of using prototypes  Confirming insights gathered during the front-end analysis  Support of the design team in making ideas concrete  Support of the design team by providing a communication medium  Support for heuristic evaluation  Support for usability testing (something to react to and use)

Heuristic Evaluation ► Analytically considering:  Characteristics of the product/system: meet human factors criteria? ► Which aspects to be evaluated: meet?  Human factors criteria (requirement specifications)  Other human factors standards and guidelines ► Evaluated by whom: usability experts  Multiple evaluators: at least 3, preferably 5  Inspect in isolation (each evaluator)  communicate and aggregate their findings

Usability Testing ► Usability:  The degree to which the system is easy to use  User friendly ► Variables relevant to usability:  Learnability: easy to learn, rapidly start  Efficiency: high level of productivity (once learned)  Memorability: easy to remember  Errors: low error rate, easily recover from errors  Satisfaction: pleasant to use, subjectively satisfied in using it, like it

Final Test and Evaluation ► Involving users ► Data collected:  Acceptability  Usability  Performance of the user or human-machine system

Conclusion ► Techniques for creating user-centered systems  To understand user needs  To design systems to meet those needs ► Critical step  Provide human factors criteria for design