Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 Design and evaluation methods: Objectives n Design life cycle: HF input and neglect n Levels of system design: Going beyond the interface n Sources of.

Similar presentations


Presentation on theme: "1 Design and evaluation methods: Objectives n Design life cycle: HF input and neglect n Levels of system design: Going beyond the interface n Sources of."— Presentation transcript:

1 1 Design and evaluation methods: Objectives n Design life cycle: HF input and neglect n Levels of system design: Going beyond the interface n Sources of design information: Where to go for help n Understanding user needs: task analysis n Design for user needs not technology: HF tools for user- centered design n Key tools for design and evaluation Decision matrix techniques Task analysis

2 2 Product design life cycle: HF input and neglect 1 Front-end analysis 2 Iterative design and testing 3 System production 4 Implementation and evaluation 5 System operation and maintenance 6 System disposal

3 3 Product design life cycle: HF input and neglect 1 Front-end analysis 2 Iterative design and testing 3 System production 4 Implementation and evaluation 5 System operation and maintenance 6 System disposal Most critical stage Neglect here lead to ineffective HF Integrate with other constraints Heuristic evaluation Human factors principles and guidelines Theories are imperfect, test and iterate locally and globally HF applies to all stages of life cycle

4 4 User-centered design n Not user designed system, but system designed to support user’s needs n Early focus on tasks n Empirical measurement and testing n Iterative design based on test results n Participatory design: involve the user n Preference DOES NOT EQUAL Performance

5 5 Levels of systems design n Interface n Job/task n Organizational n Interface adjustments vs. Programs of change n Avoid “Painting the corpse”

6 6 Sources of Human Factors design information n Data compendiums n Human Factors standards n Human Factors principles and guidelines n Human Factors research papers (Web of Science)

7 7 Interface design principles for conceptual design and heuristic evaluation n Provide good conceptual model n Make things visible, don’t hide functionality n Use natural mappings n Provide feedback n Requires more than one evaluator

8 8 Front-end analysis activities n Address the questions: Who? What functions and tasks? What conditions? What preferences? Design constraints? Human factors criteria?

9 9 Task analysis n Task analysis is a critical component of UCSD (Know the user and what they need to do) n Task analysis= study of the cognitive or physical activity required of person or team to achieve an goal n Particular type and contents of task analysis depend on project goals

10 10 Overall considerations and process of a task analysis 1. Define the purpose of analysis and type of data to be collected 2. Collect task data 3. Summarize task data 4. Analyze task data

11 11 Purpose of task analysis n Avoid designing for current tasks because they may simply reflect a poor system n Analysis of system requirements and constraints n Go beyond current tasks

12 12 Day 2

13 13 Purpose of task analysis n Predict system performance n Specify interface design requirements n Define procedures and manuals n Identify training requirements n Guide the overall conceptual design of the system n Types of data collected in task analysis depend on resources and purpose of analysis

14 14 Types of tasks data: cognitive and behavioral Attend to the cognitive nature of task if: Complex decision making or inference Requirements for conceptual knowledge Complex rules/procedures

15 15 Types of task data n Hierarchical relationships Function, task, sub-task relationships n Information flows Individuals and their roles and responsibilities Skills and knowledge of the individuals Flow of information Information needed and information generated by each individual n Task sequence Goal or intent of task Sequential relationship (what tasks must precede or follow) Trigger or event that starts a task sequence Duration of task n Location and environmental conditions Paths that people take to get from one place to another Physical structures, such as walls, partitions, desks Tools and their location Conditions under which the tasks are performed

16 16 Task data collection methods n Observe users with existing system n Critical incident analysis (WWII flying accidents) n Surveys and questionnaires n Focus groups/structured brainstorming n Interviews n Think aloud protocols and probe questions during observation

17 17 Task data collection: Critical incident analysis n Focus on events/critical incidents that demanded expertise of person n Identify sequence of events led to difficult situation n What was done? How was it done? Why was it done? n Specific environmental cues that guided decision n Specific knowledge that guided decision Identify specific events and knowledge that define expertise

18 18 Summarize task data n Lists, matrices and outlines n Hierarchical decomposition n Flow chart, transition diagrams, timelines, and maps

19 19 Analyze task data n Network analysis n Workload analysis n Simulation and modeling n Safety analysis n Scenario specification

20 20 Key concepts in conducting a task analysis n Define the purpose and resources available for analysis and select an appropriate approach n Go beyond the current tasks n Be flexible and consider alternate data collection, representations, and analysis methods

21 21 Iterative design and evaluation n Heuristic evaluation (each reviewer identifies 35% of the problems) n Workload and modeling n Safety analyses n Cost/benefit and tradeoff analyses n Usability testing n Operational test and evaluation n Very different from research experiments

22 22 Decision matrices n Quality Function Deployment, House of Quality Analysis Rows = User objectives, weightings = importance Columns = Product features Identifies most useful or beneficial features n Cost/Benefit Analysis Rows = Features, weighting = benefit of feature Columns = Design alternative Identifies the relative benefit of a design n Trade-Off Analysis Row = Product feature Columns = Alternate implementations Identifies best implementation of a particular feature n Use scenarios from task analysis to provide a holistic view that is missing from matrix analysis techniques

23 23 Key concepts n Early influence in design lifecycle, with multiple iterations has greatest impact n Addressing only the interface will not make the best product n Task analysis identifies user needs n Many sources of design information: go beyond intuition n Iterative design and testing is critical for success n Decision matrix analyses are a critical tool to accommodate diverse design constraints and focus on user needs


Download ppt "1 Design and evaluation methods: Objectives n Design life cycle: HF input and neglect n Levels of system design: Going beyond the interface n Sources of."

Similar presentations


Ads by Google