Presentation is loading. Please wait.

Presentation is loading. Please wait.

SIE 515 Design Evaluation Lecture 7.

Similar presentations


Presentation on theme: "SIE 515 Design Evaluation Lecture 7."— Presentation transcript:

1 SIE Design Evaluation Lecture 7

2 Today’s Schedule Goals of design evaluation
Evaluation by expert analysis Six approaches Evaluation through user-based participation Five approaches Laboratory vs. field studies Discussion of design project

3 Goals of Design Evaluation
Two components of evaluation: Assessing system design and checking user requirements Done throughout the design cycle (feedback loop) Three goals of design evaluation Assess system functionality Assess user experience Identify problems 3

4 Two Forms of Evaluation
Evaluation through expert analysis Goal = guide initial specifications and development No user input User-based evaluation (analysis) Input from users based on real-time use of system Later in design cycle

5 Expert Evaluation Conducted when not possible to perform user-based evaluations. Usability expert: assesses design for core user groups based on specific metrics. Goal = Guide initial specifications and identify design problems. 5

6 Approaches to Expert Analysis
Cognitive walkthroughs: applying cognitive principles to user interaction. Heuristic evaluation: usability based on rules of thumb. Use of models: merging formal cognitive models with design principles. Use of prior research: existing results inform current design. Guidelines review: conformance with internal guidelines documents. Consistency inspection: across multiple application interfaces. 6

7 Expert Analysis Cognitive walkthroughs: applying cognitive principles to user interaction Evaluates each step of interaction to perform task Support learning through user-directed exploration Heuristic evaluation: usability based on rules of thumb Critiques design using accepted principles, guidelines and standards Five evaluators will identify 75% of all usability problems Problems identified are rated on severity scale

8 Nielsen’s Ten Heuristics
Visibility of system status Match between system and the real world User control and freedom Consistency and standards Error prevention Recognition rather than recall Flexibility and efficiency of use Aesthetic and minimalist design Help users recognize, diagnose and recover from errors Help and documentation

9 Other Expert Evaluation
Model-based evaluation: Merging formal cognitive models with design principles Examples = GOMS model, dialog models, & network models Use of prior research: existing results inform current design Often sufficient to make usability design decisions

10 Evaluation Through User Participation
Observational methods Think aloud procedures Query techniques: interviews and questionnaires Physiological monitoring Experimental methods: using empirical results from human testing 10

11 Laboratory User Evaluation
Good experimental control, less real-world validity

12 Field Study User Evaluation
Realistic context, but less experimental control Good for longitudinal investigations

13 Usability Lab Usability lab test with participant and observer seated at a workstation, separated by a one way mirror.

14 Observational Methods
Based on observations of user interacting with system on pre-defined task Most common user-based evaluation technique Must have consistency between observations Observational methods do not provide information about user’s decision-making processes Combine with description of user actions (think aloud technique)

15 Observational Techniques
External observation: being present and viewing a group but not participating. Non-participant: secretly observing a group. Participant observation: joining the routines of a group and observing action. Covert observation: totally immersing yourself in a group culture and not identifying yourself.

16 Think Aloud Techniques
User talks through every action Data provides insight into user’s information processing capacity and mental model Cooperative evaluation User acts as evaluator instead of participant Users and experimenters can ask questions Protocol analysis = scoring of verbal record

17 Query Techniques Query evaluation = direct input from users
Problem: data is subjective and hard to generalize Most common query techniques = interviews and questionnaires Interviews should be focused, and based on specific questions Questionnaires provide more control but less flexibility

18 Physiological Evaluation
Most objective and direct approach for obtaining user data. Best when combined with behavioral measures. Problems = expensive equipment and complex technology; hard to analyze data.

19 Eye-Tracking Equipment Output

20 Galvanic Skin Conduction
Equipment Output

21 Electrophysiological Sensor
Equipment Output

22 fMRI Output Equipment

23 For Next Class Assignment 7:
Task 1: Read Shneiderman Chapter 4, 4.7 onward Task 2: Answer the following questions: What are the most important contributions from the reading? How do these factors relate to good design? Give an example. Task 3: Identify questions or issues that you would like to talk more about during class/on the participation blog. 23


Download ppt "SIE 515 Design Evaluation Lecture 7."

Similar presentations


Ads by Google