Usability Testing Of Interaction Components: Taking the Message Exchange as a Measure of Usability Willem-Paul Brinkman Brunel University, London Reinder.

Slides:



Advertisements
Similar presentations
Chapter 15: Analytical evaluation
Advertisements

Cultural Heritage in REGional NETworks REGNET Project Meeting Content Group Part 1: Usability Testing.
Ch:8 Design Concepts S.W Design should have following quality attribute: Functionality Usability Reliability Performance Supportability (extensibility,
User Modeling CIS 376 Bruce R. Maxim UM-Dearborn.
Making sense out of recorded user-system interaction Dr Willem-Paul Brinkman Lecturer Department of Information Systems and Computing Brunel University.
User Interfaces 4 BTECH: IT WIKI PAGE:
Visual Basic: An Object Oriented Approach 2 – Designing Software Systems.
Empirical Usability Testing in a Component-Based Environment: Improving Test Efficiency with Component-Specific Usability Measures Willem-Paul Brinkman.
Chapter 15: Analytical evaluation. 2 FJK User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept. Cal Poly San.
1 User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept. Cal Poly San Luis Obispo FJK 2009.
Improving Robustness in Distributed Systems Jeremy Russell Software Engineering Honours Project.
Component-specific usability testing Dr Willem-Paul Brinkman Lecturer Department of Information Systems and Computing Brunel University
1 CS 430 / INFO 430 Information Retrieval Lecture 24 Usability 2.
Developing an instrument to assess the impact of attitude and social norms on user selection of an interface design: a repertory grid approach Willem-Paul.
CMPUT 301: Lecture 01 Introduction Lecturer: Martin Jagersand Department of Computing Science University of Alberta Notes based on previous courses by.
Predictive Evaluation Predicting performance. Predictive Models Translate empirical evidence into theories and models that can influence design. Performance.
Usability Evaluation Methods Computer-Mediated Communication Lab week 10; Tuesday 7/11/06.
INTRODUCTION. Concepts HCI, CHI Usability User-centered Design (UCD) An approach to design (software, Web, other) that involves the user Interaction Design.
User Interface Design Chapter 11. Objectives  Understand several fundamental user interface (UI) design principles.  Understand the process of UI design.
©2011 1www.id-book.com Analytical evaluation Chapter 15.
Evaluation IMD07101: Introduction to Human Computer Interaction Brian Davison 2010/11.
Usability testing and field studies
… and after unit testing …
Predictive Evaluation
1 Forms A form is the usual way that information is gotten from a browser to a server –HTML has tags to create a collection of objects that implement this.
Protocol Layering Chapter 10. Looked at: Architectural foundations of internetworking Architectural foundations of internetworking Forwarding of datagrams.
Measuring the Effort for Creating and Using Domain-Specific Models Yali Wu PhD Candidate 18 October 2010.
System Design: Designing the User Interface Dr. Dania Bilal IS582 Spring 2009.
CHAPTER SIX Reducing Program Complexity General Sub Procedures and Developer-defined Functions.
Cognitive demands of hands-free- phone conversation while driving Professor : Liu Student: Ruby.
Slides based on those by Paul Cairns, York ( users.cs.york.ac.uk/~pcairns/) + ID3 book slides + slides from: courses.ischool.berkeley.edu/i213/s08/lectures/i ppthttp://www-
Ch 14. Testing & modeling users
Interacting with IT Systems Fundamentals of Information Technology Session 5.
Overview of the rest of the semester Building on Assignment 1 Using iterative prototyping.
Standard for a Convergent Digital Home Network for Heterogeneous Technologies Zhimeng Du 12/5/2013.
GOMS Timing for WIMP interfaces When (fine-grained) speed matters.
Copyright 2002 Prentice-Hall, Inc. Chapter 2 Object-Oriented Analysis and Design Modern Systems Analysis and Design Third Edition Jeffrey A. Hoffer Joey.
Testing & modeling users. The aims Describe how to do user testing. Discuss the differences between user testing, usability testing and research experiments.
Identifying needs and establishing requirements
Evaluation of User Interface Design 4. Predictive Evaluation continued Different kinds of predictive evaluation: 1.Inspection methods 2.Usage simulations.
Software Architecture
Technical Paper Review Designing Usable Web Forms – Empirical Evaluation of Web Form Improvement Guidelines By Amit Kumar.
Chapter 15: Analytical evaluation. Inspections Heuristic evaluation Walkthroughs.
Job scheduling algorithm based on Berger model in cloud environment Advances in Engineering Software (2011) Baomin Xu,Chunyan Zhao,Enzhao Hua,Bin Hu 2013/1/251.
Chapter 15: Analytical evaluation Q1, 2. Inspections Heuristic evaluation Walkthroughs Start Q3 Reviewers tend to use guidelines, heuristics and checklists.
Analytical evaluation Prepared by Dr. Nor Azman Ismail Department of Computer Graphics and Multimedia Faculty of Computer Science & Information System.
Welcome to the Usability Center Tour Since 1995, the Usability Center has been a learning environment that supports and educates in the process of usability.
P6 BTEC Level 3 Subsidiary Diploma in ICT. Automation The end user of a spreadsheet may be proficient in using the software, but the more that you automate.
Riga Technical University Department of System Theory and Design Usage of Multi-Agent Paradigm in Multi-Robot Systems Integration Assistant professor Egons.
ITM 734 Introduction to Human Factors in Information Systems
Silberschatz, Galvin and Gagne  Operating System Concepts UNIT II Operating System Services.
Evaluation Using Modeling. Testing Methods Same as Formative Surveys/questionnaires Interviews Observation Documentation Automatic data recording/tracking.
CHAPTER SIX Reducing Program Complexity General Sub Procedures and Developer-defined Functions.
1 Cognitive Modeling GOMS, Keystroke Model Getting some details right!
SOEN 343 Software Design Section H Fall 2006 Dr Greg Butler
Chapter 15: Analytical evaluation. Aims: Describe inspection methods. Show how heuristic evaluation can be adapted to evaluate different products. Explain.
Fall 2002CS/PSY Predictive Evaluation (Evaluation Without Users) Gathering data about usability of a design by a specified group of users for a particular.
Consistency: A Factor that Links the Usability of Individual Interaction Components Together Willem-Paul Brinkman Brunel University Reinder Haakma Philips.
Chapter 6 : User interface design
A Hierarchical Model for Object-Oriented Design Quality Assessment
SIE 515 Design Evaluation Lecture 7.
Human Computer Interaction
CIS 376 Bruce R. Maxim UM-Dearborn
Mixed Reality Server under Robot Operating System
Evaluation.
Testing & modeling users
Dept. of Computation, UMIST
SeeSoft A Visualization Tool..
Presentation transcript:

Usability Testing Of Interaction Components: Taking the Message Exchange as a Measure of Usability Willem-Paul Brinkman Brunel University, London Reinder Haakma Philips Research Laboratories Eindhoven Don Bouwhuis Eindhoven University of Technology

Topics  Introduction  Evaluation Method  Experimental validation  Comparison with other evaluation methods

Introduction Component-Based Software Engineering Empirical Usability Testing Single device with Multiple components

Layered communication

Layered Protocol Theory = 15+23= Add ProcessorEditor Control results Control equation UserCalculator

Evaluation Method Aim to evaluate the usability of a component based on the message exchange between a user and a specific component

Evaluation Method: Test Procedure  Normal procedures of a usability test  User task which requires interaction with components under investigation  Users must complete the task successfully

Evaluation Method Component specific objective performance measure: 1.Messages received + Weight factor A common currency 2.Compare with ideal user A common point of reference Usability of individual components in a single device can be compared with each other and prioritized on potential improvements

Click {1} Click {1} Call <>{2} Set <Fill colour red, no border> {7} Right Mouse Button Menu Properties Assigning weight factors to represent the user’s effort in the case of ideal user

Total effort value Total effort =  MR i.W MR i.W : Message received. Weight factor Click {1} Click {1} Call <>{2} Right Mouse Button Menu Properties 5 2 = 7 + 2

Assigning weight factors in case of real user Correction for inefficiency of higher and lower components Visual Drawing Objects Properties Right Mouse Button Menu

Assigning weight factors in case of real user Assign weight factors as if lower components operate optimal Visual Drawing Objects Properties Right Mouse Button Menu Inefficiency of lower level components: need more messages to pass on a message upwards than ideally required

Assigning weight factors in case of real user Visual Drawing Objects Properties Right Mouse Button Menu Inefficiency of higher level components: more messages are requested than ideally required UE: User effort MR i.W : Message received. Weight factor #MSU real :Number of messages sent upward by real user #MSU ideal :Number of messages sent upward by ideal user  MR i.W #MSU real  #MSU ideal UE =

Ideal User versus Real User Extra User Effort = User Effort - Total effort The total effort an ideal user would make The total effort a real user made The extra effort a real user made Calculate for each component: Prioritize

Experimental validation 40 users 4 mobile telephones 2 components were manipulated according to Cognitive Complexity Theory  Function Selector: Broad/shallow versus Narrow/deep  Short Text Messages: Simple versus Complex

Architecture Mobile telephone Send Text Message Send Text Message Function Selector Function Selector

Experimental validation Functions Selector Broad/shallow Narrow/deep

Experimental validation Send Text Message Simple Complex

Results Mobile phones Extra User Effort

Results MeasureFunction Selector Send Text Message Objective Extra keystrokes0.64**0.44** Task duration0.63**0.39** Perceived Overall ease-of-use-0.43**-0.26* Overall satisfaction-0.25*-0.22 Component-specific ease-of-use-0.55**-0.34** Component-specific satisfaction-0.41**-0.37** Partial correlation between extra user effort regarding the two components and other usability measures *p. <.05. **p. <.01.

Comparison with other evaluation methods Overall measures Sequential Data analysis GOMS Thinking-aloud, Cognitive Walkthrough and heuristic evaluation Example: Keystrokes, task duration, overall perceived usability Relatively easy to obtain Unsuitable to evaluate components

Overall measures Sequential Data analysis GOMS Thinking-aloud, Cognitive Walkthrough and heuristic evaluation Based only on lower-level events Pre-processing: selection, abstraction, and re-coding Relation between higher- level component and compound message less direct Components’ status not recorded Comparison with other evaluation methods

Help to understand the problem Only looking at error-free task execution Considers the system only at the lowest-level layer Overall measures Sequential Data analysis GOMS Thinking-aloud, Cognitive Walkthrough and heuristic evaluation Comparison with other evaluation methods

Quicker Evaluator effect (reliability) Overall measures Sequential Data analysis GOMS Thinking-aloud, Cognitive Walkthrough and heuristic evaluation Comparison with other evaluation methods

Summary  The usability of individual components can be tested  Taking the message exchange and assigning weight factor to them (common currency)  Comparison between ideal user and real users (common reference point)

Questions Thanks for your attention