User Centred Design Overview. Human centred design processes for interactive systems, ISO 13407 (1999), states: "Human-centred design is an approach to.

Slides:



Advertisements
Similar presentations
DEVELOPING A METHODOLOGY FOR MS3305 CW2 Some guidance.
Advertisements

EST/CSE/ISE 323 Spring 2011 Tony Scarlatos
User Interface Design Notes p7 T120B pavasario sem.
Ch 11 Cognitive Walkthroughs and Heuristic Evaluation Yonglei Tao School of Computing and Info Systems GVSU.
11 HCI - Lesson 5.1 Heuristic Inspection (Nielsen’s Heuristics) Prof. Garzotto.
Electronic Communications Usability Primer.
Multimedia and the World Wide Web
MScIT HCI Web GUI design. IBM’s CUA guidelines - taster Design Principles Each principle has supporting implementation techniques. The two design.
Part 4: Evaluation Days 25, 27, 29, 31 Chapter 20: Why evaluate? Chapter 21: Deciding on what to evaluate: the strategy Chapter 22: Planning who, what,
AJ Brush Richard Anderson
Interactive Systems Technical Design
Testing your design Without users: With users: Cognitive walkthrough
Today’s class Group Presentation More about principles, guidelines, style guides and standards In-class exercises More about usability Norman’s model of.
Heuristic Evaluation Evaluating with experts. Discount Evaluation Techniques  Basis: Observing users can be time- consuming and expensive Try to predict.
Evaluation techniques Part 1
Heuristic Evaluation of Usability Teppo Räisänen
Human-Computer Interaction
1 SKODA-AUTO.CZ prototype evaluation Poznań, 23th of March 2015.
Heuristic evaluation IS 403: User Interface Design Shaun Kane.
Heuristic Evaluation: Hotels.com
1 Usability evaluation and testing User interfaces Jaana Holvikivi Metropolia.
Principles of User Centred Design Howell Istance.
Mario Čagalj University of Split 2013/2014. Human-Computer Interaction (HCI)
SAMPLE HEURISTIC EVALUATION FOR 680NEWS.COM Glenn Teneycke.
INFO3315 Week 4 Personas, Tasks Guidelines, Heuristic Evaluation.
CS 4720 Usability and Accessibility CS 4720 – Web & Mobile Systems.
Chapter 26 Inspections of the UI. Heuristic inspection Recommended before but in lieu of user observations Sort of like an expert evaluation Heuristics.
Basic Principles of HCI Lecture Requirements Analysis Establish the goals for the Website from the standpoint of the user and the business. Agree.
Nielsen’s Ten Usability Heuristics
Usability Evaluation/LP Usability: how to judge it.
Object-Oriented Software Engineering Practical Software Development using UML and Java Chapter 7: Focusing on Users and Their Tasks.
10 Usability Heuristics for User Interface Design.
Multimedia Specification Design and Production 2012 / Semester 1 / week 5 Lecturer: Dr. Nikos Gazepidis
Usability Evaluation June 8, Why do we need to do usability evaluation?
Click to edit Master subtitle style USABILITY and USER INTERFACE DESIGN Application.
SEG3120 User Interfaces Design and Implementation
Design 2 (Chapter 5) Conceptual Design Physical Design Evaluation
LZW Compression Grant Friedline Robert Frankeny Thomas Sutcavage.
Mahindra Infotainment System Heuristic Evaluation v1.0 Maya Studios July 6, 2010.
Evaluating a UI Design Expert inspection methods Cognitive Walkthrough
Information Systems and Organisations
 What to “know”? ◦ Goals of information visualization. ◦ About human perceptual capabilities. ◦ About the issues involved in designing visualization for.
Usability 1 Usability evaluation Without users - analytical techniques With users - survey and observational techniques.
Developed by Tim Bell Department of Computer Science and Software Engineering University of Canterbury Human Computer Interaction.
June 5, 2007Mohamad Eid Heuristic Evaluation Chapter 9.
CENG 394 Introduction to HCI Usability Heuristics.
Heuristic Evaluation Short tutorial to heuristic evaluation
Alan Woolrych My Background Currently – Research & Liaison Officer (DMN) From 1 st January 2003 Usability Researcher with.
CMSC 345, Version 1/11 S. Mitchell 1 Usability and User Interface Design.
RUGGAAMUFFIN Requirements analysis and design Shane Murphy.
Usability Heuristics Avoid common design pitfalls by following principles of good design Nielsen proposes 10 heuristics, others propose more or less. Inspect.
1 Usability evaluation and testing User interfaces Jaana Holvikivi Metropolia.
Basic Elements.  Design is the process of collecting ideas, and aesthetically arranging and implementing them, guided by certain principles for a specific.
Ten Usability Heuristics with Example.. Page 2 Heuristic Evaluation Heuristic evaluation is the most popular of the usability inspection methods. Heuristic.
CS 575 Spring 2012 CSULA Bapa Rao Lecture 6. Agenda for today Review of previous meeting Student Comments Heuristic Evaluation Presentation Team reports.
User Interface Evaluation Heuristic Evaluation Lecture #17.
© 2016 Cognizant. © 2016 Cognizant Introduction PREREQUISITES SCOPE Heuristic evaluation is a discount usability engineering method for quick, cheap,
Ten Usability Heuristics These are ten general principles for user interface design. They are called "heuristics" because they are more in the nature of.
User Interface Design SCMP Special Topic: Software Development
Human Computer Interaction Slide 2
Heuristic Evaluation 3 CPSC 481: HCI I Fall 2014
A NEW FACE OF THE TECHNICAL COMMUNICATOR – UX IS OUR STRENGTH – WE DESIGN TO WRITE BY CHRIS GANTA © 2016, STC INDIA CHAPTER.
Unit 14 Website Design HND in Computing and Systems Development
Heuristic Evaluation Jon Kolko Professor, Austin Center for Design.
Software Engineering D7025E
One-timer?. A new face of the technical communicator – UX is our strength – we design to write.
10 Design Principles.
Chapter 26 Inspections of the user interface
Nilesen 10 hueristics.
Some Usability Engineering Methods
Presentation transcript:

User Centred Design Overview

Human centred design processes for interactive systems, ISO (1999), states: "Human-centred design is an approach to interactive system development that focuses specifically on making systems usable. It is a multi-disciplinary activity." In UCD, all "development proceeds with the user as the center of focus." (Jeffrey Rubin, Handbook of Usability Testing: How to Plan, Design, and Conduct Effective Tests) Rubin depicts the User-Centered Design Process as follows: –The users are in the center of a double circle. –The inner ring contains: Context; Objectives; Environment and Goals. –The outer ring contains: Task Detail; Task Content; Task Organization and Task Flow. User-Centered Design (UCD) is a user interface design process that focuses on usability goals, user characteristics, environment, tasks, and workflow in the design of an interface. UCD follows a series of well-defined methods and techniques for analysis, design, and evaluation of mainstream hardware, software, and web interfaces. The UCD process is an iterative process, where design and evaluation steps are built in from the first stage of projects, through implementation. User Centred Design (UCD)

User Context Objectives Environment Goals Task Detail Task Content Task Organisation Task Flow

Early focus on users and tasks –Structured and systematic information gathering (consistent across the board) –Designers trained by experts before conducting data collection sessions Empirical Measurement and testing of product usage –Focus on ease of learning and ease of use –Testing of prototypes with actual users Iterative Design –Product designed, modified and tested repeatedly. –Allow for the complete overhaul and rethinking of design by early testing of conceptual models and design ideas. User Centred Design Principles Source: Jeffrey Rubin, Handbook of Usability Testing: How to Plan, Design, and Conduct Effective Tests

Visibility of system status –The system should always keep users informed about what is going on, through appropriate feedback within reasonable time. Match between system and the real world –The system should speak the users' language, with words, phrases and concepts familiar to the user, rather than system- oriented terms. Follow real-world conventions, making information appear in a natural and logical order. User control and freedom –Users often choose system functions by mistake and will need a clearly marked "emergency exit" to leave the unwanted state without having to go through an extended dialogue. Support undo and redo. Consistency and standards –Users should not have to wonder whether different words, situations, or actions mean the same thing. Follow platform conventions. Jakob Nielsen’s Ten Usability Characteristics

Error prevention –Even better than good error messages is a careful design which prevents a problem from occurring in the first place. Either eliminate error-prone conditions or check for them and present users with a confirmation option before they commit to the action. Recognition rather than recall –Minimize the user's memory load by making objects, actions, and options visible. The user should not have to remember information from one part of the dialogue to another. Instructions for use of the system should be visible or easily retrievable whenever appropriate. Flexibility and efficiency of use –Accelerators -- unseen by the novice user -- may often speed up the interaction for the expert user such that the system can cater to both inexperienced and experienced users. Allow users to tailor frequent actions. Aesthetic and minimalist design –Dialogues should not contain information which is irrelevant or rarely needed. Every extra unit of information in a dialogue competes with the relevant units of information and diminishes their relative visibility. Jakob Nielsen’s Ten Usability Characteristics

Help users recognize, diagnose, and recover from errors –Error messages should be expressed in plain language (no codes), precisely indicate the problem, and constructively suggest a solution. Help and documentation –Even though it is better if the system can be used without documentation, it may be necessary to provide help and documentation. Any such information should be easy to search, focused on the user's task, list concrete steps to be carried out, and not be too large. Jakob Nielsen’s Ten Usability Characteristics

Task Analysis is used to identify the sequence of tasks required to complete an activity. It involves breaking down tasks into discrete steps, and noting the order in which they occur. This is most useful when designing a new system as the computerised version should mimic the manual system in the type of and ordering of tasks as closely as possible, so that users can easily make the transition to the new computerised system. It should also expose areas of under productivity which a computerised system could automate. Any of the following methods can be used to collect information about a system: –Observation (ethnography) –Interviews with users –User logging What Is Task Analysis ?

Identify the steps involved in the task What information is used for each task? What affects or causes error in task performance? What are the good features of the system? What are the bad features? What skills are required for each task? How are these skills acquired? How are these skills maintained? How do external factors (time constraints, environment) affect the task? How could you 'improve' the experience of undertaking the task? Task List Development

There are 3 main types of user –Novice –Knowledgeable / intermittent user –Expert / frequent user Users can however, be classified in any other way that is appropriate to the system being built. –Some users may have keyboard skills, others not. –Some users may have knowledge of other similar systems, others not. The advantages of classification mean that generalisations can be made about users and their needs. This doesn’t necessarily mean that the best system has been designed for every individual. It means that the system has been designed to fit the generalisations for each user group. User Classification

Novice For the novice user of a system, progress is slow because of the limitations of working memory. Chunking is almost entirely absent. Systems used by novices require more feedback and more opportunities for closure. User Classification

Guidelines for Novice Users All initiatives should come from the computer – the novice may not know what is to be done Each required input should be brief – the shorter it is the more likely it is to be remembered Input procedures should be consistent with user expectations – humans search for patterns and will generalise No special training should be necessary – especially true in the case of web or multimedia where the user is ‘on their own’ All system messages should be clear – in the language of the user, not the designer User decision should be made from a small set of options – the more of a selection you offer, the harder it is to choose Users should control the pace of interaction - they need to understand the system and feel that they can control it, and not the reverse User decision making should be a response to a specific request for action – save - y/n? Help should always be available – tutor / book / online There should be sufficient feedback - closure User Classification

Knowledgeable / Intermittent Users These users need consistent structures, good help facilities, good documentation. Expert Users These users have fast response time and will require brief feedback. Experts organise their knowledge according to a higher conceptual structure. They can recall more than novices because their knowledge is chunked. Expert users will look for keyboard shortcuts, abbreviated sequences. Experts can find constant confirmation screens irritating - use these only when important. User Classification