Multimedia Specification Design and Production 2013 / Semester 1 / week 9 Lecturer: Dr. Nikos Gazepidis

Slides:



Advertisements
Similar presentations
Chapter 15: Analytical evaluation
Advertisements

CAPD eBook: Evaluating Multimedia Application for Continuous Ambulatory Peritoneal Dialysis (CAPD) Users Presenter: Mohammad Hafiz Ismail Arifah Fasha.
Multimedia Specification Design and Production 2013 / Semester 1 / week 7 Lecturer: Dr. Nikos Gazepidis
Chapter 4 Design Approaches and Methods
Chapter 4 Quality Assurance in Context
CS305: HCI in SW Development Evaluation (Return to…)
Educational Research: Competencies for Analysis and Application, 9 th edition. Gay, Mills, & Airasian © 2009 Pearson Education, Inc. All rights reserved.
©N. Hari Narayanan Computer Science & Software Engineering Auburn University 1 COMP 7620 Evaluation Chapter 9.
Usability presented by the OSU Libraries’ u-team.
1 User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept. Cal Poly San Luis Obispo FJK 2009.
Empirical Methods in Human- Computer Interaction.
Usability Inspection n Usability inspection is a generic name for a set of methods based on having evaluators inspect or examine usability-related issues.
Evaluation Methodologies
Formative and Summative Evaluations
Design and Evaluation of Iterative Systems n For most interactive systems, the ‘design it right first’ approach is not useful. n The 3 basic steps in the.
Fundamentals of Information Systems, Second Edition
Analytical methods for Information Systems Professionals Week 13 Lecture 1 CONCLUSION.
Testing and Modeling Users Kristina Winbladh & Ramzi Nasr.
Evaluation: Inspections, Analytics & Models
IS550: Software requirements engineering Dr. Azeddine Chikh 4. Validation and management.
HCI revision lecture. Main points Understanding Applying knowledge Knowing key points Knowing relationship between things If you’ve done the group project.
Software Verification and Validation (V&V) By Roger U. Fujii Presented by Donovan Faustino.
Evaluation in HCI Angela Kessell Oct. 13, Evaluation Heuristic Evaluation Measuring API Usability Methodology Matters: Doing Research in the Behavioral.
1. Learning Outcomes At the end of this lecture, you should be able to: –Define the term “Usability Engineering” –Describe the various steps involved.
©2011 1www.id-book.com Analytical evaluation Chapter 15.
Writing.. materialises ideas and results academic writing: –exams –reports –assignments –dissertation/thesis “ good ideas and works can only be materialised.
Predictive Evaluation
Evaluation Framework Prevention vs. Intervention CHONG POH WAN 21 JUNE 2011.
Demystifying the Business Analysis Body of Knowledge Central Iowa IIBA Chapter December 7, 2005.
Lecture 9 Usability of Health Informatics Applications (Chapter 9)
OECD/INFE Tools for evaluating financial education programmes Adele Atkinson, PhD Policy Analyst OECD With the support of the Russian/World Bank/OECD Trust.
Human Computer Interaction
Interface Design Natural Design. What is natural design? Intuitive Considers our learned behaviors Naturally designed products are easy to interpret and.
Multimedia Specification Design and Production 2012 / Semester 1 / week 5 Lecturer: Dr. Nikos Gazepidis
Usability Evaluation June 8, Why do we need to do usability evaluation?
Heuristic Evaluation and Discount Usability Engineering Taken from the writings of Jakob Nielsen – inventor of both.
ScWk 242 Course Overview and Review of ScWk 240 Concepts ScWk 242 Session 1 Slides.
CS2003 Usability Engineering Usability Evaluation Dr Steve Love.
Evaluation of User Interface Design 4. Predictive Evaluation continued Different kinds of predictive evaluation: 1.Inspection methods 2.Usage simulations.
An Overview of Usability Evaluation #15 1. Outline  What is usability evaluation ?  Why perform usability evaluation ?  Types of usability evaluations.
Objectives: Have some useful tips for doing well Know the layout and expectations of the exam paper.
Usability Assessment Methods beyond Testing Chapter 7 Evaluating without users.
Chapter 15: Analytical evaluation. Inspections Heuristic evaluation Walkthroughs.
Chapter 15: Analytical evaluation Q1, 2. Inspections Heuristic evaluation Walkthroughs Start Q3 Reviewers tend to use guidelines, heuristics and checklists.
Chapter 8 Usability Specification Techniques Hix & Hartson.
Fundamentals of Information Systems, Second Edition 1 Systems Development.
Evaluating a UI Design Expert inspection methods Cognitive Walkthrough
Usability Engineering Dr. Dania Bilal IS 582 Spring 2006.
EVALUATION PROfessional network of Master’s degrees in Informatics as a Second Competence – PROMIS ( TEMPUS FR-TEMPUS-JPCR)
Cognitive Walkthrough More evaluating with experts.
Chapter 15: Analytical evaluation. Aims: Describe inspection methods. Show how heuristic evaluation can be adapted to evaluate different products. Explain.
Introduction to Evaluation without Users. Where are you at with readings? Should have read –TCUID, Chapter 4 For Next Week –Two Papers on Heuristics from.
Oct 211 The next two weeks Oct 21 & 23: Lectures on user interface evaluation Oct 28: Lecture by Dr. Maurice Masliah No office hours (out of town) Oct.
User Interface Evaluation Introduction Lecture #15.
Usability Engineering Dr. Dania Bilal IS 587 Fall 2007.
1 Usability Analysis n Why Analyze n Types of Usability Analysis n Human Subjects Research n Project 3: Heuristic Evaluation.
Design Evaluation Overview Introduction Model for Interface Design Evaluation Types of Evaluation –Conceptual Design –Usability –Learning Outcome.
1 Design and evaluation methods: Objectives n Design life cycle: HF input and neglect n Levels of system design: Going beyond the interface n Sources of.
User Interface Evaluation
SIE 515 Design Evaluation Lecture 7.
Human Computer Interaction Lecture 15 Usability Evaluation
Imran Hussain University of Management and Technology (UMT)
Cognitive walkthrough
Introducing Evaluation
Evaluation techniques
Evaluation technique Heuristic Evaluation.
Usability Techniques Lecture 13.
PostPC Computing Heuristic Evaluation Prof. Scott Kirkpatrick, HUJI
Evaluation.
HCI Evaluation Techniques
Presentation transcript:

Multimedia Specification Design and Production 2013 / Semester 1 / week 9 Lecturer: Dr. Nikos Gazepidis

2 Learning outcomes Evaluation (more about this topic) Evaluation methods Empirical evaluation (necessary steps) Analytical evaluation More about Evaluation

3 Reading List: 1.Notes for Lecture week_8: introduction to evaluation 2.Faulkner Chapter 6, pp 137 – 146 (stop before 6.5.1), 6.6. – 6.16, pp 156 – 173 and Chapter 7, pp More about Evaluation

4 Evaluation 1.Central to user-centred iterative development, carried out throughout the development process (however often developers feel they can undertake by themselves) 2.Linked to every other activity in the design cycle 3.Developers are often tempted to skip it because it can add to the development time and costs money and effort More about Evaluation

5 EVALUATION provides the opportunity to: help to ensure that the system is usable help to ensure what users want in the final system end up cheaper than fixing problems identified later on Evaluation is the process of gathering data so we can answer the questions! More about Evaluation why you are doing it what you hope to achieve within the inevitable practical constraints: availability to facilities/equipment easy access to users expertise time budget ethical issues

6 An introduction to Evaluation Terminology Evaluation …the process of systematically gathering data at various stages within the development process, which can be used to improve the designers’ understanding of the users’ requirements and amend the design to meet users’ needs. It can employ a range of techniques, some involving users directly, at different stages, to examine different aspects of the design

7 Evaluation methods 1. Empirical2. Analytical …based on user experience …based on the expert view More about Evaluation 3. Heuristic

8 1. Empirical evaluation: necessary steps 1.it should start with a clear understanding of what questions need to be answered – i.e. what the evaluation aims to find out, set appropriate targets 2.developing the evaluation activity selecting participants to perform tasks developing tasks for participants to perform (benchmark and representative tasks) – give the participants focused activities determining protocol and procedures for the evaluation sessions pilot testing may be necessary to improve experiment 3.directing the evaluation sessions More about Evaluation

9 1. Empirical evaluation: necessary steps 4.generating data by: quantitative: benchmark tasks, user questionnaires qualitative: concurrent verbal protocol, retrospective verbal protocol, critical incident reporting, structured interviews 5.collecting data - in order to have evidence on which to base evaluation real-time note-taking audiotaping videotaping internal instrumentation of the interface More about Evaluation

10 1. Empirical evaluation: necessary steps 6.analyzing the data, comparing results with targets in usability specifications 7.drawing conclusions to form a resolution of each design problem 8.depending on the outcome of the evaluation, it may be necessary to redesign and implement the revisions More about Evaluation

11 2. Analytical evaluation More about Evaluation The following section of notes has been compiled from papers on Jakob Nielson’s website Faulkner Chapter 7 provides more detail about the various methods briefly outlined below. Usability inspection is a set of methods that are all based on having evaluators inspect and analyse a user interface. Typically, usability inspection is aimed at finding usability problems in the design, though some methods could evaluate the overall usability of an entire system. Inspection methods can be applied early in the interaction development lifecycle, allowing feedback, iteration and improvement.

12 2. Analytical evaluation More about Evaluation Heuristic evaluation is the most informal method and involves having usability specialists judge Heuristic estimation is a variant in which the inspectors are asked to estimate the relative usability of two (or more) designs in quantitative terms (typically expected user performance).

13 2. Analytical evaluation More about Evaluation Cognitive walkthrough uses a more detailed procedure to simulate a user's problem-solving process at each step through the dialogue, checking if the simulated user's goals and memory content can be assumed to lead to the next correct action. Pluralistic walkthrough uses group meetings where users, developers, and human factors people step through a scenario, discussing each dialogue element.

14 2. Analytical evaluation More about Evaluation Feature inspection lists sequences of features used to accomplish typical tasks, checks for long sequences, cumbersome steps, steps that would not be natural for users to try, and steps that require extensive knowledge/experience in order to assess a proposed feature set. Consistency inspection has designers who represent multiple other projects inspect an interface to see whether it does things in the same way as their own designs.

15 2. Analytical evaluation More about Evaluation Standards inspection has an expert on an interface standard inspect the interface for compliance. Formal usability inspection combines individual and group inspections in a six-step procedure with strictly defined roles to with elements of both heuristic evaluation and a simplified form of cognitive walkthroughs.

16 2. Analytical evaluation More about Evaluation Heuristic evaluation, heuristic estimation, cognitive walkthrough, feature inspection, and standards inspection normally have the interface inspected by a single evaluator at a time. * In contrast pluralistic walkthrough and consistency inspection are group inspection methods. Many usability inspection methods are so easy to apply that it is possible to have regular developers serve as evaluators, though better results are normally achieved when using usability specialists.

17 3. Heuristic evaluation More about Evaluation Heuristics – broad based rules or principles derived from theoretical knowledge (e.g. cognitive psychology) and practical experience heuristic evaluation is the most popular usability inspection method heuristics can be used to inform the design, as well as providing a checklist for evaluation heuristic evaluation allows quick, cheap and easy evaluation of a user interface design, hence known as a ‘discount usability engineering’ method heuristic evaluation aims to identify usability problems which can then be fixed within the iterative design process heuristic evaluation involves a small set of evaluators examining an interface and judging its compliance with recognized usability principles