Evaluation. formative 4 There are many times throughout the lifecycle of a software development that a designer needs answers to questions that check.

Slides:



Advertisements
Similar presentations
Chapter 14: Usability testing and field studies
Advertisements

Evaluation of User Interface Design
Requirements gathering
Research methods – Deductive / quantitative
Evaluation Howell Istance. Why Evaluate? n testing whether criteria defining success have been met n discovering user problems n testing whether a usability-related.
Chapter 14: Usability testing and field studies. Usability Testing Emphasizes the property of being usable Key Components –User Pre-Test –User Test –User.
©N. Hari Narayanan Computer Science & Software Engineering Auburn University 1 COMP 7620 Evaluation Chapter 9.
© De Montfort University, 2001 Questionnaires contain closed questions (attitude scales) and open questions pre- and post questionnaires obtain ratings.
Evaluation J T Burns May 2004
Research Design Week 4 Lecture 1 Thursday, Apr. 1, 2004.
Think-aloud usability experiments or concurrent verbal accounts Judy Kay CHAI: Computer human adapted interaction research group School of Information.
Evaluation Methodologies
©Ian Sommerville 2000 Software Engineering, 6th edition. Chapter 15Slide 1 User interface design l Designing effective interfaces for software systems.
Design and Evaluation of Iterative Systems n For most interactive systems, the ‘design it right first’ approach is not useful. n The 3 basic steps in the.
From Controlled to Natural Settings
Sabine Mendes Lima Moura Issues in Research Methodology PUC – November 2014.
ICS 463, Intro to Human Computer Interaction Design: 8. Evaluation and Data Dan Suthers.
RESEARCH METHODS IN EDUCATIONAL PSYCHOLOGY
Assessing and Evaluating Learning
The Study of Adult Development and Aging:
RESEARCH DESIGN.
Business and Management Research
Contents Research Methods Planning Research The Experimental Method Advantages and Disadvantages Questioning Advantages and Disadvantages The Observational.
Chapter 1 Psychology as a Science
Chapter 14: Usability testing and field studies
McGraw-Hill © 2006 The McGraw-Hill Companies, Inc. All rights reserved. The Nature of Research Chapter One.
Presentation: Techniques for user involvement ITAPC1.
Research !!.  Philosophy The foundation of human knowledge A search for a general understanding of values and reality by chiefly speculative rather thanobservational.
Basic and Applied Research. Notes:  The question asked is either “basic” or “applied”  “Try again…” NEVER with the same data set  *data mining*  Literature.
Undergraduate Dissertation Preparation – Research Strategy.
Multimedia Specification Design and Production 2013 / Semester 1 / week 9 Lecturer: Dr. Nikos Gazepidis
PUBLIC RELATIONS RESEARCH AND PLANNING
Evaluating a Research Report
Human Computer Interaction
Usability testing. Goals & questions focus on how well users perform tasks with the product. – typical users – doing typical tasks. Comparison of products.
Evaluation Paradigms & Techniques IS 588 Spring 2008 Dr. D. Bilal.
Observation & Analysis. Observation Field Research In the fields of social science, psychology and medicine, amongst others, observational study is an.
Assumes that events are governed by some lawful order
1 f02kitchenham5 Preliminary Guidelines for Empirical Research in Software Engineering Barbara A. Kitchenham etal IEEE TSE Aug 02.
CS2003 Usability Engineering Usability Evaluation Dr Steve Love.
COMP5047 Pervasive Computing: 2012 Think-aloud usability experiments or concurrent verbal accounts Judy Kay CHAI: Computer human adapted interaction research.
Quantitative and Qualitative Approaches
CS2003 Usability Engineering Human-Centred Design Dr Steve Love.
AP Psych Agenda Hand back and go over test Score the free response Start chapter 2: The Research Enterprise in Psychology ▫Experiments ▫Case.
AMSc Research Methods Research approach IV: Experimental [1] Jane Reid
Research Design. Selecting the Appropriate Research Design A research design is basically a plan or strategy for conducting one’s research. It serves.
1 ISDE Revision Topics 2012 Compiled by Sheila Cassidy-Allan using J Burns slides.
1 Human-Computer Interaction Usability Evaluation: 2 Expert and Empirical Methods.
 Variables – Create an operational definition of the things you will measure in your research (How will you observe and measure your variables?) 
CHAPTER 3: METHODOLOGY.
Case Studies and Review Week 4 NJ Kang. 5) Studying Cases Case study is a strategy for doing research which involves an empirical investigation of a particular.
Oct 211 The next two weeks Oct 21 & 23: Lectures on user interface evaluation Oct 28: Lecture by Dr. Maurice Masliah No office hours (out of town) Oct.
User Interface Evaluation Introduction Lecture #15.
Research Methods in Psychology Introduction to Psychology.
1 Prepared by: Laila al-Hasan. 1. Definition of research 2. Characteristics of research 3. Types of research 4. Objectives 5. Inquiry mode 2 Prepared.
Computing Honours Project (COMP10034) Lecture 4 Primary Research.
Research Philosophies, Approaches and Strategies Levent Altinay.
6. (supplemental) User Interface Design. User Interface Design System users often judge a system by its interface rather than its functionality A poorly.
School of Engineering and Information and Communication Technology KIT305/607 Mobile Application Development Week 7: Usability (think-alouds) Dr. Rainer.
Day 8 Usability testing.
RELEVANCE OF QUESTIONNAIRE METHOD OF DATA COLLECTION IN SOCIAL SCIENCERESEARCH BY : POOJAR BASAVARAJ HEAD, DEPT OF POLITICAL SCIENCE KARNATAK ARTS.
Research & Writing in CJ
© 2012 The McGraw-Hill Companies, Inc.
From Controlled to Natural Settings
Lesson 1 Foundations of measurement in Psychology
From Controlled to Natural Settings
Empirical Evaluation Data Collection: Techniques, methods, tricks Objective data IRB Clarification All research done outside the class (i.e., with non-class.
Debate issues Sabine Mendes Lima Moura Issues in Research Methodology
AS Psychology Research Methods
Presentation transcript:

Evaluation

formative 4 There are many times throughout the lifecycle of a software development that a designer needs answers to questions that check whether his or her ideas match with those of the user(s).Such evaluation is known as formative evaluation because it (hopefully) helps shape the product. User-centred design places a premium on formative evaluation methods. 4 Summative 4 Summative evaluation, in contrast, takes place after the product has been developed.

Context of Formative Evaluation usability specific particulardefinite 4 Evaluation is concerned with gathering data about the usability of a design or product by a specific group of users for a particular activity within a definite environment or work context. 4 Regardless of the type of evaluation it is important to consider users –characteristics of the users activities –types of activities they will carry out –environment –environment of the study (controlled laboratory? field study?) –nature –nature of the artefact or system being evaluated? (sketches? prototype? full system?)

Reasons for Evaluation 4 Understanding 4 Understanding the real world –particularly important during requirements gathering 4 Comparing 4 Comparing designs –rarely are there options without alternatives –valuable throughout the development process 4 Engineering 4 Engineering towards a target –often expressed in the form of a metric 4 Checking conformance 4 Checking conformance to a standard

Classification of Evaluation Methods 4 Observation and Monitoring –data collection by note-taking, keyboard logging, video capture 4 Experimentation and Benchmarking –statement of hypothesis, control of variables 4 Collecting users’ opinions –surveys, questionnaires, interviews 4 Interpreting situated events 4 Predicting usability

Observation and Monitoring - Direct Observation Protocol 4 Usually informal in field study, more formal in controlled laboratories 4 data collection by direct observation and note- taking –users in “natural” surroundings –“objectivity” may be compromised by point of view of observer –users may behave differently while being watched (Hawthorne Effect) –ethnographic, participatory approach is an alternative

Observation and Monitoring - Indirect Observation Protocol 4 data collection by remote note taking, keyboard logging, video capture briefedpolicy justifiedprioritised –Users need to be briefed fully; a policy must be decided upon and agreed about what to do if they get “stuck”; tasks must be justified and prioritised (easiest first) –Video capture permits post-event “debriefing” and avoids Hawthorne effect (However, users may behave differently in unnatural environment) –with data-logging vast amounts of low-level data collected; difficult and expensive to analyse –interaction of variables may be more relevant than a single one (lack of context)

Experimentation and Benchmarking 4 “Scientific” and “engineering” approach 4 utilises standard scientific investigation techniques 4 Selection of benchmarking criteria is critical…and sometimes difficult (e.g., for OODBMS) 4 Control of variables, esp. user groups, may lead to “artificial” experimental bases

Collecting User’s Opinions 4 Surveys –critical mass and breadth of survey are critical for statistical reliability –Sampling techniques need to be well-grounded in theory and practice –Questions must be consistently formulated, clear and not “lead” to specific answers

Collecting User’s Opinions - Verbal Protocol 4 (Individual) Interviews –can be during or after user interaction during: immediate impressions are recorded during: may be distracting during complex tasks after: no distraction from task at hand after: may lead to misleading results (short-term memory loss, “history rewritten” etc.) –can be “structured” or not a structured interview is like a personal questionnaire - prepared questions

Collecting Users Opinions 4 Questionnaires –“open” (free form reply) or “closed” (answers “yes/no” or from a wider range of possible answers) latter is better for quantitative analysis –important to use clear, comprehensive and unambiguous terminology, quantified where possible e.g., daily?, weekly?, monthly? Rather than “seldom”, “often ” and there should always be a “never” –Needs to allow for “negative” feedback –All Form Fillin guidelines apply!

Relationship between Types of Evaluation and Reasons for Evaluation Observing and Monitoring Users’ Opinions Experiments etc. InterpretivePredictive Understanding Real World Comparing Designs Engineering to target Standards conformance Y Y Y Y Y Y Y Y Y Y Y Y Y