Lecture 12: Evaluation and Monitoring of Collaborative Virtual Environments/Groupware Dr. Xiangyu WANG.

Slides:



Advertisements
Similar presentations
Chapter 14: Usability testing and field studies
Advertisements

Methodology and Explanation XX50125 Lecture 1: Part I. Introduction to Evaluation Methods Part 2. Experiments Dr. Danaë Stanton Fraser.
Market Research Ms. Roberts 10/12. Definition: The process of obtaining the information needed to make sound marketing decisions.
Chapter 7 Data Gathering 1.
Data gathering. Overview Four key issues of data gathering Data recording Interviews Questionnaires Observation Choosing and combining techniques.
Existing Documentation
Characteristics of on-line formation courses. Criteria for their pedagogical evaluation Catalina Martínez Mediano, Department of Research Methods and Diagnosis.
Chapter 14: Usability testing and field studies. 2 FJK User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept.
Shared Workspaces: Behavioural Foundations Petra Neumann 781 October 12 th, 2005.
Data gathering.
Evaluation. formative 4 There are many times throughout the lifecycle of a software development that a designer needs answers to questions that check.
Visual Awareness in Shared Workspaces Presentation by Marc Leonard.
Design and Evaluation of Iterative Systems n For most interactive systems, the ‘design it right first’ approach is not useful. n The 3 basic steps in the.
Evaluating with experts
1 CS 430 / INFO 430 Information Retrieval Lecture 24 Usability 2.
Methods for Human- Computer Interactions (HCI) Research Dr. Xiangyu Wang Design Computing Acknowledgement to Sasha Giacoppo and.
Lecture 3: Shared Workspace and Design Coordination Dr. Xiangyu WANG.
Chapter 7 GATHERING DATA.
ICS 463, Intro to Human Computer Interaction Design: 8. Evaluation and Data Dan Suthers.
Lecture 3: Shared Workspace Awareness Dr. Xiangyu WANG 11 th August 2008.
Types of interview used in research
RESEARCH METHODS IN EDUCATIONAL PSYCHOLOGY
CSCI 4163 / CSCI 6904 – Winter Housekeeping  Write a question/comment about today’s reading on the whiteboard (chocolate!)  Make sure to sign.
Chapter 7 Requirement Modeling : Flow, Behaviour, Patterns And WebApps.
5-1 © Prentice Hall, 2007 Chapter 5: Determining Object-Oriented Systems Requirements Object-Oriented Systems Analysis and Design Joey F. George, Dinesh.
5-1 © Prentice Hall, 2007 Chapter 5: Determining Object-Oriented Systems Requirements Object-Oriented Systems Analysis and Design Joey F. George, Dinesh.
Questionnaires and Interviews
Dr. Engr. Sami ur Rahman Assistant Professor Department of Computer Science University of Malakand Research Methods in Computer Science Lecture: Research.
Sarah Drummond Dept. Computer Science University of Durham, UK MSc Research An Investigation into Computer Support for Cooperative Work in Software Engineering.
CSCI 4163/6904, summer Quiz  Multiple choice  Answer individually - pass in  Then class discussion.
Chapter 12: Survey Designs
Evaluation of Adaptive Web Sites 3954 Doctoral Seminar 1 Evaluation of Adaptive Web Sites Elizabeth LaRue by.
Data gathering. Overview Four key issues of data gathering Data recording Interviews Questionnaires Observation Choosing and combining techniques.
Chapter 11: Qualitative and Mixed-Method Research Design
Methods of Media Research Communication covers a broad range of topics. Also it draws heavily from other fields like sociology, psychology, anthropology,
Gathering User Data IS 588 Dr. Dania Bilal Spring 2008.
Multimedia Specification Design and Production 2013 / Semester 1 / week 9 Lecturer: Dr. Nikos Gazepidis
Data Collection Methods
Requirements Engineering Requirements Elicitation Process Lecture-8.
Human Computer Interaction
Team-Based Development ISYS321 Determining Object- Oriented Systems Requirements.
Lecture 7. The Questions: What is the role of alternative assessment in language learning? What are the Reasons.
COMP5047 Pervasive Computing: 2012 Think-aloud usability experiments or concurrent verbal accounts Judy Kay CHAI: Computer human adapted interaction research.
Methods for Human- Computer Interactions (HCI) Research Dr. Xiangyu Wang Design Computing Acknowledgement to Sasha Giacoppo and.
Dr. Engr. Sami ur Rahman Assistant Professor Department of Computer Science University of Malakand Research Methods in Computer Science Lecture: Data Generation.
IFS310: Module 3 1/25/2007 Fact Finding Techniques.
Chapter 8 Usability Specification Techniques Hix & Hartson.
CSCI 4163 / CSCI 6904 – Winter Housekeeping  Clarification about due date for reading comments/questions  Skills sheet  Active listening handout.
AMSc Research Methods Research approach IV: Experimental [1] Jane Reid
Usability Engineering Dr. Dania Bilal IS 582 Spring 2006.
Usability Evaluation, part 2. REVIEW: A Test Plan Checklist, 1 Goal of the test? Specific questions you want to answer? Who will be the experimenter?
Usability Engineering Dr. Dania Bilal IS 592 Spring 2005.
AVI/Psych 358/IE 340: Human Factors Data Gathering October 3, 2008.
PREPARATION OF QUESTIONNAIRES PREPARATION OF QUESTIONNAIRES Chapter - 4 Dr. BALAMURUGAN MUTHURAMAN
Usability Engineering Dr. Dania Bilal IS 582 Spring 2007.
Usability Engineering Dr. Dania Bilal IS 587 Fall 2007.
Evaluation and Assessment of Instructional Design Module #4 Designing Effective Instructional Design Research Tools Part 2: Data-Collection Techniques.
Sports Market Research. Know Your Customer How do businesses know their customers needs and wants?  Ask them/talking to customers  Surveys  Questionnaires.
1. 2 Issues in the Design and Testing of Business Survey Questionnaires: Diane K. Willimack U.S. Census Bureau Economic Census The International.
IMPROVING THE HUMAN TECHNOLOGY INTERFACE NURSING INFORMATICS CHAPTER 4 1.
RES 320 expert Expect Success/res320expertdotcom FOR MORE CLASSES VISIT
School of Engineering and Information and Communication Technology KIT305/607 Mobile Application Development Week 7: Usability (think-alouds) Dr. Rainer.
Action Research for School Leaders by Dr. Paul A. Rodríguez.
Lecture3 Data Gathering 1.
Types of interview used in research
Usability Evaluation, part 2
Human-Computer Interaction: Overview of User Studies
Presentation transcript:

Lecture 12: Evaluation and Monitoring of Collaborative Virtual Environments/Groupware Dr. Xiangyu WANG

Groupware Usability (Gutwin and Greenberg 2000) define groupware usability as "... the degree to which a groupware system supports the mechanics of collaboration for a particular set of tasks."

Facts Groupware/CVEs is traditionally considered to be difficult to evaluate because of the effects of multiple people and the social and organizational context. Researchers and developers have employed a range of techniques including scientific, engineering, and social science methodologies.

Pinelle and Gutwin (2000) reviewed 45 papers from the ACM Computer supported cooperative work conference ( ). –almost one-third of the groupware systems were not evaluated in any formal way. –only about one-quarter of the articles included evaluations in a real-world setting. –A wide variety of evaluation techniques are in use. Their main conclusions are: –more attention must be paid to evaluating groupware systems –there is room for additional evaluation techniques that are simple and low in cost.

Classifying groupware/CVEs evaluations type of evaluation, characteristics of the evaluation, data collection techniques, placement of the evaluation in the software development cycle, type of conclusions drawn from the evaluation

Classifying groupware/CVEs evaluations Type of evaluations –The major differentiating characteristics between the strategies are level of experimental manipulation and evaluation setting.

Classifying groupware/CVEs evaluations Characteristics of the evaluation: Evaluations are further classified according to the rigor of the experimental manipulation and the type and rigor of measurements. –Quantitative vs. Qualitative –Manipulation: Formal / rigorous Minimal manipulation No manipulation –Measurement: Formal / rigorous v.s. Informal

Classifying groupware/CVEs evaluations Techniques for data collection –User Observation –Interview –Discussion –Questionnaire –Qualitative work measures –Quantitative work measures –Collection of archival material

Classifying groupware/CVEs evaluations Placement of evaluation in software lifecycle –Grudin [14] stresses the importance of evaluation over a period of time following groupware implementation. He also argues that evaluations of partial prototypes in laboratory settings are not able to address complex social and organizational issues.

Classifying groupware/CVEs evaluations Six potential placements of the evaluation were considered: –Periodic evaluations throughout development process –Continuous evaluation throughout development –Evaluation of a prototype –Evaluation of a finished piece of software –Periodic evaluations after software implementation –Continuous evaluation after software implementation

Classifying groupware/CVEs evaluations Focus of the evaluation: Types of evaluation focus include: –Organizational impact / impact on work practices –End product produced through using the software –Efficiency of task performance using software –User satisfaction with the software –Task support provided by the software –Specific features of the groupware interface –Patterns of system use –User interaction while using the software

Evaluation Methods –Heuristic Evaluation –Controlled Experiments –Survey Methods: Surveys & Questionnaires –Ethnographic Methods –Logging & Automated Metrics

Heuristic Evaluation (HE) A well accepted discount evaluation technique for single user systems that is in popular use is Heuristic Evaluation (HE). HE involves a group of usability evaluators inspecting the system to identify usability issues. The issues are identified with respect to a set of usability guidelines, or heuristics. HE is popular because it is cheap, doesn't necessarily require representatives from the user community, and is effective at finding usability problems.

Heuristic Evaluation (HE) How to perform a Heuristic Evaluation study –Orientation The purpose of the orientation session is to educate and train the evaluators. –Inspection During the evaluation process, the evaluators are asked to use and explore the interface. While using the interface, they are asked to identify usability problems that violate the heuristics. –Debriefing The purpose of the debriefing session is to analyze the usability problems that have been identified. Debriefing should include all evaluators, usability team members and observers. The evaluators should go through the problems that they have found and classify them firstly by which heuristic they violate, and secondly by the severity of the problem. Problem severity is gauged by taking into account the frequency of the problem, the impact of the problem, and the persistence of the problem.

Heuristic Evaluation (HE) Below is a list of the groupware heuristics with a brief description of the behaviour the heuristic relates to in a face-to-face setting. –Provide the means for intentional and appropriate verbal communication The dominant form of communication in groups is verbal conversation. The conversations are used to establish a common understanding of the tasks and activities that the group is participating in. –Provide the means for intentional and appropriate gestural communication –Provide consequential communication of an individual's embodiment –Provide consequential communication of shared artifacts –Provide protection –Management of tightly and loosely-coupled collaboration –Allow people to coordinate their actions –Facilitate finding collaborators and establishing contact

Heuristic Evaluation Results Provide the means for intentional and appropriate verbal communication: –a chat window and the audio channel. Provide the means for intentional and appropriate gestural communication –Illustrative and emblem gestures are supported by the video stream, although taking these gestures through the video comes at the cost of moving attention away from the workspace area. –The pointing hand icon in telepointer that can be seen on the whiteboard. The telepointer is a poor expression tool though, it is a fixed shape and there is only one of them for each user. It falls far short of the expressiveness of a pair of human hands. …….

Heuristic Evaluation Source Heuristic Evaluation of Groupware by Gregor McEwan URL: EG.html

Survey Methods Observations Interviews (visiting, telephone, indirect) Questionnaire Diary …..

Questionnaires Before conducting a study -practicalities –Time –Cost –Range –Questions (type, wording, order) –Demands reflection –Instructions –Pictures, anonymous cards –Language, knowledge

Checklist for constructing a questionnaire Be concrete Think about effects due to the order of posing your questions (context) Ambiguity Assumptions Careful with yes and no questions

Question type Open questions List Category Ranking Scale Quantity Table

Appearance and layout Word processed Clear instructions Spacing between questions Keep response boxes in line (left/right) Guide the respondent the right way Promise anonymity and confidentiality

Administering the questionnaire Electronic, snail mail or face-to-face Self-addressed envelope Instruction, information letter Anonymous, confidential

Piloting the questionnaire How long did it take you to complete? Were the instructions clear? Were any of the questions unclear or ambiguous? If so, will you say which and why? Did you object to answering any of the questions? In your opinion, has any major topic been omitted? Was the layout of the questionnaire clear/attractive? Any comments?

Ethnographic Methods Ethnography has been adapted from sociology and anthropology, where it is a method of observing human interactions in social settings and activities.

Ethnographic methods There are several reasons why ethnography is of vital importance to good interface design, including these: –An ethnographic study is a powerful assessment of users' needs –It uncovers the true nature of the user's job: A goal of an ethnographic study is to uncover all tasks and relationships that combine to form a user's job. –The open-ended and unbiased nature of ethnography allows for discovery: The unassuming nature of ethnography can often yield unexpected revelations about how a system is used. Drawbacks: –Time requirements –Presentation of results –Scale

Logging and Automated Metrics Logging can be manual or automated. Automated logging involves having the computer collect statistics about the detailed use of a system. Typically, an interface log will contain statistics about: –the frequency with which each user has used each feature in the program –the frequency with which various events of interest have occurred Statistics showing the frequency of use of commands and other system features can be used to optimize frequently used features and to identify the features that are rarely used or not used. In addition, an analysis on patterns of use can be made using the logging data. Statistics showing the frequency of various events, such as error situations and the use of online help, can be used to improve the usability of future releases of the system.

Next week final design presentation This is a group presentation, make sure you all present a part of it. Each group will have 15 minutes. The format is free, power point slides would be the best. –talk about the design concepts/brief and show screen shots of the final designs. –talk about the design collaboration process Your presentation will help us to understand whether you have gain knowledge of design collaboration and aware of the issues that are related to distributed design collaboration. Thus use this chance to show us that you understand design collaboration and what kind of features a collaborative virtual environment should have.