A METHODOLOGY TO EVALUATE THE USABILITY OF DIGITAL SOCIALIZATION IN ‘VIRTUAL’ ENGINEERING DESIGN Amjad El-Tayeh, Nuno Gil, Jim Freeman Centre for Research.

Slides:



Advertisements
Similar presentations
Chapter 14: Usability testing and field studies
Advertisements

Database Planning, Design, and Administration
Evaluation for 1st Year Grantees Shelly Potts, Ph.D. Arizona State University
Addressing Patient Motivation In Virtual Reality Based Neurocognitive Rehabilitation A.S.Panic - M.Sc. Media & Knowledge Engineering Specialization Man.
Sharad Oberoi, Susan Finger and Eric Rosé Carnegie Mellon University Online Implementation of the Delta Design Game for analyzing Collaborative Team Practices.
Choosing and Defining a Research Problem Dr. U N Roy Associate Professor, NITTTR Chandigarh Mobile No: Dr U N Roy.
WHAT IS INTERACTION DESIGN?
Ch 4 The Process page 1CS 368 Building Software is Difficult often delivered late often over budget always with errors must be a custom solution complexity.
The Methods of Social Psychology
PPA 502 – Program Evaluation
Team Composition and Team Role Allocation in Agile Project Teams Brian Turrel 30 March 2015.
Needs Analysis Session Scottish Community Development Centre November 2007.
Alexandra Savelieva, Sergey Avdoshin, PhD National Research University “Higher School of Economics” Alexandra Savelieva, Sergey Avdoshin, PhD National.
1 Module 4: Designing Performance Indicators for Environmental Compliance and Enforcement Programs.
Database System Development Lifecycle © Pearson Education Limited 1995, 2005.
Overview of the Database Development Process
PDHPE K-6 Using the syllabus for consistency of assessment © 2006 Curriculum K-12 Directorate, NSW Department of Education and Training.
Impact assessment framework
Global Action Plan and its implementation in other regions Meeting for Discussion of the draft Plan for the Implementation of the Global Strategy to Improve.
Qualitative Analysis Information Studies Division Research Workshop Elisabeth Logan.
Quality assurance activities at EUROSTAT CCSA Conference Helsinki, 6-7 May 2010 Martina Hahn, Eurostat.
Dissertation Theme “The incidence of using WebQuests on the teaching-learning process of English Foreign Language (EFL) for students attending the seventh.
May11, Introduction: Project Team Agenda Introduction Performance Analysis LATIST Framework Usage Centered Design Process LATIST Development Usability.
Writing research proposal/synopsis
1 MultiCom, a platform for the design and the evaluation of interactive systems. MultiCom, a platform for the design and the evaluation of interactive.
Lecture 8 Research Proposal.  Find out what is the required format of research proposal  Research Proposal is a solid and convincing framework of a.
Creating a Shared Vision Model. What is a Shared Vision Model? A “Shared Vision” model is a collective view of a water resources system developed by managers.
In METRO CENTRAL EDUCATION DISTRICT An Intervention of the Literacy and Numeracy Strategy ( ) By Faith Engel.
CHAPTER TWO THE MAKING OF MULTIMEDIA: AND MULTIMEDIA DEVELOPMENT TEAM
EVALUATION APPROACHES Heather Aquilina 24 March 2015.
CCSSO Criteria for High-Quality Assessments Technical Issues and Practical Application of Assessment Quality Criteria.
Click to edit Master title style Click to edit Master subtitle style 19/10/20151 Theme assessment and feedback: Can a business simulation game (BSG) provide.
Optimizing Resource Discovery Service Interfaces in Statewide Virtual Libraries: The Library of Texas Challenge William E. Moen, Ph.D. Texas Center for.
Project Management Models
By Cao Hao Thi - Fredric W. Swierczek
UNESCO Institute for Statistics Monitoring and Improving Learning in the 2030 Agenda Simple Road Map to a cross-national scale Silvia Montoya, PhD Director.
IMPACT OF QUALITY ASSURANCE SYSTEM IN EFFECTIVENESS OF VOCATIONAL EDUCATION IN ALBANIA IMPACT OF QUALITY ASSURANCE SYSTEM IN EFFECTIVENESS OF VOCATIONAL.
| 1 › Matthias Galster, University of Groningen, NL › Armin Eberlein, American University of Sharjah, UAE Facilitating Software Architecting by.
Strengthening the Science-Policy Platform on Biodiversity and Ecosystem Services Africa Consultation on IPBES May 2010 Nairobi, Kenya Peter Gilruth,
User Interfaces 4 BTECH: IT WIKI PAGE:
Fueloil Information Systems Standards and and Evaluation methods Prof. Dr. Mohamed Elazab
Ohio Superintendent Evaluation System. Ohio Superintendent Evaluation System (Background) Senate Bill 1: Standards for teachers, principals and professional.
Lecture 2 –Approaches to Systems Development Method 10/9/15 1.
Process Quality in ONS Rachel Skentelbery, Rachael Viles & Sarah Green
Software Architecture Evaluation Methodologies Presented By: Anthony Register.
CASE (Computer-Aided Software Engineering) Tools Software that is used to support software process activities. Provides software process support by:- –
Advanced Manufacturing Laboratory Department of Industrial Engineering Sharif University of Technology Session #14.
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
DEVELOPING THE WORK PLAN
What is Social Accountability? “Social Accountability is an approach initiated by the community for collective responsibility of all stakeholders in ensuring.
BME 353 – BIOMEDICAL MEASUREMENTS AND INSTRUMENTATION MEASUREMENT PRINCIPLES.
1 Multimedia Development Team. 2 To discuss phases of MM production team members Multimedia I.
Chapter 9 Database Planning, Design, and Administration Transparencies © Pearson Education Limited 1995, 2005.

An assessment framework for Intrusion Prevention System (IPS)
E2E Testing in Agile – A Necessary Evil
Introducing Evaluation
WHAT IS INTERACTION DESIGN?
Global Challenge Love Heart Lesson 3.
Global Challenge Love Heart Lesson 3.
Global Challenge Love Heart Lesson 3.
Global Challenge Love Heart Lesson 3.
Global Challenge Love Heart Lesson 3.
1. INTRODUCTION.
Global Challenge Love Heart Lesson 3.
Global Challenge Love Heart Lesson 3.
Global Challenge Love Heart Lesson 3.
Chapter 14 INTRODUCING EVALUATION
Global Challenge Love Heart Lesson 3.
Presentation transcript:

A METHODOLOGY TO EVALUATE THE USABILITY OF DIGITAL SOCIALIZATION IN ‘VIRTUAL’ ENGINEERING DESIGN Amjad El-Tayeh, Nuno Gil, Jim Freeman Centre for Research in the Management of Projects (CRMP) Manchester Business School

2 Problem Statement Socialization (Knowledge Creation Theory)  Conversion of Individual tacit to group tacit knowledge without codifying (Nonaka 1990)  Processes: Storytelling, apprenticeship, meetings, job rotation, physical socialization (Nonaka 1990) Challenges for Physical Socialization in project- based environments:  People work together temporarily  Individuals come from different occupational Backgrounds  Global teams mean people are increasingly NOT co- located How to facilitate Digital Socialisation in Project- based Environments ?

3 Aim Develop methodology to evaluate usability of prototypes for supporting digital socialization within geographically-dispersed, or ‘virtual,’ engineering design teams

4 Methodology Develop IDRAK - proof-of-concept of Rich Internet Application to promote digital socialization in design Use IDRAK to play Delta design game (Bucciarelli 1994), which simulates engineering design team undertaking project feasibility stage. Evaluate usability of IDRAK to support work of ‘virtual’ teams Measure usability of IDRAK in terms of satisfaction, efficiency, effectiveness Compare Digital and Board game results

5 IDRAK: Proof-of-Concept I  Navigation Bar  Social Proxy (passive)  Social Proxy (active)  Digital ID  Knowledge Map IDRAK User Interface Snapshot (1)

6 IDRAK: Proof-of-Concept II  Instant Dialogue  Knowledge Repository IDRAK User Interface Snapshot (2)

7 Validation Methodology Delta Design:  Original board-based exercise developed at MIT  4 individuals (architect, project manager, structural engineers, mechanical engineer) must develop 2- dimensional concept  Players place triangular red- and blue- tiles on a diamond-shaped grid  Players iteratively play design output against design brief and specifications Delta Design Game Window

8 Validation On Testing day:  Set of Instructions provided ex-ante  Each user allocated one PC  Teams play for 120 min  No verbal talk allowed Data collection methods:  12 experiments (MBA, MSc. students, Arup team)  Content coding of digital dialogues `Work Talk `Work Coordination `Social Talk `IDRAK Related  Usability questionnaire

9 Effectiveness  Measure quality of design outcome in relation to design brief and specs  Statistically compare design output from 12 digital and 28 board experiments

10 Analysis Technique Conduct independent samples T-tests to assess comparative performance of ‘virtual’ and co-located engineering design teams. Relevant data plots indicate that underlying distributions conformed to assumptions of normality

11 T-test Statistics DisciplineDesign CriteriaDesig n Target Descriptive Statistics Board exercise (28 experiments) Digital exercise (12 experiments) MeanStDevMeanSt DevT-value ArchitectureFunctional Internal Area (FIA)  Maximum Blue Deltas (% total) (MCD) (§) ≤ Thermal Engineering Average Internal Temperature Range (AITR) * Minimum Individual Delta Temperature Range (min IDTR) Maximum Individual Delta Temperature Range (max IDTR) * Structural Engineering Maximum Load at Anchor Points (MLAP) ≤ Maximum Internal Moment (MIM) ≤ Project Management Total Budget (TB)≤! * Building Time (BT)Shortest (*) p < 0.05

12 Findings Digital results tend to exhibit higher variability than board-based results. Performance of ‘virtual’ engineering design teams less consistent than that of co-located teams No statistically significant differences in mean performance of architects and structural engineers Thermal engineers participating in virtual teams tend to perform less well than those in co-located teams ‘Virtual’ project managers can be more successful in ensuring that design outcomes meet their own key criteria (total budget and building time) than co-located project manager

13 Contributions  A conceptual framework `Grounded in empirical findings and extant theory  Proof-of-Concept (IDRAK) `Illustrates original implementation of conceptual framework in digital environment  Novel methodology to validate usability of prototypes `Evaluates usability of proof-of-concept prototypes to support digital socialization in engineering design

14 Limitations  Laboratory game simulation is not reality `Real-world is much more complex than lab setting `Engineering design involves more than four people `Decision-making involves many more variables and problems `Exercise itself only mimics demands of work at early design stage  IDRAK does not exhibit a reputation system  Only one channel of communication (IDRAK) was permitted

15 Questions Thank You