Human Factor Evaluation for Marine Education by using Neuroscience Tools N. Νikitakos D. Papachristos Professor Ph.d. candidate Dept. Shipping, Trade and.

Slides:



Advertisements
Similar presentations
©2011 1www.id-book.com Evaluation studies: From controlled to natural settings Chapter 14.
Advertisements

Chapter 14: Usability testing and field studies
Evaluation of User Interface Design
School leaders’ professional development: what do they think about it? Dr Athena Michaelidou Educational Research and Evaluation Centre and Open University.
Robin L. Donaldson May 5, 2010 Prospectus Defense Florida State University College of Communication and Information.
Web Search Results Visualization: Evaluation of Two Semantic Search Engines Kalliopi Kontiza, Antonis Bikakis,
Genre Shift: Instructor Presence and its Impact on Student Satisfaction in Online Learning.
Face Recognition & Biometric Systems, 2005/2006 Face recognition process.
Chapter 14: Usability testing and field studies. 2 FJK User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept.
Page16/2/2015 Sirlan Usage and usability considerations for SIRLAN solution success.
Chapter 14: Usability testing and field studies. Usability Testing Emphasizes the property of being usable Key Components –User Pre-Test –User Test –User.
CS CS 5150 Software Engineering Lecture 12 Usability 2.
Comparison of two eye tracking devices used on printed images Barbora Komínková The Norwegian Color Research Laboratory Faculty of Computer Science and.
Define usability testing Usability is all about how easy a product, service or system is to use. The extent to which a product can be used by specified.
Design of metadata surrogates in search result interfaces of learning object repositories: Linear versus clustered metadata design Panos Balatsoukas Anne.
1 CS 430 / INFO 430 Information Retrieval Lecture 24 Usability 2.
ABSTRACT Audiovisual quality is an important factor in acceptance and success of new mobile services such as mobile television. The produced quality is.
Feedback from Usability Evaluation to User Interface Design: Are Usability Reports Any Good? Christian M. Nielsen 1 Michael Overgaard 2 Michael B. Pedersen.
Usages of ICT A user oriented innovation process Sylvain Dejean - Coordinator.
From Controlled to Natural Settings
Effect of Staff Attitudes on Quality in Clinical Microbiology Services Ms. Julie Sims Laboratory Technical specialist Strengthening of Medical Laboratories.
Damian Gordon.  Summary and Relevance of topic paper  Definition of Usability Testing ◦ Formal vs. Informal methods of testing  Testing Basics ◦ Five.
CORRELATIO NAL RESEARCH METHOD. The researcher wanted to determine if there is a significant relationship between the nursing personnel characteristics.
VALUTAZIONE DELL’USABILITÀ BASATA SU EYE TRACKING: ANALISI DEL SITO WEB DEL COMUNE DI MILANO Relatore: Chiar.mo Prof. Marco Porta Correlatrice: Chiar.ma.
1 The Benefits of Using Eye Tracking in Usability Testing Jennifer C. Romano Usability Laboratory Statistical Research Division U.S. Census Bureau.
The role of eye tracking in usability evaluation of LMS in ODL context Mr Sam Ssemugabi Ms Jabulisiwe Mabila (Professor Helene Gelderblom) College of Science.
Preparing a User Test Alfred Kobsa University of California, Irvine.
CS Spring 5/3/ Presenter : Yubin Li Professor : Dr. Bamshad Mobasher Week 6: Descriptive Research.
Evaluation IMD07101: Introduction to Human Computer Interaction Brian Davison 2010/11.
«Enhance of ship safety based on maintenance strategies by applying of Analytic Hierarchy Process» DAGKINIS IOANNIS, Dr. NIKITAKOS NIKITAS University of.
Chapter 14: Usability testing and field studies
Introduction to SDLC: System Development Life Cycle Dr. Dania Bilal IS 582 Spring 2009.
Evaluation Framework Prevention vs. Intervention CHONG POH WAN 21 JUNE 2011.
Ch 14. Testing & modeling users
Context of Use (Based on Ch2 of Usability-Centered Development) Jim Carter USERLab University of Saskatchewan © Jim A Carter Jr 2012.
WELNS 670: Wellness Research Design Chapter 5: Planning Your Research Design.
Usability testing. Goals & questions focus on how well users perform tasks with the product. – typical users – doing typical tasks. Comparison of products.
Copyright © 2008 Pearson Education, Inc. Publishing as Pearson Addison-Wesley The Resonant Interface HCI Foundations for Interaction Design First Edition.
1 Producing Your Assessment Question Mark “Software for creating and delivering assessments with powerful reports”  Copyright 2000 QuestionMark. All.
Testing & modeling users. The aims Describe how to do user testing. Discuss the differences between user testing, usability testing and research experiments.
Experimentation in Computer Science (Part 1). Outline  Empirical Strategies  Measurement  Experiment Process.
Usability and Accessibility Usability of Accessibility Features Janey Barnes, PhD User-View, Inc. 1.
Quantitative and Qualitative Approaches
1 Technical & Business Writing (ENG-315) Muhammad Bilal Bashir UIIT, Rawalpindi.
Copyright  2003 by Dr. Gallimore, Wright State University Department of Biomedical, Industrial Engineering & Human Factors Engineering Human Factors Research.
AN ARTICLE CRITIQUE BY BONNIE MACGREGOR LIBR 285 SPRING 2010 Mansourian, Y., Ford, N., Webber, S., & Madden, A. (2008). An integrated model of “information.
Users’ Quality Ratings of Handheld devices: Supervisor: Dr. Gary Burnett Student: Hsin-Wei Chen Investigating the Most Important Sense among Vision, Hearing.
Chapter 3 Managing Design Processes. 3.1 Introduction Design should be based on: –User observation Analysis of task frequency and sequences –Prototypes,
Jaison A Manjaly Assistant Professor Indian Institute of Technology Gandhinagar.
June 5, 2007Mohamad Eid Usability Testing Chapter 8.
MEASURING PHYSICAL ACTIVITY Week 3. What you need to know The difference between subjective and objective methods The difference between subjective and.
Useability testing Software Development Unit 4 Outcome 1.
Research Methodology How the study was conducted; what did and how you did it. 1- Participants/ subjects, who participated in the study? How many? How.
Engaging Students in Technical Modules: The Quest to Promote Student Identification of Problematic Knowledge. Dr William Lyons, School of Engineering,
Crew-centered Design and Operations of ships and ship systems CyClaDes Workshop ( )
Day 8 Usability testing.
Human Computer Interaction (HCI)
Eye Movement & Reading Awareness lab
SIE 515 Design Evaluation Lecture 7.
Eye Movement & Reading Awareness lab
CIS 376 Bruce R. Maxim UM-Dearborn
AN INTRODUCTION TO EDUCATIONAL RESEARCH.
Awatef Ahmed Ben Ramadan
From Controlled to Natural Settings
Lesson 1 Foundations of measurement in Psychology
THE RELATIONSHIP BETWEEN PRE-SERVICE TEACHERS’ PERCEPTIONS TOWARD ACTIVE LEARNING IN STATISTIC 2 COURSE AND THEIR ACADEMIC ACHIEVEMENT Vanny Septia Efendi.
Testing & modeling users
The samples and the Error
Presentation transcript:

Human Factor Evaluation for Marine Education by using Neuroscience Tools N. Νikitakos D. Papachristos Professor Ph.d. candidate Dept. Shipping, Trade and Transport University of Aegean MASSEP 2013 May 2013

Human Factors Evaluation Research Methodology Case Study CONTENTS

HUMAN FACTORS EVALUATION

Maritime education user’s satisfaction objective criteria satisfaction phenomena Human Factor Evaluation (1)

mixed approach to Human Factor evaluation ship’s bridge equipment usability and educational evaluation ship bridge interactive systems neuroscience tools of gaze tracking & speech recording for measuring emotional user responses Usability testing Human Factor Evaluation (2)

Human Factors Evaluation (3) Neuroscience tools

Human Factors evaluation ship manipulation systems design (interactive technologies) ergonomic few applications in industry cognitive ergonomics Human Factor Evaluation (4)

Usability has been defined by ISO 9241 as “the extent to which a product can be used by specified users to achieve specified goals with effectiveness, efficiency, and satisfaction in a specified context of use” Human Factor Evaluation (5)

Effectiveness means accuracy and completeness with which users achieve specified goals Efficiency means resources expended in relation to the accuracy and completeness with which users achieve goals Satisfaction means freedom from discomfort and positive attitudes towards the use on the product Human Factor Evaluation (6)

RESEARCH METHODOLOGY

Defining satisfaction concerns the following parameters, which are being investigated: concerning software and educational scenarios, system usability as far as the system per se is concerned (total functionality), as well as the individual training and technical characteristics that complete the teaching act. Research Methodology (1)

The research questions (RQ) set by the suggested research framework as follows: RQ-1: There is a relationship between the user’s optical attention to the user’s satisfaction either for the software or the scenario? RQ-2: There exists a relationship between the training characteristics, user satisfactions and by extension optical attention? Research Methodology (2)

The suggested research method aims at interpreting, determining and evaluating the data of the biometric tool in combination with the conventional methods (qualitative, quantitative) results based on the factors (relationships) that is possible to influence the user’s satisfaction Research Methodology (3)

Research Methodology (4) Measurements software scenario Quantitative data (questionnaires) Qualitative data (interview) Relationships between parameters-factors Interpretation procedure

CASE STUDY

The case study aims the following: the evaluation of the user satisfaction from using the ECDIS software and scenario and educational evaluation of ECDIS from the user’s point of view (opinions). Case study (1)

first (random) sampling (January 2012 until May 2012), in the Information Technologies Laboratory of the National Marine Training Centre of Piraeus 31 Marine officers video recording of ~23 minutes per student Case study (2)

Case study (3) ECDIS operation ECSIS Lab room Eye tracker “Face Analysis“ software - station experiment

Stage 1: Information about the experiment, Presentation of the acceptance document by the user-trainee (estimated time duration minutes) Stage 2: Completion of a user’s profile and of the assessment survey concerning educational and technical characteristics (questionnaire, T3) by the trainee (estimated time period minutes) Stage 3: Equipment installation (gaze tracking device) and configuring the parameters (T1) Stage 4: Video recording (T1) in combination with filling in a work sheet (T3) by the researcher (estimated time duration minutes) Stage 5. Completion of the process (device disconnect) through a semi structured interview & questionnaire (T2, T3, T4) with the user (estimated at 5 – 10 minutes) Case study (4)

Tool-1 (T1): the optical data registration will be conducted by the “Face Analysis” software in connection with a Web camera set on the computer in which there is the subject of the research (ECDIS) Tool-2 (T2): Use of a microphone for voice recording (interview) Tool-3(T3):Questionnaires using for opinion/attitudes/expectation/self-evaluation and observing Tool-4 (T4): Usability assessment tool SUS Case study (5)

Case study (6) Eye gaze vector Schedule of eyes & head pose Distance from monitor Head roll (angle), HR XoXo Eyes Quality parameter (eye gaze trucking) values (horizontal)>0: mean out of screen values (vertical)→-18 view of the center of the screen ~0 attention in screen ~1 & >1 no attention >1 close to the screen <1 away from the screen Values >10 o degrees, (high mobility) Values <10 o degrees (attention depending on the scenario EL HR= HL Horizontal Level, HL Eye Level, EL Βiometric tool parameters interpretation (‘Face Analysis’)

The data of experiment come from three sources (by using SPSS, Excel): questionnaires, SUS Tool, optical data (gaze tracking) and Interviews (voice recording) Case study (7)

The results are shows: a relationship between Gaze parameter and Usability assessment of users. The gaze parameter depending from SUS score. It shows attention increases as assessment from ECDIS software (RQ-1), a relationship between SUS score and training characteristics (total assessment, time schedule) and a strong relationship between ECDIS satisfaction and Scenario Satisfaction (RQ-2). It seems the scenario operation depending from software environment (navigation, interface), High usability for ECDIS software (questionnaires evaluation, SUS tool) and high score for Training program evaluation (National Marine Training Centre of Piraeus). Case study (8)

Case study (9) Marine OfficersMale (29) Female (2) Age’s scale years16 (51.6 %)2 (6.45%) years6 (19.35%)0 >45 years7 (22.5%)0 Officer order A’13 (41.9%)0 B’4 (12.9%)0 C’12 (38.7%)2(6.45%) Eye diseases10 (32.2%)0 Eye Operation1 (3.2%)0 Sample’s structure

Case study (10) Marine OfficersTraining ProgramECDIS Total AssessmentTime ScheduleEducational GoalsNavigationInterfaceMultimedia Evaluation scale Very high15 (48.3 %)18 (58.06%)19 (61.25%)15 (51.6 %)12 (38.7%) High7 (22.6%) 1 (3.22%)9(19.35%)13 (41.9%)16 (51.6 %) Medium6 (19.35%)4 (12.9%)2 (6.45%)7 (19.35%)6 (19.35%)3 (9.6%) Low3 (9.6%)2 (6.45%)0000 Total31 (100%) ECDIS – Training program Evaluation

Case study (11) NoVariable correlatedSpearman’s rhoSig. (2-tailed)Remark 1SUS score – ECDIS Satisfaction Positive 2SUS score - Interface Positive 3SUS Score - Gaze Positive 4ECDIS satisfaction – Total Assessment Positive 5ECDIS satisfaction – Time Schedule Positive 6ECDIS satisfaction – Navigation Positive 7ECDIS satisfaction – Interface Positive 8ECDIS satisfaction – Scenario Satisfaction Positive 9Scenario Satisfaction - NAvigation Positive Correlations between variables of research tools

Thank you