Research Part 3. Interface:BaselineAlt IndependentCognitively -Eng Problem Scenario:Ownship Firing Question Complexity:Lo High Cognitive Design Principles.

Slides:



Advertisements
Similar presentations
1 Challenge the future Subtitless On Lightweight Design of Submarine Pressure Hulls.
Advertisements

Using VB with MS Applications R. Juhl, Delta College.
Usually the next step is to run the Cognitive Tests. Click on “Run Cognitive Tests” button to start testing. All of the tests begin with you giving a brief.
Howard Community College Test Center LPN-RN Mobility Exam.
Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Chapter 1 An Introduction to Business Statistics.
Assessment Literacy Series
Item Writing Techniques KNR 279. TYPES OF QUESTIONS Closed ended  Checking yes/no, multiple choice, etc.  Puts answers in categories  Easy to score.
Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. Alternative Assessments FOUN 3100 Fall 2003 Sondra M. Parmer.
1 A Review  Adequate training does not happen without a lot of work  It requires significant planning  There are definite.
Mass Digitization of Archival Manuscripts To ThisGoing from this.
Stress, Workload, Accidents, & Errors
Presentation Styles Balancing Function And Fashion Ben Carson Rajesh Golla Sunil D’souza.
IS 1181 IS 118 Introduction to Development Tools VB Chapter 03.
How do you know it worked
7M701 1 User Interface Design Sommerville, Ian (2001) Software Engineering, 6 th edition: Chapter 15
This Interaction Annoys Me Documenting a problem with an interaction.
Assignment 1 Pick an interaction you find annoying. Document the steps. Describe the annoyance and how it can be fixed.
Information Technology Center Hany Abdelwahab Computer Specialist.
Component-specific usability testing Dr Willem-Paul Brinkman Lecturer Department of Information Systems and Computing Brunel University
ITCS 6010 Laws Of Interface Design. 1. User Control The interface will allow the user to perceive that they are in control and will allow appropriate.
Multimodal Interfaces in a Ubiquitous Computing Environment 3 rd UK-Ubinet Workshop —————— 9 th – 11 th February 2005 —————— Fausto. J. Sainz Salces, Dr.
1 User Interface Design CIS 375 Bruce R. Maxim UM-Dearborn.
Skills and Techniques Intermediate 2 & Higher Physical Education.
Unit 3: Command & Control IC/IMT Interface
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 16 Slide 1 User interface design.
Review an existing website Usability in Design. to begin with.. Meeting Organization’s objectives and your Usability goals Meeting User’s Needs Complying.
Design, goal of design, design process in SE context, Process of design – Quality guidelines and attributes Evolution of software design process – Procedural,
Development of Questionnaire By Dr Naveed Sultana.
SATERN for Supervisors May Session Objectives At the end of the session, participants will be able to:  Describe the benefits of SATERN.  Log.
1 What do People Recall about their Documents? Implications for Desktop Search Tools Tristan Blanc-Brude and Dominique L. Scapin INRIA ACM IUI 2007 (22%)
Heuristic Evaluation “Discount” Usability Testing Adapted from material by Marti Hearst, Loren Terveen.
PhD Research Seminar Series: Writing the Method Section Dr. K. A. Korb University of Jos.
Mobile Text Entry: Methods and Evaluation CSCI 4800 March 31, 2005.
RECRUITER CAMPAIGNS 101 HOW TO SET UP CAMPAIGNS FOR BEGINNERS Angela Skjeie Pacific University Oregon July 30, 2015 Enrollment & Student Services Track.
Chapter 11: Interaction Styles. Interaction Styles Introduction: Interaction styles are primarily different ways in which a user and computer system can.
1 EVALUATING INTELLIGENT FLUID AUTOMATION SYSTEMS USING A FLUID NETWORK SIMULATION ENVIRONMENT Ron Esmao - Sr. Applications Engineer, Flowmaster USA.
Writing Clear Learning Objectives. Help trainers focus on “need to know” content and eliminate unnecessary content. Guide trainers in choosing appropriate.
Software Engineering Chapter 16 User Interface Design Ku-Yaw Chang Assistant Professor Department of Computer Science and Information.
SATERN for Supervisors Updated: January Session Objectives At the end of the session, participants will be able to:  Describe the benefits of SATERN.
Heuristic evaluation Functionality: Visual Design: Efficiency:
Alternative Assessment
If Not Now, When? The Effects of Interruption at Different Moments Within Task Execution Piotr D. Adamczyk, Brian P. Bailey Graduate School of Library.
Quantitative analysis Sonia Williams Northern College of Acupuncture 19 th February 2011.
System for Administration, Training, and Educational Resources for NASA SATERN Overview for Users December 2009.
User Modeling of Assistive Technology Rich Simpson.
For ABA Importance of Individual Subjects Enables applied behavior analysts to discover and refine effective interventions for socially significant behaviors.
Interface Design Web Design Professor Frank. Design Graphic design and visual graphics are equally important Both work together to create look, feel and.
Introduction to Web Authoring Ellen Cushman Class mtg. #21.
Carmen M Sarabia-Cobo. University of Cantabria Spain
AMSc Research Methods Research approach IV: Experimental [1] Jane Reid
Assessment and Testing
Usability and Human Factors Unit 4c Human Factors and Healthcare.
Ch16: User Interface Design Users often judge a system by its interface rather than its functionality Poorly designed interfaces can cause users to make.
Sampling Design & Measurement Scaling
Measurement Experiment - effect of IV on DV. Independent Variable (2 or more levels) MANIPULATED a) situational - features in the environment b) task.
1 Evaluating the User Experience in CAA Environments: What affects User Satisfaction? Gavin Sim Janet C Read Phil Holifield.
SAVE-IT SAfety VEhicles using adaptive Interface Technology Phase 1 Research Program Quarterly Program Review Task 4: Distraction Mitigation John D. Lee.
Oct 211 The next two weeks Oct 21 & 23: Lectures on user interface evaluation Oct 28: Lecture by Dr. Maurice Masliah No office hours (out of town) Oct.
Single- Subject Research Designs
Usability Olaa Motwalli CIS764, DR Bill – KSU. Overview Usability factors. Usability guidelines.  Software application.  Website. Common mistakes. Good.
Research Methodology Rebecca Pera, PhD 1 /43.
ADDITION USING PARTITIONING The following questions are multiple choice. When working out these questions mentally you must use the following method. e.g.
Sandy Lozito ATM2003 June 2003 The Impact of Voice, Data Link, and Mixed Modality Environments on Flight Deck Procedures Sandy Lozito 1, Savvy Verma 2,
Human Computer Interaction Lecture 21,22 User Support
Turkka Keinonen NRC Helsinki
WHAT IS TECHNICAL WRITING?
Introduction to Object-Oriented Programming
Methodology Week 5.
Human Computer Interaction
Measuring Learning During Search: Differences in Interactions, Eye-Gaze, and Semantic Similarity to Expert Knowledge Florian Groß Mai
Presentation transcript:

Research Part 3

Interface:BaselineAlt IndependentCognitively -Eng Problem Scenario:Ownship Firing Question Complexity:Lo High Cognitive Design Principles Used: 1 Automation of Unwanted Workload Use of Alert Messages & Color Coding Appropriate Data Grouping Use of Metaphor Display Name Related to Function Consistent Meaningful Grouping Use of Status Indicators Necessary Information Only Judicious Redundancy Multiple Coding of Data To Promote Cognitive Flexibility Condition Total Interface Total =Absence of Cognitive Design Principle (CDP) 1=Partial Presence of CDP 2= Full Presence of CDP

Research Model Problem Domain: Anti Submarine Warfare Independent Variables 1. Interface (three levels:) 1.1. Baseline 1.2. Cognitively engineered 1.3. Alternate independent 2. Problem (two levels:) 2.1. Ownship 2.2. Firing 3. Question type: (two levels:) 3.1. Yes/No 3.2. Multiple Choice Intervening Variable: Participants’ Cognitive Processes Control Variables: 1.Environment 3. Experimenter 2. Equipment 4. Instrumentation Dependent Variables 1.Objective: 1.1. Reaction Time 1.2. Accuracy 2. Subjective: 2.1. Workload 2.2. Preference

Experimental Schedule -Oct. 15 – Meet with students to give them an introduction to the domain (submarines) and to the workload tools. (approximately one and half hours) -Oct. 22= Day 1 of experiment -Oct. 29= Day 2 of Experiment -Nov. 5= Day 3 of Experiment

There would be six different groups of students that would participate in the experiment at the following times. Grp. Day 1- Oct. 22Day 2- Oct. 29Day 3- Nov.5 1.4:30 4:30 4:30 2.6:00 6:00 6:00 3.7:30 7:30 7:30 4.4:30 6:00 7:30 5.6:00 7:30 4: :30 4:30 6:00

Experimental Design IntroductionGroup Module1 2 3 NumberTreatmentObservationTreatmentObservationTreatment Observation Same1A0-1B0-2C0-3 treatment for all2A0-1C0-2B0-3 Given in3B0-1A0-2C0-3 class. 4B0-1C0-2A0-3 5C0-1C0-2A0-3 6C0-1B0-2A0-3 Codes: A=CognitiveInterface B=BaselineInterface C=AlternateIndependentInterface O=Observa- tions (subjective and objective measures)

Trail Events: Ownship Task PICTURE Enter Y for Yes Or N for No Ownship Is Ownship Ready? Yes NO DISPLAY PICTURE Ownship Why Isn’t Ownship Ready? Select a number indicating your answer Your Response Press Return To Start Press return when you know the answer Enter Y or N And Press Return Press return to Start Press return when you know the answer Enter a Number And Press return New Trial

Task Load Index (NASA – TLX) Was developed at NASA-Ames Research Center A subjective multi-dimensional rating procedure Provides an overall workload score

Rating Scale Description: MENTAL DEMAND: How much mental and perceptual activity was required (e.g. thinking, deciding, calculating, remembering, looking, searching, etc.)? Was the task easy or demanding, simple or complex, exacting or forgiving?

Rating Scale Description: TEMPORAL DEMAND: How much time pressure did you feel due to the rate or pace at which the task(s) elements occurred? Was the pace slow and leisurely or rapid and frantic?

Rating Scale Description: OWN PERFORMANCE: How successful do you think you were in accomplishing the goals of the task(s)? How satisfied were you with your performance in accomplishing these goals?

Rating Scale Description: EFFORT: How hard did you have to work (mentally and physically) to accomplish your level of performance? FRUSTRATION: How insecure, discouraged, irritated, stressed and annoyed versus secure, gratified, content, relaxed and complacent did you feel during the task?

TLX Rating Scales Task or Mission Segment: Please rate the task or mission segment by putting one “X” on each of the six scales at the point which matches your experience. Point the mouse at the location where you wish to place the “X”. Mental Demand Low High Temporal Demand Performance Effort Frustration Low High Good Poor Low High Low High Click The Mouse When Finished

Preference Question “Rank the three interfaces (by letter code) in their order of quality with quality defined as how well you could perform the task within each interface.” (A=cognitively engineered, B=baseline, C=alternate independent).